Job Description

As the Data Engineer, you will support an intelligent and scalable approach for driving rich and measurable outcomes for our vendor partners. It will be effective, efficient, and exciting. It will be the most critical place which our vendors engage with Ingram Micro and will serve as their window into all available channels The platform will provide an exceptional experience, incremental efficiencies, and enhanced value creation.

The Data Engineer is responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The Data engineer will also design and develop algorithms or technical solutions to deliver product requirements. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, architects and product managers on data initiatives and will ensure optimal data delivery architecture is consistent. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing Ingram Micro’s data architecture to support our next generation of products and data initiatives.

This role will also be responsible, jointly with Product managers for ensuring proper documentation is made for each system change requested and that adequate change control, rigorous testing and training is done prior to moving changes into production. This position will contribute to improving the ability of business teams to adopt, support and promote Ingram Micro capabilities. In addition, this position will contribute to continuous improvement initiatives in data-based business solutions, business self-service.

Create and maintain optimal data pipeline architecture

  • Gather and process large, complex, raw data sets at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
  • Work with stakeholders including the Business, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Create data tools for analytics and engineering team members that assist them in building and optimizing our product into an innovative industry leader.
  • Be a data expert to strive for greater functionality in our data systems.
  • Data Management On-call support will be required for implementation and support activities.

Data solutions development

  • Employing your skills in designing, developing, and delivering world class data algorithmic artifacts, including documentation and coding; coordinate data algorithmic development with infrastructural development.
  • Work closely with our engineering and cross functional IT teams to integrate your amazing innovations and algorithms into our products.
  • Research and apply advanced algorithms and methods involving data mining, statistical analysis, and machine learning techniques.
  • Process unstructured data into a form suitable for analysis – and then do the analysis.
  • Support business decisions with ad hoc analysis as needed.
  • Master third party systems and interfaces, including: data available by the parties, API to be used for obtaining the data, limitations related to these interfaces.
  • Excellent subject matter expertise in designing algorithms, business logics to automate commerce process flows.
  • Apply your broad-based data development expertise to create practical and innovative solutions.
  • Efficiently implement clean, maintainable, and testable data solutions with high availability, blazing speed in performance and fault tolerant.
  • Participate in agile and SDLC project execution and provide accurate work effort estimates.
  • Apply excellent communications skills, creativity, and practical knowledge to benefit our customers.

What you bring to the role:

  • Education: Bachelor's degree in Computer Science, or Relevant Science and Math disciplines with an IT emphasis is required.
  • 4 years of relevant technical experience, with at least 5+ years of experience with web services development and middleware applications or master’s degree plus 5-7 years of technical experience.
  • 3 Years of Business Functional experience
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets.
  • Experience with ingestion of External 3rd Party data
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
  • A successful history of manipulating, processing, and extracting value from large, disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • A Plus,Experience with Google Cloud Platforms and tech stack ( Cloud Storage, Cloud PubSub, BIg Query)
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with Cloud Platforms and its tools : Google Cloud, AWS, and Microsoft Azure
  • Experience with relational SQL and NoSQL databases, including Postgres. Cloud SQL and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with stream-processing systems: Storm, Spark-Streaming, Striim etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C# etc.
  • Experience with ETL; Informatica, MS SSIS, SAP Data Services, etc.
  • Experience with secure cloud services platform for Data Management and Integration.

Company Info.

Ingram Micro Inc.

Ingram Micro is an American distributor of information technology products and services. The company is based in Irvine, California, U.S. and has operations around the world.

Get Similar Jobs In Your Inbox

Ingram Micro Inc. is currently hiring Data Engineer Jobs in Irvine, CA, USA with average base salary of $120,000 - $190,000 / Year.

Similar Jobs View More