Posted on:7 Jan 2023
BACK TO SEARCH
Java Programming, Design of Experiment, Python Programming, Database, SQL, Azure, Scala Programming, Data science techniques, ETL/ELT technologies, NoSQL, Continuous Integration & Continuous Delivery - CI/CD, SSIS
Join us in the Procurement Execution Center (PEC) as a Data Engineer as part of a is a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the E2E management of our data, including: ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.
How you will do it
- Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse.
- Build and design ETL/ELT processes with Azure Data Factory (ADF) and/or Python, which once deployed, will require to be executed daily and weekly.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using Azure SQL and ADF.
- Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models.
- Maintain conceptual, logical, and physical data models along with corresponding metadata.
- Manages the DevOps pipeline deployment model, including automated testing procedures
- Deploys data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules.
- Performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities.
- Support the deployment of a global data standard for Logistics.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Support Rate Repository management as required (including Rate Card uploads to our DW).
- Other Procurement duties as assigned.
What are we looking for
- Bachelor’s degree in related field (Engineering, Computer Science, Data Science or similar)
- 4+ years of relevant experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles.
- Advanced working SQL knowledge and experience working with relational databases.
- Knowledge in DW/DL concepts, data marts, data modeling, ETL/ELT, data quality/stewardship, distributed systems and metadata management.
- Experience building and optimizing data pipelines, architectures, and data sets.
- Azure Data Engineering certification preferred (DP-203)
- ETL/ELT development experience (3+ years). SSIS or ADF are preferred.
- Ability to resolve ETL/ELT problems by proposing and implementing tactical/Strategic solutions.
- Strong project management and organizational skills.
- Experience with object-oriented function scripting languages: Python, Scala, C#, etc.
- Experience with NoSQL databases is a plus to support the transition from On-Prem to Cloud.
- Excellent problem solving, critical thinking, and communication skills
- Relevant experience with Azure DevOps (CI/CD, git/repo management)
- Due to the global nature of the role, proficiency in English language is a must
Johnson Controls International plc is an Irish-domiciled multinational conglomerate headquartered in Cork, Ireland, that produces fire, HVAC, and security equipment for buildings. As of mid-2019, it employed 105,000 people in around 2,000 locations across six continents.
Get Similar Jobs In Your Inbox
Johnson Controls is currently hiring Data Engineer Jobs in San Pedro Garza García, Nuevo Leon, Mexico with average base salary of Mex$37,000 - Mex$77,000 / Year.