Advanced Data Engineer

Honeywell International
Apply Now

Job Description

Honeywell is charging into the Industrial IoT revolution with the establishment of Honeywell Connected Enterprise (HCE), building on our heritage of invention and deep, on-the-ground industry expertise. HCE is the leading industrial disruptor, building and connecting software solutions to streamline and centralize the assets, people and processes that help our customers make smarter, more accurate business decisions. Moving at the speed of software, we are creating, innovating and delivering solutions fast, challenging the way things have always been done, piloting new ways for all of us to work, and expecting our successes to set new standards for our customers and for Honeywell.

Join a company that is transforming from a traditional industrial company to a contemporary digital industrial business, harnessing the power of cloud, big data, analytics, Internet of Things, and design thinking. You will lead change that brings value to our customers, partners, and shareholders through the creation of innovative software and data-driven products and services. You will work with customers to identify their high value business questions and work through their data to search for answers. You will be responsible for working within Honeywell to identify opportunities for new growth and efficiency based on data analysis.

JOB ACTIVITIES

As an Advanced Data Engineer, you will be part of a team that delivers contemporary analytics solutions for all Honeywell business groups and functions. You will build strong relationships with leadership to effectively deliver contemporary data analytics solutions and contribute directly to business success. You will develop solutions on various Database systems viz. Databricks, Hive, Hadoop, PostgreSQL, etc.

You will identify and implement process improvements – and you don’t like to the same thing twice so you will automate it if you can. You are always keeping an eye on scalability, optimization, and process. You have worked with Big Data before, IoT data, SQL, Azure, AWS, and a bunch of other acronyms.

You will work on a team including scrum masters, product owners, data architects, data engineers, data scientists and DevOps. You and your team collaborate to build products from the idea phase through launch and beyond. The software you write makes it to production in couple of sprints. Your team will be working on creating a new platform using your experience of APIs, microservices, and platform development.

YOU MUST HAVE

  • Bachelor's degree in Computer Science, Engineering, Applied Mathematics or related field
  • Minimum of 6 years of data engineering experience
  • Minimum of 4 years with development and deployment of complex big data ingestion jobs in Spark/Informatica BDM/Talend bringing prototypes to production on Hadoop/NoSQL/MPP platforms.
  • Minimum of 4 years of hands on experience with Spark, Pig/Hive, etc. and automation of data flow using Informatica, Spark, NiFi or Airflow/Oozie.
  • Minimum 3 years of experience in developing and building applications to process very large amounts of data (structured and unstructured), including streaming real-time data (Spark, Scala, Kafka, Python, Spark streaming or other such tools).
  • Minimum 2 years of experience in working with at least one NoSQL system (HBase, Cassandra, MongoDB etc.). In-depth knowledge of schema design to effectively tackle the requirement.

WE VALUE

  • Hands on experience in Databricks, Cloudera, Hortonworks and/or Cloud (AWS EMR, Azure Data Lake Storage) based Hadoop distributions.
  • Effective communication skills and succinct articulation
  • Experience in writing complex SQL statements
  • Experience in working with cloud-based deployments. Understanding of containers & container orchestration (Swarm or Kubernetes). 
  • Experience in building advanced analytics solutions with data from enterprise systems like ERPs, CRMs, Marketing tools etc.
  • Experience with dimensional modeling, data warehousing and data mining
  • Good understanding of branching, build, deployment, CI/CD methodologies such as Octopus and Bamboo
  • Experience working with in Agile Methodologies and Scrum
  • Knowledge of software best practices, like Test-Driven Development (TDD)
  • Database performance management and API development
  • Technology upgrade oversight
  • Experience with visualization software (Tableau, Spotfire, Qlikview, Angular js, D3.js)
  • Understanding of best-in-class model and data configuration and development processes
  • Experience working with remote and global teams and cross team collaboration
  • Consistently makes timely decisions even in the face of complexity, balancing systematic analysis with decisiveness

Company Info.

Honeywell International

Honeywell International Inc. is an American publicly traded, multinational conglomerate headquartered in Charlotte, North Carolina. It primarily operates in four areas of business: aerospace, building technologies, performance materials and technologies, and safety and productivity solutions.

  • Industry
    Information Technology,Aerospace
  • No. of Employees
    103,000
  • Location
    Charlotte, North Carolina, USA
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Honeywell International is currently hiring Data Scientist Jobs in Atlanta, GA, USA with average base salary of $120,000 - $190,000 / Year.

Similar Jobs View More