Principal Data Engineering

Mastercard Inc.
Apply Now

Job Description

The Data Warehouse team is looking for a Big Data Lead Engineer to drive our mission to unlock potential of data assets by consistently innovating, eliminating friction in how users access data from its Big Data repositories and enforce standards and principles in the Big Data space. Responsible for defining and maintaining the data architecture and associated data maintenance, integration and load processes for the organization. The Hadoop Data Warehouse is being used by various groups to derive insights, perform Machine Learning and Data Science activities which in turn supports various revenue generating initiatives across the organization.

Role

  • Develop high quality, secure and scalable data pipelines using spark, Scala/ python/ Java on Hadoop or object storage.
  • Lead consultative position in complex initiatives of strategic importance (e.g., cross functional/ cross geographies), including implementing the most appropriate data solutions for individual applications
  • Design and architect data flow schemes in Hadoop environment which can be scalable, repeatable and eliminate time consuming steps.
  • Drive automation and efficiency in Data ingestion, data movement and data access workflows by innovation and collaboration.
  • Understand, implement and enforce Software development standards and engineering principles in the Big Data space.
  • Work closely with business stakeholders and embedded engineering teams within business teams in a collaborative manner to help them build scalable products in quick time.
  • Leverage new technologies and approaches to innovating with increasingly large data sets.
  • Work with project team to meet scheduled due dates, while identifying emerging issues and recommending solutions for problems.
  • Perform assigned tasks and production incident independently.
  • Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency.

All About You

  • 10-16 years of experience in Data Warehouse related projects in product or service-based organization
  • Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data Hadoop environment
  • Experience of building data pipelines through Spark with Scala/Python/Java on Hadoop or Object storage
  • Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge
  • Experience of working on real time data flow system, NiFi and Kafka will be an added advantage
  • Experience of working on automation in data flow process in a Big Data environment.
  • Experience of working in Agile teams
  • Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan
  • Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills
  • Ability to multi-task across multiple projects, interface with external / internal resources and provide technical leadership to junior team members
  • Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results
  • Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement
  • Flexibility to work as a member of a matrix based diverse and geographically distributed project teams
  • Successfully led complex cross-functional domain initiatives, expected to set project direction and anticipate/resolve bottlenecks
  • Experience participating in a major automation and/or cloud delivery effort, including supporting financial decision making
  • Experience driving deliverables within the global database technology domains and sub-domains, expected to team with leaders in the technical community and derive new solutions
  • Proven ability to serve as a subject matter expert and trusted advisor on future-proofing big data infrastructure and business strategies for cloud and on-prem.

Corporate Security Responsibility

All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must:

  • Abide by Mastercard’s security policies and practices;
  • Ensure the confidentiality and integrity of the information being accessed;
  • Report any suspected information security violation or breach, and
  • Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Company Info.

Mastercard Inc.

Mastercard Inc. is an American multinational financial services corporation headquartered in the Mastercard International Global Headquarters in Purchase, New York. The Global Operations Headquarters is located in O'Fallon, Missouri, a municipality of St. Charles County, Missouri.

  • Industry
    Financial services
  • No. of Employees
    24,000
  • Location
    Purchase, Harrison, NY, USA
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Mastercard Inc. is currently hiring Principal Data Engineer Jobs in Pune, Maharashtra, India with average base salary of ₹90,000 - ₹250,000 / Month.

Similar Jobs View More