Data Integration Engineer

Wells Fargo
Apply Now

Job Description

At Wells Fargo, we are looking for talented people who will put our customers at the center of everything we do. We are seeking candidates who embrace diversity, equity and inclusion in a workplace where everyone feels valued and inspired.

Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you.

Technology sets IT strategy; enhances the design, development, and operations of our systems; optimizes the Wells Fargo infrastructure; provides information security; and enables Wells Fargo global customers to have 24 hours a day, 7 days a week banking access through in-branch, online, ATMs, and other channels.

Our mission is to deliver stable, secure, scalable, and innovative services at speeds that delight and satisfy our customers and unleash the skills potential of our employees.

The Enterprise Functions Technology (EFT) group provides technology solutions and support for Risk, Audit, Finance, Marketing, Human Resources, Corporate Properties, and Stakeholder Relations business lines. In addition, EFT provides unique technology solutions and innovation for Wells Fargo Technology, Enterprise Shared Services, and Enterprise Data Management. This combined portfolio of applications and tools are continually engineered to meet the challenges of stability, security, scalability, and speed.

Within EFT the Corporate Risk Technology group helps all Wells Fargo businesses identify and manage risk. We focus on three key risk areas: credit risk, operational risk and market risk. We help our management and Board of Directors identify and monitor risks that may affect multiple lines of business, and take appropriate action when business activities exceed the risk tolerance of the company.

We are seeking a Hadoop Senior Software Engineer and Data Integration Developer (Senior Software Engineer) to be part of a team that provides reporting data warehouse solutions for Risk and Compliance across all levels of the organization. This position will use their strong understanding of data management, how to design and build dimensional data models, and overall delivery of data integration solutions. This position will partner with Data Modelers, report developers, and Business Intelligence Analysts to design/develop/deploy information solutions in the enterprise risk domain.


Primary responsibilities for the individual in this role will be:

  • Work closely with Data Modelers to structure data requirements into a data warehouse model for reporting and analytics
  • Design, develop and debug ETL integration code to meet defined source to target mappings

Handle complex ETL requirements and design

  • Work with Business Stakeholders to build database objects to meet desired output
  • Build, maintain and govern the Hadoop data layer to ensure data consistency and single version of the truth
  • Complete Unit Testing to ensure low probability of defects and that data matches with back-end systems
  • Ensure solutions are highly usable, scalable, and maintainable
  • Understand the impacts of data layer performance factors and collaborate with the Data Architect to implement mitigating physical modeling solutions
  • Perform code and design reviews to ensure performance, maintainability, and standards
  • Work with the Data Architect to define data security roles, groups and policies
  • Enforce Hadoop design standards, reusable objects, tools, best practices, and related development methodologies for the organization
  • Partnering with Business Stakeholders and developers to ensure deliverables meet business expectations
  • Suggest solutions for development or process improvement
  • Tuning of dataset design for optimal performance
  • Develop, execute, and troubleshooting of complex report SQL
  • Must be comfortable working on multiple, complex issues and projects

Required Qualifications

  • 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • 4+ years of ETL (Extract, Transform, Load) Programming experience
  • 3+ years of Hadoop / Big Data development experience
  • 2+ years of experience with Talend ETL development.
  • 3+ years of Java or Python experience
  • 3+ years of experience with databases such as Oracle, DB2, SQL Server, or Teradata

Desired Qualifications

  • An industry-standard technology certification
  • Strong verbal, written, and interpersonal communication skills
  • A BS/BA degree or higher in science or technology
  • Knowledge and understanding of software development life cycle (SDLC): all phases and types of testing
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
  • MS SQL server experience
  • Knowledge and understanding of Selenium software testing framework
  • 2+ years of experience with continuous integration development technologies such as IBM Udeploy, (UrbanCode Deploy), Jenkins, etc.
  • Knowledge and understanding of GitHub

Company Info.

Wells Fargo

Wells Fargo & Company is an American multinational financial services company with corporate headquarters in San Francisco, California, operational headquarters in Manhattan, and managerial offices throughout the United States and internationally. The company has operations in 35 countries with over 70 million customers globally. It is considered a systemically important financial institution by the Financial Stability Board.

Get Similar Jobs In Your Inbox

Wells Fargo is currently hiring Data Integration Engineer Jobs in Charlotte, NC, USA with average base salary of $120,000 - $190,000 / Year.

Similar Jobs View More