Data Engineer (Experienced)

Salesforce, Inc.
Apply Now

Job Description

Job Category

Products and Technology

Job Details

***** If you are currently in college/ grad school or have less than 2 years of experience - please check out FutureForce job opportunities at Salesforce:

https://www.salesforce.com/company/careers/university-recruiting/

Salesforce has a number of teams hiring Data Engineers with a variety of experience across, but not limited to, the following types of teams:

  • Legal
  • Data Intelligence
  • Monetization Strategy
  • Marketing
  • Data Science

Each team is made up of data scientists, engineers, growth analysts, and information management authorities who are dedicated to driving product strategy with data-driven insights. Teams with executives, product managers, designers, developers, user researchers, marketers, and sales strategy team members across all Cloud businesses to discover new opportunities for growth and optimization, experiment with data, drive adoption, and provide useful insights that impact product strategy.

Open to Fully Remote, Flex (1-3 days/week in the office), or Office-Based (4-5 days/week in office)

Role Description:

A Data Engineer will be responsible for designing, developing & maintaining all parts of the data pipeline to build interactive and curated data needed to drive insights through data science, reporting & analytics.

Depending on the team, the role requires partnership with Data Scientists, Software Engineers, Data Analysts, and Information Management authorities within Salesforce. This role involves making an impact by driving continuous improvements in moving, aggregating, profiling, sampling, testing and analyzing terabytes of data.

Depending on the team, Responsibilities may include:

  • Be responsible for the technical solution design, lead the technical architecture and implementation of data acquisition and integration projects, both batch and real time. Define the overall solution architecture needed to implement a layered data stack that ensures a high level of data quality and timely insights.
  • Communicate with product owners and analysts to clarify requirements. Craft technical solutions and assemble design artifacts (functional design documents, data flow diagrams, data models, etc.).
  • Build data pipelines data processing tools and technologies in open source and proprietary products.
  • Serve the team as a subject matter expert & mentor for ETL design, and other related big data and programming technologies.
  • Identify incomplete data, improve quality of data, and integrate data from several data sources.
  • Proactively identify performance & data quality problems and drive the team to remediate them. Advocate architectural and code improvements to the team to improve execution speed and reliability.
  • Design and develop tailored data structures
  • Reinvent prototypes to create production-ready data flows.
  • Support Data Science research by designing, developing, and maintaining all parts of the Big Data pipeline for reporting, statistical and machine learning, and computational requirements.
  • Perform data profiling, sophisticated sampling, statistical testing, and testing of reliability on data.
  • Clearly articulate pros and cons of various technologies and platforms in open source and proprietary products. Implement proof of concept on new technology and tools to help the organization pick the best tools and solutions.
  • Harness operational excellence & continuous improvement with a can do leadership demeanor.
  • Strong SQL optimization and performance tuning experience in a high volume data environment that uses parallel processing. Teams are using the following: SQL, Python, Airflow, AWS, Spark, Tableau, Hadoop
  • Participate in the team’s on-call rotation to address sophisticated problems in real-time and keep services operational and highly available. 

Job Requirements:

  • BS/MS degree in Computer Science, Engineering, Mathematics, Physics, or equivalent/related degree.
  • Build programmatic ETL pipelines with SQL based technologies and platforms.
  • Solid understanding of databases, and working with sophisticated datasets.
  • Data governance, verification and data documentation using current tools and future adopted tools and platform.
  • Work with different technologies (Python, shell scripts) and translate logic into well-performing SQL.
  • Perform tasks such as writing scripts, web scraping, getting data from APIs etc.
  • Automate data pipelines using scheduling tools like Airflow.
  • Be prepared for changes in business direction and understand when to adjust designs.
  • At least 3 years of expert experience with SQL.
  • At least 3 years of experience with AWS eco system.
  • Experience writing production level SQL code and good understanding of Data Engineering pipelines.
  • 2+ years of Python development experience using Pandas.
  • Experience with Hadoop ecosystem and similar frameworks.
  • Previous projects should display technical leadership with an emphasis on data lake, data warehouse solutions, business intelligence, big data analytics, enterprise-scale custom data products.
  • Knowledge of data modeling techniques and high-volume ETL/ELT design.
  • Experience with version control systems (Github, Subversion) and deployment tools (e.g. continuous integration) required.
  • Experience working with Public Cloud platforms like GPC, AWS, or Snowflake
  • Familiarity with scrum/agile project management methodologies and SDLC stages required.
  • Hands-on on Salesforce.com knowledge of product and functionality a plus.
  • Ability to work effectively in an unstructured and fast-paced environment both independently and in a team setting, with a high degree of self-management with clear communication and commitment to delivery timelines.
  • Strong problem solving with acute attention to detail and ability to meet tight deadlines and project plans.
  • Ability to research, analyze, interpret, and produce accurate results within reasonable turnaround times with an iterative mentality with rapid prototyping designs.
  • Tableau experience is always preferred

Company Info.

Salesforce, Inc.

Salesforce is an American cloud-based software company headquartered in San Francisco, California. It provides customer relationship management (CRM) service and also provides enterprise applications focused on customer service, marketing automation, analytics, and application development.

  • Industry
    Consulting,Cloud computing,Computer software
  • No. of Employees
    73,541
  • Location
    Salesforce Tower, Mission Street, San Francisco, California, USA
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Salesforce, Inc. is currently hiring Data Engineer Jobs in Austin, TX, USA with average base salary of $120,000 - $190,000 / Year.

Similar Jobs View More