Job Description

We are seeking a motivated and skilled Data Engineer to join our team. The ideal candidate is passionate about data engineering, possesses a strong analytical mindset, and demonstrates a keen interest in working with large-scale data systems. In this role, you will assist in designing, developing, and maintaining our data pipelines, ensuring the efficient flow of data between various systems. You will collaborate closely with cross-functional teams, including data scientists, cloud engineers, software engineers, and business analysts, to enable data-driven decision-making and deliver high-quality solutions.

Responsibilities:

  • Assist in designing, developing, and maintaining data pipelines, ensuring smooth data flow from diverse sources into our data warehouse.
  • Collaborate with data scientists and software engineers to implement efficient and scalable data processing solutions.
  • Perform data cleansing, transformation, and aggregation to ensure data integrity and accuracy.
  • Help optimize and tune data pipelines for performance and reliability.
  • Assist in monitoring and troubleshooting data-related issues to ensure smooth operations.
  • Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.
  • Stay up-to-date with emerging data engineering technologies and best practices to contribute innovative ideas and improvements.
  • Document data engineering processes, procedures, and standards for future reference.

Education, Requirements and Qualifications

  • Must be clearable for DoD Public Trust or higher
  • Bachelor's degree in computer science, information systems, economics or a related critical thinking field. Equivalent practical experience will also be considered.
  • Strong understanding of data engineering concepts and principles (ingestion, standardization, transformation)
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with SQL and relational databases.
  • Familiarity with data modeling and ETL/ELT (Extract, Transform, Load) processes.
  • Basic understanding of distributed computing and big data technologies such as Apache Hadoop, Spark, or Kafka.
  • Knowledge of cloud-based data storage and processing platforms like AWS, Azure, or Google Cloud is a plus.
  • Excellent problem-solving and analytical skills.
  • Strong communication and teamwork abilities.
  • Attention to detail and commitment to delivering high-quality work.
  • Ability to work in a fast-paced environment and handle multiple tasks simultaneously

Nice to have

  • Familiarity with DevOps scripting languages (Gitlab CI/CD)
  • AWS Cloud Practitioner or AWS Associate Level Certification
  • Experience automating tasks in a cloud or on premise environment

Working Conditions and Physical Requirements

Company Info.

Credence Management Solutions, LLC

Credence has been consistently rated as one of the Top Places to work (per Washington Post), is one of the largest privately held government contractors, and is also proud to be one of the fastest growing privately held firms in the U.S. over the last decade (per Inc. 5000). Our ability to perform exceptionally to implement new solutions, efficiencies, and savings across 220 U.S. Government programs is due to our obsession with exceeding custom

  • Industry
    Artificial intelligence,Cybersecurity
  • No. of Employees
    1,172
  • Location
    Vienna, Virginia, USA
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Credence Management Solutions, LLC is currently hiring Data Scientist Jobs in Tysons Corner, VA, USA with average base salary of $83,000 - $187,000 / Year.

Similar Jobs View More