Key Skills
Apache Airflow, AWS, AWS Glue, Data Modeling, Data Visualization, DICOM, ETL/ELT technologies, Lambda, Machine learning techniques, Oracle, Postgres, Python Programming, SQL
Job Description
Position Responsibilities
- Design and manage large scale data systems with serverless, event-driven cloud technology, such as AWS Glue, Batch and Lambda
- Work closely with software engineers to build ETL pipelines around our data sources, applications, deployments, and integrations
- Support Machine Learning Engineers and Data Scientists to automate and optimize data discovery, access, and feature enrichment
- Support remote data ETL for deployments to partners and clients
- Collaborate with clinicians, product managers, researchers, regulatory experts, and IT stakeholders to tackle complex problems on a mission driven product team
- Participate in data modeling, schema design, and SQL development
- Ingest and aggregate data from both internal and external data sources to build our world class datasets
- Be involved in testing and fixing of new or enhanced solutions for data products and reports, including automating ETL testing
- Assist with the development and review of technical and end user documentation including ETL workflows, research, and data analysis
- Build monitoring dashboards and automate data quality testing
- Own meaningful parts of our service, have an impact, grow with the company
Requirements
Location
- Remote or Fort Collins, Northern Colorado Area preferred
- Relocation assistance available
Education
- Bachelor's Degree in computer science or related field, or equivalent practical experience
Experience
- 5+ years’ experience in data analytics and/or data management
- Proficient in Python
- Experience in SQL, data modeling and managing databases
- Experience in building and optimizing ETL pipelines
- Experience with AWS or other cloud technology
- Proficient in data analysis and visualization
- Experience working with medical data (DICOM, HL7, EHR)
- Experience with Apache Spark + other big data tools
- Experience working with Apache Airflow
- Experience working with Postgres, Oracle, MS SQL
ISO/QMS Requirements
- Understand and practice all requirements of EN ISO 13485:2016, ISO 13485:2016 MDSAP including 21 CFR 820, QMS Manual, Process Flows and Work Instructions. Experience with EN ISO 62304:2006 a plus.
- Comply with applicable regulatory requirements (including but not limited to MDSAP participating countries and CE Marking).
- Support Internal and External audits.
Benefits
Our people love working here because they are challenged by unique and difficult problems in a nimble startup environment that is backed by five plus years of success in performance.
- Health,vision, and dental insurance
- Company paid Short Term Disability
- Company paid Long Term Disability
- Company paid employee Life Insurance
- Up to 6% 401k match
- Unlimited PTO
- $150 a month health and fitness stipend
Company Info.
Enlitic
Enlitic is a healthcare technology company that specializes in deep learning and artificial intelligence (AI) for medical image analysis. Enlitic uses deep learning algorithms to analyze medical images such as X-rays, CT scans, and MRIs. The company's AI software can detect anomalies and identify patterns in medical images, allowing doctors to make more accurate diagnoses and treatment decisions.
Get Similar Jobs In Your Inbox
Enlitic is currently hiring Senior Data Platform Engineer Jobs in Fort Collins, CO, USA with average base salary of $115,000 - $135,000 / Year.