Apache Airflow, Dask - Python library, Data pipelines, Data science techniques, Database, Github, Infrastructure as code, Machine learning techniques, NumPy, Python Programming, SPARK Programming, Teamwork
We are a young startup looking for a passionate candidate who can contribute to our mission to deploy impactful solutions to the most important issue of our generation. Our focus is on developing climate adaptation & resilience tools for multiple sectors, with agriculture as our initial market. We are looking for a Data Engineer to join the team designing robust and scalable data processes from the ground up. Experience with infrastructure is a bonus.
This person will help the Data Engineering group at ClimateAi build pipelines to ingest a variety of weather and climate datasets and make them available in useful ways to our customers. We work with a range of data types from hyper-structured n-dimensional data at terabyte scale to heterogeneous sensor data, with the need to deliver high quality information to data scientists (ML research) and operational products. This individual should be able to work remotely and be supremely self-driven with a strong sense of teamwork and collaboration. The ideal candidate will have a deep passion for designing and building data systems from the ground up, and an unwavering commitment to quality.
The ideal candidate will be comfortable working on open-ended projects, both individually as well as in a fast-paced team environment. We hire people who are collaborative, adaptable, communicate well, and love to learn. Expect to give and receive constructive criticism, as we are constantly re-evaluating our hypotheses and our products.
MAIN RESPONSIBILITIES:
DESIRED SKILLS AND EXPERIENCE:
THE FOLLOWING SKILLS ARE CONSIDERED STRONG PLUSES BUT ARE NOT REQUIRED:
What We Offer You
The world's first enterprise climate resilience tech platform, specializing in adaptation for agriculture, food, and beverage companies.