Data Ops Developer

Geotab Inc.
Apply Now

Job Description

Geotab is advancing security, connecting commercial vehicles to the internet and providing web-based analytics to help customers better manage their fleets. Geotab’s open platform and Geotab Marketplace ®, offering hundreds of third-party solution options, allows both small and large businesses to automate operations by integrating vehicle data with their other data assets. Processing billions of data points a day, Geotab leverages data analytics and machine learning to improve productivity, optimize fleets through the reduction of fuel consumption, enhance driver safety and achieve strong compliance to regulatory changes.

Our team is growing and we’re looking for people who follow their passion, think differently and want to make an impact. Ours is a fast paced, ever changing environment. Geotabbers accept that challenge and are willing to take on new tasks and activities - ones that may not always be described in the initial job description. Join us for a fulfilling career with opportunities to innovate, great benefits, and our fun and inclusive work culture. Reach your full potential with Geotab. To see what it’s like to be a Geotabber, check out our blog and follow us @InsideGeotab on Instagram. Join our talent network to learn more about job opportunities and company news.

Who you are:

We are always looking for amazing talent who can contribute to our growth and deliver results! Geotab is seeking a Data Ops Developer to develop and maintain data pipelines. If you love technology, and are keen to join an industry leader — we would love to hear from you!

What you'll do:

The Data Ops Developer is the foundation of Geotab’s data engineering capabilities. They are responsible for the development of scalable and reusable pipelines that minimize time from insight to production. The role continuously collaborates with data analysts and data scientists to design innovative pipelines using new tools and frameworks. They also work closely with the Data Strategist and Data Quality experts to ensure all data assets are delivered with the highest quality, with the right schema and into the right location.

How you'll make an impact:

  • Design, optimize and maintain SQL queries used for creating data products.
  • Advance your SQL knowledge to ensure queries leverage most recent advancements.
  • Deploy and maintain ETL/ELT pipelines using SQL, Python and Airflow.
  • Design and publish reusable pipeline templates (i.e. templates for data integration, derived metrics, reporting, custom runs).
  • Collaborate with data analysts and data scientists to develop complex pipelines involving big data tools (i.e. Spark, Apache Beam, GKE, Kafka, Docker).
  • Lead optimization of pipelines based on requirements and pipeline performance.
  • Contribute to development of data integration connectors to extract external data.
  • Manage pipeline releases through Git and CI/CD.
  • Ensure metadata is captured and stored across the pipeline lifecycle (i.e. creation, execution, deprecation/update).
  • Support remediation of issues within production pipelines.
  • Collaborate with data quality analysts and specialists to ensure all pipelines include automatic quality checks.
  • Recommend features and enhancements to infrastructure and pipeline framework.
  • Contribute to the migration of data assets and pipelines from legacy data structures.
  • Participate in a 24x7 on-call rotating schedule.

What you'll bring to the role:

  • Post-secondary Degree specialization in Computer Science, Software or Computer Engineering or a related field.
  • 3-5 years experience in Data Engineering or a similar role.
  • 3-5 years of experience with SQL.
  • 3-5 years of experience building ETL/ELT production pipelines in Python.
  • Knowledge of data management fundamentals and data modeling principles.
  • Experience with Big Data environments (e.g. Google BigQuery) is an asset.
  • Experience with CI/CD processes and tools, such as Gitlab runners or Jenkins is required.
  • Knowledge of workflow orchestration tools is required (e.g. Apache Airflow).
  • Knowledge of Linux and command line commands is an asset.
  • Experience working in a cloud based infrastructure, especially Google Cloud Platform, is an asset.
  • Previous experience with python package development is highly regarded.
  • Excellent oral and written communication skills.
  • Strong analytical skills with the ability to problem solve well-judged decisions.
  • Highly organized and able to manage multiple tasks and projects simultaneously.
  • Entrepreneurial mindset and comfortable in a flat organization.
  • Must stay relevant to technology and have the flexibility to adapt to the growing technology and market demands.
  • Strong team player with the ability to engage with all levels of the organization.

If you got this far, we hope you're feeling excited about this role! Even if you don't feel you meet every single requirement, we still encourage you to apply.

Please note:

Geotab does not accept agency resumes and is not responsible for any fees related to unsolicited resumes. Please do not forward resumes to Geotab employees.

For roles requiring access to the customer database, candidates must have resided in the US for at least the last three years, and must successfully pass a comprehensive background check, which includes a drug test, as well as a credit check.

Why job seekers choose Geotab

Flex working arrangements

Home office reimbursement program

Baby bonus & parental leave top up program

Online learning and networking opportunities

Electric vehicle purchase incentive program

Competitive medical and dental benefits

Retirement savings program

The above are offered to full-time permanent employees only

How we work

At Geotab, we have adopted a flexible hybrid working model in that we have systems, functions, programs and policies in place to support both in-person and virtual work. However, you are welcomed and encouraged to come into our beautiful, safe, clean offices as often as you like. When working from home, you are required to have a reliable internet connection with at least 50mb DL/10mb UL. Virtual work is supported with cloud-based applications, collaboration tools and asynchronous working. The health and safety of employees are a top priority. We encourage work-life balance and keep the Geotab culture going strong with online social events, chat rooms and gatherings. Join us and help reshape the future of technology!

Company Info.

Geotab Inc.

Geotab Inc. is a privately held company that provides telematics hardware solutions which it presents as Internet of Things devices. These devices feed their software as a service analytics platform.

  • Industry
    Telematics
  • No. of Employees
    500
  • Location
    Oakville, ON, Canada
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Geotab Inc. is currently hiring DataOps Engineer Jobs in Atlanta, GA, USA with average base salary of $126,000 - $246,300 / Year.

Similar Jobs View More