Big Data Engineer - Ho Chi Minh

GeoComply
Apply Now

Job Description

This role will work in a Big Data team. Also will closely work with business analysts, developers and product owners.

The primary responsibility is to contribute to GeoComply’s success by collecting, storing, processing and analyzing huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.

The role’s overall focus is to implement world-class scalable data storage, which is to improve the overall performance of GeoComply products.

We are looking for an experienced Big Data Engineer to join the team and help us build world-class scalable and efficient data storage. Which is to improve the overall performance of GeoComply products. If you enjoy working with large data sets, finding the best in class solutions to satisfy Business requests, and love challenging problems you are very welcome!

As a Big Data Engineer with GeoComply, you will help design and develop a world-class data management platform. You will maintain open communication with your team members and cross-functional stakeholders.

What You Will Be Doing:

  • Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
  • Implementing the ETL process to transform data from OLTP databases to OLAP DB and Data Lake using event streaming platforms such as Kafka.
  • Develop, transform large datasets and maintain robust data pipelines that can support various use cases with high performance.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Defining data retention, data governance policies and framework.

Bonus points if you:

  • Have experience with Delta Lake technology.
  • Good Kibana, Elasticsearch ELK stack knowledge is a plus.

About You:

  • At least 3 to 5 years experience in Java, Python programming languages.
  • At least 3 to 5 years experience with Big Data, Java Spring, Kafka Streams, Spark Streams frameworks.
  • 1-2 years of experience on one of the following cloud analytics platforms: AWS, Azure or GCP
  • Experience with AWS cloud services such as S3, Glue, Kinesis, Redshift, SageMaker preferred
  • Experience with Neo4j graph database management system preferred
  • Experience in big data and streaming data frameworks (Hadoop/ PySpark/ Kafka or equivalent) preferred
  • Experience in large scale deployment and performance tuning.
  • Experience with schema design and dimensional data modeling
  • Experience with non-relational and relational databases (MySQL, MongoDB)
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Fluent written and spoken English.
  • Strong analytical and problem-solving skills.

Company Info.

GeoComply

GeoComply was founded in 2011 and provides fraud prevention and cybersecurity solutions that detect location fraud and help verify a user's true digital identity. Our award-winning products are based on the technologies developed for the highly regulated and complex U.S. online gaming and sports betting market, as well as streaming video broadcasters and the online banking, payments and cryptocurrency industries.

  • Industry
    Information Technology,Cybersecurity
  • No. of Employees
    450
  • Location
    Vancouver, BC, Canada
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

GeoComply is currently hiring Big Data Engineer Jobs in Ho Chi Minh, Ho Chi Minh City, Vietnam with average base salary of ₫18,400,000 - ₫48,600,000 / Month.

Similar Jobs View More