MORE FILTERS
LESS FILTERS
United States
4-6 year
22 Apr 2024
In this role you will build very large, scalable platforms using cutting edge data technologies. This is not a maintain existing platform or make minor tweaks to current code base kind of role. We...
Apache Hive,Apache Kafka,B2B,Effective communication skills,Java Programming,NoSQL,Presto,Python Programming,Scala Programming,SPARK Programming,SQL
Canada
Apache Flink,Apache Hive,Apache Kafka,B2B,Java Programming,NoSQL,Python Programming,Scala Programming,SPARK Programming,SQL
Lisbon, Portugal
21 Apr 2024
The Role
Veeva Link supports the life sciences industry to connect with key people to improve research and care. It helps professionals to find the right people for, e.g., clinical trials,...
Amazon RedShift,AWS,Continuous Integration & Continuous Delivery - CI/CD,Data Engineering,Data Lake,Data pipelines,Data science techniques,DevOps,Google Cloud Platform (GCP),Lakehouse,Machine learning techniques,MLflow,PySpark,Python Programming,SPARK Programming
Barcelona, Spain
London, UK
Zürich, Switzerland
2-4 year
Capco is a global technology and management consultancy dedicated to the financial services industry. We combine innovative thinking with unrivalled industry knowledge, to offer our clients...
Ansible,Apache Kafka,AWS,Big Data Technology,Continuous Integration & Continuous Delivery - CI/CD,Data Architecture,Data Lake,Data pipelines,Data science techniques,Design,DevSecOps,Effective communication skills,ETL/ELT technologies,JavaScript,Kubernetes-K8s,Project management,PySpark,Python Programming,R Programming,Scala Programming
Kuala Lumpur, Malaysia
Capco, a Wipro company, is a global technology and management consultancy specializing in driving digital transformation in the financial services industry. With a growing client portfolio...
Azure,Big Data Technology,Continuous Integration & Continuous Delivery - CI/CD,Data Engineering,Data Modeling,Database,DevOps,Oracle,PostgreSQL,Teradata
İstanbul, Türkiye; Ankara, Turkey; izmir, Turkey
If you are a techie, you belong in our Technology Team that builds scalable, high-performance platforms for our customers using up-to-date and efficient technologies.
We are all working with...
3D data processing,Apache Airflow,Apache Cassandra,Apache Flink,Apache Hive,Apache Kafka,Big Data Technology,Data Engineering,DevOps,Docker,Effective communication skills,Git,GitLab,Google Cloud Platform (GCP),Kubernetes-K8s,Linux Operating system,NoSQL,Postgres,REST API,RESTful,Scala Programming,SPARK Programming,SQL,Unix Operating system,Writing
Vietnam
20 Apr 2024
Apache Hadoop,Apache Hive,AWS,Azure,Azure CosmosDB,Big Data Technology,Continuous Integration & Continuous Delivery - CI/CD,Data Analytics,Data Architecture,Data Warehousing,Design,HBase,IT,Kanban,Neo4J,OLAP,PostgreSQL,REDIS,Scrum Agile Methodology,SQL,SQL Server
Estonia
This position is needed to build and maintain highly scalable, reliable, and efficient data pipelines that will empower both inbound and outbound messaging stacks, not to mention other internal...
Aartificial intelligence,Apache Kafka,AWS,Azure,Big Data Technology,Data Engineering,Data pipelines,Data Warehousing,Database,Docker,DynamoDB,Elasticsearch,ETL frameworks,Google Cloud Platform (GCP),Infrastructure as code,Java Programming,Kubernetes-K8s,Machine learning techniques,PySpark,Python Programming,Scala Programming,SPARK Programming,Spark-SQL
If you have forgotten your password you can reset it here.