Job Description
As a Senior or Lead Big Data DevOps Engineer, you will be working with a team responsible for setting up, scaling, and maintaining Big Data infrastructure and tools in private and public cloud environments.
Main Responsibilities:
- Driving improvement of the efficiency of Big Data infrastructure.
- Coordinating cross-team infrastructure and Big Data initiatives.
- Leading Big Data related architecture and design efforts.
- Ensuring availability, efficiency, and reliability of the Big Data infrastructure.
- Building and supporting tools for operational tasks.
- Evaluating, designing, deploying monitoring tools.
- Design and implementation of DR/BC practices and procedures.
- On-call support of production systems.
Requirements:
- 7+ years of experience working with Hadoop, preferably Open Source.
- 3+ years of leading Big Data, DevOps, SRE, DBA, or development team.
- Experience setting up and running Hadoop clusters of 1000+ nodes.
- Solid knowledge of NoSQL databases, preferably Cassandra or ScyllaDB.
- Experience running and troubleshooting Kafka.
- Working knowledge of at least one of: Terraform, Ansible, SaltStack, Puppet.
- Proficiency in shell scripting.
Nice to have:
- Experience with Prometheus.
- Experience managing Showflake.
- Solid knowledge of Graphite and Grafana.
- Python or Perl scripting skills.
- Experience with installing and managing Aerospike.
- DBA experience with one of: PostgreSQL, MySQL, MariaDB.
Company Info.
Zeta Global
Zeta Global Holdings Corp. is a data-driven marketing technology company. Zeta offers companies a suite of multichannel marketing tools focused on creating, maintaining, and monetizing customer relationships. Zeta Global has the industry's third largest data set (2.4B+ identities), next to Google and Facebook, powered by demographic, locational, behavioural, transactional, and predictive signals.
-
Industry
Computer software
-
No. of Employees
1,300
-
Location
New York, NY, USA
-
Website
-
Jobs Posted