BigData Multi-Cloud DevOps Platform Engineer

Experian
Apply Now

Job Description

  • Deliver innovative CI/CD solutions using the most cutting-edge techno stack like Jenkins & Terraform scripts.
  • Automating Big Data technologies infrastructure, building and configuring using DevOps tools.
  • Designing and configuration of monitoring systems (e.g. ELK or Prometheus or Cloudera Manager or Splunk or Dynatrace or Cloud watch).
  • Managing application containerisation using Docker and Kubernetes would be highly critical.
  • Ability to configure and support API and Open Source integrations
  • Responsible for implementation and ongoing administration of Big Data infrastructure.
  • Leading the thinking on automation of repetitive tasks and enabling a better engineering experience (release/monitoring and guiding efforts through the team.
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Big Data Environments.
  • Collaborating with multiple teams to perform updates, patches, version upgrades when required.
  • General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Help to shape enterprise solutions to allow better serviceability, monitoring and alerting to meet the increasing business requirements and demands.
  • Keep track of latest release versions and update the running workload consistently across stack.
  • Basic Understanding On-premise and Cloud network architectures
  • Working with an agile team to develop, test, and maintain APIs
  • Assisting in the collection and documentation of user's requirements, development of user stories, and estimates
  • Will require on-call 24X7 support of production systems on a rotation basis with other team members.

Qualifications

  • Typically requires a bachelor's degree (in Computer Science or related field) or equivalent.
  • 5-8 years of total experience
  • 2+ years of experience in DevOps using Agile best practices
  • Experience in developing solutions utilising terraform
  • 2+ years of Linux (Redhat) system administration or user
  • Sound knowledge with delivering Solutions in the cloud with AWS or Azure or Oracle Cloud
  • Cloud Platforms IaaS/PaaS – Cloud solutions: AWS or Azure or Oracle Cloud or GCP or VMWare
  • Strong hands on skills with scripting in Bash, Shell and Python is mandatory
  • Experience working with DevOps and Continuous Integration tools and technologies including Docker or Ansible or Bamboo or Jenkins and Terraform.
  • Good to have experience with tools & applications (Eclipse, IntelliJ, JIRA, Confluence, Bitbucket, Git and Artifactory)
  • Aware of operational Support needs and good with documentation and automation
  • Automating deployments and monitoring/alerting tasks using terraform.
  • Good to have experience working with Open-source products.
  • Strong Problem Solving and creative thinking, Analytical skills, effective oral and written communications
  • Experience working with geographically distributed teams
  • Ability to handle conflicting priorities.
  • Ability to learn, Adapt and Receptive to change.

Company Info.

Experian

Experian unlocks the power of data to create opportunities for consumers, businesses and society. During life’s big moments – from buying a home or car, to sending a child to college, to growing a business exponentially by connecting it with new customers – we empower consumers and our clients to manage data with confidence so they can maximize every opportunity.

Get Similar Jobs In Your Inbox

Experian is currently hiring Big Data DevOps Engineer Jobs in Hyderabad, Telangana, India with average base salary of ₹90,000 - ₹250,000 / Month.

Similar Jobs View More