Hadoop Administrator (Freelancer)

Mactores
Apply Now

Job Description

Mactores is a team of emerging technologists, data engineers, and data scientists driven by excellence in technology to solve critical business problems. We help bridging complex business challenges with technical expertise. Specializing in AI, Fast Data, Industrial IoT and Cloud helps us collaborate with organizations that view technology as a strategic driver to success.

As a Hadoop Administrator, you have experience of working in Hadoop Engineering, Administration for 4+ years. You have experience of deploying, managing, monitoring production grade Hadoop clusters for large deployments. You are well versed with Hadoop frameworks such as Hbase, Hive, and Apache Spark . Experience of working on DataOps, Automation and AWS would be a bigger plus.
If you love to solve problems using your skills, then come join the Team Mactores. We have a casual and fun office environment that actively steers clear of rigid corporate culture, focuses on productivity and creativity, and allows you to be part of a world class team while still being yourself. What you will do?

  • Implementation of Hadoop services and ongoing administration of Hadoop infrastructure on AWS EC2 or Amazon EMR
  • Deploy new Hadoop clusters on public cloud services such as Amazon Web Services and to expand existing environments.
  •  Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
  • Perform automated migration of Hadoop cluster from one region to the other.
  •  Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  •  Screen Hadoop cluster job performances and capacity planning
  •  Monitor Hadoop cluster connectivity and security.
  •  Manage and review Hadoop log files.
  •  File system management and monitoring.
  •  Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
  •  Collaboration with application teams to install operating system and Hadoop updates, patches, version upgrades when required.

What we are looking for?

  • Bachelors Degree in Information Technology, Computer Science or other relevant fields,
  • General operational expertise such as good troubleshooting skills, understanding of systems capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
  • 4+ years of experience in Hadoop skills like HBase, Hive, Pig, Mahout.
  • 2+ years of experience to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name node high availability, schedule and configure it and take backups.
  • Good knowledge of Linux as Hadoop runs on Linux.
  • Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting.

You will be preferred if

  • Cloudera and Hortonworks Certified Engineer
  • AWS Big Data Pro Certification
  • Extensive migration and troubleshooting experience.

Company Info.

Mactores

Mactores is a team of emerging technologists, data engineers, and data scientists driven by excellence in technology to solve critical business problems. We help bridging complex business challenges with technical expertise. Specializing in AI, Fast Data, Industrial IoT and Cloud helps us collaborate with organizations that view technology as a strategic driver to success.

  • Industry
    Information Technology
  • No. of Employees
    30
  • Location
    701 5th Ave, Seattle, Washington, USA
  • Website
  • Jobs Posted

Get Similar Jobs In Your Inbox

Mactores is currently hiring Software Engineer Jobs in Mumbai, Maharashtra, India with average base salary of ₹840,000 - ₹2,160,000 / Year.

Similar Jobs View More