Hadoop Administrator

5 - 10 years

8 - 15 Lacs

Posted:2 days ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

We are looking for an experienced Hadoop Administrator to design, build, and maintain our Hadoop ecosystem based on the Open Data Platform (ODP) framework. The ideal candidate will have hands-on experience in the installation, configuration, tuning, and administration of key Hadoop components, including HDFS, YARN, Hive, Spark, Ranger, and Oozie.

This role involves end-to-end platform setup, ongoing maintenance, performance optimization, and support for enterprise big data workloads.

Key Responsibilities

  • Install, configure, and manage Hadoop clusters and ecosystem components (HDFS, YARN, Hive, Spark, Zookeeper, Oozie, etc.) on ODP-compliant distributions.
  • Build and deploy Hadoop stacks from scratch, including hardware sizing, capacity planning, and architecture design.
  • Implement cluster high availability (HA), backup/recovery, and disaster recovery strategies.
  • Manage user access, security policies, and Kerberos/Ranger configurations.
  • Perform cluster performance tuning, troubleshooting, and log analysis to ensure system stability.
  • Monitor system health and optimize resource utilization using Ambari, Cloudera Manager, or other monitoring tools.
  • Automate cluster operations using shell scripts or Python for deployment, maintenance, and patching.
  • Collaborate with data engineering and infrastructure teams for upgrades, migrations, and platform integrations.
  • Maintain detailed documentation for architecture, configurations, and operational runbooks.

Required Skills & Experience

  • 5–10 years of experience in Hadoop ecosystem administration.
  • Proven experience building Hadoop clusters from scratch using ODP distributions (Hortonworks, Cloudera, or similar).
  • Strong expertise in:
  • HDFS, YARN, Hive, Spark, Zookeeper, Oozie
  • Ambari or Cloudera Manager (installation, service management, and monitoring)
  • Kerberos, Ranger, or Sentry for security and authorization.
  • Proficiency in Linux system administration, shell scripting, and configuration management.
  • Experience with performance tuning, capacity planning, and troubleshooting in production environments.
  • Familiarity with HA configurations, NameNode failover, and cluster scaling.

Education

  • Regular MCA or Bachelor’s degree in Computer Science, Information Technology, Engineering, or equivalent experience.

Nice to Have

  • Experience with cloud-based Hadoop environments (AWS EMR, Azure HDInsight, GCP Dataproc).
  • Exposure to containerized big data platforms (Kubernetes, Docker).
  • Knowledge of automation tools (Ansible, Terraform, Puppet).
  • Experience with Kafka, Airflow, or NiFi for data pipeline integration.\
  • Understanding of data governance, auditing, and monitoring best practices.

Job Types: Full-time, Permanent

Pay: ₹800,000.00 - ₹1,500,000.00 per year

Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

gurgaon kty., gurugram, haryana