Hadoop Developer

4 - 6 years

4 - 6 Lacs

Posted:1 week ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities:

  • Develop, test, and deploy Hadoop-based data processing workflows using tools like

    MapReduce, Hive, Pig, and Spark

    .
  • Design and implement ETL/ELT pipelines to ingest and process large volumes of structured and unstructured data.
  • Write efficient Hive queries, optimize MapReduce jobs, and develop Spark applications using

    Scala, Java, or Python

    .
  • Work with HDFS for storage management and data ingestion strategies.
  • Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
  • Monitor and troubleshoot Hadoop jobs and cluster performance issues.
  • Ensure data quality, data governance, and security compliance in big data solutions.
  • Maintain documentation for code, processes, and workflows.
  • Participate in code reviews, testing, and deployment activities.

Qualifications and Requirements:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience as a Hadoop Developer or Big Data Engineer.
  • Strong experience with Hadoop ecosystem components such as

    HDFS, MapReduce, Hive, Pig, HBase, Oozie, Sqoop, and Flume

    .
  • Proficient in programming languages such as

    Java, Scala, or Python

    for developing big data applications.
  • Experience with

    Apache Spark

    for batch and stream processing is highly desirable.
  • Familiarity with data modeling, schema design, and query optimization techniques in big data environments.
  • Knowledge of Linux/Unix systems and shell scripting.
  • Experience working with cloud-based big data platforms (AWS EMR, Azure HDInsight, Google Dataproc) is a plus.
  • Good problem-solving skills and ability to work in a collaborative Agile environment.

Desirable Skills:

  • Experience with real-time data streaming tools like

    Kafka

    or

    Storm

    .
  • Knowledge of NoSQL databases such as

    HBase, Cassandra, or MongoDB

    .
  • Familiarity with DevOps and CI/CD pipelines for big data workflows.
  • Understanding of data security and privacy best practices in big data environments.
  • Excellent communication and teamwork skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Teamware Solutions logo
Teamware Solutions

IT Services and IT Consulting

Chennai Tamilnadu

RecommendedJobs for You

Chennai, Tamil Nadu, India

Pune, Maharashtra, India