GCP Hadoop Developer/Lead

5 - 10 years

0 Lacs

Pune, Bengaluru

Posted:1 day ago| Platform: Naukri logo

Apply

Skills Required

Pyspark Bigquery Hadoop GCP Python Airflow Kafka Dataproc SQL Hive cloud composer Spark Google Cloud Platforms

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Summary We are seeking a highly skilled Hadoop Developer / Lead Data Engineer to join our data engineering team based in Bangalore or Pune. The ideal candidate will have extensive experience with Hadoop ecosystem technologies and cloud-based big data platforms, particularly on Google Cloud Platform (GCP). This role involves designing, developing, and maintaining scalable data ingestion, processing, and transformation frameworks to support enterprise data needs. Minimum Qualifications Bachelor's degree in computer science, Computer Information Systems, or related technical field. 5-10 years of experience in software engineering or data engineering, with a strong focus on big data technologies. Proven experience in implementing software development life cycles (SDLC) in enterprise environments. Technical Skills & Expertise Big Data Technologies: Expertise in Hadoop platform, Hive , and related ecosystem tools. Strong experience with Apache Spark (using SQL, Scala, and/or Java). Experience with real-time data streaming using Kafka . Programming Languages & Frameworks: Proficient in PySpark and SQL for data processing and transformation. Strong coding skills in Python . Cloud Technologies (Google Cloud Platform): Experience with BigQuery for data warehousing and analytics. Familiarity with Cloud Composer (Airflow) for workflow orchestration. Hands-on with DataProc for managed Spark and Hadoop clusters. Responsibilities Design, develop, and implement scalable data ingestion and transformation pipelines using Hadoop and GCP services. Build real-time and batch data processing solutions leveraging Spark, Kafka, and related technologies. Ensure data quality, governance, and lineage by implementing automated validation and classification frameworks. Collaborate with cross-functional teams to deploy and operationalize data analytics tools at enterprise scale. Participate in production support and on-call rotations to maintain system reliability. Follow established SDLC practices to deliver high-quality, maintainable solutions. Preferred Qualifications Experience leading or mentoring data engineering teams. Familiarity with CI/CD pipelines and DevOps best practices for big data environments. Strong communication skills with an ability to collaborate across teams.

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now
Onix

3 Jobs

RecommendedJobs for You

Gurugram, Bengaluru, Mumbai (All Areas)

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru

Hyderabad, Pune, Bengaluru