GCP Data Engineer | 4+ years

4 years

4 - 20 Lacs

Posted:1 week ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary:

We are seeking an experienced and results-driven GCP Data Engineer with over 4 years of hands-on experience in building and optimizing data pipelines and architectures using Google Cloud Platform (GCP). The ideal candidate will have strong expertise in data integration, transformation, and modeling, with a focus on delivering scalable, efficient, and secure data solutions. This role requires a deep understanding of GCP services, big data processing frameworks, and modern data engineering practices.

Key Responsibilities:

  • Design, develop, and deploy scalable and reliable data pipelines on Google Cloud Platform.
  • Build data ingestion processes from various structured and unstructured sources using Cloud Dataflow, Pub/Sub, BigQuery, and other GCP tools.
  • Optimize data workflows for performance, reliability, and cost-effectiveness.
  • Implement data transformations, cleansing, and validation using Apache Beam, Spark, or Dataflow.
  • Work closely with data analysts, data scientists, and business stakeholders to understand data needs and translate them into technical solutions.
  • Ensure data security and compliance with company and regulatory standards.
  • Monitor, troubleshoot, and enhance data systems to ensure high availability and accuracy.
  • Participate in code reviews, design discussions, and continuous integration/deployment processes.
  • Document data processes, workflows, and technical specifications.

Required Skills:

  • Minimum 4 years of experience in data engineering with at least 2 years working on GCP.
  • Strong proficiency in GCP services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, Cloud Functions, and Vertex AI (preferred).
  • Hands-on experience in SQL, Python, and Java/Scala for data processing and transformation.
  • Experience with ETL/ELT development, data modeling, and data warehousing concepts.
  • Familiarity with CI/CD pipelines, version control (Git), and DevOps practices.
  • Solid understanding of data security, IAM, encryption, and compliance within cloud environments.
  • Experience with performance tuning, workload management, and cost optimization in GCP.

Preferred Qualifications:

  • GCP Professional Data Engineer Certification.
  • Experience with real-time data processing using Kafka, Dataflow, or Pub/Sub.
  • Familiarity with Terraform, Cloud Build, or infrastructure-as-code tools.
  • Exposure to data quality frameworks and observability tools.
  • Previous experience in an agile development environment.

Job Types: Full-time, Permanent

Pay: ₹473,247.51 - ₹2,000,000.00 per year

Schedule:

  • Monday to Friday

Application Question(s):

  • Mention Your Last Working Date

Experience:

  • Google Cloud Platform: 4 years (Preferred)
  • Python: 4 years (Preferred)
  • ETL: 4 years (Preferred)

Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start Java Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Java Skills

Practice Java coding challenges to boost your skills

Start Practicing Java Now

RecommendedJobs for You