Cloud Data Engineer (GCP)

3 - 6 years

13 - 15 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About The Opportunity

We’re a rapid-growth enterprise AI platform provider in the cloud services & SaaS sector, empowering Fortune 500 customers to modernize data pipelines, automate knowledge work, and unlock new revenue streams with Generative AI. Backed by AI researchers and Google Cloud architects, we deliver production-grade solutions on GCP at scale. Join our hybrid team in Pune or Mumbai to shape the next generation of agentic AI-driven data products.Role & Responsibilities
  • Architect, develop, and maintain end-to-end ETL pipelines on GCP leveraging BigQuery, Dataflow, Cloud Composer, Dataform, and Pub/Sub.
  • Build and optimize secure, high-performance RESTful and gRPC APIs to expose analytics and ML features to internal and external consumers.
  • Implement cost-effective, resilient data workflows through partitioning, autoscaling, and advanced monitoring with Cloud Monitoring/Stackdriver.
  • Automate infrastructure provisioning and deployments using Terraform, Cloud Build, and Cloud Deploy, enforcing Infrastructure-as-Code best practices.
  • Embed data quality and governance via schema enforcement, versioned contracts, and automated regression testing.
  • Collaborate closely with product managers, data scientists, and SRE teams to meet SLAs and deliver measurable business impact.

Skills & Qualifications

Must-Have
  • 3-6 years designing and implementing large-scale data solutions on Google Cloud Platform (BigQuery, Composer, Dataflow, Dataform, Pub/Sub).
  • Strong proficiency in Python and SQL for building robust data pipelines and analytics queries.
  • Expertise in DevOps workflows: Git, CI/CD (Cloud Build, Cloud Deploy), containerization, and Infrastructure-as-Code (Terraform).
  • Proven experience developing and tuning high-throughput REST/gRPC APIs for data services.
  • Deep understanding of data partitioning, optimization, and monitoring using Cloud Monitoring/Stackdriver.
  • Solid knowledge of data quality frameworks, schema management, and automated testing in pipeline workflows.

Preferred

  • Experience integrating or operating Elasticsearch/OpenSearch for log and metric search.
  • Familiarity with streaming frameworks such as Kafka or Flink and performance benchmarking on GCP.
  • Exposure to Kubernetes or GKE for container orchestration and production scaling.
Benefits & Culture Highlights
  • Hybrid work model with flexible hours and collaborative office spaces in Pune and Mumbai.
  • Continuous learning opportunities: certifications, hackathons, and AI/cloud training programs.
  • Inclusive, innovation-driven culture emphasizing work-life balance and career growth.
Skills: genrative AI,CI,Cd,Git,Data,Azure,DevOps,Python,langchain,Kubernetes,elastic search,Data Flow,RAG,LLMs,MLOps,API Development,API Platform,GCP,Cloud,Advanced,Pipelines

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You