GCP Platform Engineer

7 - 12 years

27 - 37 Lacs

Posted:22 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role:

Location:

Experience:

Position Type:

Mode:

Role & responsibilities

Platform Build & Engineering:

  • • Architect, deploy, and configure GCP data platform components (Dataproc clusters, Composer environments, Dataflow workers, GCS storage controls, IAM policies).
  • • Build Terraform-driven reusable templates and configuration blueprints for platform provisioning.
  • • Implement security & governance controls CMEK, VPC Service Controls, org policies, workload identity, IAM least privilege.
  • • Define platform standards autoscaling rules, cluster profiles, dependency packaging, environment provisioning workflows.

L3 Troubleshooting & Developer Enablement:

  • • Serve as platform L3 escalation for engineering teams to debug Dataproc/YARN execution failures, Dataflow backlog or performance bottlenecks, Composer scheduler or DAG dependency errors.
  • • Provide reference architectures, runbooks, onboarding documentation.
  • • Enable self-service platform usage via templates, automation and tooling.

Automation, CI/CD & Tooling:

  • • Build internal automation to provision environments using IaC and scripting.
  • • Integrate CI/CD pipelines (Cloud Build, GitHub Actions) for Composer DAGs, Dataflow templates and platform configurations.
  • • Develop frameworks for dependency packaging, deployment workflows and observability.

Performance, Reliability & Platform Evolution:

  • Monitor platform health using Cloud Monitoring, tune performance, and scale capacity.
  • Optimize cost across clusters/jobs, introduce budget controls and quotas.
  • Participate in platform roadmap, architecture modernization and platform upgrades.

Preferred candidate profile

Core GCP Data Services Expertise:

  • Dataproc – cluster config, init-actions, HA setups, autoscaling, debugging YARN.
  • GCS – IAM policy design, lifecycle management, CMEK, cross-project sharing.
  • Dataflow – Flex Templates, worker sizing, backlog handling, tuning pipelines.
  • Composer – Airflow deployments, dependency management, scaling, DAG troubleshooting.

Engineering Skills:

  • Terraform (preferred) or IaC experience.
  • Python or Shell scripting for automation.
  • CI/CD integration for platform workloads.
  • Spark/Hadoop understanding for job failure troubleshooting.

Leadership & Collaboration:

  • Ability to lead platform roadmap execution while remaining hands-on.
  • Strong collaboration and communication skills across architecture, security & engineering teams.
  • Ownership mindset and documentation capability.

Preferred Certifications

  • Google Professional Data Engineer
  • Google Cloud Architect
  • Google Associate Cloud Engineer

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Fint Solutions logo
Fint Solutions

Financial Technology

Finlandia

RecommendedJobs for You