Sr Data Platform Engineer (GCP Focus)

5 - 10 years

7 - 12 Lacs

Posted:3 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role : Senior Data Platform Engineer (GCP Focus)
Key Responsibilities
Data Pipeline Automation & Engineering
  • Design, build, and automate scalable, production-grade data pipelines on GCP using core services such as Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery.
  • Develop and implement continuous integration and continuous deployment (CI/CD) workflows for data processing and analytics pipelines using tools like Cloud Build, GitHub Actions, or Jenkins.
  • Implement reusable data frameworks, templates, and libraries to standardize pipeline deployments, configuration management, and promote "Infrastructure as Code" principles.
  • Orchestrate complex, multi-stage ETL/ELT pipelines across diverse data sources and environments, ensuring efficient resource utilization and low latency.
Data Quality, Testing & Reliability
  • Implement and manage automated data validation, schema checks, and anomaly detection using industry-leading tools like Great Expectations, dbt tests, or custom Python frameworks.
  • Integrate quality gates directly into CI/CD workflows to ensure early issue detection and continuously improve overall data reliability (Data Reliability Engineering - DRE principles).
  • Schedule, monitor, and optimize data workflows, ensuring strict adherence to data delivery SLAs.
Monitoring & Observability
  • Set up and maintain proactive monitoring, logging, and automated alerting for all data pipelines and platform components using Stackdriver (Cloud Monitoring & Logging), Prometheus, or Grafana.
  • Develop and maintain comprehensive dashboards to track critical metrics, including data health, SLA adherence, and pipeline operational performance.
Data Governance & Metadata Management
  • Integrate and manage data assets, schemas, and metadata within Google Data Catalog or equivalent metadata management platforms.
  • Enforce robust governance policies, including data lineage tracking, strict access control (IAM), and compliance standards for sensitive data.
Required Skills & Experience
  • 5+ years of professional experience in data engineering, data platform operations, or a similar cloud-native technical role.
  • Strong expertise in the Google Cloud Platform (GCP) data stack: BigQuery, Dataflow, Cloud Composer, Pub/Sub, Cloud Functions, and Cloud Build.
  • High proficiency in Python, SQL, and general automation scripting.
  • Hands-on experience with CI/CD principles and tools, including GitOps and Infrastructure as Code (IaC) using Terraform or Cloud Deployment Manager.
  • Proven experience with data quality and testing frameworks such as Great Expectations, dbt, or PyTest.
  • Working knowledge of observability, logging, and monitoring frameworks for high-volume data systems.
  • Familiarity with metadata management, data lineage tools, and establishing data governance policies.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Infinite logo
Infinite

Advertising Services

Danvers MA

RecommendedJobs for You