Full Stack Data Engineer

5 - 8 years

6 - 10 Lacs

Posted:3 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Key Responsibilities
Design, build, and maintain end-to-end GCP data pipelines (batch and streaming). Ensure data platform uptime and performance in alignment with defined SLAs/SLOs. Develop and optimize ETL/ELT workflows using Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery. Manage and enhance data lakes and warehouses using BigQuery and Cloud Storage. Implement streaming data solutions using Pub/Sub, Dataflow, or Kafka. Build data APIs and microservices for data consumption using Cloud Run, Cloud Functions, or App Engine.

Define and enforce data quality, governance, and lineage using Data Catalog
and Cloud Data Quality tools. Collaborate with DevOps to build CI/CD pipelines, infrastructure as code, and automated monitoring for data workflows. Participate in incident management, RCA, and change control processes following ITIL best practices. Proactively identify optimization opportunities to improve cost efficiency, latency, and reliability. Mentor junior engineers and ensure adherence to engineering best practices.

Technical Skills & Experience
Programming: Python (advanced), SQL (expert), Shell scripting. GCP Data Stack: o Data Storage: BigQuery, Cloud Storage o Data Processing: Dataflow (Apache Beam), Dataproc (Spark), Cloud Composer (Airflow) o Streaming: Pub/Sub, Kafka (optional) o Orchestration: Cloud Composer / Airflow o APIs & Services: Cloud Run, Cloud Functions, API Gateway o Monitoring: Cloud Logging, Cloud Monitoring, Stackdriver, or Prometheus/Grafana

DevOps: Terraform, Git, Docker, Kubernetes, Cloud Build / Jenkins.
Data Visualization: Looker Studio, Power BI, or Tableau. Version Control & CI/CD: GitHub, GitLab, or Bitbucket pipelines

Key Competencies
Proven experience in managed service delivery under strict SLAs (availability, latency, and resolution timelines). Strong understanding of GCP networking, IAM, and cost management. Strong knowledge of incident management, change management, and problem management using ITSM tools (e.g., ServiceNow, Jira). Ability to lead on-call operations in a 16/5 support model, coordinating across global teams. Understanding of data security, governance, and compliance. Excellent communication and client-handling skills. Ability to work in a 16/5 model, coordinating with global teams across time zones.

Good to Have
Certifications: o Google Professional Data Engineer o Google Professional Cloud Architect Experience with data observability tools (Monte Carlo, Databand, or similar). Exposure to MLOps pipelines on GCP (Vertex AI, AI Platform).

Mock Interview

Practice Video Interview with JobPe AI

Start Artificial Intelligence Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You