Gcp Data Engineer

8 - 13 years

25 - 35 Lacs

Posted:2 hours ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Skill: Lead Data Engineer Python & GCP

Experience Level:

10+ years of relevant IT experience

Key Responsibilities:

  • Design, develop, test, and maintain scalable ETL data pipelines using Python.
  • Architect the enterprise solutions with various technologies like Kafka,

multi-cloud services, auto-scaling using GKE, Load balancers, APIGEE proxy API

management, DBT, using LLMs as needed in the solution, redaction of sensitive

information, DLP (Data Loss Prevention) etc.

  • Work extensively on Google Cloud Platform (GCP) services such as:

Dataflow for real-time and batch data processing

Cloud Functions for lightweight serverless compute

BigQuery for data warehousing and analytics

Cloud Composer for orchestration of data workflows (on Apache Airflow)

Google Cloud Storage (GCS) for managing data at scale

IAM for access control and security

Cloud Run for containerized applications

Should have experience in the following areas :

API framework: Python FastAPI

Processing engine: Apache Spark

Messaging and streaming data processing: Kafka

Storage: MongoDB, Redis/Bigtable

Orchestration: Airflow

Experience in deployments in GKE, Cloud Run.

  • Perform data ingestion from various sources and apply transformation and

cleansing logic to ensure high-quality data delivery.

  • Implement and enforce data quality checks, validation rules, and monitoring.
  • Collaborate with data scientists, analysts, and other engineering teams to

understand data needs and deliver efficient data solutions.

  • Manage version control using GitHub and participate in CI/CD pipeline

deployments for data projects.

  • Write complex SQL queries for data extraction and validation from relational

databases such as SQL Server, Oracle, or PostgreSQL.

  • Document pipeline designs, data flow diagrams, and operational support

procedures.

Required Skills:

  • 10+ years of hands-on experience in Python for backend or data engineering

projects.

  • Strong understanding and working experience with GCP cloud services

(especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).

  • Solid understanding of data pipeline architecture, data integration, and

transformation techniques.

  • Experience in working with version control systems like GitHub and knowledge of

CI/CD practices.

  • Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
  • Strong experience in SQL with at least one enterprise database (SQL Server,

Oracle, PostgreSQL, etc.).

  • Experience in data migrations from on-premise data sources to Cloud platforms.

Good to Have (Optional Skills):

  • Experience working with the Snowflake cloud data platform.
  • Hands-on knowledge of Databricks for big data processing and analytics.
  • Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.

Additional Details:

  • Excellent problem-solving and analytical skills.
  • Strong communication skills and ability to collaborate in a team environment.

Education:

  • Bachelor's degree in Computer Science, a related field, or equivalent experience.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Mobilution It Systems logo
Mobilution It Systems

Information Technology

N/A

RecommendedJobs for You

hyderabad, chennai, bengaluru