Lead Data Engineer with GCP

10 - 15 years

20 - 35 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Overview:

We are looking for a skilled and motivated Lead Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for requirements gathering, designing, architecting the solution, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) & ELT data pipelines. The role involves working with customers directly, gathering requirements, discovery phase, designing, architecting the solution, using various GCP services, implementing data transformations, data ingestion, data quality, and consistency across systems, and post post-delivery support.

Experience Level:

10 to 15 years of relevant IT experience

Notice Period: Please apply only if you can join immediately.

Key Responsibilities:

  • Design, develop, test, and maintain scalable ETL data pipelines using Python.
  • Architect the enterprise solutions with various technologies like Kafka, multi-cloud services, auto-scaling using GKE, Load balancers, APIGEE proxy API management, DBT, using LLMs as needed in the solution, redaction of sensitive information, DLP (Data Loss Prevention) etc.
  • Work extensively on Google Cloud Platform (GCP) services such as:
    • Dataflow for real-time and batch data processing
    • Cloud Functions for lightweight serverless compute
    • BigQuery for data warehousing and analytics
    • Cloud Composer for orchestration of data workflows (on Apache Airflow)
    • Google Cloud Storage (GCS) for managing data at scale
    • IAM for access control and security
    • Cloud Run for containerized applications

Required Skills:

  • 10 to 15 years of hands-on experience in Python for backend or data engineering projects.
  • Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
  • Solid understanding of data pipeline architecture, data integration, and transformation techniques.
  • Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
  • Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
  • Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
  • Experience in data migrations from on-premise data sources to Cloud platforms.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

hyderabad, pune, bengaluru