Senior & Lead Data Engineer - (Python ,GCP) | Top MNC | Hyderabad

6 - 11 years

30 - 45 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title:

Experience:

Location:

Employment Type:

About the Role:

Senior Data Engineer

Key Responsibilities:

  • Design, develop, and maintain

    data pipelines and ETL workflows

    using

    Python

    and

    GCP services

    .
  • Work extensively on

    GCP components

    , including:
    • Dataflow

      Real-time & batch data processing
    • BigQuery

      – Data warehousing and analytics
    • Cloud Functions

      – Serverless compute
    • Cloud Composer (Airflow)

      – Workflow orchestration
    • Google Cloud Storage (GCS)

      – Scalable data storage
    • IAM

      – Access control and security
    • Cloud Run

      – Containerized applications
  • Build and optimize data pipelines for performance, scalability, and reliability.
  • Implement

    data ingestion, validation, and transformation

    from diverse sources.
  • Perform

    data quality checks, auditing, and monitoring

    .
  • Work with tools and technologies like:
    • Apache Spark

      for data processing
    • Kafka

      for streaming data pipelines
    • FastAPI

      for API frameworks
    • MongoDB

      ,

      Redis/Bigtable

      for data storage
    • Airflow / GCP Composer DAGs

      for orchestration
  • Collaborate with cross-functional teams to understand data needs and deliver data solutions.
  • Use

    GitHub

    for version control and participate in

    CI/CD pipeline deployments

    .
  • Write

    complex SQL queries

    for data validation and extraction (SQL Server, Oracle, PostgreSQL).
  • Maintain documentation, including pipeline design and data flow diagrams.

Required Skills:

  • 7–10 years of experience in

    Python

    for backend or data engineering.
  • Strong knowledge of

    GCP services

    (Dataflow, BigQuery, Cloud Functions, Cloud Composer, GCS).
  • Hands-on experience with

    Apache Spark

    ,

    Kafka

    ,

    Redis

    ,

    FastAPI

    , and

    Airflow

    .
  • Expertise in

    SQL

    (SQL Server, Oracle, or PostgreSQL).
  • Experience with

    GitHub

    and

    CI/CD pipelines

    .
  • Experience with

    data migrations

    from on-prem to cloud.
  • Strong understanding of

    data pipeline architecture

    ,

    ETL/ELT processes

    , and

    data integration

    .

Good to Have (Optional):

  • Experience with

    Snowflake

    ,

    Databricks

    , or

    GKE deployments

    .
  • Familiarity with

    Azure Data Factory (ADF)

    or other Azure tools.
  • Exposure to

    Cloud Run

    for deployments.

Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration abilities.
  • Ability to work in a fast-paced, agile environment.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You