Data Engineer - Python & GCP || Face to Face Interview || Contract Job || 8-10 Years Experience

8 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Contractual

Job Description

About Client:

Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media.


Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia.


Interview Details : Face to Face Interview

Job Title : Data Engineer - Python & GCP

Experience Level : 8-10 Years

Job Location : Hyderabad

Budget : 1,50,000 Per Month

Job Type : Contract

Work Mode : Hybrid

Notice Period : Immediate Joiners

Client : CMMI Level 5



We are looking for a skilled and motivated Senior Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team.

The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines.

The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.


Key Responsibilities:


  • Design, develop, test, and maintain scalable ETL data pipelines using Python.
  • Work extensively on Google Cloud Platform (GCP) services such as: ○ Dataflow for real-time and batch data processing
  • Cloud Functions for lightweight serverless compute
  • BigQuery for data warehousing and analytics
  • Cloud Composer for orchestration of data workflows (based on Apache Airflow)
  • Google Cloud Storage (GCS) for managing data at scale
  • IAM for access control and security
  • Cloud Run for containerized applications Should have experience in the following areas : API framework: Python FastAPI Processing engine: Apache Spark
  • Messaging and streaming data processing : Kafka Storage: MongoDB, Redis/Bigtable
  • Orchestration: Airflow
  • Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
  • Implement and enforce data quality checks, validation rules, and monitoring.
  • Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
  • Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
  • Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
  • Document pipeline designs, data flow diagrams, and operational support procedures.


Required Skills:


  • 8–10 years of hands-on experience in Python for backend or data engineering projects.
  • Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
  • Solid understanding of data pipeline architecture, data integration, and transformation techniques.
  • Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
  • Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
  • Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
  • Experience in data migrations from on-premise data sources to Cloud platforms.
  • Experience working with Snowflake cloud data platform.
  • Experience in deployments in GKE, Cloud Run.
  • Hands-on knowledge of Databricks for big data processing and analytics.
  • Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.


Additional Details:


  • Excellent problem-solving and analytical skills.
  • Strong communication skills and ability to collaborate in a team environment.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now