Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary:

We are seeking a Senior Data Engineer to join a Global Data Analytics & Insights team focused on transforming enterprise data management and democratizing access to insights. This role will lead a large-scale data migration from external agency-managed environments into an internal Google Cloud Platform (GCP) ecosystem. You will drive discovery, define and execute migration strategy, rebuild ingestion pipelines, and ensure secure, compliant, always-on access for internal teams and approved external partners. The work directly supports global analytics initiatives by delivering scalable, high-quality, governed data products


Roles and Responsibility

  • Lead discovery and deep analysis of existing third-party data warehouses (e.g., Snowflake or hybrid setups), including data assets, lineage, dependencies, and integration patterns.
  • Define and execute end-to-end migration strategy to move data assets and workflows into internal GCP environments.
  • Handle and migrate third-party API-based ingestion flows so inbound data lands in GCP data lakes/warehouses.
  • Partner with product lines, business stakeholders, and downstream consumers (dashboards, CRM, analytics) to capture requirements and ensure smooth data consumption.
  • Design, build, and maintain scalable ingestion and transformation pipelines using Python, SQL, PySpark, and DBT/Dataform.
  • Develop and manage GCP-native data solutions using BigQuery, Dataflow, Pub/Sub, Cloud Functions, Cloud Run, Dataproc, etc.
  • Implement strong data governance, security, access controls, and compliance practices using GCP security capabilities.
  • Integrate DevSecOps quality and security checks (e.g., SonarQube, FOSSA) into CI/CD pipelines, and respond to findings.
  • Orchestrate workflows with Apache Airflow/Astronomer and provision infrastructure via Terraform (IaC best practices).
  • Monitor and optimize performance, scalability, reliability, and cost-efficiency of pipelines and storage.
  • Promote engineering best practices, reusable patterns, automation, and continuous improvement across teams.
  • Produce clear documentation and communicate technical decisions effectively to technical and non-technical audiences.


Required Skills:

  • Expert proficiency in Python (NumPy, Pandas), SQL, and PySpark.
  • Strong production-grade GCP experience with BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Functions, Cloud Run, Data Fusion/Data Prep.
  • Proven ability to build scalable ELT/ETL ingestion and curation pipelines.
  • Hands-on experience with DBT and/or Dataform for transformations.
  • Workflow orchestration expertise with Apache Airflow and/or Astronomer.
  • Proven experience integrating and managing third-party APIs for data ingestion.
  • CI/CD and DevOps exposure (e.g., Tekton) and strong GitHub-based version control.
  • Infrastructure as Code with Terraform.
  • Solid knowledge of data governance, encryption, masking, access management, and cloud security best practices.
  • Strong understanding of modern data ecosystems: data warehouses/lakes, metadata, meshes/fabrics, and analytics/AI use cases.
  • Experience with Agile delivery, user stories, and cross-functional collaboration.
  • Willingness to work from office (Chennai location).


Preferred Skills:

  • Direct experience with Snowflake migration, exploration, and optimization.
  • Experience with Java.
  • Familiarity with Master Data Management (MDM) concepts/tools.
  • Exposure to additional security/performance tools (e.g., Checkmarx, Dynatrace).
  • Working knowledge of GDPR and data privacy impact on architecture.


Required Experience:

  • 5–7 years in Data Engineering or Software Engineering with data-intensive systems.
  • 2+ years building and deploying production-scale cloud data platforms on GCP.
  • Demonstrated leadership in delivering migration or large data engineering programs.
  • Strong track record of optimizing compute/storage cost and performance in cloud environments.


Education:

  • Bachelor’s degree in Computer Science, Information Technology, Data Analytics, or related field (or equivalent practical experience).

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You