Python Developer, Data Engineer _ Talend To Python Migration

3 - 5 years

7 - 15 Lacs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About the Role

We are seeking experienced Python Developers/Data Engineers to join our team for a strategic migration project, moving enterprise ETL workloads from Talend to a modern Python-based ingestion framework on Google Cloud Platform (GCP). You will work closely with architects, platform engineers, and automation teams to design, build, and optimize scalable data pipelines, leveraging Airflow, Cloud Run, Dataproc, and BigQuery.

Key Responsibilities

  • 3+ years of experience in Python development, with a focus on data engineering and ETL pipeline design.
  • ETL Migration & Development

Analyze existing Talend jobs/modules and design equivalent Python-based ETL workflows.

Develop, test, and deploy Python scripts/functions for ingestion, preprocessing, file management, and data loading (SFTP to GCS, file unzip, preprocessing, raw/harmonized table loads, DQ checks, view creation).

Implement orchestration using Airflow DAGs and Cloud Run jobs, ensuring modular, maintainable, and scalable solutions.

Integrate with automation tools (Automic) for job triggering, monitoring, and error handling.

  • Platform Configuration & Optimization

Configure and tune serverless workloads (Cloud Run, Dataproc, BigQuery) for performance, stability, and cost efficiency (e.g., retry strategies, timeout settings, CPU/memory sizing, autoscaling).

Apply best practices for resource allocation, parallelism, and batch processing in Python and PySpark jobs.

  • Data Quality & Observability

Implement data quality checks, logging, and monitoring for traceability and operational excellence.

Ensure robust error handling, retry logic, and alerting for ETL pipelines.

  • Security & Compliance

Work with IAM, service accounts, and Secret Manager for secure credential management and access control.

Adhere to data governance, privacy, and compliance requirements.

  • Collaboration & Documentation

Collaborate with cross-functional teams (platform, DevOps, data governance) to align on requirements and deliverables.

Document migration processes, configuration parameters, and operational procedures.

Required Skills & Experience

Hands-on experience with GCP services: Cloud Run, Dataproc, BigQuery, Cloud Composer (Airflow), Cloud Storage.

  • Strong understanding of Talend ETL concepts and experience migrating Talend jobs to Python is highly desirable.
  • Experience with orchestration tools (Airflow, Automic) and automation of data workflows.
  • Proficiency in containerization (Docker), CI/CD, and artifact management.
  • Familiarity with data quality frameworks, logging, and monitoring tools.
  • Solid grasp of IAM, service accounts, and cloud security best practices.
  • Excellent problem-solving, communication, and documentation skills.

Preferred Qualifications

  • Experience with PySpark, serverless Spark jobs, and advanced resource tuning.
  • Exposure to enterprise data platforms, large-scale migrations, and cloud-native architectures.
  • Knowledge of GCP networking, firewall rules, and hybrid connectivity

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You