ETL Developer

3 years

0 Lacs

Posted:13 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Part Time

Job Description

Primary Title:

ETL Developer (Data Integration Engineer) — REMOTE (India)

About The Opportunity

A fast-growing company in the HR Tech & talent solutions sector, delivering scalable data platforms that power analytics, reporting, and business decisioning. We run cloud-first data warehouses and production ETL workflows to unify HR, talent, and operational data for global customers. Join a distributed engineering team delivering high-quality, reliable pipelines and measurable business impact.

Role & Responsibilities

  • Design, build and maintain scalable ETL/ELT pipelines to ingest, transform and load data from multiple source systems into the central data warehouse.
  • Implement robust data transformations using SQL, Python/Scala and ETL tools (Informatica/SSIS/Talend/Azure Data Factory/AWS Glue) with clear lineage and documentation.
  • Develop and operate workflow orchestration (Airflow or native cloud schedulers), ensuring reliable scheduling, retry logic and alerting for production jobs.
  • Optimize query and pipeline performance: partitioning, indexing, Spark tuning, and cost-efficient cloud resource usage.
  • Implement data quality checks, monitoring, logging and automated alerts to maintain pipeline reliability and SLA adherence.
  • Collaborate with analysts, data scientists and product teams to translate business requirements into performant data models and documentation; support release & CI/CD processes.

Skills & Qualifications

  • Must-Have: 3+ years hands-on ETL development experience; advanced SQL skills and production experience with at least one ETL tool (Informatica, SSIS, Talend, ADF or AWS Glue).
  • Must-Have: Strong scripting skills in Python or Scala for data transformation, automation and troubleshooting.
  • Must-Have: Practical experience with cloud data warehouses (Snowflake, Redshift, BigQuery) and knowledge of data modelling (star/snowflake schemas).
  • Must-Have: Experience operating workflows/orchestrators (Airflow) and applying data quality, monitoring and alerting patterns in production.
  • Preferred: Familiarity with Spark (batch/streaming), CI/CD for data pipelines, Git, and container-based deployment.
  • Preferred: Exposure to large-scale data ingestion patterns, cost-optimisation in cloud environments and prior remote/ distributed team experience.

Benefits & Culture Highlights

  • Fully remote role across India with flexible hours and a focus on outcomes over face-time.
  • Opportunity to work on cloud-first data platforms, modern ETL tooling and influence data architecture decisions.
  • Learning allowance, technical training and clear career progression paths within a fast-scaling product and engineering organisation.
We’re seeking pragmatic, quality-driven ETL engineers who care about data accuracy, performance and delivering reliable production systems. If you thrive in remote teams and enjoy solving complex data integration challenges, we’d like to hear from you.
Skills: etl,data,sql

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

nagpur, maharashtra, india

noida, uttar pradesh, india