Senior ETL Developer (Complex Data Flows)

3 years

25 - 30 Lacs

Posted:1 day ago| Platform: Indeed logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title: Senior ETL Developer – Complex Data Flows & High-Performance Systems
Location: Gurugram
Experience: 3+ years
Department: Data Engineering / Analytics

Job Summary:
We are looking for a highly skilled ETL Developer with extensive experience in managing complex data flows, designing scalable pipelines, and optimizing performance in large data environments. The ideal candidate should have hands-on experience with ETL tools, strong knowledge of PostgreSQL, and the capability to manage environments with thousands of processors (e.g., 5000+ in a single PG setup).

Key Responsibilities:

  • Design, build, and maintain robust ETL pipelines for large-scale and complex datasets
  • Develop, optimize, and troubleshoot SQL queries in PostgreSQL, including working with high-concurrency environments
  • Work with 5000+ processor instances in PostgreSQL or similar scale setups
  • Manage data ingestion from multiple sources, ensuring data integrity, consistency, and availability
  • Monitor data workflows, identify bottlenecks, and apply performance tuning
  • Collaborate with data architects, analysts, and stakeholders to define and fulfill data requirements
  • Ensure data quality, validation, and reconciliation across systems
  • Create and maintain documentation for data processes, models, and architecture
  • Ensure ETL pipelines meet security, privacy, and compliance standards

Required Skills & Experience:

  • 3+ years of experience in ETL development and complex data workflows
  • Strong hands-on experience with PostgreSQL, including optimization at scale
  • Proven ability to manage and process data across massively parallel systems (e.g., 5000 processor environments)
  • Proficient in SQL, PL/pgSQL, and performance tuning
  • Experience with ETL tools like Talend, Apache Nifi, Informatica, Airflow, etc.
  • Familiarity with big data ecosystems (Hadoop, Spark, Kafka) is a plus
  • Strong understanding of data modeling, warehousing, and data governance
  • Excellent analytical, debugging, and problem-solving skills

Preferred Qualifications:

  • Experience in cloud platforms (AWS, GCP, or Azure)
  • Familiarity with DevOps and CI/CD practices for data pipelines
  • Exposure to real-time streaming data processing
  • Knowledge of scripting languages (Python, Bash, etc.)

Education:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field

Job Types: Full-time, Permanent

Pay: ₹2,500,000.00 - ₹3,000,000.00 per year

Ability to commute/relocate:

  • Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred)

Application Question(s):

  • Are you serving your Notice Period ? If yes, what is your Last Working Day ?
  • What is your ECTC? Please mention here.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You