Senior ETL Developer

5 - 7 years

0 Lacs

Posted:1 day ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title:

We are seeking an experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines and cloud-based data platforms. The ideal candidate will have strong hands-on experience with ETL/ELT development, data modeling, and modern data engineering tools. This role will focus on building reliable, high-performance data solutions that support analytics, operations, and business decision-making.

Key Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines for batch and near-real-time data ingestion.
  • Develop and optimize data integration workflows using tools such as Matillion, Informatica, dbt, ADF, Airflow, Glue, or similar.
  • Implement and maintain data warehouse and lakehouse architectures (Snowflake, Databricks, Redshift, Synapse).
  • Create and maintain logical and physical data models (star, snowflake, 3NF, data vault).
  • Write efficient SQL and Python code for data processing, transformation, and automation.
  • Ensure data quality, validation, and observability through testing frameworks and monitoring tools.
  • Collaborate with data architects, analysts, and business teams to deliver high-quality data solutions.
  • Support performance tuning, cost optimization, and environment management within cloud platforms.
  • Implement and follow data governance, metadata management, and security best practices.
  • Contribute to CI/CD processes, version-controlled development, and deployment automation.
  • Document data flows, pipeline designs, and system-level technical artifacts for reference and audit.

Required Qualifications

  • Bachelor's or master's degree in computer science, Information Systems, or a related field.
  • 5+ years of experience in data engineering.
  • Strong expertise in SQL, Python, and data transformation techniques
  • Hands-on experience with modern cloud platforms (AWS, Azure, or GCP).
  • Proficient in at least one cloud-native data warehouse (e.g., Snowflake, Redshift, Databricks, BigQuery, Synapse).
  • Experience with ETL/ELT tools (e.g., Matillion, Informatica, Talend, dbt, AWS Glue, Azure Data Factory (ADF)).
  • Strong SQL skills and familiarity with Python/Scala for data processing.
  • Understanding of data governance, metadata management, and security practices.
  • Experience with CI/CD for data pipelines and version-controlled deployments.

Preferred Qualifications

  • Certifications in Snowflake, AWS, Azure, GCP would be appreciated.
  • Experience with data cataloging tools (e.g., Alation, Collibra, Apache Atlas).
  • Exposure to real-time data streaming technologies (e.g., Kafka, Kinesis).
  • Understanding of AI/ML integration within data platforms.
  • Tableau, Sigma or any other BI tool exposure is a plus.

Soft Skills

  • Strong communication skills with the ability to work cross-functionally.
  • Proactive, detail-oriented, and structured in problem solving.
  • Ability to mentor junior engineers and contribute to team development.
  • Demonstrated ownership and accountability in delivering data solutions.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

pune, maharashtra, india

hyderabad, telangana, india