ETL Lead - Data Pipeline

8 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

We are seeking a highly skilled ETL Lead to design, develop, and manage our data integration pipelines and ensure smooth data operations across the organization. The role requires deep expertise in ETL processes, advanced SQL, Snowflake, Python scripting, and DBT (Data Build Tool). You will lead a team of data engineers and collaborate closely with business stakeholders to deliver robust data solutions.

Key Responsibilities

  • Design, develop, and maintain ETL workflows to extract, transform, and load data from various sources into the data warehouse (Snowflake).
  • Lead the end-to-end delivery of data integration projects, including planning, technical architecture, development, and deployment.
  • Develop and optimize complex SQL queries for data extraction, aggregation, and reporting.
  • Build data transformation models and pipelines using DBT and ensure data accuracy and consistency.
  • Write Python scripts for data processing, automation, and integration with external APIs or systems.
  • Monitor ETL jobs, troubleshoot performance issues, and implement best practices for scalability and reliability.
  • Mentor and guide junior team members in ETL development, coding standards, and data governance.
  • Collaborate with data analysts, business users, and product teams to understand requirements and translate them into technical solutions.
  • Ensure adherence to data security, privacy, and compliance policies.

Required Skills & Qualifications

  • Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field.
  • 8+ years of experience in ETL development and data engineering.
  • Strong expertise in ETL tools and processes.
  • Advanced proficiency in SQL, including query optimization and performance tuning.
  • Hands-on experience with Snowflake, including schema design, SnowSQL, and Snowflake utilities.
  • Proficient in Python for scripting and data manipulation.
  • Experience with DBT for data transformation and modeling.
  • Knowledge of data warehousing concepts, data governance, and data quality frameworks.
  • Experience working in Agile environments and managing end-to-end project delivery.
  • Excellent problem-solving and communication skills.
  • Snowflake certification(s).
  • Experience integrating data from cloud platforms (AWS, Azure, GCP).
  • Familiarity with CI/CD practices and version control (Git).
  • Exposure to BI tools such as Tableau or Power BI.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You