Posted:2 days ago| Platform: Shine logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Role Overview:

ETL Developer

This role requires hands-on technical expertise, problem-solving skills, and the ability to collaborate with cross-functional teams to deliver scalable and efficient data solutions.

Key Responsibilities:

  • ETL Development & Data Integration:

    • Design, develop, and optimize ETL pipelines using

      Snowflake

      and

      dbt (Data Build Tool)

      .
    • Implement transformations using

      Jinja templates

      and

      SQL models

      in dbt.
    • Ensure data is cleansed, validated, and transformed accurately to meet business requirements.
  • Data Modeling & Architecture:

    • Work with stakeholders to design and implement

      star schema / snowflake schema

      and other optimized data models.
    • Support data warehousing initiatives and ensure alignment with data architecture best practices.
  • Performance Optimization:

    • Monitor, troubleshoot, and optimize SQL queries, dbt models, and Snowflake pipelines for better efficiency.
    • Implement best practices for performance tuning, query optimization, and cost management in Snowflake.
  • Automation & Scripting:

    • Leverage

      Python

      for automating ETL workflows, data quality checks, and operational tasks.
    • Integrate dbt with orchestration frameworks (e.g., Airflow, dbt Cloud, or equivalent).
  • Data Governance & Quality:

    • Implement data validation frameworks, audit checks, and reconciliation processes.
    • Maintain documentation of ETL workflows, data models, and transformations for transparency and governance.
  • Collaboration & Support:

    • Partner with business analysts, data scientists, and BI developers to provide high-quality, reliable datasets.
    • Provide production support for ETL jobs, ensuring timely resolution of issues.

Required Skills (Mandatory):

  • Snowflake:

    • Advanced knowledge of Snowflake features (Warehouses, Schemas, Cloning, Micro-partitioning, Streams, Tasks).
    • Experience in query optimization, performance tuning, and cost-effective scaling in Snowflake.
  • dbt (Data Build Tool):

    • Strong experience in developing and maintaining dbt models.
    • Proficiency in

      Jinja templating

      and

      SQL transformations

      in dbt.
    • Knowledge of dbt testing frameworks and deployment practices.
  • SQL Expertise:

    • Advanced SQL programming skills for data transformation, analysis, and performance optimization.

Secondary Skills (Preferred):

  • Python:

    • Strong scripting and automation capabilities.
    • Experience in integrating Python scripts into ETL workflows.
    • Familiarity with data manipulation libraries (e.g., Pandas, PySpark).
  • Familiarity with cloud platforms (AWS/Azure/GCP) and orchestration tools like

    Airflow

    .
  • Knowledge of version control (Git/GitHub/GitLab) and CI/CD practices.
  • Exposure to data visualization tools (Tableau, Power BI) is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

bangalore, chennai, hyderabad, gurugram, pune

mumbai, navi mumbai, mumbai (all areas)