Senior Data Engineer – Data Build Tool

8 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Location:

Employment Type:

Shift time:

Experience Required:


About the Role:

We are seeking a Senior Data Engineer with deep experience in DBT (Data Build Tool) to join our data team. You will be responsible for building scalable and maintainable data pipelines, transforming raw data into actionable insights, and helping shape the future of our data architecture and governance practices.


Key Responsibilities:

● Design, develop, and maintain data pipelines using DBT, SQL, and orchestration tools like Airflow or Prefect

● Collaborate with data analysts, scientists, and stakeholders to understand data needs and deliver clean, well-modeled datasets

● Optimize DBT models for performance and maintainability

● Implement data quality checks, version control, and documentation standards in DBT

● Work with cloud data warehouses like Snowflake, BigQuery, Redshift, or Databricks

● Own and drive best practices around data modeling (Kimball, Star/Snowflake schemas), transformation layers, and CI/CD for data

● Collaborate with cross-functional teams to integrate data from various sources (APIs, third-party tools, internal services)

● Monitor and troubleshoot data pipelines and ensure timely delivery of data to business stakeholders

● Mentor junior engineers and contribute to team growth and development


Required Skills:

● 7+ years of experience in Data Engineering or related fields

● 4+ years of hands-on experience with DBT (Core or Cloud)

● Strong SQL skills and experience with modular data modeling

● Experience with ELT/ETL pipelines using orchestration tools like Airflow, Dagster, Prefect, or similar

● Solid understanding of data warehouse architecture and performance tuning

● Proficient with one or more cloud platforms: AWS, GCP, or Azure

● Familiarity with version control (Git), CI/CD pipelines, and testing frameworks in data engineering

● Experience working with structured, semi-structured (JSON, Parquet) data

● Excellent communication and documentation skills


Preferred Qualifications:

● Experience with DataOps practices and monitoring tools

● Familiarity with Python or Scala for data processing

● Exposure to Looker, Tableau, or other BI tools

● Knowledge of data governance, cataloging, or lineage tools (e.g., Great Expectations, Monte Carlo, Atlan)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You