DBT Engineer

5 - 9 years

18 - 22 Lacs

Posted:5 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title: DBT Developer - Pune

About Us

Capco, a Wipro company, is a global technology and management consulting firm.

Awarded with Consultancy of the year in the British Bank Award and has been ranked Top

100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence

across 32 cities across globe, we support 100+ clients across banking, financial and Energy

sectors. We are recognized for our deep transformation execution and delivery.

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance

companies, payment service providers and other key players in the industry. The projects

that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients

transform their business. Together with our clients and industry partners, we deliver

disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking

their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

Job Title: DBT Developer - Pune

Location: Pune

Work Mode: Hybrid (3 days WFO - Tues, Wed, Thurs)

Shift Time: 12.30 PM TO 9.30 PM

Job Summary

We are seeking a skilled and detail-oriented DBT Engineer to join our cross-functional Agile

team. In this role, you will be responsible for designing, building, and maintaining modular,

reliable data transformation pipelines using dbt (Data Build Tool) in a Snowflake

environment. You will collaborate closely with backend and frontend engineers, product

managers, and analysts to create analytics-ready data models that power application

features, reporting, and strategic insights. This is an exciting opportunity for someone who

values clean data design, modern tooling, and working at the intersection of engineering

and business.

Key Responsibilities

  • Design, build, and maintain scalable, modular dbt models and transformation

pipelines using DBT Core. DBT Cloud experience is good to have.

  • Understand DBT Architecture thoroughly and experience in writing Python operators

in DBT flow. Strong experience in writing Jinja code, macros, seeds etc.

  • Write SQL to transform raw data into curated, tested datasets in Snowflake.
  • Knowledge of data modeling techniques like data vault and dimensional modeling

(Kimball/Inmon).

  • Collaborate with full-stack developers and UI/UX engineers to support application

features that rely on transformed datasets.

  • Work closely with analysts and stakeholders to gather data requirements and

translate them into reliable data models.

  • Enforce data quality through rigorous testing, documentation, and version control in

dbt.

  • Participate in Agile ceremonies (e.g., stand-ups, sprint planning) and manage tasks

using Jira.

  • Integrate dbt into CI/CD pipelines and support automated deployment practices.
  • Monitor data performance and pipeline reliability, and proactively resolve issues.

Mandatory Qualifications & Skills

  • 35 years of experience in data engineering or analytics engineering, with a focus

on SQL-based data transformation.

  • Hands-on production experience using dbt core or dbt cloud as a primary

development tool.

  • Strong command of SQL and solid understanding of data modeling best practices

(e.g., star/snowflake schema).

  • Proven experience with Snowflake as a cloud data warehouse.
  • Python skills for data pipeline integration or ingestion.
  • Familiarity with Git-based version control workflows.
  • Strong communication and collaboration skills, with the ability to work across

engineering and business teams.

  • Experience working in Agile/Scrum environments and managing work using Jira.

Nice-to-Have Skills

  • Knowledge of data orchestration tools (e.g., Apache Airflow) is a big plus.
  • Exposure to CI/CD pipelines and integrating dbt into automated workflows.
  • Experience with cloud platforms such as AWS.
  • Familiarity with Docker and container-based development.
  • Understanding of how data is consumed in downstream analytics tools (e.g.,

Looker, Tableau, Power BI).

Preferred Experience

  • A track record of building and maintaining scalable dbt projects in a production

setting.

  • Experience working in cross-functional teams involving developers, analysts, and

product managers.

  • A strong sense of ownership, documentation habits, and attention to data quality

and performance.

If you are keen to join us, you will be part of an organization that values your contributions,

recognizes your potential, and provides ample opportunities for growth. For more

information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Capco logo
Capco

Consulting, Financial Services

Chicago

RecommendedJobs for You