Senior Data Engineer (Snowflake & DBT)

5 years

18 - 22 Lacs

Posted:1 day ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Logic Pursuits is seeking an experienced Senior Data Engineer to lead the design and implementation of scalable, high-performance data pipelines using Snowflake and dbt. You will define architectural best practices, optimize data platforms, and mentor junior engineers while collaborating with clients to deliver robust, production-grade data solutions.

Key Responsibilities

  • Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake.
  • Design layered data models (staging, intermediate, marts / medallion architecture) aligned with dbt best practices.
  • Lead ingestion of structured & semi-structured data (APIs, flat files, cloud storage like Azure Data Lake, AWS S3) into Snowflake.
  • Optimize Snowflake performance and cost (warehouse sizing, clustering, materializations, query profiling, credit monitoring).
  • Leverage advanced dbt features: macros, packages, custom tests, sources, exposures, and documentation.
  • Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory integrated with CI/CD pipelines.
  • Define and enforce data governance & compliance (RBAC, secure data sharing, encryption).
  • Collaborate with analysts, data scientists, architects, and stakeholders to deliver validated business-ready datasets.
  • Mentor junior engineers, lead code/architecture reviews, and establish reusable frameworks.
  • Manage end-to-end project delivery in a client-facing consulting environment.

Required Qualifications

  • 5–8 years of data engineering experience, with 3+ years hands-on with Snowflake & dbt in production.
  • Strong expertise in SQL & Snowflake (performance optimization, clustering, cost management).
  • Solid experience in dbt development: modular model design, macros, tests, documentation, Git.
  • Workflow orchestration skills: dbt Cloud, Airflow, or Azure Data Factory.
  • Experience ingesting data from APIs, cloud storage, and databases.
  • Knowledge of dimensional modeling, SCDs, and medallion architecture (Bronze/Silver/Gold layers).
  • Proficiency in working with data formats (Parquet, JSON, CSV, etc.).
  • Programming: Python & SQL (strong), Jinja (nice-to-have).
  • Exposure to ETL/ELT pipeline design, data quality frameworks, AI/ML pipelines (good to have).
  • Experience with data security & governance (dbt tests, encryption, RBAC, secure sharing).

Soft Skills & Leadership

  • Strong client-facing communication & stakeholder management skills.
  • Ability to manage multiple priorities in fast-paced delivery cycles.
  • Proven project ownership: design, implementation, monitoring.
  • Experience mentoring junior engineers & driving best practices.
  • Agile ways of working (sprints, scrum ceremonies, estimation).

Education & Certifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • Certifications like Snowflake SnowPro Advanced or dbt Certified Developer are a plus.

Job Type: Full-time

Pay: ₹1,800,000.00 - ₹2,200,000.00 per year

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You