Data Engineer - Snowflake DBT

3 - 5 years

12 - 15 Lacs

Posted:17 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Role : Data Engineer (Snowflake, DBT)
Location : Hyderabad Experience : 3 to 5 years Must have :

Experience in SSIS

Experience in cloud storage (Azure Data Lake, AWS S3, or GCS)
Experience in SQL and Python Experience in Snowflake and DBT or Matillion Description :

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization.
Key Responsibilities :

Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices.
2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices : modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Core Competencies :
Data Engineering and ELT Development :
  • Building robust and modular data pipelines using dbt.
  • Writing efficient SQL for data transformation and performance tuning in Snowflake.
  • Managing environments, sources, and deployment pipelines in dbt.

Cloud Data Platform Expertise :
  • Strong proficiency with Snowflake : warehouse sizing, query profiling, data loading, and performance optimization.
  • Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages.

Technical Toolset :
Languages & Frameworks :
  • Python : For data transformation, notebook development, automation.
  • SQL : Strong grasp of SQL for querying and performance tuning.

Best Practices and Standards :
  • Knowledge of modern data architecture concepts including layered architecture (e.g., staging intermediate marts, Matillion architecture).
  • Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs).

Security & Governance :
Access and Permissions :
  • Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling.
  • Familiar with data privacy policies (GDPR basics), encryption at rest/in transit.

Deployment & Monitoring :
DevOps and Automation :
  • Version control using Git, experience with CI/CD practices in a data context.
  • Monitoring and logging of pipeline executions, alerting on failures.

Soft Skills :
Communication & Collaboration :
  • Ability to present solutions and handle client demos/discussions.
  • Work closely with onshore and offshore team of analysts, data scientists, and architects.
  • Ability to document pipelines and transformations clearly.
  • Basic Agile/Scrum familiarity working in sprints and logging tasks.
  • Comfort with ambiguity, competing priorities and fast-changing client environment.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You