Posted:1 day ago|
Platform:
On-site
Full Time
Job Description
We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client’s organization.
Key Responsibilities
1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices.
2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake.
3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets..
4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows.
5. Apply dbt best practices: modular SQL development, testing, documentation, and version control.
6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.
7. Apply CI/CD and Git-based workflows for version-controlled deployments.
8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks.
9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.
10. Write well-documented, maintainable code using Git for version control and CI/CD processes.
11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.
12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions.
Required Qualifications
Experience building and deploying DBT models in a production environment.
Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).
Experience with Git, CI/CD, and deployment workflows in a team setting
Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory.
Core Competencies:
o Data Engineering and ELT Development:
o Cloud Data Platform Expertise:
'
Technical Toolset:
o Languages & Frameworks:
Job Type: Full-time
Pay: ₹1,200,000.00 - ₹1,500,000.00 per year
Application Question(s):
Work Location: In person
HIRINGHOUSE TECHNOLOGIES PVT LTD
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now12.0 - 15.0 Lacs P.A.
12.0 - 15.0 Lacs P.A.
hyderabad, telangana, india
Salary: Not disclosed
12.0 - 15.0 Lacs P.A.
12.0 - 15.0 Lacs P.A.
hyderabad, telangana, india
Salary: Not disclosed