About Us:
We are a Data and Analytics Consulting organization that leverages the modern practices of descriptive, predictive, and prescriptive analytics to extract value from business data. Our mission is to help forward-thinking organizations translate their data into actionable insights through analytics to achieve better business outcomes. Whether you need to visualize your financial data, analyze your customer’s journey or behavior, evaluate your marketing campaign performance, quantify risk in your business process, or enhance audit quality through deeper insights, our data analytics expertise can help..
Role: Data Engineer (Snowflake & DBT)
- Experience: 3 - 5 Years
- Budget: 12 -15 LPA
- Location: Hyderabad (Work from office)
- Skills: Snowflake, DBT or Matillion (Matillion-DPC is highly preferred), SSIS, SQL and Python, Cloud storage (Azure Data Lake, AWS S3, or GCS), Matillion Architecture,
- Notice Period: We are looking for immediate joiners only (Max. 15 days)
- Working mode and days: This is 5 days’ work from office role in Hyderabad
- Interview round: There are 2 rounds of interview in the process
Education
- Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
- Certifications such as Snowflake SnowPro, DBT Certified Developer Data Engineering are a plus.
Please note the mandatory or most preferred skill set for this role:
- Must have experience in Snowflake, DBT or Matillion (Matillion-DPC is highly preferred), SSIS, SQL and Python,
Important Job Insights
- These are urgent positions. (We need only immediate joiners- Max 15 days) (30 days’ notice period will be screen rejected)
- Communication should be excellent for all roles
- Please mention only official notice period.
- We can add 1 lakh of joining bonus for a very good candidate
- No flexibility in the budget
- Relocation is not open for this role
We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team.
The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and DBT or Matillion and be able to effectively work in a consulting setup.
In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client’s organization
Key Responsibilities
- Design and implement scalable ELT pipelines using DBT on Snowflake, following industry accepted best practices.
- Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake.
- Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets..
- Leverage orchestration tools (e.g., Airflow, DBT Cloud, or Azure Data Factory) to schedule and monitor data workflows.
- Apply DBT best practices: modular SQL development, testing, documentation, and version control.
- Perform performance optimizations in DBT/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.
- Apply CI/CD and Git-based workflows for version-controlled deployments.
- Contribute to growing internal knowledge base of DBT macros, conventions, and testing frameworks.
- Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.
- Write well-documented, maintainable code using Git for version control and CI/CD processes.
- Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.
- Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions.
Required Qualifications
- 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT or Matillion (Matillion-DPC is highly preferred)
- Experience building and deploying DBT models in a production environment.
- Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).
- Familiarity with data quality and validation techniques: DBT tests, DBT docs etc.
- Experience with Git, CI/CD, and deployment workflows in a team setting
- Familiarity with orchestrating workflows using tools like DBT Cloud, Airflow, or Azure Data Factory
Skills: dbt,data engineering,snowflake