Data Engineer (Snowflake, dbt, SQL, Azure Data Lake)

10.0 years

0.0 Lacs P.A.

Hyderabad, Telangana, India

Posted:2 weeks ago| Platform: Linkedin logo

Apply Now

Skills Required

datasqlazureconsultingengineeringaiautomationarchitecturemigratesapdesigntuningclusteringqueryoptimizationorchestrationairflowgitagilepythonscriptingintegration

Work Mode

On-site

Job Type

Full Time

Job Description

Note If shortlisted, we’ll contact you via WhatsApp and email. Please monitor both and respond promptly. This role is located in Hyderabad. Candidates willing to relocate are welcome to apply. Location: Hyderabad Work Mode: Work From Office Salary: ₹13,00,000 – ₹22,00,000 INR Joining Time / Notice Period: Immediate – 30 Days About The Client – A top-tier tech consulting firm specializing in data engineering, AI, and automation. With deep expertise in digital transformation and cloud solutions, the company helps businesses make smarter, data-driven decisions and optimize operations. Job Purpose Seeking an experienced and detail-oriented Data Engineer to join a growing data engineering team. This role involves building and optimizing scalable ELT pipelines using Snowflake and dbt, working on cloud data architecture, and collaborating with analysts, architects, and other engineers to deliver validated, business-ready datasets. Key Responsibilities Build and maintain ELT pipelines using dbt on Snowflake Migrate and optimize SAP Data Services (SAP DS) jobs to cloud-native platforms Design and manage layered data architectures (staging, intermediate, mart) Apply performance tuning techniques like clustering, partitioning, and query optimization Use orchestration tools such as dbt Cloud, Airflow, or Control-M Develop modular SQL, write tests, and follow Git-based CI/CD workflows Collaborate with data analysts/scientists to gather requirements and document solutions Contribute to knowledge sharing through reusable dbt components and Agile ceremonies Must-Have Skills 3–10 years of Data Engineering experience Strong hands-on with Snowflake, dbt, SQL, and Azure Data Lake Basic proficiency in Python for scripting and automation Experience with SAP DS for legacy system integration Understanding of data modeling (preferably dimensional/Kimball) Familiarity with RBAC, GDPR, and data privacy best practices Git-based version control and CI/CD exposure Show more Show less