7 - 12 years
14 - 27 Lacs
Posted:6 days ago|
Platform:
On-site
Full Time
Job Overview:
If you are passionate about data architecture, ELT best practices, and modern cloud data stack, we'd like to meet you.
Key Responsibilities:
Pipeline Design & Orchestration: Build and maintain robust, scalable data pipelines using Apache Airflow, including incremental & full-load strategies, retries, and logging.
Data Modelling & Transformation: Develop modular, tested, and documented transformations in dbt, ensuring scalability and maintainability.
Snowflake Development: Design and maintain warehouse in Snowflake, optimize Snowflake schemas, implement performance tuning (clustering keys, warehouse scaling, materialized views), manage access control, and utilize streams & tasks for automation.
Data Quality & Monitoring: Implement validation frameworks (null checks, type checks, threshold alerts) and automated testing for data integrity and reliability.
Collaboration: Partner with analysts, data scientists, and business stakeholders to translate requirements into scalable technical solutions.
Performance Optimization: Develop incremental and full-load strategies with continuous monitoring, retries, and logging and tune query performance and job execution efficiency.
Infrastructure Automation: Use Terraform or similar IaC tools to provision and manage Snowflake, Airflow, and related environments
Partner with data analysts, scientists, and business stakeholders to translate reporting and analytics requirements into technical specifications.
Qualifications:
Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, or a related field.
710 years of experience in data engineering, with strong hands-on expertise in:
o
Proficiency in SQL and Python (Spark experience is a plus).
Experience building and managing pipelines on AWS, GCP, or Azure.
Strong understanding of data warehousing concepts and ELT best practices.
Familiarity with version control (Git) and CI/CD workflows.
Exposure to infrastructure-as-code tools like Terraform for provisioning Snowflake or Airflow environments.
Excellent problem-solving, collaboration, and communication skills, with the ability to lead technical projects.
Good to have:
o
Experience with streaming data pipelines (Kafka, Kinesis, Pub/Sub).
o
Exposure to BI/analytics tools (Looker, Tableau, Power BI).
o
Knowledge of data governance and security best practices
Mayur Chhatbar (Proprietor Of KD Servicess)
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
ahmedabad, gujarat, india
14.0 - 27.0 Lacs P.A.
ahmedabad, gujarat
Salary: Not disclosed
Ahmedabad
20.0 - 25.0 Lacs P.A.
Ahmedabad, Gujarat, India
20.0 - 25.0 Lacs P.A.
chennai, tamil nadu, india
10.0 - 18.5 Lacs P.A.
bengaluru, karnataka, india
0.5 - 3.0 Lacs P.A.
hyderabad, telangana, india
16.0 - 20.0 Lacs P.A.
hyderabad, telangana, india
15.0 - 25.0 Lacs P.A.
bengaluru, karnataka, india
0.5 - 24.0 Lacs P.A.
kolkata, west bengal, india
5.0 - 6.0 Lacs P.A.