Job
Description
Project Role :Data Engineer
Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills :Snowflake Data Warehouse
Good to have skills :NA
Minimum 7.5 year(s) of experience is required
Educational Qualification :15 years full time education
SUMMARY:The ideal candidate will have strong hands-on experience working with Snowflake for data warehousing, table management, SQL development, and validating DBT transformations. The candidate should support scalable data pipelines, data modeling, and data governance practices. Experience with Python, Airflow, and AWS services is a plus for enhancing data processing, ETL workflows, and automation. You will work within a cross-functional team, contributing to reliable data delivery, efficient data workflows, and Agile development processes.ROLES AND RESPONSIBILITIES:Manage and maintain Snowflake tables, including creating, updating, and optimizing them.Test and validate DBT queries in Snowflake before migrating them to DBT pipelines.Write advanced SQL queries and apply strong data warehousing architecture techniques.Work with Python and PySpark (nice to have) for large dataset processing, data transformation, and exporting data in multiple formats.Use Python/PySpark (nice to have) to support Snowflake workflows and feed data into analytical tools such as Tableau.Develop and manage ETL and data pipelines using Airflow (nice to have).Set up and design pipelines using strong data modeling, data governance, and cataloging practices.Contribute to CI/CD deployment pipelines for deploying DBT changes.Work with AWS services such as IAM, S3, SNS, SQS, API Gateway, Lambda, DynamoDB, and EKS (nice to have).Apply knowledge of Private/Capital Markets industry domain (nice to have).Participate in Agile ceremonies, sprint planning, and collaborate with cross-functional teams for delivery. PROFESSIONAL AND TECHNICAL SKILLS:Engineering, or related field.5–8 years of experience in data engineering or software development.Strong hands-on expertise with Snowflake, SQL, and warehousing architecture.Experience testing and validating DBT transformations in Snowflake.Strong SQL proficiency for data transformation and analytics.(Nice to have) Python programming skills for data processing and manipulation.(Nice to have) Experience with Airflow for ETL orchestration and pipeline management.(Nice to have) Familiarity with AWS services including IAM, S3, SNS, SQS, API Gateway, Lambda, DynamoDB, and EKS.Understanding of CI/CD processes for data pipelines and DBT deployments.Knowledge of data modeling, data governance, and cataloging.Experience working in Agile environments.Soft SkillsStrong problem-solving and analytical skills.Excellent verbal and written communication skills.Ability to work in agile, cross-functional teams.Ownership mindset, proactive, and self-driven.ADDITIONAL INFORMATIONThe candidate should have experience relevant to Snowflake data engineering.This position is based at our Bengaluru office.A 15 years full-time education is required.Qualification15 years full time education