Posted:-1 days ago| Platform:
Hybrid
Full Time
Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills Role & responsibilities Preferred candidate profile
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
8.0 - 15.0 Lacs P.A.
2.0 - 5.0 Lacs P.A.
15.0 - 15.0 Lacs P.A.
Salary: Not disclosed
10.0 - 12.0 Lacs P.A.
Salary: Not disclosed
5.0 - 10.0 Lacs P.A.
Salary: Not disclosed
Salary: Not disclosed
Salary: Not disclosed