Posted:2 months ago| Platform:
Work from Office
Full Time
At least 5 years of experience in designing and developing Data Pipelines Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof Hands on programmer with a thorough understand of performance tuning techniques Handling large data volume transformations (order of 100 GBs monthly) Able to create solution / data flows to suit requirements Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter learner Able to understand and probe for requirements Tech experience expected Primary: Snowflake, DBT (development testing) Secondary: Python, ETL or any data processing tool Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru
8.0 - 13.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
Salary: Not disclosed
12.0 - 16.0 Lacs P.A.
Experience: Not specified
Salary: Not disclosed
0.4 - 0.5 Lacs P.A.
Bengaluru
6.54 - 9.55 Lacs P.A.
Bengaluru, Karnataka, India
Salary: Not disclosed
Experience: Not specified
0.2 - 0.45 Lacs P.A.
Ahmedabad
5.84 - 8.375 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 8.0 Lacs P.A.