Posted:3 weeks ago|
Platform:
Hybrid
Full Time
Design, develop, and maintain robust ELT/ETL data pipelines to load structured and semi-structured data into Snowflake.
Implement data ingestion workflows using tools like Azure Data Factory, Informatica, DBT, or custom Python/SQL scripts.
Write and optimize complex SQL queries, stored procedures, views, and UDFs within Snowflake.
Use Snowpipe for continuous data ingestion and manage tasks, streams, and file formats for near real-time processing
Optimize query performance using techniques like clustering keys, result caching, materialized views, and pruning strategies.
Monitor and tune warehouse sizing and usage to balance cost and performance.
Design and implement data models (star, snowflake, normalized, or denormalized) suitable for analytical workloads.
Create logical and physical data models for reporting and analytics use cases.
Brillio
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
20.0 - 25.0 Lacs P.A.
25.0 - 40.0 Lacs P.A.
chennai
7.0 - 12.0 Lacs P.A.
Salary: Not disclosed
8.5 - 18.0 Lacs P.A.
Salary: Not disclosed
chennai, bengaluru
18.0 - 27.5 Lacs P.A.
Salary: Not disclosed
mohali district, india
Salary: Not disclosed
delhi, delhi
Experience: Not specified
Salary: Not disclosed