8 - 10 years
6 - 10 Lacs
Posted:1 week ago|
Platform:
Work from Office
Full Time
Role Summary :
As a Snowflake Subject Matter Expert (SME), you will be responsible for architecting, developing, and optimizing data warehouse and lakehouse solutions using the Snowflake platform.
Your expertise will enable the data team to build a modern, scalable, and high-performance data platform that supports business analytics, data science, and reporting needs.
You will work closely with data architects, engineers, and business stakeholders to ensure robust data ingestion, storage, and access frameworks that meet security, compliance, and performance standards.
Key Responsibilities :
- Design and implement Snowflake data warehouse and lakehouse architectures.
- Develop and optimize Snowflake SQL, stored procedures, and data transformation logic.
- Manage Snowflake account configurations, resource monitors, virtual warehouses, and access controls.
- Integrate Snowflake with Databricks, Azure Data Factory, and cloud storage (ADLS/S3/GCS).
- Implement best practices for data partitioning, clustering, and caching for performance optimization.
- Participate in data ingestion automation, metadata management, and pipeline monitoring.
- Collaborate with security and governance teams to enforce RBAC, encryption, and compliance policies.
- Contribute to CI/CD automation for Snowflake deployments and pipeline orchestration.
- Provide technical leadership, mentoring, and knowledge sharing across teams.
Required Skills & Qualifications :
- 8+ years of experience in Data Engineering, with 3+ years in Snowflake.
- Deep expertise in Snowflake features :
- Strong SQL and ETL/ELT development skills.
- Experience with data modeling (Kimball/Inmon/Dimensional).
- Hands-on experience with :
- Databricks or PySpark for transformation
- Azure Data Factory / Airflow / DBT for orchestration
- Cloud storage (Azure Data Lake, S3, or GCS)
- Knowledge of data governance, RBAC, encryption, and compliance frameworks (GDPR, HIPAA, etc.
- Familiarity with CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins).
- Strong problem-solving and communication skills.
Preferred / Good To Have :
- Experience with Snowpipe and auto-ingestion.
- Exposure to Delta Lake and Unity Catalog.
Certifications :
- SnowPro Core / Advanced Architect / Data Engineer
- Azure / AWS Data Engineer certifications.
Skills : azure, Python, Sql, Snowflake
Zorba Consulting
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now6.0 - 10.0 Lacs P.A.
10.0 - 15.0 Lacs P.A.
9.0 - 14.0 Lacs P.A.
hyderabad
8.0 - 13.0 Lacs P.A.
bengaluru
8.0 - 13.0 Lacs P.A.
kochi
Experience: Not specified
7.0 - 11.0 Lacs P.A.
pune, chennai, bengaluru
4.0 - 8.0 Lacs P.A.
hyderabad
3.0 - 6.0 Lacs P.A.
bengaluru
5.0 - 9.0 Lacs P.A.
chennai
6.0 - 10.0 Lacs P.A.