Snowflake Subject Matter Expert - Data Warehousing

8 - 10 years

6 - 10 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role Summary :

As a Snowflake Subject Matter Expert (SME), you will be responsible for architecting, developing, and optimizing data warehouse and lakehouse solutions using the Snowflake platform.

Your expertise will enable the data team to build a modern, scalable, and high-performance data platform that supports business analytics, data science, and reporting needs.

You will work closely with data architects, engineers, and business stakeholders to ensure robust data ingestion, storage, and access frameworks that meet security, compliance, and performance standards.

Key Responsibilities :

- Design and implement Snowflake data warehouse and lakehouse architectures.

- Develop and optimize Snowflake SQL, stored procedures, and data transformation logic.

- Manage Snowflake account configurations, resource monitors, virtual warehouses, and access controls.

- Integrate Snowflake with Databricks, Azure Data Factory, and cloud storage (ADLS/S3/GCS).

- Implement best practices for data partitioning, clustering, and caching for performance optimization.

- Participate in data ingestion automation, metadata management, and pipeline monitoring.

- Collaborate with security and governance teams to enforce RBAC, encryption, and compliance policies.

- Contribute to CI/CD automation for Snowflake deployments and pipeline orchestration.

- Provide technical leadership, mentoring, and knowledge sharing across teams.

Required Skills & Qualifications :

- 8+ years of experience in Data Engineering, with 3+ years in Snowflake.

- Deep expertise in Snowflake features :

  • Performance tuning & query optimization
  • Time Travel, Zero-Copy Cloning, Streams, Tasks, and Materialized Views
  • Data sharing and data marketplace

- Strong SQL and ETL/ELT development skills.

- Experience with data modeling (Kimball/Inmon/Dimensional).

- Hands-on experience with :

- Databricks or PySpark for transformation

- Azure Data Factory / Airflow / DBT for orchestration

- Cloud storage (Azure Data Lake, S3, or GCS)

- Knowledge of data governance, RBAC, encryption, and compliance frameworks (GDPR, HIPAA, etc.

- Familiarity with CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins).

- Strong problem-solving and communication skills.

Preferred / Good To Have :

- Experience with Snowpipe and auto-ingestion.

- Exposure to Delta Lake and Unity Catalog.

Certifications :

- SnowPro Core / Advanced Architect / Data Engineer

- Azure / AWS Data Engineer certifications.

Skills : azure, Python, Sql, Snowflake

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

pune, chennai, bengaluru