Snowflake Architect

12 - 17 years

30 - 37 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Snowflake Architect

Role Overview:
We are seeking an experienced Snowflake Architect with hands-on expertise in Snowpipe and a strong overall background in Data Architecture. The ideal candidate will have 12-15 years of experience in designing, implementing, and optimizing large-scale data solutions, including at least 5 years dedicated to Snowflake cloud data platform.

Key Responsibilities

  • Architect, design, and implement Snowflake-based data solutions for enterprise clients across BFSI and other domains.
  • Lead end-to-end data migration and modernization projects, transitioning legacy environments to Snowflake.
  • Expertly configure and optimize Snowpipe for automated, scalable data ingestion from diverse sources.
  • Define and implement data governance, security, and compliance protocols leveraging Snowflake features (RBAC, masking, etc.).
  • Collaborate with business stakeholders and technical teams to translate requirements into high-performance data models and workflows.
  • Develop reusable frameworks and accelerators for Snowpipe-based batch and real-time ingestion pipelines.
  • Monitor and troubleshoot Snowpipe jobs, optimizing costs, performance, and reliability.
  • Document best practices, technical architectures, and implementation approaches for Snowflake-centric ecosystems.
  • Mentor and guide junior staff and engineers in Snowflake, Snowpipe, and cloud-native ETL development.

Required Skills & Experience

  • 12-15 years overall Data Architect experience including high-volume data warehouse and lakehouse platforms.
  • Minimum 5 years hands-on experience with Snowflake:
    • Advanced data modeling, schema design, and performance tuning
    • Snowflake account setup, security, and data sharing
    • Deep understanding of Snowflake architecture, scaling, and compute pattern optimization
  • Strong expertise in Snowpipe:
    • Building, scheduling, and monitoring ingestion pipelines
    • Integrating Snowpipe with cloud storage (AWS S3, Azure Blob, GCP etc.)
    • Automating ingestion and alerting for failures/success
  • Proficiency in ETL/ELT tools: Informatica, Talend, Matillion, dbt, Azure/AWS native
  • Solid scripting/programming skills: SQL, Python or Scala (for UDFs and orchestration)
  • Cloud data platform knowledge (Azure, AWS, GCP) with focus on security, roles, and data governance.
  • Excellent communication skills for client-facing discussions and solution presentations.
  • Experience with data migration, modernization, and CI/CD for data pipelines.

Preferred/Bonus Skills

  • Snowflake certification (SnowPro Core/Advanced).
  • Experience with Kafka or streaming data integration.
  • BFSI or regulatory compliance data projects.
  • Experience architecting solutions for high-availability and disaster recovery scenarios.

Education

  • Bachelor’s/master’s in computer science, Engineering, or related discipline.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Moder Solutions logo
Moder Solutions

Technology Consulting

Innovation City

RecommendedJobs for You

pune, chennai, bengaluru

hyderabad, telangana, india