Onix Datametica is Looking For Snowfalke Gcp Lead

5 - 10 years

5 - 15 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title: Snowflake Lead / Lead Snowflake Data Engineer

Summary

Snowflake Lead

Key Responsibilities

Technical Leadership

  • Lead the end-to-end architecture, design, and implementation of Snowflake-based data solutions.
  • Define best practices for Snowflake development, ELT patterns, security, CI/CD, and performance optimization.
  • Provide technical guidance to data engineers and BI developers.

Data Engineering & Development

  • Design and implement scalable data models (3NF, dimensional modeling, Data Vault).
  • Build and optimize Snowflake objects: warehouses, schemas, tables, streams, tasks, snowpipes, UDFs.
  • Develop and maintain ELT/ETL pipelines using tools such as

    DBT, Airflow, Matillion, Informatica, ADF

    , or equivalent.
  • Implement data ingestion frameworks using cloud storage (S3/GCS/ADLS) and integration technologies (Kafka, APIs).

Performance & Optimization

  • Optimize warehouse sizing, query performance, micro-partitioning, clustering, caching, and cost management.
  • Monitor and fine-tune workloads to ensure high reliability and efficiency.

Security & Governance

  • Implement Snowflake security best practices including RBAC, data masking, encryption, and access policies.
  • Collaborate with data governance teams on metadata, cataloging, lineage, and quality frameworks.

Collaboration & Delivery

  • Work closely with data science, analytics, and application teams to deliver high-quality datasets.
  • Participate in sprint planning, solution reviews, and documentation.
  • Communicate complex technical concepts to non-technical stakeholders.

Required Skills & Qualifications

  • 712+ years of experience in data engineering, with at least

    3–5 years hands-on Snowflake

    .
  • Strong SQL engineering background and expert-level Snowflake SQL understanding.
  • Experienced in designing and implementing large-scale cloud data architectures.
  • Strong proficiency with at least one cloud platform:

    AWS, Azure, or GCP

    .
  • Hands-on experience with data orchestration and transformation tools (

    DBT, Airflow, ADF, Matillion, etc.

    ).
  • Solid understanding of data modeling, warehousing concepts, and modern data architecture.
  • Experience with version control, CI/CD pipelines, and DevOps practices.
  • Knowledge of Python/Scala/Java for data pipeline development is a plus.
  • Experience with data security, governance, and compliance frameworks.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Datametica logo
Datametica

IT Services and IT Consulting

New York NY

RecommendedJobs for You