Snowflake Data Engineer

0 years

0 Lacs

Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

About BayRock Labs

At BayRock Labs, we pioneer innovative tech solutions that drive business transformation. As a leading product engineering firm based in Silicon Valley, we provide full-cycle product development, leveraging cutting-edge technologies in AI, ML, and data analytics. Our collaborative, inclusive culture fosters professional growth and work-life balance. Join us to work on ground-breaking projects and be part of a team that values excellence, integrity, and innovation. Together, let's redefine what's possible in technology.We are seeking a highly skilled

Snowflake Data Engineer

to design, develop, and optimize our enterprise data foundation, specifically for our production-level AI applications built on Snowflake Cortex. This role is crucial for ensuring the AI agents receive

clean, aggregated, and optimized data

efficiently.

Key Responsibilities

  • Snowflake Architecture & Design: Design and implement scalable and high-performance data models (e.g., Data Vault, Dimensional Modeling) within Snowflake, specifically structuring data for AI/ML consumption.
  • Data Aggregation & Optimization: Lead the effort to reduce our existing columns down to the necessary, non-duplicated, and optimized feature set required by the AI Agents.
  • ETL/ELT Development: Develop robust and performant ELT pipelines using Snowpipe, Tasks, Streams, and Dynamic Tables to aggregate data from diverse sources into Snowflake.
  • Performance Tuning: Optimize Snowflake queries, clustering keys, and warehouse sizing to ensure low latency data retrieval for real-time agent workflows and baseline report generation.
  • Collaboration: Work closely with the AI/ML Agent Developers to expose data via optimized views, UDFs, and Stored Procedures that can be easily called by Snowpark or Cortex Analyst tools.
  • Data Governance: Ensure data quality, lineage, and adherence to security policies (e.g., Row Access Policies, Data Masking) within the Snowflake environment.

Required Skills & Qualifications

  • Expert-level proficiency in Snowflake architecture, optimization, and advanced features (e.g., Streams, Dynamic Tables, Time Travel).
  • Deep expertise in SQL and data modeling for high-volume, complex datasets.
  • Strong hands-on experience with Python and Snowpark for custom data transformation logic.
  • Proven ability to perform data cleansing, feature engineering, and dimensional reduction (reducing columns).

The Pay Range For This Role Is

25 - 30 INR per year(Remote (India))

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You