About BayRock Labs
At BayRock Labs, we pioneer innovative tech solutions that drive business transformation. As a leading product engineering firm based in Silicon Valley, we provide full-cycle product development, leveraging cutting-edge technologies in AI, ML, and data analytics. Our collaborative, inclusive culture fosters professional growth and work-life balance. Join us to work on ground-breaking projects and be part of a team that values excellence, integrity, and innovation. Together, let's redefine what's possible in technology.We are seeking a highly experienced and expert Senior Data Scientist with a minimum of 6years of hands-on experience to lead the development and deployment of advancedmachine learning solutions. The ideal candidate is an expert in predictive modeling, timeseries forecasting, and recommendation systems, with a strong focus on Native andefficient ML development within the Snowflake Data Cloud (Snowpark). This is a criticalMLOps role requiring deep expertise in managing the entire model lifecycle using tools likeMLflow, and a proven ability to apply Advanced Modeling techniques, includingReinforcement Learning (RL), to solve high-impact business challenges.Key Responsibilities
- Advanced Model Development & Leadership
- Design, develop, and implement production-ready machine learning models for
Core Business Challenges, Including
- Prediction (e.g., customer churn, risk, conversion).
- Forecasting (e.g., demand, resource planning) using advanced time-series
methods.
- Recommendation Systems (e.g., content, product matching).
- Lead Advanced Modeling Initiatives: Research, prototype, and implement cutting
edge techniques such as Reinforcement Learning (RL) for sequential decision
making problems (e.g., dynamic pricing, inventory optimization).
- Apply expert-level knowledge of deep learning and other complex algorithms to
drive innovation and competitive advantage.
- Lead the entire model development lifecycle, from ideation and feature engineering
to deployment and monitoring.
- Native Snowflake ML & MLOps Excellence
- Snowflake Native ML Development: Drive efficient and native ML development
by utilizing Snowpark (Python, Scala, or Java) and Snowflake ML Functions (e.g.,
FORECAST, Model Registry) for data processing, model training, and inferencedirectly within the Snowflake Data Cloud.
- MLOps and Production Engineering: Own and automate end-to-end ML pipelines,
ensuring scalability, low latency, and high reliability.
- Implement rigorous model optimization and performance tuning to ensure
maximum efficiency and minimal cost within the Snowflake compute environment.
- Utilize MLflow (or similar tools like the Snowflake Model Registry) for
comprehensive experiment tracking, model versioning, and governance.
- Data Strategy & Collaboration
- Expertly leverage Snowflake for large-scale data wrangling, feature engineering, and
high-performance data preparation.
- Collaborate closely with Data Engineers to establish and integrate a robust,
production-ready Feature Store into the ML workflow.
- Conduct A/B testing and rigorous experimental design to scientifically validate the
business impact of deployed models and features.
Core Technical Expertise (Must Haves)
- Expert Proficiency in Python and its Data Science ecosystem (Pandas, NumPy,
Scikit-learn, TensorFlow etc.).
- Expert Proficiency in SQL and experience optimizing queries for cloud data
warehouses.
- Deep hands-on experience with Snowflake for data preparation and ML, including
Snowpark.
- Proven experience with MLOps tools and practices, especially MLflow for model
lifecycle management.
- Advanced expertise in implementing and tuning predictive models, time-series, and
recommendation systems.
Advanced and Cloud Skills
- Practical experience in Advanced Modeling beyond classical ML (e.g., Deep
Learning, Bayesian methods).
- Demonstrated experience with Reinforcement Learning (RL) algorithms (e.g., Q
Learning, Policy Gradients) and frameworks (e.g., Stable-Baselines, Ray) for real
world application.
- Experience with cloud computing platforms (AWS, Azure, or GCP) for
infrastructure and model deployment.
The Pay Range For This Role Is
25 - 30 INR per year(Remote (India))