Data Engineer

3 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Part Time

Job Description

Mercor is partnering with a cutting-edge AI research lab to hire a Senior Data/Analytics Engineer with expertise across DBT and Snowflake’s Cortex CLI. In this role, you will build and scale Snowflake-native data and ML pipelines, leveraging Cortex’s emerging AI/ML capabilities while maintaining production-grade DBT transformations. You will work closely with data engineering, analytics, and ML teams to prototype, operationalise, and optimise AI-driven workflows—defining best practices for Snowflake-native feature engineering and model lifecycle management. This is a high-impact role within a modern, fully cloud-native data stack.


Responsibilities

  • Design, build, and maintain DBT models, macros, and tests following modular data modeling and semantic best practices.
  • Integrate DBT workflows with Snowflake Cortex CLI, enabling:
  • Feature engineering pipelines
  • Model training & inference tasks
  • Automated pipeline orchestration
  • Monitoring and evaluation of Cortex-driven ML models
  • Establish best practices for DBT–Cortex architecture and usage patterns.
  • Collaborate with data scientists and ML engineers to produce Cortex workloads in Snowflake.
  • Build and optimise CI/CD pipelines for dbt (GitHub Actions, GitLab, Azure DevOps).
  • Tune Snowflake compute and queries for performance and cost efficiency.
  • Troubleshoot issues across DBT arti-facts, Snowflake objects, lineage, and data quality.
  • Provide guidance on DBT project governance, structure, documentation, and testing frameworks.


Required Qualifications

  • 3+ years experience with DBT Core or DBT Cloud, including macros, packages, testing, and deployments.
  • Strong expertise with Snowflake (warehouses, tasks, streams, materialised views, performance tuning).
  • Hands-on experience with Snowflake Cortex CLI, or strong ability to learn it quickly.
  • Strong SQL skills; working familiarity with Python for scripting and DBT automation.
  • Experience integrating DBT with orchestration tools (Airflow, Dagster, Prefect, etc.).
  • Solid understanding of modern data engineering, ELT patterns, and version-controlled analytics development.


Nice-to-Have Skills

  • Prior experience operationalising ML workflows inside Snowflake.
  • Familiarity with Snow-park, Python UDFs/UDTFs.
  • Experience building semantic layers using DBT metrics.
  • Knowledge of MLOps / DataOps best practices.
  • Exposure to LLM workflows, vector search, and unstructured data pipelines.


Why Join

  • You will be an hourly contractor through Mercor, working 20–40 hours per week with flexibility.
  • Direct opportunity to build next-generation Snowflake AI/ML systems with Cortex.
  • High-impact ownership of DBT and Snowflake architecture across production pipelines.
  • Work alongside top-tier ML engineers, data scientists, and research teams.
  • Fully remote, high-autonomy environment focused on innovation, velocity, and engineering excellence.


We consider all qualified applicants without regard to legally protected characteristics and provide reasonable accommodations upon request.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

gurugram, haryana, india

gurugram, haryana, india

mumbai, maharashtra, india

all india, gurugram

all india, gurugram

coimbatore, all india