Analytics Engineer (Remote)

7 - 12 years

7 - 14 Lacs

Posted:Just now| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Crossing Hurdles

Position:

Type:

Compensation:

Location:

Commitment:

Role & Responsibilities

  • Design, build, and maintain DBT models, macros, and tests following modular and semantic data modeling best practices.
  • Integrate DBT workflows with Snowflake Cortex CLI for feature engineering, model training, inference, orchestration, and evaluation.
  • Build and scale Snowflake-native data and ML pipelines leveraging Cortex AI/ML capabilities.
  • Collaborate closely with data engineers, analytics engineers, and ML teams to operationalize AI-driven workflows.
  • Establish best practices for DBTCortex architecture, governance, and usage patterns.
  • Optimize Snowflake queries and compute for performance and cost efficiency.
  • Build and maintain CI/CD pipelines for DBT using tools like GitHub Actions, GitLab, or Azure DevOps.
  • Troubleshoot issues across DBT artifacts, Snowflake objects, data lineage, and data quality checks.
  • Maintain clear documentation for data models, pipelines, testing frameworks, and architectural decisions.

Requirements:

  • 3+ years of experience with DBT Core or DBT Cloud, including macros, packages, testing, and deployments.
  • Strong hands-on expertise with Snowflake (warehouses, tasks, streams, materialized views, performance tuning).
  • Hands-on experience with Snowflake Cortex CLI or strong ability to learn and adopt it quickly.
  • Strong SQL skills with working familiarity in Python for scripting and DBT automation.
  • Experience integrating DBT with orchestration tools such as Airflow, Dagster, or Prefect.
  • Solid understanding of modern data engineering principles, ELT patterns, and version-controlled analytics workflows.
  • Strong problem-solving skills with clear communication ability. Fluency in English.

Preferred:

  • Experience operationalizing ML workflows within Snowflake.
  • Familiarity with Snowpark and Python UDFs/UDTFs.
  • Experience building semantic layers using DBT metrics.
  • Exposure to MLOps or DataOps best practices.
  • Experience with LLM workflows, vector search, or unstructured data pipelines.

Application Process:

  • Apply for the job role.
  • Await the official message/email from our recruitment team (typically within 1-2 days).

Mock Interview

Practice Video Interview with JobPe AI

Start Machine Learning Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Crossing Hurdles logo
Crossing Hurdles

Consulting

Atlanta

RecommendedJobs for You