Data Engineer Analyst – ELT & Feature Store

0.0 - 2.0 years

0.0 Lacs P.A.

Gurugram, Haryana

Posted:4 days ago| Platform: Indeed logo

Apply Now

Skills Required

datasoftwarepowerforecastinglearningdocumentationtypescriptormchecksdriftlogictrackingsqlpostgresqltuningpythongitdockerairflowsupportengineeringanalyticswritingqueryscriptingorchestration

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title: Data Engineer / Analyst – ELT & Feature Store Type: Full-Time Industry: IT / Software Location: Gurgaon, Haryana, India (Required) Experience: 2 Years (Required) Job Summary: We are seeking a hands-on Data Engineer / Analyst to build and optimize ELT pipelines and feature stores that power forecasting and machine learning workflows . You'll work on transforming multi-source datasets into trusted, auditable data products while ensuring data quality, performance, and documentation standards. Key Responsibilities: Build scalable ELT pipelines and star-schema models using TypeScript + Drizzle ORM. Implement data quality checks (nulls, schema drift, outliers) and post-COVID re-weighting logic. Develop and manage feature stores (e.g., holiday events, promo tracking, customer segments). Optimize ingestion and transformation performance using chunked uploads and materialized views. Maintain up-to-date data dictionaries, ERDs, and lineage documentation. Required Skills: Advanced SQL & PostgreSQL tuning TypeScript and/or Python dbt-style modular transformations Git, Docker, and CI/CD basics Good to Have: Experience with Airflow or Airbyte Familiarity with Parquet or columnar formats BI-level validation and support Job Type: Full-time Pay: ₹700,000.00 - ₹1,100,000.00 per year Application Question(s): Do you have at least 2 years of professional experience in a data engineering or analytics role? Are you proficient in writing and optimizing Advanced SQL queries? Have you worked with PostgreSQL and performed query tuning? Are you experienced in TypeScript or Python for data scripting and transformation? Have you built ELT pipelines or transformations using dbt-style (modular, version-controlled) approaches? Have you implemented any data quality checks such as null checks, schema drift detection, or outlier handling? Are you familiar with data orchestration tools like Airflow or Airbyte? Work Location: In person Application Deadline: 07/06/2025