Senior Data Engineer (ML)

5 years

0 Lacs

Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

This role is for one of the Weekday's clients

Min Experience: 5 yearsJobType: full-timeWe are seeking a

Senior Data Engineer (ML Platform)

to help design and build a no-code interface that enables users to develop predictive models for complex supply chain challenges. In this role, you will work closely with platform and data science teams to orchestrate end-to-end workflows for supervised learning and time-series forecasting, ensuring a seamless and high-performance user experience.

Requirements

Why This Role Now:

The ML workspaces are being re-architected to efficiently manage large-scale data volumes while optimizing both system throughput and predictive performance. This involves enhancing ETL pipelines, increasing parallel processing efficiency, and creating a unified workflow that supports powerful yet user-friendly machine learning operations.

What Success Looks Like in the First 3-6 Months:

  • Optimize ETL and data storage pipelines to handle large-scale datasets (up to 25 million time series).
  • Work with a modern technology stack leveraging distributed computing, advanced ML algorithms, and MLOps tools.
  • Gain end-to-end exposure across the ML product lifecycle — from data ingestion to analytics and user-facing dashboards.
  • Take ownership of the ML workspace platform to drive continuous improvement and deliver exceptional user experience.

What Makes This Role Exciting:

You'll contribute to developing one of the most advanced products in the platform ecosystem, shaping how users interact with ML workspaces and helping simplify predictive modeling for real-world business problems.

Key Responsibilities:

  • Apply best practices in software engineering and agile methodologies to build high-quality, scalable data solutions.
  • Research and prototype new frameworks and technologies to enhance the data and ML infrastructure.
  • Own and evolve the product architecture, ensuring continuous improvement and technical excellence.
  • Design and implement robust data pipelines to support data processing and analytical workflows.
  • Ensure data quality and consistency through effective cleaning, transformation, and validation.
  • Manage and optimize data flow lifecycles using domain-specific data schemas and efficient data storage techniques (SQL and NoSQL).
  • Collaborate with cross-functional teams to integrate, deploy, and maintain distributed data systems.

Must-Haves:

  • 5-10 years of experience in product or data engineering roles.
  • Proven expertise in designing and implementing data pipelines and ETL workflows.
  • Strong proficiency in Python, SQL, and REST API development.
  • Hands-on experience with modern data tools such as Airflow, Kafka, and Snowflake.
  • Solid understanding of distributed computing frameworks (Spark, Dask, Ray, etc.).
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines.
  • Strong grounding in software engineering fundamentals and data modeling.

Nice-to-Haves:

  • Exposure to MLOps practices and tools, including monitoring and managing data drift.
  • Understanding of the end-to-end lifecycle of machine learning projects

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You