Senior Data Engineer

8 years

0 Lacs

Posted:12 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Senior Data Engineer

Location: Remote

Company: Emeritus

About Emeritus

At Emeritus, we're committed to teaching you the skills of the future by making

high-quality education accessible and affordable. We achieve this by collaborating with more than 80 top-tier universities across the United States, Europe, Latin America, Southeast Asia, India, and China. Our short courses, degree programs, professional certificates, and senior executive programs help you learn new skills and transform your life, company, and organization. With our unique model of state-of-the-art technology, curriculum innovation, and hands-on instruction from senior faculty, mentors, and coaches, we've educated more than 350,000 individuals across 80+ countries.

Role Overview

We are looking for a Senior Data Engineer (5–8 years of experience) to design, build, and optimize data platforms and streaming pipelines that power data products, analytics, and AI/ML initiatives at Emeritus. You will work closely with data scientists, ML engineers, product managers, and business stakeholders to build robust, scalable, and reliable data systems, including LLM-powered applications.

Key Responsibilities


- Design, build, and maintain scalable data pipelines (batch and streaming) to support analytics, personalization, and AI/ML use cases.


- Develop, deploy, and maintain microservices and data APIs using Python frameworks such as FastAPI or Flask.


- Build and optimize distributed data processing jobs using technologies such as Apache Spark, Flink, or Beam (with a strong focus on streaming workloads).


- Design and manage data models, warehouses, and data marts; write efficient SQL for analytics and production workloads.


- Implement and maintain event-driven architectures using pub/sub or messaging systems.



- Set up and maintain CI/CD pipelines for data and application deployments.

- Containerize services using Docker and manage deployments on Kubernetes-based infrastructure.


- Orchestrate complex data workflows using tools such as Apache Airflow or Argo Workflows.


- Collaborate with ML and product teams to build and productionize LLM-powered applications (Agents, RAG-based systems, etc.).


- Work with vector databases and retrieval systems for RAG and semantic search use cases.


- Ensure data quality, reliability, observability, and documentation across all pipelines and services.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

vadodara, gujarat, india