Data Engineer

2 years

0 Lacs

Posted:5 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

What You’ll Do:

  • Architect and sustain scalable data pipelines to continuously ingest user event data into our centralized data warehouse.
  • Engineer canonical datasets and key performance metrics that enable the tracking of user growth, engagement, retention, and revenue.
  • Design robust, fault-tolerant ingestion systems to ensure high availability and data consistency during processing.
  • Ensure the security, integrity, and compliance of data according to industry and company standards.
  • Partner closely with cross-functional teams including Infrastructure, Data Science, Product, Marketing, Finance, and Research to understand data needs and deliver scalable solutions.
  • Set up and manage data pipelines for the streaming platform, integrating data from product events, internal tools, and third-party platforms such as Google Ads Manager, Meta Ads Manager, and YouTube Ads Manager into a single analytics layer.
  • Plan and implement event tracking frameworks across Android, iOS, Web, and TV platforms.
  • Build and maintain A/B testing and experimentation pipelines across all platforms.
  • Set up data pipelines for internal company tools, and create dashboards for tracking and analysis.


We’re looking for experience with:

  • Around 2 years of experience as a Data Engineer or in a similar role involving the implementation and maintenance of complex software systems.
  • Proficiency in at least one language commonly used within Data Engineering, such as Python, Scala, or Java.
  • Hands-on use of distributed processing technologies and frameworks such as Hadoop and Flink, alongside distributed storage systems like HDFS and S3.
  • Deep familiarity with distributed data processing frameworks including Apache Spark, Apache Flink, Apache Beam, and query engines like Presto/Trino.
  • Managing data migration and complex workflows using ETL schedulers like Apache Airflow, AWS Glue, Dagster, or Prefect.
  • Implementation of real-time streaming and messaging systems such as Apache Kafka, AWS Kinesis, Google Pub/Sub, or Apache Pulsar.


Bonus Points:

  • Experience working in fast-moving startup environments.
  • Thrives in high-ownership, high-speed, and hands-on problem-solving environments.


Compensation:

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

saket, delhi, india

bangalore urban, karnataka, india