Big Data Engineer

5 - 8 years

9 - 14 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About Tarento: Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. Were proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Big Data Engineer (Connected Scooter Platform) About the Role: We are building a cutting-edge connected vehicle platform for our client s next-generation scooters. This role is critical in designing and managing the data pipelines that ingest, process, and deliver telematics and sensor data from thousands of connected scooters to the cloud in real time. Key Responsibilities: Develop and maintain robust ETL pipelines using PySpark and AWS Glue to handle high-volume streaming and batch data from connected scooters. Design and implement scalable data processing frameworks to cleanse, transform, and enrich raw vehicle telemetry data. Work closely with IoT and telematics teams to understand data schemas and ensure reliable data ingestion from edge devices to the AWS cloud. Optimize and tune big data jobs for performance and cost efficiency within AWS infrastructure. Implement monitoring and logging for data pipelines to ensure data quality, reliability, and compliance. Collaborate with Data Scientists, Analysts, and Backend Engineers to deliver clean and usable datasets for analytics, predictive maintenance, and OTA updates. Must Have Skills: Strong hands-on experience with PySpark and Spark-based ETL processing. Proficiency in AWS Glue, S3, Lambda, and related AWS data services. Solid understanding of Big Data architectures, distributed computing principles, and performance tuning. Experience with data lake design, schema management, and partitioning strategies. Familiarity with JSON, Parquet, and other modern data formats used for IoT telemetry data. Strong problem-solving and debugging skills for large-scale data workflows. Good to Have: Knowledge of connected vehicle or IoT data ecosystems. Experience with data governance and security best practices in the cloud.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

Andhra Pradesh, India