Data Engineer

3 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Company Introduction

Coditas is a new-age, offshore product development organization, offering services

pertaining to the entire software development life cycle. Headquartered in Pune, Coditas

works with clients across the globe. We attribute our organic growth to an

engineering-driven culture and steadfast philosophies around writing clean code, designing

intuitive user experiences, and letting the work speak for itself.


Job Description

We are looking for data engineers who have the right attitude, aptitude, skills, empathy,

compassion, and hunger for learning. Build products in the data analytics space. A passion

for shipping high-quality data products, interest in the data products space; curiosity about

the bigger picture of building a company, product development and its people.


Roles and Responsibilities

● Develop and manage robust ETL pipelines using Apache Spark (Scala)

● Understand park concepts, performance optimization techniques and governance tools

● Develop a highly scalable, reliable, and high-performance data processing pipeline to

extract, transform and load data from various systems to the Enterprise Data

Warehouse/Data Lake/Data Mesh

● Collaborate cross-functionally to design effective data solutions

● Implement data workflows utilizing AWS Step Functions for efficient orchestration.

Leverage AWS Glue and Crawler for seamless data cataloging and automation

● Monitor, troubleshoot, and optimize pipeline performance and data quality ● Maintain high coding standards and produce thorough documentation. Contribute to high-level (HLD) and

low-level (LLD) design discussions


Technical Skills

● Minimum 3 years of progressive experience building solutions in Big Data

environments.

● Have a strong ability to build robust and resilient data pipelines which are scalable,

fault tolerant and reliable in terms of data movement.

● 3+ years of hands-on expertise in Python, Spark and Kafka.

● Strong command of AWS services like EMR, Redshift, Step Functions, AWS Glue, and

AWS Crawler.

● Strong hands on capabilities on SQL and NoSQL technologies.

● Sound understanding of data warehousing, modeling, and ETL concepts

● Familiarity with High-Level Design (HLD) and Low-Level Design (LLD) principles

● Excellent written and verbal communication skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You