Posted:1 day ago|
Platform:
On-site
Full Time
Company Introduction
Coditas is a new-age, offshore product development organization, offering services
pertaining to the entire software development life cycle. Headquartered in Pune, Coditas
works with clients across the globe. We attribute our organic growth to an
engineering-driven culture and steadfast philosophies around writing clean code, designing
intuitive user experiences, and letting the work speak for itself.
Job Description
We are looking for data engineers who have the right attitude, aptitude, skills, empathy,
compassion, and hunger for learning. Build products in the data analytics space. A passion
for shipping high-quality data products, interest in the data products space; curiosity about
the bigger picture of building a company, product development and its people.
Roles and Responsibilities
● Develop and manage robust ETL pipelines using Apache Spark (Scala)
● Understand park concepts, performance optimization techniques and governance tools
● Develop a highly scalable, reliable, and high-performance data processing pipeline to
extract, transform and load data from various systems to the Enterprise Data
Warehouse/Data Lake/Data Mesh
● Collaborate cross-functionally to design effective data solutions
● Implement data workflows utilizing AWS Step Functions for efficient orchestration.
Leverage AWS Glue and Crawler for seamless data cataloging and automation
● Monitor, troubleshoot, and optimize pipeline performance and data quality ● Maintain high coding standards and produce thorough documentation. Contribute to high-level (HLD) and
low-level (LLD) design discussions
Technical Skills
● Minimum 3 years of progressive experience building solutions in Big Data
environments.
● Have a strong ability to build robust and resilient data pipelines which are scalable,
fault tolerant and reliable in terms of data movement.
● 3+ years of hands-on expertise in Python, Spark and Kafka.
● Strong command of AWS services like EMR, Redshift, Step Functions, AWS Glue, and
AWS Crawler.
● Strong hands on capabilities on SQL and NoSQL technologies.
● Sound understanding of data warehousing, modeling, and ETL concepts
● Familiarity with High-Level Design (HLD) and Low-Level Design (LLD) principles
● Excellent written and verbal communication skills.
Coditas
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowbengaluru
25.0 - 35.0 Lacs P.A.
bengaluru
0.5 - 0.9 Lacs P.A.
hyderabad
2.0 - 5.0 Lacs P.A.
hyderabad
2.0 - 5.0 Lacs P.A.
hyderabad
2.0 - 5.0 Lacs P.A.
pune, mumbai (all areas)
20.0 - 35.0 Lacs P.A.
bengaluru
4.75 - 9.75 Lacs P.A.
andhra pradesh
4.0 - 7.0 Lacs P.A.
pune
15.0 - 25.0 Lacs P.A.
bengaluru
10.0 - 20.0 Lacs P.A.