Posted:2 months ago|
Platform:
Hybrid
Full Time
Data Developer Location: Bangalore / Pune Experience: 5+ year Job Type: Full-time Job Description: We are seeking a skilled Senior / Junior engineer to join our team in preventing financial crime. The ideal candidate will have strong expertise in Python, PySpark, SQL, and ETL development , along with experience working in AWS, Kafka, APIs, Airflow, and CI/CD pipelines . Key Responsibilities: Develop and optimize ETL pipelines for large-scale data processing Work with Big Data technologies like Hadoop Data Lakes, PySpark, SQL, and scripting languages ( Python, Shell Scripting ) Build and maintain CI/CD pipelines for seamless deployment Ensure solutions meet industry standards for usability, performance, reliability, and scalability Implement DevOps best practices and work with Airflow (on-prem & AWS) Collaborate with cross-functional teams to deliver high-quality solutions Primary Skills: Python, PySpark, SQL, ETL Secondary Skills (any 1 is mandatory): AWS, Kafka, API Development (REST/gRPC), Airflow (on-prem & AWS), CI/CD Good to Have: Strong Python coding skills, SQL, Data Modeling, Advanced Spark Experience with Financial domain projects
Consulting Krew
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Consulting Krew
Pune, Bengaluru
5.0 - 15.0 Lacs P.A.
Chennai, Tamil Nadu, India
6.0 - 10.0 Lacs P.A.
Chennai, Tamil Nadu, India
7.0 - 10.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 7.0 Lacs P.A.
Hyderabad / Secunderabad, Telangana, Telangana, India
3.0 - 7.0 Lacs P.A.
Delhi, Delhi, India
3.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
3.0 - 9.5 Lacs P.A.
Gurgaon / Gurugram, Haryana, India
7.0 - 14.0 Lacs P.A.
Noida, Uttar Pradesh, India
7.0 - 14.0 Lacs P.A.
Patan - Gujarat, Gujrat, India
4.0 - 11.0 Lacs P.A.