Posted:15 hours ago|
Platform:
On-site
Full Time
Hands-on technologist with deep expertise in Python and a strong background in data engineering, cloud platforms, and modern development practices. You will play a key role in building scalable, high-performance applications and data pipelines that power critical business functions. You will be instrumental in designing and developing high-performance data pipelines from relational to graph databases, and leveraging Agentic AI for orchestration. You ll also define APIs using AWS Lambda and containerised services on AWS ECS. Join us on an exciting journey where youll work with cutting-edge technologies including Generative AI, Agentic AI, and modern cloud-native architectures while continuously learning and growing alongside a passionate team. Thrive in afast-paced, ever-evolving environmentwith shifting priorities. Demonstrated ability toquickly learn and integrate new technologiesand frameworks. Strongproblem-solving mindsetwith the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintainrobust Python applicationsand data pipelines using Python/Pyspark. Define and implement smartdata pipelines from RDBMS to Graph Databases. Build and expose APIs usingAWS LambdaandECS-based microservices. Collaborate with cross-functional teams to define, design, and deliver new features. Writeclean, efficient, and scalable codefollowing best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance ofCI/CD pipelinesand deployment workflows if required. Ensure security, compliance, and observability across all development activities. Expert-level proficiency in Python with a strong grasp of Object oriented functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms - AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions.
Sourced Group
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Hyderabad
10.0 - 13.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
Bengaluru
11.0 - 13.0 Lacs P.A.
Bengaluru
6.5 - 13.0 Lacs P.A.
Ghaziabad, Uttar Pradesh, India
Salary: Not disclosed
Ahmedabad, Gujarat, India
Salary: Not disclosed
Pune, Maharashtra, India
6.0 - 8.0 Lacs P.A.
18.0 - 25.0 Lacs P.A.