Pune, Maharashtra, India
None Not disclosed
On-site
Full Time
About the Role: We are looking for a Technical Project Manager who combines solid hands-on technical expertise in Python programming and Amazon Web Services (AWS) with proven experience in managing software development projects. You will lead cross-functional teams to deliver scalable, secure, and reliable cloud-based applications, ensuring alignment with business goals and technology strategy. Key Responsibilities: Manage the end-to-end software development lifecycle for technical projects using Agile methodologies. Lead architecture discussions, technical design reviews, and coding best practices. Collaborate with engineering teams to ensure high-quality deliverables within timelines and budgets. Own project planning, sprint execution, resource allocation, risk management, and stakeholder reporting. Actively contribute to Python codebases, helping debug, review, or develop features when needed. Coordinate with DevOps and cloud engineers to maintain robust CI/CD pipelines on AWS. Communicate project progress, issues, and risks clearly to both technical and non-technical stakeholders. Ensure adherence to security, compliance, and governance standards across AWS infrastructure. Must-Have Qualifications: Bachelor's or Master’s degree in Computer Science, Engineering, or related field. 12+ years of software development experience, including strong hands-on Python programming . 8+ years of experience managing software or cloud projects as a Technical Project Manager . Deep understanding of AWS services (EC2, S3, Lambda, RDS, CloudFormation, IAM, etc.). Experience with Agile/Scrum methodologies and tools (JIRA, Confluence, etc.). Solid understanding of CI/CD processes, infrastructure-as-code (e.g., Terraform or CloudFormation), and DevOps practices. Excellent communication, leadership, and team management skills. Nice-to-Have Skills: AWS certification (e.g., AWS Certified Solutions Architect or DevOps Engineer). Experience working with microservices, API design, or event-driven architectures. Familiarity with Docker, Kubernetes, and serverless frameworks. Background in data engineering, ML pipelines, or cloud-native applications.
Pune, Maharashtra, India
None Not disclosed
On-site
Full Time
About the Role: We are looking for a Technical Project Manager who combines solid hands-on technical expertise in Python programming and Amazon Web Services (AWS) with proven experience in managing software development projects. You will lead cross-functional teams to deliver scalable, secure, and reliable cloud-based applications, ensuring alignment with business goals and technology strategy. Key Responsibilities: Manage the end-to-end software development lifecycle for technical projects using Agile methodologies. Lead architecture discussions, technical design reviews, and coding best practices. Collaborate with engineering teams to ensure high-quality deliverables within timelines and budgets. Own project planning, sprint execution, resource allocation, risk management, and stakeholder reporting. Actively contribute to Python codebases, helping debug, review, or develop features when needed. Coordinate with DevOps and cloud engineers to maintain robust CI/CD pipelines on AWS. Communicate project progress, issues, and risks clearly to both technical and non-technical stakeholders. Ensure adherence to security, compliance, and governance standards across AWS infrastructure. Must-Have Qualifications: Bachelor's or Master’s degree in Computer Science, Engineering, or related field. 12+ years of software development experience, including strong hands-on Python programming . 8+ years of experience managing software or cloud projects as a Technical Project Manager . Deep understanding of AWS services (EC2, S3, Lambda, RDS, CloudFormation, IAM, etc.). Experience with Agile/Scrum methodologies and tools (JIRA, Confluence, etc.). Solid understanding of CI/CD processes, infrastructure-as-code (e.g., Terraform or CloudFormation), and DevOps practices. Excellent communication, leadership, and team management skills. Nice-to-Have Skills: AWS certification (e.g., AWS Certified Solutions Architect or DevOps Engineer). Experience working with microservices, API design, or event-driven architectures. Familiarity with Docker, Kubernetes, and serverless frameworks. Background in data engineering, ML pipelines, or cloud-native applications.
Pune, Maharashtra, India
None Not disclosed
On-site
Full Time
Job Overview: We are seeking a highly skilled Python Developer with strong expertise in Amazon Web Services (AWS) and Infrastructure as Code (IaC) tools like CloudFormation and Terraform . The ideal candidate will have hands-on experience in building and maintaining scalable cloud-native solutions using services such as EC2, S3, and Lambda. If you’re passionate about automation, cloud architecture, and delivering reliable solutions in a fast-paced environment, we’d love to hear from you. Key Responsibilities: Design, develop, and maintain Python-based applications and automation scripts. Implement and manage cloud infrastructure using AWS CloudFormation and Terraform . Build and deploy scalable, secure, and cost-effective cloud solutions using AWS services (EC2, S3, Lambda, etc.). Collaborate with DevOps, Cloud Engineering, and Application Development teams to streamline deployment and delivery processes. Troubleshoot issues across development, test, and production environments. Write unit and integration tests to ensure code quality and reliability. Participate in architecture reviews, code reviews, and contribute to best practices. Required Qualifications: 5+ years of Python development experience , with a strong understanding of core programming concepts and object-oriented design. Hands-on experience with AWS , including services like EC2, S3, Lambda, IAM, CloudFormation . Experience with Terraform , including module development and state management. Solid understanding of DevOps principles, CI/CD pipelines, and cloud security best practices. Familiarity with Git, Jenkins, or similar version control and CI tools. Strong problem-solving skills and ability to work independently or within a team. Excellent written and verbal communication skills.
Pune, Maharashtra, India
None Not disclosed
On-site
Full Time
Job Title: Python Databricks Developer Location: Pune Experience: 6 to 10 Years Employment Type: Full-Time Job Summary: We are looking for a skilled and experienced Python Databricks Developer who is proficient in developing scalable data pipelines, data transformation logic, and cloud-native analytics solutions using Python, Databricks , and AWS services. The ideal candidate should have strong data engineering experience and be comfortable working in fast-paced, Agile environments. Key Responsibilities: Design and develop scalable ETL pipelines and data workflows using Databricks (PySpark) and Python . Work on large-scale data ingestion, processing, and transformation from various sources. Leverage AWS services (e.g., S3, Glue, Lambda, Redshift, EMR) for data storage, orchestration, and compute. Optimize performance of Spark jobs and Databricks notebooks for large-scale data operations. Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions. Implement best practices for data quality, data governance, and security. Participate in code reviews, testing, and deployment of data solutions in DevOps-driven environments. Create and maintain technical documentation, data dictionaries, and process flows. Required Skills & Experience: Strong hands-on programming experience in Python . Minimum 4+ years of experience working with Databricks , especially with PySpark and Delta Lake . Experience in building and managing data pipelines and ETL processes in cloud environments , particularly AWS . Solid understanding of distributed computing concepts and Spark performance optimization. Hands-on experience with AWS services such as S3, Glue, Lambda, Redshift, Athena, CloudWatch , etc. Experience with version control (e.g., Git), CI/CD tools, and workflow orchestration tools like Airflow or Databricks Jobs . Knowledge of data modeling, data warehousing, and data lake architectures. Strong problem-solving skills and the ability to work independently or in a team. Preferred Qualifications: Bachelor's or Master’s degree in Computer Science, Information Technology, or related field. Certification in AWS or Databricks is a plus. Experience working in Agile environments with Jira or similar tools. Familiarity with SQL and NoSQL databases is a plus.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.