Jobs
Interviews

15579 Containerization Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

India

Remote

Job Title: Senior .NET Developer Experience: 6–10 Years Location: Remote (Working Hours: 5 AM – 3 PM IST) Employment Type: Full-Time About the Role: We are seeking a seasoned and self-motivated .NET Developer with 6–10 years of experience to design, develop, and maintain scalable applications. The ideal candidate will have a strong background in backend development using Microsoft technologies and should be comfortable working independently in a remote, fast-paced environment aligned with early IST working hours. Key Responsibilities: Design, build, and maintain efficient, reusable, and reliable code using .NET technologies (.NET Core/.NET Framework, C#). Develop RESTful APIs and integrate with front-end components. Collaborate with cross-functional teams including Product, QA, and DevOps. Optimize applications for performance, scalability, and maintainability. Participate in code reviews and provide constructive feedback. Troubleshoot and resolve complex technical issues in production and development environments. Write clean, testable, and well-documented code. Required Skills: Strong proficiency in C#, .NET Core/.NET Framework . Experience with Entity Framework , LINQ , and ADO.NET . Proficient in designing and consuming RESTful APIs and Web Services (SOAP/REST) . Solid knowledge of SQL Server , including stored procedures and performance tuning. Hands-on experience with Azure or other cloud platforms is a plus. Familiarity with CI/CD pipelines using tools like Azure DevOps, Jenkins, or GitHub Actions. Good understanding of SOLID principles , OOP , and software architecture patterns (MVC, MVVM, etc.). Strong debugging, analytical, and problem-solving skills. Nice to Have: Experience with front-end frameworks like Angular or React. Knowledge of Microservices architecture . Exposure to containerization tools (Docker, Kubernetes).

Posted 3 days ago

Apply

2.0 years

0 Lacs

Kochi, Kerala, India

On-site

About the Role We are looking for a System Architect with deep technical expertise in .NET Core , PostgreSQL , and React to lead the architecture and design of our next-generation software systems. In this role, you will define how the backend, frontend, and database layers work together to create scalable, secure, and maintainable solutions. You’ll work closely with developers, product managers, and DevOps to ensure our architecture supports fast development cycles and high-quality releases. Key Responsibilities Design scalable and maintainable architectures using .NET Core for backend services, PostgreSQL for data storage, and React for frontend applications. Define system-level architecture including service boundaries, data flow, API design, and deployment strategies. Create and maintain architecture documentation, diagrams, and technical specifications. Lead technology decisions, including frameworks, libraries, design patterns, and infrastructure. Review and guide implementation to ensure it aligns with architectural goals. Design and optimize relational data models using PostgreSQL. Ensure security, performance, and scalability are embedded in every layer of the system. Collaborate with frontend and backend developers, DevOps, and QA to ensure a smooth development lifecycle. Advocate for best practices in software engineering, code quality, testing, and deployment. Requirements 5+ years of software development experience with at least 2+ years in a software/system architect role. Strong hands-on experience with .NET Core (C#) in designing APIs and backend systems. Solid knowledge of PostgreSQL including schema design, indexing, and performance tuning. Experience with React and understanding of modern frontend architecture. Deep understanding of software design principles and architectural patterns (e.g., microservices, clean architecture, layered architecture). Familiarity with containerization (e.g., Docker) and CI/CD processes. Knowledge of RESTful APIs, authentication (OAuth/JWT), and API versioning. Excellent communication skills and ability to translate business requirements into technical designs. Nice to Have Experience with cloud platforms (e.g., Azure, AWS). Familiarity with GraphQL, WebSocket's, or real-time data systems. Exposure to DevOps tools like GitHub Actions, Azure DevOps, or Jenkins. Experience in Agile development environments. What We Offer Competitive salary based on experience Flexible hours Supportive, collaborative, and skilled team Opportunities for professional growth and training Access to the latest tools and tech stack Overseas travel

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

Kochi, Kerala, India

On-site

About Mitsogo | Hexnode: Mitsogo is a global organization that highly values the contributions of each employee. Our ability to attract top talent is a testament to our commitment to fostering a sense of belonging for everyone. We recognize the rapid evolution of technology and society that impacts our industry, and we prioritize equipping our employees with diverse opportunities and empowering them with a wide range of skills. Hexnode, the Enterprise software division of Mitsogo Inc., was founded to simplify how people work. Operating in over 100 countries, Hexnode UEM empowers organizations in diverse sectors. Fueling the transformation to a seamless ecosystem of connected tools, Hexnode is revolutionizing the enterprise software and cybersecurity landscape. Job Overview: We are seeking a skilled DevOps - Lead with hands on experience with Ansible, scripting languages, and AWS. This role requires someone with hands-on expertise to quickly adapt and provide the leadership necessary to guide teams effectively. The DevOps Lead will play a key role in managing ongoing projects, mentoring junior team members, and ensuring streamlined workflows for optimal performance. Responsibilities: Lead a team of engineers and technical specialists in the design, implementation, and maintenance of cloud-based infrastructure. Oversee the development and execution of automation workflows using Ansible and scripting languages (e.g., Python, Bash, etc.) to improve operational efficiency. Take charge of cloud infrastructure provisioning, configuration, and deployment using AWS services, ensuring high availability, scalability, and security. Collaborate with cross-functional teams to define project requirements, develop solutions, and ensure successful project delivery. Mentor and guide junior engineers, providing technical expertise and fostering professional development. Troubleshoot and resolve complex infrastructure and automation issues in a timely manner. Ensure that team activities align with the company's strategic goals, and maintain best practices in coding, system design, and infrastructure management. Monitor and optimize system performance, reliability, and security of cloud infrastructure. Contribute to the development and improvement of operational processes, including CI/CD pipelines and infrastructure-as-code practices. Requirements: Experience: 5 - 9 years of hands-on experience in cloud infrastructure, automation, or DevOps-related roles. Scripting Languages: Strong expertise in Python, Bash, or similar scripting languages for automation and tool development. Ansible: In-depth experience with Ansible for configuration management, automation, and orchestration tasks. AWS: Proven experience in working with Amazon Web Services (AWS), including services such as EC2, S3, Lambda, CloudFormation, RDS, and VPC. CI/CD Pipelines: Knowledge of continuous integration and deployment processes using tools such as Jenkins, GitLab, or similar. Containerization and Orchestration: Familiarity with Docker, Kubernetes, and container orchestration tools. Leadership: Strong leadership and mentoring skills, with the ability to manage and inspire a team of engineers. Problem-Solving: Excellent troubleshooting, diagnostic, and performance optimization skills. Collaboration: Strong interpersonal and communication skills to work effectively across teams and stakeholders.

Posted 3 days ago

Apply

3.0 years

10 - 27 Lacs

Mysore, Karnataka, India

On-site

About The Opportunity A leading IT services and digital integration firm specializing in enterprise application connectivity and automation. We empower clients across banking, retail, healthcare, and manufacturing to drive digital transformation through scalable API-led architectures and robust integration frameworks. Join our team to architect, build, and optimize mission-critical MuleSoft solutions that streamline business processes and amplify system interoperability. Role & Responsibilities Design, develop, and deploy integration solutions on MuleSoft Anypoint Platform, including APIs, connectors, and Mule flows. Leverage DataWeave to perform complex data transformations between JSON, XML, CSV, and database formats. Collaborate with enterprise architects, business analysts, and QA teams to gather requirements, define integration patterns, and ensure end-to-end system reliability. Implement CI/CD pipelines using tools like Jenkins or GitLab to automate build, test, and deployment processes for Mule applications. Troubleshoot, debug, and optimize existing integrations, monitoring performance metrics and resolving defects in production. Document technical designs, code standards, and runbooks; participate in peer code reviews and mentor junior developers on best practices. Skills & Qualifications Must-Have 3+ years of hands-on experience developing with MuleSoft Anypoint Platform (Mule 3.x/4.x, Studio). Strong proficiency in DataWeave scripting for data mapping and transformation tasks. Solid understanding of API design principles (RAML/OpenAPI) and experience configuring Mule API Gateway. Working knowledge of Java or JavaScript for custom component development and scripting. Experience integrating with SOAP/REST web services, JMS queues, JDBC databases, and cloud endpoints. Familiarity with version control systems (Git) and CI/CD automation tools (Jenkins, GitLab CI). Preferred MuleSoft Certified Developer (Mule 4) or MuleSoft Certified Integration Architect credentials. Exposure to containerization (Docker) and orchestration (Kubernetes) for Mule application deployment. Experience with cloud platforms (AWS, Azure) and their integration services (API Gateway, EventHub). Background in event-driven architecture and microservices design patterns. Domain experience in finance, healthcare, or e-commerce integration projects. Benefits & Culture Highlights Competitive compensation with performance-based incentives and annual reviews. On-the-job training, certification support, and access to an extensive technical learning library. Collaborative, innovation-driven environment with cross-functional teams and knowledge-sharing forums. Skills: azure,gitlab ci,mulesoft,java,dataweave,mule api gateway,jms,docker,mule 4,mulesoft anypoint platform,jenkins,soap,rest,javascript,openapi,raml,git,api design principles,kubernetes,jdbc,aws

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VOIS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description Role purpose: Automation Developer with Strong knowledge on Java, Selenium, open-source development responsible for ensuring the development, maintenance of Automation framework which will be used by multiple teams. Candidates should lead framework transition, create roadmap and overall delivery plan. Advanced knowledge of Java, Python, JavaScript, Groovy Java Ecosystem - Apache Maven, Gradle, Spring: Core, Data, MVC, Web, Lombok Selenium, Concordion, Test Automation & Test Strategy, Spock, Junit, Mockito, HTML, CSS, XPath GIT, Linux, Atlassian Stack, including X-Ray test management software for JIRA, Passbolt API Github, Renovate, Ansible, Shell scripting & Unix tools IDE: IntelliJ, IntelliJ Plugin Development CI/CD: Jenkins, Jenkins Job DSL, Github Actions DB : SQL, Oracle Database, Oracle SQL Developer, JDBC, JPA, Hibernate Networking & protocol skills: SSH, HTTP, REST, JSON, OpenAPISOAP, JAX-WS, XML, WSDL, SOAP-UI Weblogic, Apache Tomcat Kubernetes, Helm, Containerization (e.g., Docker), AWS ecosystem Spring Boot, Microservices Infrastructure-as-Code, GitOps GitHub Actions & Runners, ArgoCD Open Telemetry (Java APIs), Grafana, Postgres Additional Skills Communication skills. Proficiency in spoken and written English Scrum Willingness to work on agent-facing applications with German frontend and digging through historic, non-translated German documentation as needed Willingness to train onboard and support less technical users Handling diverse stakeholders and getting to know their business domains Working with a very large, diverse system stack, including multiple legacy systems Sense of Responsibility/Ownership/Initiative to refactor & continuously improve VOIS Equal Opportunity Employer Commitment India VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 3 days ago

Apply

12.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Job Description Vice President, Full-Stack Engineer I At BNY, our culture allows us to run our company better and enables employees’ growth and success. As a leading global financial services company at the heart of the global financial system, we influence nearly 20% of the world’s investible assets. Every day, our teams harness cutting-edge AI and breakthrough technologies to collaborate with clients, driving transformative solutions that redefine industries and uplift communities worldwide. Recognized as a top destination for innovators and champions of inclusion, BNY is where bold ideas meet advanced technology and exceptional talent. Together, we power the future of finance – and this is what is all about. Join us and be part of something extraordinary. We’re seeking a future team member for the role of Vice President, Full-Stack Engineer I to join our Data Solution & SVCS Platform team. This role is located in Pune. In this role, you’ll make an impact in the following ways: Design, develop, and maintain backend services and RESTful APIs using Java (Spring Boot or similar frameworks) Work closely with product, QA, operations, and business stakeholders to understand requirements and translate them into technical solutions Build and manage CI/CD pipelines to enable automated testing, deployment, and continuous delivery Participate actively in SAFe Agile ceremonies including PI planning, sprint planning, and retrospectives Collaborate with DBAs and data architects to ensure performant and accurate data handling within the Client Data Masters platform Contribute to front-end development as needed using modern JavaScript frameworks Ensure code quality and maintainability through code reviews, automated tests, and clean design Support production systems, troubleshoot issues, and implement sustainable fixes To be successful in this role, we’re seeking the following: 10–12 years of full stack development experience with strong emphasis on backend/API development Proficient in Java, Spring/Spring Boot, and RESTful API design Hands-on experience with CI/CD tools like Jenkins, GitLab CI, or GitHub Actions Experience working in Scaled Agile Framework (SAFe) or similar large-scale agile delivery environments Familiarity with front-end technologies (React, Angular, HTML, CSS, JavaScript) Strong understanding of relational databases (e.g., Oracle, PostgreSQL) and ability to write complex SQL queries Experience with source control systems (Git) and build automation Excellent communication and collaboration skills for working with cross-functional teams Proficiency in Jira, Confluence, and similar collaboration and project tracking tools Preferred Qualifications Strong DB performance tuning, data modeling, and ETL experience Exposure to client data domains in financial services or regulated environments Experience with microservices, containerization (Docker), and orchestration (Kubernetes) Familiarity with cloud environments (AWS, Azure, or GCP) Knowledge of messaging platforms (Kafka, MQ) and event-driven architecture At BNY, our culture speaks for itself, check out the latest BNY news at: BNY Newsroom BNY LinkedIn Here’s a Few Of Our Recent Awards America’s Most Innovative Companies, Fortune, 2025 World’s Most Admired Companies, Fortune 2025 “Most Just Companies”, Just Capital and CNBC, 2025 Our Benefits And Rewards BNY offers highly competitive compensation, benefits, and wellbeing programs rooted in a strong culture of excellence and our pay-for-performance philosophy. We provide access to flexible global resources and tools for your life’s journey. Focus on your health, foster your personal resilience, and reach your financial goals as a valued member of our team, along with generous paid leaves, including paid volunteer time, that can support you and your family through moments that matter. BNY is an Equal Employment Opportunity/Affirmative Action Employer - Underrepresented racial and ethnic groups/Females/Individuals with Disabilities/Protected Veterans.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Position Overview We are seeking an experienced Senior DevOps Lead to spearhead our DevOps transformation and lead a high-performing team of DevOps engineers. This role combines strategic leadership with hands-on technical expertise, requiring someone who can manage complex cloud and hybrid infrastructure while ensuring security compliance and operational excellence. Key Responsibilities Team Mentoring Lead and mentor a team of 5-8 DevOps engineers, fostering growth and technical excellence Establish DevOps best practices, standards, and processes across the organization Collaborate with cross-functional teams including development, security, and operations Drive continuous improvement initiatives and innovation in DevOps practices DevOps & Platform Management Design and implement scalable CI/CD pipelines for cloud and hybrid environments Manage containerized applications using Kubernetes orchestration Architect and maintain infrastructure as code using Terraform Implement monitoring and observability solutions using Grafana and Prometheus Manage deployment strategies including blue-green, canary, and rolling deployments Security Operations (SecOps) Integrate security practices into DevOps workflows (DevSecOps) Coordinate and manage Vulnerability Assessment and Penetration Testing (VAPT) activities Ensure timely closure of security vulnerabilities and compliance issues Implement security scanning and monitoring tools in CI/CD pipelines Collaborate with security teams to maintain security posture Compliance & Audit Management Lead ISO audit preparations and manage audit cycles Ensure compliance with industry standards and regulatory requirements Maintain documentation for audit trails and compliance reporting Implement controls and processes to meet ISO 27001, SOC 2, and other relevant standards Coordinate with internal and external auditors Technical Leadership Hands-on implementation and troubleshooting of complex infrastructure issues Design cloud architecture solutions on AWS platform Manage Helm charts and Kubernetes cluster operations Optimize system performance and cost management Provide technical guidance for architectural decisions Required Qualifications Technical Skills Containerization & Orchestration: Expert-level Kubernetes administration and Docker containerization Cloud Platforms: Advanced AWS services knowledge (EC2, EKS, RDS, S3, IAM, VPC, etc.) Infrastructure as Code: Proficiency in Terraform for infrastructure provisioning and management Package Management: Experience with Helm for Kubernetes application deployment Monitoring & Observability: Hands-on experience with Grafana dashboards and Prometheus metrics CI/CD Tools: Jenkins, GitLab CI, GitHub Actions, or similar platforms Scripting: Python, Bash, or PowerShell for automation Version Control: Git workflows and branching strategies Security & Compliance Experience with vulnerability management and VAPT coordination Knowledge of security frameworks (OWASP, NIST, etc.) ISO 27001, SOC 2, or similar compliance experience Security tools integration (SAST, DAST, SCA) Incident response and security monitoring Leadership & Management 8+ years of DevOps experience Experience with agile methodologies and project management Strong communication and stakeholder management skills Budget management and vendor relationship experience Preferred Qualifications AWS Certified Solutions Architect or DevOps Engineer certification Certified Kubernetes Administrator (CKA) or Certified Kubernetes Security Specialist (CKS) Terraform Associate or Professional certification Experience with multi-cloud or hybrid cloud environments Knowledge of GitOps practices (Jenkins, ArgoCD, Flux) Experience with service mesh technologies (Istio, Linkerd) check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#9647FD;border-color:#9647FD;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered="">

Posted 3 days ago

Apply

4.5 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a highly skilled Senior Software Engineer with deep expertise in Java and AWS to join our team. The successful candidate will play a critical role in designing and implementing robust data integration solutions, enhancing system performance, and leveraging modern cloud technologies to drive business innovation. Responsibilities Design and develop Java-based connectors, REST APIs, and components to enable seamless data integration between systems Collaborate with cross-functional teams to analyze business requirements and deliver technical solutions Implement data pipelines and ETL/ELT processes to ensure accurate and consistent data flow Optimize performance and efficiency of data workflows by utilizing AWS services and Java best practices Contribute to code reviews, testing, and deployment to ensure high-quality deliverables Ensure compliance with project requirements, deadlines, and technical standards Stay updated on emerging trends and advancements in Java, cloud computing, and data-driven technologies Provide technical guidance and mentorship to junior engineers Requirements 4.5+ years of experience in software development with a focus on Java and frameworks like Spring, Hibernate, JUnit Strong understanding of RESTful APIs, data integration patterns, and data modeling techniques Knowledge of SQL, Python or Bash scripting, and version control tools like Git Proficiency in building and maintaining data ingestion, transformation, and ETL/ELT processes Competency in leveraging AWS services like EC2, S3, and Lambda for cloud-based solutions Ability to work effectively in Agile environments and collaborate across diverse teams Strong analytical and problem-solving skills to troubleshoot and optimize complex systems Nice to have Background in big data tools such as Apache Kafka or Apache Spark Familiarity with containerization solutions like Docker and Kubernetes Expertise in data governance, data quality assurance, and security standards Understanding of Snowflake data warehousing concepts and query performance tuning Certifications in Java or AWS technologies

Posted 3 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are looking for a Lead Back-end Developer to join our team. You will play a key role in designing and implementing back-end solutions using Java and related technologies. If you are passionate about software development and eager to lead a team, we encourage you to apply. Responsibilities Develop, enhance and maintain code Build back-end microservices and REST APIs Conduct unit testing Perform code reviews Adhere to best practices including code review and CI Participate in SCRUM ceremonies Engage in estimation and planning sessions Mentor junior developers and collaborate with peers Requirements Bachelor's degree in computer science or related field 8+ years of software development experience Hands-on experience with Java and Spring Framework Understanding of APIs and microservices architecture Proficiency in Kafka Nice to have Master's degree in computer science Experience in financial services, particularly wealth management Familiarity with CI/CD practices Knowledge of cloud platforms Experience with containerization technologies

Posted 3 days ago

Apply

4.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

At IF MedTech, we are dedicated to revolutionizing healthcare through cutting-edge medical device design, development, and pilot manufacturing. Our global team collaborates with experts across medical, engineering, business, and research domains to bring innovative solutions that enhance healthcare and improve lives. Join us in our mission to drive innovation and make a global impact in the medical technology sector. Responsibilities: ● Develop and optimize Android-native components using Java/Kotlin. ● Collaborate with the Flutter developer to integrate Android modules into cross-platform builds. ● Ensure secure data handling, storage, and permissions in compliance with HIPAA. ● Support responsive and accessible UI integration for mobile and web interfaces. ● Design, develop, and maintain backend services using Python FastAPI and Java. ● Implement RESTful and GraphQL APIs for seamless data flow between devices, apps, and servers. ● Apply Java frameworks such as Spring Boot where applicable for service development. ● Optimize backend services for high-volume, low-latency medical data processing. ● Work with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB). ● Design efficient schemas and implement secure data access patterns. ● Ensure database architecture supports scalability, redundancy, and compliance. ● Deploy and manage applications on AWS (EC2, S3, RDS, Lambda, API Gateway, etc.). ● Set up CI/CD pipelines for automated build, test, and deployment. ● Use Docker for containerization and reproducible environments. ● Monitor and optimize cloud resource usage for cost efficiency. ● Adhere to ISO 13485 and HIPAA standards for medical software development. ● Participate in code reviews, unit testing, and automated quality checks. ● Maintain proper documentation as per medical device software lifecycle requirements. Qualifications: ● Bachelor’s degree in Computer Science, Software Engineering, or a related field. ● Experience: 2–4 years in full-stack development with proven backend and Android expertise. ● Must-have: Proficiency in Java for Android and backend services. ● Plus: Knowledge of Spring/Spring Boot. ● Strong in Python (FastAPI). ● Proficiency in AWS services and deployment pipelines. ● Experience with relational and NoSQL databases. ● Familiarity with HIPAA and ISO 13485 guidelines. ● Hands-on with Git, Docker, CI/CD workflows. ● Ability to work independently and collaborate effectively. ● Strong problem-solving, adaptability, and attention to detail. Join IF MedTech to drive innovation in healthcare technology and develop software solutions that transform lives worldwide!

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Python Full-Stack Developer - Bangalore/ Pune About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Title: Python Full-Stack Developer Location: Bangalore/ Pune Work Mode: Hybrid (3 days WFO - Tues, Wed, Thurs) Shift Time: 12.30 PM TO 9.30 PM Job Summary: We are seeking a skilled and motivated Python Full-Stack Developer with a strong emphasis on backend development to join our growing team. This role is ideal for someone who thrives in an Agile environment and is eager to contribute to building robust, scalable web applications and data-driven systems. The ideal candidate will have hands-on experience with Snowflake, Python web frameworks, and a solid understanding of object-oriented programming principles. You'll collaborate closely with cross-functional teams to design, develop, and maintain full-stack solutions that power critical business functions. Key Responsibilities Design, develop, and maintain backend services using Python and frameworks such as Flask, FastAPI, or Django. Integrate and manage data pipelines with Snowflake and SQL-based sources. Build and maintain basic frontend components using React (beginner level acceptable). Apply object-oriented design principles to build maintainable and scalable codebases. Collaborate with other engineering teams to ensure smooth delivery and deployment of features. Troubleshoot and resolve performance issues, bugs, and technical challenges. Write clear, maintainable code and contribute to code reviews and documentation. Mandatory Qualifications & Skills Proficiency in Python with strong understanding of object-oriented programming concepts. Hands-on experience with Pandas for data manipulation. Proficient in SQL and working knowledge of Snowflake. Exposure to Python web frameworks such as Flask, FastAPI, or Django. Basic experience with React and frontend development principles. Familiarity with Agile/Scrum methodologies and tools such as Jira. Good at analytical skills, problem solving and communicating technical concepts clearly. Nice-to-Have Skills Experience with AWS cloud services. Familiarity with Git version control workflows. Exposure to CI/CD pipelines and modern deployment practices. Understanding of Software Development Life Cycle (SDLC) processes. Experience with containerization tools like Docker Preferred Experience 3–5 years of professional experience in software development. Background in data-centric applications, SaaS platforms, or enterprise web systems is a plus. Prior experience working in cross-functional Agile teams. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Posted 3 days ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Driven by the passion to improve quality of people’s lives, WS Audiology continues to grow as market leader in the hearing aid industry. With our commitment to increase penetration in an underserved hearing care market, we want to accelerate our business transformation in order to reach more people, more effectively. As an engineer, you will work alongside a team of dedicated and skilled professionals, all driven by the mission to “build a secure, scalable, and reliable remote fitting solution that enables WSA to deliver remote care to its customers.” What you will do Take an active role in the entire development lifecycle, starting from backlog refinement to implementation and deployment. Collaborate closely with team members to influence architecture and make key technical decisions. Design and implement scalable, secure, and high-performing solutions aligned with business and technology strategies. Write clean, maintainable, and well-tested code using unit and integration tests. Troubleshoot, resolve bugs, and optimize performance across services and applications. Contribute to continuous improvement by identifying and addressing technical debt. Discover, evaluate, and implement new ideas & technologies. What you bring BE/B.Tech degree in Computer Science, Software Engineering, or a related field. 3-5 years of hands-on experience in software development, with good expertise in .NET and C#. Good understanding of Object-Oriented Programming (OOP), SOLID principles and common design patterns Exposure designing and building Microservices and REST APIs. Exposure to working with Microsoft Azure cloud services and Microservices Architecture. Exposure to containerization (Docker, Kubernetes). Interest or exposure to mobile application development using .NET MAUI, Native Android, or iOS. Willingness to contribute to web development using React (prior experience is a plus). At WS Audiology, we provide innovative hearing aids and hearing health services. Together with our 12,000 colleagues in 130 countries, we invite you to help unlock human potential by bringing back hearing for millions of people around the world. With us, you will become part of a truly global company where we care for one another, welcome diversity and celebrate our successes. Sounds wonderful? We can't wait to hear from you. WS Audiology is an equal-opportunity employer and committed to creating an inclusive employee experience for all. Regardless of race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status we firmly believe that our work is at its best when everyone feels free to be their most authentic self.

Posted 3 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a highly skilled and motivated Senior Software Engineer specializing in Java and AWS to join our dynamic team. In this role, you will play a pivotal part in designing and implementing cutting-edge solutions that drive data integration and cloud-based application development. If you are passionate about creating scalable, secure, and efficient systems, we would love to hear from you. Responsibilities Design and develop Java-based connectors, REST APIs, and key components to facilitate seamless data integration between applications and platforms Collaborate with cross-functional teams to assess business requirements and define technical approaches Implement ETL pipelines and processes within Java applications to handle robust data workflows Optimize workflows and application performance by leveraging AWS services and adherence to development best practices Support deployment, integration, and troubleshooting of AWS cloud-native architectures Conduct code reviews and participate in quality assurance to ensure maintainable and high-quality deliverables Stay informed about emerging technologies and trends, incorporating new strategies into development processes Requirements Proficiency in Java and its frameworks such as Spring, Hibernate, and JUnit Strong understanding of RESTful APIs, data modeling concepts, and integration patterns Knowledge of database systems including SQL, and scripting capabilities with Python and Bash Background in data ingestion, transformation, ETL/ELT pipelines, and workflows Experience with Git or other version control systems in Agile development environments Skills in problem-solving, troubleshooting, and analytical reasoning Familiarity with AWS services such as EC2, S3, and Lambda Nice to have Understanding of big data technologies like Apache Kafka and Apache Spark Knowledge of containerization tools such as Docker and Kubernetes Awareness of data governance, security, and quality standards Familiarity with Snowflake data warehousing and query optimization techniques Showcase of certifications in service-based frameworks like AWS or Java

Posted 3 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description: Senior QA Automation Engineer App Store QA Automation Engineer – Are you passionate about test automation and eager to drive quality through advanced automated testing strategies? Join Deloitte's App Store team as a Senior QA Automation Engineer and lead the development of comprehensive automated testing frameworks for our internal app marketplace platform. Work You'll Do As a Senior QA Automation Engineer (Senior Consultant, USI), you will lead the design, development, and execution of automated testing strategies for Deloitte's internal app marketplace. You'll collaborate with global product, development, and DevOps teams to build robust automated testing frameworks that ensure delivery of secure, high-performing features with minimal manual intervention. Key Responsibilities: Test Planning & Strategy : Analyze business and functional requirements to develop comprehensive test plans and strategies. Review plans with technical leads and stakeholders to ensure alignment. Test Case Development : Write detailed, comprehensive test cases and scenarios based on requirements and user stories for both manual and automated execution. Automated Testing : API Automation : Design and implement automated test suites for RESTful APIs and microservices using tools like REST Assured, Postman/Newman, or similar frameworks. UI Automation : Develop and maintain end-to-end automated UI test frameworks using Selenium, Cypress, Playwright, or similar tools. Framework Development : Build and maintain scalable, reusable automated testing frameworks that support parallel execution and integrate with CI/CD pipelines. Test Execution : Execute end-to-end test cases, including manual and automated tests, as part of each sprint. Conduct functional, regression, integration, performance, and security testing as defined in the test strategy. Defect Management : Identify, document, and log defects with clear steps to reproduce, severity, and environment details. Work with development teams to ensure timely resolution. CI/CD Integration : Integrate automated tests into CI/CD pipelines to ensure continuous quality checks on every build and deployment. Reporting & Analysis : Analyze test results, generate quality metrics, and provide regular, automated reports to stakeholders. Collaboration & Mentorship : Collaborate with product managers, developers, and DevOps engineers to ensure full test coverage. Lead test automation initiatives, establish best practices, and mentor team members. Documentation : Maintain clear and detailed documentation of QA processes, standards, and test cases. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field 4+ years of experience in software quality assurance or testing roles with a strong focus on test automation Advanced experience with test automation tools and frameworks (e.g., Selenium, Cypress, Playwright, REST Assured, TestNG, JUnit) Strong understanding of QA methodologies, software development life cycle (SDLC), and Agile processes Extensive experience testing web applications, APIs, and backend services through automated frameworks Programming skills in Java, Python, JavaScript, or C# with object-oriented design principles Familiarity with defect tracking and test management tools (e.g., GitHub Projects, Zephyr) Excellent analytical, problem-solving, and communication skills Ability to work effectively with geographically distributed teams and stakeholders across time zones Strong understanding of version control systems (Git) and branching strategies Experience with containerization (Docker) and cloud platforms Preferred Qualifications: Experience with cloud-based platforms (AWS, Azure, or GCP) Exposure to performance and security testing tools and automation frameworks Knowledge of CI/CD pipelines and integration of automated tests Experience with digital marketplaces, app stores, or SaaS platforms Relevant certifications (e.g., ISTQB, CSTE) Experience with microservices architecture testing Understanding of database testing and SQL Knowledge of mobile automation testing (Appium, etc.)

Posted 3 days ago

Apply

10.0 - 15.0 years

0 Lacs

India

Remote

Job Title: Senior .NET Architect - Part-Time Remote – India Shift Timing: 7:30 PM – 11:30 PM IST (Monday to Friday) Commitment: 4 hours/day, 5 days/week Duration: Contract-Based About The Role We are looking for an experienced and highly skilled Senior .NET Architect to join our team on a part-time basis . This is a remote role with a fixed evening shift, ideal for someone looking to contribute to impactful projects in a flexible engagement. As a .NET Architect, you will play a critical role in designing scalable, secure, and high-performing architecture for enterprise-grade applications. You must be available for regular video calls to actively discuss and collaborate on architectural decisions with the development team and stakeholders. Key Responsibilities Define, design, and oversee implementation of scalable .NET architecture for web and cloud-based applications. Collaborate closely with developers, product managers, and other stakeholders during evening hours. Lead architectural discussions over video calls; ensure clear communication of design decisions. Review code and provide mentorship to development teams. Identify and resolve architectural risks and issues. Ensure best practices in software development, including performance optimization, security, and maintainability. Keep documentation of architecture decisions and design artifacts up to date. Required Skills & Qualifications 10-15 years of experience in software development, with at least 4 years in an architecture or lead role. Strong expertise in .NET 4.8/.NET Core 6/7/8, C#, and ASP.NET MVC/Web API. Proficient in micro services architecture, RESTful APIs, and distributed systems. Experience with cloud platforms (Azure preferred). Solid understanding of DevOps practices, CI/CD pipelines, containerization (Docker/Kubernetes) is a plus. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Comfortable working in a remote, collaborative team environment. Must have a reliable internet connection and a suitable environment for video conferencing. Good To Have Exposure to front-end frameworks like Angular or React. Experience with database architecture (SQL Server, NoSQL, etc.). Previous experience working with global teams.

Posted 3 days ago

Apply

6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Profile: We are looking for a skilled and experienced DevOps Engineer with 6+ years of expertise to join our IT team. The ideal candidate will be responsible for architecting, deploying, and managing cloud infrastructure (preferably in GCP , with experience in AWS and Azure as well), setting up CI/CD pipelines, managing containerized environments, and ensuring robust system performance and security. What we are looking for in you Strong experience in architecting, deploying, and managing scalable, highly available, secure, and cost-effective cloud solutions in GCP (preferred), AWS, and Azure. Proficiency in Infrastructure as Code (IaC) using tools like Terraform or CloudFormation Solid hands-on experience in Linux system administration, troubleshooting, performance tuning, and scripting (Bash/Python). Strong understanding of server-level operations, process management, and file systems in Linux environments. Hands-on experience in designing and implementing CI/CD pipelines to automate build, test, and deployment processes. Expertise with CI/CD tools such as Jenkins, GitLab CI/CD, Bitbucket Pipelines, and GitHub Actions. Proficient in Docker for containerization and experienced in managing containerized environments. Exposure to Kubernetes or other orchestration tools to ensure application scalability, high availability, and reliability. Ability to set up monitoring and alerting systems (e.g., Prometheus, Grafana, Cloud Monitoring) for proactive issue detection. Experience with centralized logging solutions (e.g., ELK Stack, Loki, Cloud Logging) to facilitate efficient troubleshooting. Basic experience managing on-premises servers and understanding networking fundamentals (DNS, Load Balancing, Firewalls, VPN). Ability to work in hybrid environments integrating cloud and on-premises infrastructure. Implementation of cloud security best practices, including IAM policies, VPC security, encryption, and access control. Familiarity with compliance standards such as ISO, GDPR, SOC2, and applying security frameworks in deployments. Proven ability to collaborate across cross-functional teams and communicate effectively with stakeholders. Technical leadership experience, mentoring junior team members and driving DevOps culture and best practices within the team. Key Skills Cloud Infrastructure Management Server Administration DevOps CI/CD Tools Containerization Scripting On-Premise & Hybrid Infrastructure Networking Fundamentals Monitoring & Logging Good to have Infrastructure as Code (IaC) Orchestration Cloud Security & Compliance

Posted 3 days ago

Apply

6.0 years

0 Lacs

Ernakulam, Kerala, India

On-site

Job Profile: We are looking for a skilled and experienced DevOps Engineer with 6+ years of expertise to join our IT team. The ideal candidate will be responsible for architecting, deploying, and managing cloud infrastructure (preferably in GCP , with experience in AWS and Azure as well), setting up CI/CD pipelines, managing containerized environments, and ensuring robust system performance and security. What we are looking for in you Strong experience in architecting, deploying, and managing scalable, highly available, secure, and cost-effective cloud solutions in GCP (preferred), AWS, and Azure. Proficiency in Infrastructure as Code (IaC) using tools like Terraform or CloudFormation Solid hands-on experience in Linux system administration, troubleshooting, performance tuning, and scripting (Bash/Python). Strong understanding of server-level operations, process management, and file systems in Linux environments. Hands-on experience in designing and implementing CI/CD pipelines to automate build, test, and deployment processes. Expertise with CI/CD tools such as Jenkins, GitLab CI/CD, Bitbucket Pipelines, and GitHub Actions. Proficient in Docker for containerization and experienced in managing containerized environments. Exposure to Kubernetes or other orchestration tools to ensure application scalability, high availability, and reliability. Ability to set up monitoring and alerting systems (e.g., Prometheus, Grafana, Cloud Monitoring) for proactive issue detection. Experience with centralized logging solutions (e.g., ELK Stack, Loki, Cloud Logging) to facilitate efficient troubleshooting. Basic experience managing on-premises servers and understanding networking fundamentals (DNS, Load Balancing, Firewalls, VPN). Ability to work in hybrid environments integrating cloud and on-premises infrastructure. Implementation of cloud security best practices, including IAM policies, VPC security, encryption, and access control. Familiarity with compliance standards such as ISO, GDPR, SOC2, and applying security frameworks in deployments. Proven ability to collaborate across cross-functional teams and communicate effectively with stakeholders. Technical leadership experience, mentoring junior team members and driving DevOps culture and best practices within the team. Key Skills Cloud Infrastructure Management Server Administration DevOps CI/CD Tools Containerization Scripting On-Premise & Hybrid Infrastructure Networking Fundamentals Monitoring & Logging Good to have Infrastructure as Code (IaC) Orchestration Cloud Security & Compliance

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Implement and maintain Continuous Integration/Continuous Deployment (CICD) pipelines using tools like Jenkins. Support and maintain various DevOps tools, including Jenkins, Terraform and GCP Products like GCE, GKE, BigQuery, Pubsub, Alerting, Monitoring. Create and manage infrastructure as code using Terraform for efficient and scalable deployments. Collaborate with development and operations teams to ensure smooth integration and deployment processes. Build and manage images using tools like Packer, Docker and implement image rotation strategies. Monitor and respond to alerts and incidents, troubleshooting production issues promptly. Ensure end-to-end infrastructure management, including configuration management, monitoring, and security. Implement and enforce security standards and best practices for the DevOps environment. Provide technical support and guidance to development teams on DevOps processes and tools. Stay up-to-date with the latest DevOps trends and technologies, recommending improvements as needed. Document processes, procedures, and configurations for future reference. Implement and support GitOps practices and should be able to configure repositories to follow best practices like codeowners and webhooks. Requirements To be successful in this role, you should meet the following requirements: Proficient in scripting and automation using languages like Bash, Python, or Groovy. INTERNAL Proven experience as a DevOps Engineer or similar role, with a focus on CICD, Jenkins, Terraform, GCP, and relevant DevOps tools. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes (GKE or any equivalent) and Helm. Fair knowledge of disaster recovery and backups, with the ability to troubleshoot production issues. The candidate should be familiar with ICE compliance and standards and capable of guiding development teams. Strong understanding of end-to-end infrastructure management and security standards. Experience with image creation and rotation using tools like Packer and Docker. Knowledge of banking industry processes and regulations is a plus. Excellent problem-solving and communication skills. Ability to work independently and in a team, with a flexible and adaptable mindset. Strong attention to detail and ability to prioritize tasks effectively. The candidate having experience in managing on-prem IKP or VM Unix-based workloads and maintenance would be a plus. Familiarity with connectivity patterns using Jenkins to run various pipelines interacting with Google-managed or unmanaged resources. The candidate should have knowledge of HSBC internal controls related to CI/CD, including tool integration and adoption for PDP. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are seeking a highly skilled and motivated Senior Software Engineer specializing in Java and AWS to join our dynamic team. In this role, you will play a pivotal part in designing and implementing cutting-edge solutions that drive data integration and cloud-based application development. If you are passionate about creating scalable, secure, and efficient systems, we would love to hear from you. Responsibilities Design and develop Java-based connectors, REST APIs, and key components to facilitate seamless data integration between applications and platforms Collaborate with cross-functional teams to assess business requirements and define technical approaches Implement ETL pipelines and processes within Java applications to handle robust data workflows Optimize workflows and application performance by leveraging AWS services and adherence to development best practices Support deployment, integration, and troubleshooting of AWS cloud-native architectures Conduct code reviews and participate in quality assurance to ensure maintainable and high-quality deliverables Stay informed about emerging technologies and trends, incorporating new strategies into development processes Requirements Proficiency in Java and its frameworks such as Spring, Hibernate, and JUnit Strong understanding of RESTful APIs, data modeling concepts, and integration patterns Knowledge of database systems including SQL, and scripting capabilities with Python and Bash Background in data ingestion, transformation, ETL/ELT pipelines, and workflows Experience with Git or other version control systems in Agile development environments Skills in problem-solving, troubleshooting, and analytical reasoning Familiarity with AWS services such as EC2, S3, and Lambda Nice to have Understanding of big data technologies like Apache Kafka and Apache Spark Knowledge of containerization tools such as Docker and Kubernetes Awareness of data governance, security, and quality standards Familiarity with Snowflake data warehousing and query optimization techniques Showcase of certifications in service-based frameworks like AWS or Java

Posted 3 days ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role: .Net Architect (.Net Core) Experience: 9+ Years (Minimum 5 years in .NET Core Development) Location: Coimbatore / Chennai Mandatory Skills: .Net Core, .Net MVC, Web API, LINQ, SQL Server, Project Architecture & Documentation (HLD/LLD), Azure (App Services, AKS), CI/CD Pipelines. JD: Required Skills • .NET Core, ASP.NET MVC / ASPX, C#/VB.NET, Web API • LINQ, Entity Framework / ADO.NET • Strong in Object-Oriented Programming (OOP) and architectural design patterns (e.g., SOLID, Repository, DI) • Deep expertise in SQL Server and database design • Hands-on experience in Azure services: o Azure App Services, AKS, Azure SQL, Blob Storage, Azure AD, Key Vault • CI/CD automation using Azure DevOps, GitHub Actions, or TFS • Strong documentation skills – HLD/LLD creation and architectural artifacts • Front-end integration: HTML5, CSS3 (basic familiarity) Good to Have / Preferred Skills • Experience collaborating directly with client technical teams • Familiarity with third-party tools: Telerik, DevExpress • Exposure to Agile management tools: JIRA, TFS • Working knowledge of cloud-native architecture, containerization (Docker), Helm charts, YAML • Knowledge of testing practices, static code analysis tools, and performance monitoring Soft Skills & Attributes • Strong analytical, problem-solving, and communication skills • Excellent email and professional communication etiquette • Flexible and quick to learn new tools and frameworks • Strong ownership mindset and a passion for delivering quality solutions • Able to work independently or as part of a team in a dynamic environment Mandatory Technologies • .NET Core • ASP.NET MVC • Web API • LINQ • SQL Server • Project Architecture & Documentation (HLD/LLD) • Azure (App Services, AKS) • CI/CD Pipeline

Posted 3 days ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597

Posted 3 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597

Posted 3 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About the firm: Sustainability lies at the core of Stantech AI. Our vision is to empower organizations to derive actionable insights—effectuating a smarter way of working. We operate on the premise that each client is unique and as such requires their own idiosyncratic solutions. Putting this principle into practice, we deliver tailor-made solutions to digitalize, optimize, and strategize fundamental processes underpinning client organizations. For more information, please refer to our website: www.stantech.ai Job Description: As a Senior Software Engineer at Stantech AI, you will play a pivotal role in designing, developing, and maintaining enterprise-grade backend services and APIs that cater to the unique needs of our clients. You will be a key member of our engineering team and will contribute to the success of projects by leveraging your expertise in Python, SQL, and modern DevOps practices. Key Responsibilities: Design, develop, and maintain high-performance backend applications and RESTful APIs using Python FastAPI framework. Optimize and maintain relational databases with SQL (data modeling, query optimization, and sharding) to ensure data integrity and scalability. Create, configure, and manage CI/CD pipelines using GitLab CI for automated build, test, and deployment workflows. Collaborate with cross-functional teams (data scientists, frontend engineers, DevOps) to gather requirements and deliver robust, scalable, and user-friendly solutions. Participate in architectural and technical decisions to drive innovation, ensure reliability, and improve system performance. Conduct code reviews, enforce best practices, and mentor junior engineers. Troubleshoot, diagnose, and resolve production issues in a timely manner. Stay up-to-date with industry trends, emerging technologies, and best practices. Bonus: Hands-on experience with server-level configuration and infrastructure— setting up load balancers, API gateways, and reverse proxies. Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Minimum 3 years of professional experience in backend development, with strong expertise in Python and SQL . Proven track record building and maintaining CI/CD pipelines using GitLab CI . Familiarity with containerization and orchestration technologies: Docker, Kubernetes. Solid understanding of software development lifecycle (SDLC) best practices, design patterns, and version control (Git). Excellent problem-solving, debugging, and communication skills. Ability to work independently and collaboratively in a fast-paced environment. Plus: Experience with front-end technologies (HTML, CSS, JavaScript) and cloud platforms (AWS, GCP, Azure). Financial Package: Competitive salary in line with experience: ₹10–20 Lakhs per annum , contingent on qualifications and experience.

Posted 3 days ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Education And Experience Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or related field. 3–6 years of hands-on experience in Scala development, preferably in a data engineering or data pipeline context. Key Responsibilities Collaborate with business analysts and stakeholders to gather and analyze requirements for data pipeline solutions. Design, develop, and maintain scalable data pipelines using Scala and related technologies. Write clean, efficient, and well-documented Scala code for data ingestion, transformation, and processing. Develop and execute unit, integration, and end-to-end tests to ensure data quality and pipeline reliability. Orchestrate and schedule data pipelines using tools such as Apache Airflow, Oozie, or similar workflow schedulers. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Participate in code reviews, provide constructive feedback, and adhere to best practices in software development. Document technical solutions, data flows, and pipeline architectures. Work closely with DevOps and Data Engineering teams to deploy and maintain solutions in production environments. Stay current with emerging technologies and industry trends in big data and Scala development. Required Skills & Qualifications Strong proficiency in Scala, including functional programming concepts. Experience building and maintaining ETL/data pipelines. Solid understanding of data structures, algorithms, and software engineering principles. Experience with workflow orchestration/scheduling tools (e.g., Apache Airflow, Oozie, Luigi, or similar). Familiarity with distributed data processing frameworks (e.g., Apache Spark, Kafka, Flink). Proficiency in writing unit and integration tests for data pipelines. Experience with version control systems (e.g., Git). Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Skills & Qualifications Experience with cloud platforms (AWS, Azure, or GCP) and related data services. Knowledge of SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). Familiarity with containerization and orchestration tools (Docker, Kubernetes). Exposure to CI/CD pipelines and DevOps practices. Experience with data modeling and data warehousing concepts. Knowledge of other programming languages (e.g., Python, Java) is a plus. Experience working in Agile/Scrum environments. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 308597

Posted 3 days ago

Apply

4.0 - 5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Summary: We are seeking a highly skilled Sr. Software Developer with expertise in Full Stack Development, Python, and JavaScript. The ideal candidate should have 4-5 years of experience in software development, strong understanding of end-to-end software development processes, and ability to translate business requirements into technical solutions. Exposure to Android and iOS app development is an added advantage. Key Responsibilities: Design, develop, test, and deploy scalable and robust full-stack applications. Work with cross-functional teams to gather and analyze business requirements and translate them into technical specifications. Write clean, maintainable, and efficient code using Python and JavaScript frameworks. Build responsive web applications and RESTful APIs. Collaborate in architecture and design discussions, proposing optimal solutions. Conduct code reviews, provide mentorship to junior developers, and ensure best coding practices. Troubleshoot, debug, and optimize application performance. Stay up to date with emerging technologies, tools, and industry trends. If required, contribute to development of mobile applications for Android and iOS platforms. Document development processes, code changes, and technical specifications. Required Skills & Qualifications: Bachelor’s/Master’s degree in Computer Science, Information Technology, or related field. 4-5 years of experience in full-stack software development. Proficient in Python and JavaScript (Node.js, React.js, or Angular). Strong understanding of database technologies (SQL/NoSQL). Hands-on experience in developing RESTful APIs and integrating third-party services. Solid understanding of software development lifecycle (SDLC) and agile methodologies. Ability to gather and analyze requirements and provide practical solutions. Excellent problem-solving, analytical, and communication skills. Good to have: experience in developing Android and iOS mobile applications. Good to have: experience in working with containerization technologies like Docker and Kubernetes. Preferred Skills: Familiarity with cloud platforms (AWS, Azure, or GCP). Experience with CI/CD pipelines, version control (Git), and DevOps practices. Knowledge of security best practices in software development.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies