Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Program languages: Java 11+ (must), Frontend vue.js(must),CSS/HTML, Bootstrap Architectuur: Micro-service architecture. Databases: MySQL, Neo4J Monitoring tools: Grafana. Software Management Tools: Maven APIs: Restful API's, Message Queues (JMS) Frameworks: Spring Boot, Robot Framework, Groovy, Cucumber. CI/CD: Bitbucket, GitLab, Jenkins, Ansible, Artifactory. Server Ops: Management of Redhat Linux/Centos7 Scripting: JavaScript, Shell, Bash, Makefile Containerization and container platforms: Docker knowledge, Kubernetes, Microservices architecture, Docker file, Docker-compose
Posted 1 week ago
8.0 - 13.0 years
32 - 40 Lacs
Pune
Hybrid
Job Summary: We are seeking a highly experienced and hands-on Boomi Integration Consultant to lead and guide cross-functional teams in designing, building, and delivering enterprise-grade integration solutions using Boomi Integration Platform (Boomi Atomsphere) . This role combines technical leadership, solution architecture, and mentoring, along with strong hands-on development expertise. The ideal candidate will be instrumental in defining best practices, ensuring solution quality, and enabling integration success across the organization or client engagements. Key Responsibilities: Lead technical discovery, design workshops, and architecture discussions to define scalable integration strategies using Boomi. Design and develop robust and reusable Boomi processes , orchestrations, and connector configurations. Define and establish integration patterns, best practices, reusable assets , and governance standards for the team. Guide and mentor internal or client-side development teams, conduct code reviews, and ensure adherence to architecture principles. Collaborate closely with business stakeholders, solution architects, and third-party vendors to gather requirements and align on delivery goals. Provide technical leadership in Boomi solution architecture, including API Management, EDI, master data flows, event-driven patterns , and cloud-to-cloud/on-premises integrations. Troubleshoot complex integration issues, optimize performance, and provide strategic direction on Boomi environments and deployment models. Lead PoCs , perform fit-gap analysis, and support Boomi platform evaluations and roadmap planning. Ensure end-to-end integration lifecycle including design, implementation, documentation, testing, and support . Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or related field. 7-14 years of experience overall. 5+ years of experience in integration architecture and development, with 3+ years of deep hands-on experience in Boomi AtomSphere . Proven ability to lead integration teams and drive project delivery in a consulting or enterprise environment. Expertise in Boomi connectors , API components , map functions , flow controls , and process orchestration . Strong command of integration concepts including asynchronous messaging, REST/SOAP services, pub-sub models, and data transformation. Hands-on experience with authentication protocols (OAuth2, JWT, SAML), security best practices , and governance frameworks . Familiarity with enterprise systems like Salesforce, SAP, NetSuite, Oracle , etc. Knowledge of database integration (SQL/NoSQL) , file-based integrations (SFTP/CSV/XML/EDI) . Comfortable working in Agile teams , using DevOps pipelines , Git , and CI/CD tools . Preferred Qualifications: Boomi Professional Developer Certification or Architect Certification . Previous experience in a consulting or client-facing role . Familiarity with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker/Kubernetes). Understanding of monitoring tools (e.g., Boomi Molecule metrics, Datadog, Splunk) and error tracing strategies . Experience in data quality management , master data integration , or event-driven architecture is a plus.
Posted 1 week ago
6.0 - 11.0 years
19 - 22 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities .NET Core / .NET 6+: Built scalable, modular services. Used async programming, performance tuning, and reusable code. C# : Strong in LINQ, async/await, error handling, and memory use. Fixed major issues and built shared code libraries. REST APIs (ASP.NET Core / Web API): Created secure APIs with JWT/OAuth2, versioning, and clear responses. Used Swagger for testing. SQL Server / PostgreSQL: Designed databases, wrote complex queries, and improved performance with indexing. SQL Server required; PostgreSQL is a plus. CI/CD & Git : Experience with Azure DevOps or GitHub Actions. Can set up and fix pipelines, handle rollbacks, and manage environments. Agile (Scrum/Kanban): Worked in sprints, did story estimates, code reviews, and managed own tasks. Cloud (AWS or Azure): Deployed apps on AWS or Azure using basic cloud services. No certification needed. Microservices : Helped build or manage microservices. Understands how they connect, handle errors, and stay reliable. Frontend Basics (React / JS / TS) : Can read and debug frontend code to fix API issues. Doesnt need to build UI. Nice-to-Have Skills: Docker / Kubernetes: Used Docker to package .NET apps. Some knowledge of Kubernetes is a plus. ASP.NET MVC / WCF: Basic understanding of older .NET tech for legacy projects. Certifications: AWS or Azure certs are a bonus but not required.
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Pune
Work from Office
Role & responsiby: Position Details: Role: Data Engineer Location: Pune, Baner (Work from Office) Experience: 6+ years Working Days: Monday to Friday (9:30 AM 6:30 PM) Education: Bachelors or Masters in Computer Science, Engineering, Mathematics, or related field Company Website: www.avigna.ai LinkedIn: Avigna.AI Key Responsibilities: Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics. Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data. Work with Python , PySpark , and Azure services to support data modeling, automation, and predictive insights. Develop custom KQL queries and manage data using Power BI , Azure Cosmos DB , or similar tools. Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends. Ensure secure, efficient, and reliable CI/CD pipelines for automated deployments and data updates. Skills & Experience Required: Strong proficiency in Python , PySpark , and cloud-native data tools Experience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory) Hands-on experience with Microsoft Fabric (preferred or good to have) Working knowledge of Power BI and building interactive dashboards for business insights Familiarity with CI/CD practices for automated deployments Exposure to machine learning integration into data workflows (nice to have) Strong analytical and problem-solving skills with attention to detail Good to Have: Experience with KQL (Kusto Query Language) Background in simulation models or mathematical modeling Knowledge of Power Platform integration (Power Pages, Power Apps) ilities Benefits : Competitive salary. Health insurance coverage. Professional development opportunities. Dynamic and collaborative work environment.
Posted 1 week ago
4.0 - 6.0 years
10 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Looking for immediate to 30days 4years to 6years Location - Bangalore, Hyderabad & Chennai We are looking for a good Python Developer responsible for managing DEVOPS and infrastructure for AIML space Your primary focus will be setting the tech-stack and dependency software deployment, GPU infra configuration and commissioning, build container applications Build and Automate MDLC lifecycle using ADO Manage the BAU and operational issues and queries to ensure the component deployment is stable and operational in lower and production environment Managed and setup alerting across environments Monitor application performance and look for opportunities to improve by infra tuning and scaling. Investigate and close security issues and observation during pipeline promotion Good knowledge in Jenkins Azure Dev ops or CI/CD Vx-Pipeline In-depth knowledge in configuration and automation tools like RunDeck and Ansible Good scripting knowledge using Python, Groovy, Shell
Posted 1 week ago
3.0 - 5.0 years
5 - 10 Lacs
Ahmedabad
Work from Office
We are a premium IT company that delivers truly outstanding products and solutions. We provide our customers with data-centric solutions such as data cleaning, data visualization, data modeling, data forecasting, and data reporting. Data is a new hidden asset that helps businesses to learn about new information such as threats, demand, supply, and productivity. We have the experience and capability to handle your large data in real-time helping you to achieve better outcomes. We aim is to improve productivity, performance, and reliability in your business and reduce overhead. Responsibilities: Design and develop robust, efficient, and scalable software applications using Golang. Write clean, maintainable, and well-documented code following best practices and coding standards. Participate in code reviews to ensure code quality, identify issues, and provide constructive feedback to team members. Optimize software performance and improve scalability through code refactoring, caching techniques, and other optimization strategies. Troubleshoot and debug software issues, identify root causes, and implement effective solutions. Stay up-to-date with industry trends, emerging technologies, and best practices in Golang development, and share knowledge with the team. Mentor and provide guidance to junior developers, helping them grow their skills and expertise in Golang development. Contribute to the continuous improvement of development processes, tools, and methodologies. Have knowledge and experience in managing cloud databases. Mandatory requirements Strong proficiency in Golang, including experience with concurrency, data structures, and algorithms. Experience with web development frameworks Familiarity with microservices architecture and related technologies Knowledge of relational and NoSQL databases and experience with database integration using Golang Understanding of RESTful API design principles and experience in building and consuming APIs. Excellent problem-solving skills and ability to troubleshoot and debug complex issues. Strong communication skills and ability to work effectively in a collaborative team environment. Experience in a cloud environment (AWS, Azure, GCP). Knowledge of containerization and orchestration (Docker, Kubernetes). Familiarity with CI/CD pipelines. What We Offer: Competitive compensation package. Supportive and collaborative team culture. Opportunities for continued learning and career development. Work Schedule: 5 days a week Location: Ahmedabad
Posted 1 week ago
14.0 - 24.0 years
20 - 55 Lacs
Chennai, Tamil Nadu, India
On-site
Primary Skills: Java Backend, Spring Boot, Microservices, CI CD Pipeline, Cointainerzation, SQL , Any Cloud (People Management is Mandatory) Job Desc:VP - Back End Engineering JOB SUMMARY The Chapter Lead Backend development is a hands-on developer role focusing on back-end development and is accountable for people management and capability development of their Chapter members. Responsibilities in detail are: RESPONSIBILITIES Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Maintain exemplary coding standards within the team, contributing to code base development and code repository management. Perform code reviews to guarantee quality and promote a culture of technical excellence in Java development. Function as a technical leader and active coder, setting and enforcing domain-specific best practices and technology standards. Allocate technical resources and personal coding time effectively, balancing leadership with hands-on development tasks. Maintain a dual focus on leadership and hands-on development, committing code while steering the chapter's technical direction. Oversee Java backend development standards within the chapter across squads, ensuring uniform excellence and adherence to best coding practices. Harmonize Java development methodologies across the squad, guiding the integration of innovative practices that align with the bank's engineering strategies. Advocate for the adoption of cutting-edge Java technologies and frameworks, driving the evolution of backend practices to meet future challenges. Strategy Oversees the execution of functional standards and best practices and provide technical assistance to the members of their Chapter. Responsible for the quality of the code repository where applicable. Acts as a conduit for the wider domain strategy, for example technical standards. Prioritises and makes available capacity for technical debt. This role is around capability building, it is not to own applications or?delivery. Actively shapes and drives towards the Bank-Wide engineering strategy and programmes to uplift standards and steer the technological direction towards excellence Act as a custodian for Java backend expertise, providing strategic leadership to enhance skill sets and ensure the delivery of high-performance banking solutions. Business Experienced practitioner and hands on contribution to the squad delivery for their craft (Eg. Engineering). Responsible for balancing skills and capabilities across teams (squads) and hives in partnership with the Chief Product Owner & Hive Leadership, and in alignment with the fixed capacity model. Responsible to evolve the craft towards improving automation, simplification and innovative use of latest market trends. Collaborate with product owners and other tech leads to ensure applications meet functional requirements and strategic objectives Processes Promote a feedback-rich environment, utilizing internal and external insights to continuously improve chapter operations. Adopt and embed the Change Delivery Standards throughout the lifecycle of the product / service. Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team. Follows the chapter operating model to ensure a system exists to continue to build capability and performance of the chapter. Chapter Lead may vary based upon the specific chapter domain its leading. People & Talent Accountable for people management and capability development of their Chapter members. Reviews metrics on capabilities and performance across their area, has improvement backlog for their Chapters and drives continual improvement of their chapter. Focuses on the development of people and capabilities as the highest priority. Risk Management Responsible for effective capacity risk management across the Chapter with regards to attrition and leave plans. Ensures the chapter follows the standards with respect to risk management as applicable to their chapter domain. Adheres to common practices to mitigate risk in their respective domain. Design and uphold a robust risk management plan, with contingencies for succession and role continuity, especially in critical positions. Governance Ensure all artefacts and assurance deliverables are as per the required standards and policies Regulatory & Business Conduct Ensure a comprehensive understanding of and adherence to local banking laws, anti-money laundering regulations, and other compliance mandates. Conduct business activities with a commitment to legal and regulatory compliance, fostering an environment of trust and respect. Key Stakeholders Chapter Area Lead Sub-domain Tech Lead Domain Architect Business Leads / Product owners Other Responsibilities Champion the company's broader mission and values, integrating them into daily operations and team ethos. Undertake additional responsibilities as necessary, ensuring they contribute to the organisation's strategic aims and adhere to Group and other Relevant policies. Qualification Requirements & Skills Bachelor's or Master's degree in Computer Science, Computer Engineering, or related field, with preference given to advanced degrees. 10 years of professional Java development experience, including a proven record in backend system architecture and API design. At least 5 years in a leadership role managing diverse development teams and spearheading complex Java projects. Proficiency in a range of Java frameworks such as Spring, Spring Boot, and Hibernate, and an understanding of Apache Struts. Proficient in Java, with solid expertise in core concepts like object-oriented programming, data structures, and complex algorithms. Knowledgeable in web technologies, able to work with HTTP, RESTful APIs, JSON, and XML Expert knowledge of relational databases such as Oracle, MySQL, PostgreSQL, and experience with NoSQL databases like MongoDB, Cassandra is a plus Familiarity with DevOps tools and practices, including CI/CD pipeline deployment, containerisation technologies like Docker and Kubernetes, and cloud platforms such as AWS, Azure, or GCP. Solid grasp of front-end technologies (HTML, CSS, JavaScript) for seamless integration with backend systems. Strong version control skills using tools like Git / Bitbucket with a commitment to maintaining high standards of code quality through reviews and automated tests. Exceptional communication and team-building skills, with the capacity to mentor developers, facilitate technical skill growth, and align team efforts with strategic objectives. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to work effectively in a fast-paced, dynamic environment. Role Specific Technical Competencies Hands-on Java Development Leadership in System Architecture Database Proficiency CI / CD Container Platforms Kubernetes / OCP / Podman
Posted 1 week ago
7.0 - 12.0 years
16 - 27 Lacs
Hyderabad
Work from Office
Role Summary: We are looking for an experienced and dynamic Lead Azure DevOps to join our growing DevOps team. As a Lead, you will play a pivotal role in managing, designing, and implementing end-to-end DevOps solutions, with a strong focus on Azure DevOps and cloud infrastructure. The ideal candidate should possess deep expertise in CI/CD pipeline creation, automation, and cloud security best practices, alongside hands-on experience in Azure cloud services. This is a leadership role that will involve collaborating with cross-functional teams to streamline development workflows, improve system reliability, and ensure the continuous delivery of high-quality solutions. Key Responsibilities: Azure DevOps Services & Cloud Infrastructure : Manage and optimize Azure DevOps Services, implement cloud infrastructure solutions using Azure resources, and ensure seamless integration with existing systems. CI/CD Pipeline Development : Design, implement, and maintain scalable CI/CD pipelines in Azure DevOps to automate application deployment and enhance software delivery processes. Scripting & Automation : Develop and maintain scripts using PowerShell, Bash, or Python to automate repetitive tasks, system configuration, and enhance operational efficiency. Security Best Practices : Implement and enforce security best practices and policies within Azure cloud environments to protect data and applications. Ensure that all systems comply with the latest security protocols. Version Control Systems (Git) : Manage code versioning using Git and collaborate with development teams to streamline version control workflows, including branching, merging, and pull request processes. Infrastructure as Code (IaC) : Lead efforts to define and deploy infrastructure using tools like Terraform, ARM templates, and Azure CLI, ensuring infrastructure is repeatable, scalable, and maintainable. Monitoring & Logging : Oversee the integration of monitoring, logging, and alerting tools to proactively detect and respond to issues in the system. Continuously optimize monitoring workflows for greater efficiency and insight. Problem-Solving & Troubleshooting : Lead efforts to diagnose and resolve complex technical issues related to the Azure cloud environment, DevOps pipelines, and application deployment. Collaboration & Communication : Foster strong collaboration between development, operations, and security teams. Act as a subject-matter expert and provide guidance to junior engineers and cross-functional teams on best practices for Azure DevOps and cloud environments. Documentation & Reporting : Create and maintain clear documentation on all DevOps processes, procedures, and guidelines. Report regularly on the status of pipelines, deployments, and infrastructure. Required Skills & Qualifications: Azure DevOps Expertise : Proven experience working with Azure DevOps Services, including pipeline creation, release management, and environment management. Cloud Infrastructure : Hands-on experience in Azure Cloud, including managing and deploying cloud resources and services. CI/CD Pipeline Creation : Strong experience in designing, building, and managing continuous integration and continuous delivery (CI/CD) pipelines. Scripting & Automation : Advanced proficiency in scripting languages such as PowerShell, Bash, or Python for task automation and infrastructure management. Security Best Practices : Strong understanding of cloud security principles, including identity and access management (IAM), network security, and compliance best practices within Azure. Version Control : Solid knowledge of version control systems such as Git, including experience with branching strategies, pull requests, and merge workflows. Infrastructure as Code (IaC) : Proficiency in using Infrastructure as Code tools, including Terraform, ARM templates, or Azure CLI, to manage cloud resources. Monitoring & Logging : Familiarity with monitoring and logging platforms such as Azure Monitor, Application Insights, or third-party solutions to maintain application performance and availability. Problem-Solving & Troubleshooting : Ability to identify, analyze, and resolve technical issues across a variety of systems and platforms. Communication & Teamwork : Exceptional communication skills, with a collaborative approach to problem-solving and decision-making in cross-functional teams. Preferred Qualifications: Certifications : Azure certifications (e.g., Azure DevOps Engineer Expert, Azure Solutions Architect) are a plus. Experience with Agile : Previous experience working in Agile/Scrum environments is beneficial. Containerization & Orchestration : Familiarity with Docker, Kubernetes, or other container orchestration technologies is a plus.
Posted 1 week ago
12.0 - 20.0 years
35 - 40 Lacs
Navi Mumbai
Work from Office
Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.
Posted 1 week ago
4.0 - 9.0 years
5 - 14 Lacs
Kolkata, Hyderabad, Bengaluru
Work from Office
Role & responsibilities Key Responsibilities Architect and implement AI/ML/GenAI pipelines , automating end-to-end workflows from data ingestion to model deployment and monitoring. Develop scalable, production-grade APIs and services using FastAPI, Flask , or similar frameworks for AI/LLM model inference. Design and maintain containerized AI applications using Docker and Kubernetes . Operationalize Large Language Models (LLMs) and other GenAI models via cloud-native deployment (e.g., Azure ML, AWS Sagemaker, GCP Vertex AI). Manage and monitor model performance post-deployment, applying concepts of MLOps and LLMOps including model versioning, A/B testing, and drift detection. Build and maintain CI/CD pipelines for rapid and secure deployment of AI solutions using tools such as GitHub Actions, Azure DevOps, GitLab CI . Implement security, governance, and compliance standards in AI pipelines. Optimize model serving infrastructure for speed, scalability, and cost-efficiency. Collaborate with AI researchers to translate prototypes into robust production-ready solutions. Shift: 11am-8pm
Posted 1 week ago
12.0 - 20.0 years
35 - 40 Lacs
Navi Mumbai
Work from Office
Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.
Posted 1 week ago
10.0 - 20.0 years
35 - 40 Lacs
Navi Mumbai
Work from Office
Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.
Posted 1 week ago
7.0 - 9.0 years
12 - 20 Lacs
Thane
Work from Office
We are seeking a skilled and proactive Medical Device Cybersecurity Engineer to join our team. This position plays a critical role in ensuring the cybersecurity and regulatory compliance of our connected medical devices throughout the product lifecycle. The ideal candidate has hands-on experience in threat modeling, managing third-party software components, performing vulnerability scans and penetration testing, and collaborating across cross-functional teams to integrate robust cybersecurity controls in accordance with FDA and global regulatory requirements. Key Responsibilities: Perform and maintain comprehensive threat modeling (e.g., STRIDE) for embedded and connected medical devices. Perform regular vulnerability scans, penetration testing , and static/dynamic analysis using tools such as Kali Linux, Metasploit, Wireshark, NMAP, Fortify, Nessus, or similar. Develop and update cybersecurity risk assessments as part of the overall risk management process (including CVSS scoring). Define, implement, and document security controls based on threat model outcomes. Manage and maintain Software Bill of Materials (SBOM) in compliance with FDA premarket and post-market guidance and global standards (e.g., NTIA, NIST). Support secure software development lifecycle (SDLC) practices including secure coding reviews. Conduct cybersecurity surveillance for new threats, advisories, CVEs, and zero-day vulnerabilities that may impact devices post-market. Triage and assess reported vulnerabilities, coordinate remediation and update documentation accordingly. Support preparation of cybersecurity documentation for FDA submissions (e.g., premarket submissions, 510(k), PMA) including security risk management reports and architecture diagrams. Ensure compliance with FDA applicable standards (e.g., ISO 14971, IEC 62304, ANSI/AAMI SW96:2023) Collaborate with Quality, Regulatory, and Engineering to ensure cybersecurity is integrated across the product lifecycle. Collaborate with software, hardware, and systems teams to guide cybersecurity design and testing. Qualifications Required: Bachelors or Masters degree in Computer Engineering, Cybersecurity, Electrical Engineering, or related field. 57 years of experience in embedded systems or medical device cybersecurity. Strong working knowledge of SBOM, SOUP, vulnerability scanning tools, penetration testing, and threat modeling methodologies. Familiarity with relevant regulations and standards (e.g., FDA Cybersecurity Guidance, NIST SP 800-53/30/218, ANSI/AAMI SW96:2023). Experience with secure development tools and CI/CD environments. Preferred: Certified Ethical Hacker (CEH), CISSP, CSSLP, or similar certification. Experience with connected devices (IoMT), wireless protocols (BLE, Wi-Fi), and cloud security principles. Familiarity with DevSecOps practices and security tools integration.
Posted 1 week ago
6.0 - 10.0 years
15 - 20 Lacs
Mumbai, Thane, Navi Mumbai
Work from Office
Position Requirements: 5+ years of experience in iOS Development. • Experience with using Swift 5 and code versioning tools such as GIT. • Experience with using Swift UI and combine framework • Experience with RESTful APIs to connect backend services. • Experience on Apples design principles and interface guidelines. • Experience with using MVVM design pattern • Must have built at least one SDK. • Experience with object-oriented programming and protocol-oriented programming. • Experience with Unit and UI Testing will be an added advantage. • Excellent written, verbal communication and time management skills. Role & Responsibilities: Lead the development process for assigned features and functionalities within the iOS application. Design and implement clean, maintainable, and efficient Swift code. Collaborate with designers, product managers, and other developers to translate user stories and product requirements into technical solutions. Write unit and integration tests to ensure code quality and stability. Identify and resolve bugs and performance issues. Stay up-to-date with the latest trends and technologies in iOS development, including SwiftUI, ARKit, and Core ML. Mentor and guide junior developers within the team. Education: B.E. Computer Science/IT degree (or any other engineering discipline) Experience: 5+ years Work timings: 2:00 PM to 11:00 PM IST
Posted 1 week ago
15.0 - 20.0 years
25 - 35 Lacs
Gurugram
Hybrid
Were looking for a Senior Engineering Manager who expects more from their career. Its a chance to extend and improve client Software Engineering Department. Its an opportunity to work with a market-leading business to explore new opportunities for us and influence global retailers. As a Senior Engineering Manager, you will be responsible for leading and inspiring multiple engineering teams to deliver high-quality, innovative software products that drive business growth. You will set the technical direction, build high-performing teams, and foster a culture of engineering excellence. Role & Responsibilities As a Senior Engineering Manager, you will lead multiple engineering teams, fostering a culture of technical excellence, agility, and innovation. Your responsibilities will include: Defining & Leading the Engineering Vision Drive the technical strategy, platform evolution, and modernization of our software ecosystem. Influence cloud adoption, API-first development, automation, and DevOps maturity. Owning Sub-Domain Level OKRs Work closely with Product, Design, UX, and Commercial teams to co-own objectives, ensuring that engineering efforts align with business goals. Managing High-Performing Global Teams Lead squads across India, the UK, and the US, ensuring delivery of secure, scalable, and high-quality software solutions in an agile environment. Hiring & Talent Development Build, mentor, and grow top-tier engineering talent, fostering a culture of learning, inclusion, and career growth. Technical Excellence & Architecture Alignment Collaborate with Architects and Principal Engineers to establish best practices in distributed systems, microservices, cloud-native design, API strategy, and front-end development. Driving Automation & DevOps Best Practices Champion CI/CD pipelines, cloud transformation, and platform engineering principles, ensuring robust observability, security, and cost efficiency. Engineering Governance & Innovation – Continuously improve engineering processes, team productivity, and technical debt reduction, driving a shift-left approach and best-in-class software development methodologies. Stakeholder Collaboration – Engage with Product, UX, and business leaders to influence platform roadmaps, balancing speed, scalability, and reliability. Hands-On Leadership – Stay close to technology, providing technical mentorship, contributing to architectural decisions, and ensuring engineering best practices. What You’ll Bring 15+ years of experience in software engineering, with at least 3+ years leading global teams. Proven experience as a Engineering Manager(senior), Lead Engineer, or Tech manager, managing complex engineering projects. Strong expertise in distributed systems, cloud architecture (GCP & Azure), microservices, API design, and scalable platform engineering. In-depth knowledge and hands-on experience with .NET, Python, Spark, Git (GitLab), Docker, Kubernetes and Cloud development (GCP & Azure). Experience working with React.js Strong knowledge of DevOps, CI/CD pipelines, observability, and cloud security best practices. Ability to drive engineering strategy, process improvements, and high-velocity agile execution. Experience hiring, mentoring, and leading global teams across multiple time zones. Excellent stakeholder management, communication, and decision-making skills, working cross-functionally with PMs, UX, and Business Leaders. Passion for continuous learning, innovation, and staying ahead of technology trends.
Posted 1 week ago
10.0 - 15.0 years
20 - 35 Lacs
Chennai
Remote
Job Summary We are seeking a highly skilled Senior Technical Architect with expertise in Databricks, Apache Spark, and modern data engineering architectures. The ideal candidate will have a strong grasp of Generative AI and RAG pipelines and a keen interest (or working knowledge) in Agentic AI systems. This individual will lead the architecture, design, and implementation of scalable data platforms and AI-powered applications for our global clients. This high-impact role requires technical leadership, cross-functional collaboration, and a passion for solving complex business challenges with data and AI. Key Responsibilities Lead architecture, design, and deployment of scalable data solutions using Databricks and the medallion architecture. Guide technical teams in building batch and streaming data pipelines using Spark, Delta Lake, and MLflow. Collaborate with clients and internal stakeholders to understand business needs and translate them into robust data and AI architectures. Design and prototype Generative AI applications using LLMs, RAG pipelines, and vector stores. Provide thought leadership on the adoption of Agentic AI systems in enterprise environments. Mentor data engineers and solution architects across multiple projects. Ensure adherence to security, governance, performance, and reliability best practices. Stay current with emerging trends in data engineering, MLOps, GenAI, and agent-based systems. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or related technical discipline. 10+ years of experience in data architecture, data engineering, or software architecture roles. 5+ years of hands-on experience with Databricks, including Spark SQL, Delta Lake, Unity Catalog, and MLflow. Proven experience in designing and delivering production-grade data platforms and pipelines. Exposure to LLM frameworks (OpenAI, Hugging Face, LangChain, etc.) and vector databases (FAISS, Weaviate, etc.). Strong understanding of cloud platforms (Azure, AWS, or GCP), particularly in the context of Databricks deployment. Knowledge or interest in Agentic AI frameworks and multi-agent system design is highly desirable. Technical Skills Databricks (incl. Spark, Delta Lake, MLflow, Unity Catalog) Python, SQL, PySpark GenAI tools and libraries (LangChain, OpenAI, etc.) CI/CD and DevOps for data REST APIs, JSON, data serialization formats Cloud services (Azure/AWS/GCP) Soft Skills Strong communication and stakeholder management skills Ability to lead and mentor diverse technical teams Strategic thinking with a bias for action Comfortable with ambiguity and iterative development Client-first mindset and consultative approach Excellent problem-solving and analytical skills Preferred Certifications Databricks Certified Data Engineer / Architect Cloud certifications (Azure/AWS/GCP) Any certifications in AI/ML, NLP, or GenAI frameworks are a plus
Posted 1 week ago
5.0 - 8.0 years
25 - 30 Lacs
Indore, Chennai
Work from Office
We are seeking a Senior Python DevOps Engineer to develop Python services and build CI/CD pipelines for AI/data platforms. Must have strong cloud, container, and ML workflow deployment experience. Required Candidate profile Experienced Python DevOps engineer with expertise in CI/CD, cloud, and AI platforms. Skilled in Flask/FastAPI, Airflow, MLFlow, and model deployment on Dataiku and OpenShift.
Posted 1 week ago
6.0 - 8.0 years
7 - 9 Lacs
Hyderabad, Bengaluru, Thiruvananthapuram
Hybrid
Job Title : DevOps Engineer Job Location : Pan India (Hybrid Mode) Salary Budget: As per Industry Standards Immediate joiners are preferred Role & responsibilities Build and deploy a highly scalable Kubernetes (K8s) container platform. Automate the build of containerized systems using CI/CD pipelines, GitOps tools, and Helm charts. Set up and install containerized API gateway components on the Kubernetes platform. Troubleshoot and resolve technical issues related to the container platform infrastructure. Perform health checks and validations to ensure smooth and successful installation of the platform. Perform upgrades and operating system updates to ensure compliance with governance and security standards. Monitor the platform components and ensure stability of the platform Preferred candidate profile 6 years of experience in DevOps Engineering roles is mandatory. Strong hands-on experience with Kubernetes, containerization (Docker), and Helm Proficiency in building CI/CD pipelines using tools such as Jenkins, GitLab CI, Circle CI, or similar. Familiarity with GitOps tools like ArgoCD or Flux for Kubernetes resource management.
Posted 1 week ago
12.0 - 18.0 years
35 - 45 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Summary We are seeking an experienced Amazon Connect Architect with 12 to 15 years of experience to design, develop and implement scalable and reliable cloud-based contact center solutions using Amazon Connect and AWS ecosystem services You will play a key role in translating business needs into technical solutions and lead implementation across clients or business units Key Responsibilities Architect and design contact center solutions using Amazon Connect and AWS services like Lambda Lex DynamoDB S3 and CloudWatch Lead the endtoend implementation and configuration of Amazon Connect Integrate Amazon Connect with CRMs, Salesforce, ServiceNow etc, ticketing systems, and third-party tools Define call flows IVR designs, routing profiles and queue configurations Implement Contact Lens realtime metrics and historical reporting Collaborate with cross-functional teams, developers, business analysts project managers Create technical documentation diagrams and handoff materials Stay updated on AWS best practices and new Amazon Connect features Provide technical leadership and mentorship to development support teams Required Skills Proven experience designing and deploying Amazon Connect solutions Strong hands-on knowledge of AWS Lambda, IAM, S3, DynamoDB, Kinesis, and CloudFormation Experience with Amazon Lex and AIML for voice bots Proficiency in programming scripting JavaScript, Node.js Familiarity with CRM integrations especially Salesforce Service Cloud Voice Understanding of telephony concepts SIP DID ACD IVR CTI Experience with CICD pipelines and version control Git Strong documentation and communication skills Preferred Skills AWS Certified Solutions Architect or Amazon Connect accreditation
Posted 1 week ago
8.0 - 13.0 years
22 - 30 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
LOCATION: PAN INDIA Experience: 8+ years Support Model: 24x7 rotational Role Overview: Handle service delivery and ensure performance across all Amazon Connect support areas. Oversee overall support operations, enhancements, and system updates. Act as primary escalation point for incidents. Manage SLAs and ensure service standards are met. Identify process gaps and implement improvements. Lead and mentor Junior engineers. Maintain relationships with internal and external stakeholders. Skills Required: Deep hands-on experience with Amazon Connect Strong knowledge of AWS Lambda, DynamoDB, S3 In-depth understanding of contact flows, queues, routing profile, quick connect, telephony config, and call routing Strong troubleshooting skills in WebRTC and voice issues Experience with CloudWatch, Connect Metrics, CI/CD pipelines Experience integrating with Salesforce. (Service Cloud Voice) Good documentation and process improvement capability Strong leadership and communication skills
Posted 1 week ago
3.0 - 8.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
LOCATION: PAN INDIA Experience: 3-5 years Support Model: 24x7 rotational Role Overview: Provide support on Amazon Connect-related incidents and user issues. Handle basic troubleshooting of voice, call routing, and UI-based configurations. Support change announcements and basic deployment activities. Coordinate with L2/L3 engineers for escalation. Maintain documentation and update knowledge base. Skills Required: Hands-on experience with Amazon Connect (basic flows, routing, and settings) Exposure to AWS Lambda, S3, DynamoDB Basic understanding of WebRTC and voice troubleshooting Familiar with CloudWatch, Connect Metrics Willingness to learn Salesforce integration. (Service Cloud Voice) Strong willingness to work in support model and take ownership Experience: 5-8 years Support Model: 24x7 rotational Role Overview: Provide L2 level support for Amazon Connect and associated AWS services. Address incidents and troubleshoot system or telephony-related issues. Support service delivery and ensure announced changes are implemented. Maintain SLAs and escalate where required. Contribute to documentation and improvement plans. Support deployment through CI/CD pipeline. Skills Required: Strong hands-on experience with Amazon Connect Working knowledge of Lambda, DynamoDB, S3 Good understanding of call flows, routing, and WebRTC troubleshooting Familiarity with CloudWatch, Connect Metrics, CI/CD Exposure to Salesforce integration helpful. (Service Cloud Voice) Ability to work independently with issue resolution Good communication and support handling
Posted 1 week ago
4.0 - 7.0 years
4 - 9 Lacs
Pune
Hybrid
Hiring for Java Developer at Pune location !!! Job Title: Java Developer B2C Web Applications (Cloud, Angular) Location: Pune ( Kharadi ) Experience: 4+ Years Employment Type: Full-time Work Mode: Hybrid Job Summary: We are seeking a skilled and motivated Java Developer with over 4 years of experience in developing modern B2C web applications and backend services within enterprise environments. The ideal candidate should have a strong background in Java-based APIs, batch/scheduled services in a cloud setup, and front-end development with Angular 12+. Experience with enterprise standards, Agile methodologies, and DevOps tools such as GitLab and Jira is essential. Key Responsibilities: Design, develop, and maintain Java-based APIs and batch/scheduled services deployed in cloud environments (AWS, Azure, or GCP). Collaborate with cross-functional teams to build scalable and robust B2C applications. Troubleshoot and address defects, resolve security vulnerabilities, and handle production triaging and issue resolution. Participate in Agile/Scrum ceremonies, ensuring timely delivery of high-quality code. Contribute to the automation of workflows and CI/CD pipelines using GitLab. Ensure adherence to enterprise coding standards and best practices. Work closely with QA and DevOps teams to ensure seamless integration and deployment. Primary Skills: Strong hands-on experience in Java, Spring Boot, REST APIs. Experience with batch processing (e.g., Spring Batch, Quartz). Familiarity with cloud platforms (AWS, Azure, or GCP). Proficiency in using Git/GitLab for source control and pipeline automation. Agile/Scrum methodology experience with Jira as a task-tracking tool. Secondary Skills: Front-end development using Angular 12+. Understanding of responsive UI design and integration with REST APIs. Knowledge of unit testing frameworks (e.g., JUnit, Jasmine/Karma for Angular). Nice to Have: Experience in microservices architecture. Knowledge of containerization tools (Docker, Kubernetes). Exposure to monitoring/logging tools (e.g., ELK stack, Prometheus, Grafana). Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. Strong communication and collaboration skills. Problem-solving attitude with attention to detail and a proactive approach. Interested candidates can share your Updated cv to premkumar.m@kiya.ai
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP services. Work closely with data analysts, data architects, and product teams to gather and understand data requirements. Manage and monitor BigQuery datasets, tables, and partitioning strategies. Implement error handling, resiliency, and observability mechanisms across pipeline components. Collaborate with DevOps teams to enable automated delivery (CI/CD) for data pipeline components. Required Skills: 5+ years of hands-on experience in Data Engineering or Software Engineering . Proficiency in Python and SQL . Good understanding of Java (for reading or modifying codebases). Experience building ETL pipelines with Apache Beam and Google Cloud Dataflow . Hands-on experience with Apache Kafka for stream processing. Solid understanding of BigQuery and data modeling on GCP. Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Compose, etc.). Good to Have: Experience building reusable ETL libraries or framework components. Knowledge of data governance, data quality checks, and pipeline observability. Familiarity with Apache Airflow or Cloud Composer for orchestration. Exposure to CI/CD practices in a cloud-native environment (Docker, Terraform, etc.). Tech stack : Python, SQL, Java, GCP (BigQuery, Pub/Sub, Cloud Storage, Cloud Compose, Dataflow), Apache Beam, Apache Kafka, Apache Airflow, CI/CD (Docker, Terraform)
Posted 1 week ago
2.0 - 5.0 years
3 - 6 Lacs
Pune
Hybrid
Job Summary: We are seeking an experienced and dynamic Python Project Manager to lead and deliver complex software projects while guiding and mentoring a high-performing technical team. This role blends project leadership with technical expertise , making it ideal for someone with a solid background in Python development who also excels at managing people, timelines, and client expectations. Key Responsibilities: Lead end-to-end project execution with a focus on Python-based development solutions. Manage the planning, coordination, and delivery of software projects, ensuring high quality and timely execution. Provide hands-on technical guidance in designing, developing, and deploying Python applications (e.g., using Flask, Django, FastAPI). Define project scope, schedules, and resource plans in collaboration with stakeholders. Conduct code reviews, set development standards, and support best practices in coding and architecture. Serve as a technical and strategic advisor to internal teams and clients. Facilitate Agile ceremonies including sprint planning, daily standups, and retrospectives. Proactively identify project risks and issues, and implement mitigation strategies. Collaborate with DevOps, QA, UI/UX, and data teams to ensure integration, testing, and deployment are seamless. Deliver regular updates to senior leadership and stakeholders on project status and team performance. Required Skills and Qualifications: Minimum of 2 years of experience in a technical leadership or project management role, demonstrating the ability to guide teams and deliver successful outcomes. Strong proficiency in Python and frameworks such as Django , Flask , or FastAPI . Experience designing RESTful APIs and working with databases (SQL/NoSQL). Proven ability to lead software engineering teams and manage project life cycles. Strong understanding of Agile/Scrum methodologies and project management tools (e.g., Jira, Trello). Hands-on experience with DevOps tools, CI/CD pipelines, and cloud platforms (AWS, GCP, or Azure). Exceptional communication and stakeholder management skills. Bachelors or Masters degree in Computer Science, Engineering, or related field. PMP, PMI-ACP, or Scrum certifications are a plus. Preferred Qualifications: Experience in data engineering, machine learning, or AI projects. Familiarity with front-end technologies (e.g., JavaScript, React) is a plus. Prior experience working in a client-facing or consulting environment. Company Website: https://flynautsaas.com/
Posted 1 week ago
6.0 - 8.0 years
13 - 20 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant – GCP Platform Engineer We are looking for a GCP Platform Engineer (Lead Consultant level) to drive the design, implementation, and management of scalable and secure cloud-native platforms. The ideal candidate will have deep expertise in Google Cloud Platform (GCP) and hands-on experience in building automated, resilient, and highly available infrastructure environments. Responsibilities Design and implement infrastructure solutions on GCP using best practices for scalability, availability, and security. Develop and maintain Infrastructure as Code (IaC) using tools like Terraform and Deployment Manager. Build and manage CI/CD pipelines, container orchestration platforms (GKE), and automated deployment processes. Implement and monitor logging, alerting, and observability frameworks across platform components. Collaborate with development, DevOps, and security teams to align platform engineering with product and business goals. Troubleshoot complex platform issues, performance bottlenecks, and optimize cloud spend (FinOps). Ensure compliance, governance, and risk management are embedded into platform delivery. Mentor junior engineers and participate in technical leadership forums and design reviews. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor's degree in information technology, Computer Science, or a related field. Strong knowledge of GCP core services : Compute, Networking, IAM, GKE, Cloud Functions, Cloud Build, etc. Proficiency in scripting (Python, Bash) and configuration management tools. Experience with CI/CD tooling (Jenkins, GitLab CI, Cloud Build) and monitoring tools (Prometheus, Stackdriver). Hands-on experience with containerization and Kubernetes in production environments. Expertise in Infrastructure as Code using Terraform or similar tools. Solid understanding of networking, security controls , and cloud-native architecture patterns. Preferred Qualifications/ Skills GCP certifications (e.g., Professional Cloud DevOps Engineer, Cloud Architect). Experience with multi-cloud or hybrid cloud environments. Exposure to Agile/DevOps methodologies and SRE principles. Previous experience in a consulting or client-facing role. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane