Jobs
Interviews

2429 Aws Cloud Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Requirements (Qualifications): Bachelor or Master degree with 3+ years of strong Python development experience Core skills: - Bachelor or Master degree with 3+ years of strong Python development experience - OOPs concepts: Functions, Classes, Decorators - Python and experience in anyone frameworks Flask/Django/Fast API - Python Libraries (Pandas, TensorFlow, Numpy, SciPy) - AWS Cloud Experience - Docker, Kubernetes, and microservices - Postgres/MySQL - GIT, SVN, or any Code repository tools - Design Patterns - SQL Alchemy/any ORM libraries (Object Relational Mapper) EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Bigdata QA Test Engineer at our company, you will play a crucial role in ensuring the quality and reliability of our Big Data systems. With a minimum of 6 years of experience, you will be based in Hyderabad with a hybrid work setup. We are seeking individuals who can join us on an immediate to 15 days notice period. Your primary responsibilities will include: - Conducting QA testing on Big Data systems - Utilizing AWS Cloud services - Implementing Python scripts for testing - Collaborating with DevOps teams to enhance system performance - Seeking individuals with AWS Data Engineer certification In this role, you will be evaluating a minimum of 10 resumes to identify suitable candidates for the position. We have multiple openings for Big Data QA/Test engineers, specifically targeting senior professionals. The ideal candidates will possess a strong background in AWS Cloud, Big Data technologies, Python programming, and DevOps practices. It is essential that candidates hold certification as an AWS Data Engineer. We welcome applicants with any graduation qualification who demonstrate proficiency in QA testing, AWS Cloud services, AWS Data Engineering, DevOps practices, Python programming, and Big Data technologies. If you are passionate about ensuring the quality and functionality of Big Data systems, we encourage you to apply for this challenging yet rewarding position.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

dindigul, tamil nadu

On-site

As a Software Trainer specializing in Full Stack Development, Front-End, Python, AWS Cloud, Testing, Java, PHP, Web Development, and Networking, your role involves developing and delivering comprehensive training programs tailored for learners of different levels. You will create detailed curriculum and training materials, conduct interactive training sessions, and evaluate trainee progress through various assessments. Staying updated with the latest programming trends is crucial, and incorporating new technologies into training programs is essential. Providing one-on-one support, collaborating with industry professionals, and continuously improving training strategies based on feedback are also part of your responsibilities. To qualify for this position, you should have a Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Proven experience as a Full Stack Developer with proficiency in Java, Python, PHP, and AWS Cloud is required. Previous teaching or training experience in a technical subject is highly desirable. A strong understanding of front-end technologies such as HTML, CSS, JavaScript, and frameworks is necessary. Excellent communication, presentation skills, the ability to explain complex technical concepts clearly, and strong organizational and time-management skills are essential for this role. Key skills for this position include proficiency in programming languages like Java, Python, PHP, familiarity with front-end technologies like HTML, CSS, JavaScript, React.js, Angular, expertise in back-end frameworks such as Django, Spring Boot, knowledge of Cloud Computing specifically AWS, experience with software testing methodologies, database management using MySQL, PostgreSQL, version control with Git, and understanding of basic and advanced networking principles. If you meet these qualifications and are passionate about training and mentoring individuals in the field of Full Stack Development, Front-End, Python, AWS Cloud, Testing, Java, PHP, Web Development, and Networking, we encourage you to apply for this position. For inquiries or to apply, please contact franchise@elysiumacadmey.org or call 78457 36974. This is a Full-time or Part-time position located in person.,

Posted 2 weeks ago

Apply

12.0 - 16.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Group Manager role involves managing a team to establish and implement new or revised application systems and programs in coordination with the Technology Team. The main responsibility is to drive applications systems analysis and programming activities. As a manager, you will manage multiple teams, conduct personnel duties, provide strategic influence, monitor budget management, and ensure essential procedures are followed. It is essential to integrate knowledge of applications development with the overall technology function to achieve established goals. You will be expected to provide evaluative judgement in complex situations, influence and negotiate with senior leaders, assess risks in business decisions, recruit and retain teams, manage stakeholders, and follow corporate mandates in product/project management. Thought leadership is required to bring efficiencies across the Software Development Life Cycle (SDLC) by planning, tracking progress, raising risks, and taking corrective actions. Additionally, you should lead agile best practices, improve communication, ensure adherence to IT risk & controls, and provide superior customer service. The ideal candidate should have 12+ years of experience in software development with expertise in web applications, enterprise integration, data & analytics. Proficiency in Java, Spring Boot, Microservices, Angular/React, Oracle/PostgreSQL technology stack, RESTful APIs, Kafka messaging, Elastic Search, NoSQL databases, and caching solutions is necessary. Expertise in designing and optimizing software solutions, troubleshooting, site reliability engineering, test-driven development, authentication, authorization, security, and familiarity with AWS cloud are also required. A Bachelors degree/University degree or equivalent experience is a minimum requirement, with a Masters degree preferred. Citi is an equal opportunity and affirmative action employer, inviting all qualified interested applicants to apply for career opportunities. If you require a reasonable accommodation to use their search tools and/or apply for a career opportunity, review Accessibility at Citi.,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

15 - 20 Lacs

jaipur

Work from Office

Solution Sales Specialist Experience: 3+ Years Location: India (Jaipur) About Celebal Technologies Celebal Technologies is a globally recognised Data, AI, and Cloud solutions company. As a Microsoft Partner of the Year , we help enterprises modernise, automate, and innovate using cutting-edge technologies, including Azure, Databricks, Power Platform, SAP, AWS, and Generative AI . We serve large enterprise clients across North America, APAC, EMEA, and India in industries like Manufacturing, Retail & CPG, BFSI, Energy, etc . Role Overview We are seeking a high-performing Solution Sales Specialist who can bridge the gap between technology and business. You will be responsible for identifying strategic opportunities, proposing tailored solutions, and supporting clients through the entire sales lifecycle from discovery to proposal, solutioning, and closure. This is a consultative, value-driven sales role where you will work across Celebals key offerings in Data Engineering, Cloud Modernisation, Enterprise Automation, and AI-led transformation . Key Responsibilities Own the end-to-end business development and sales lifecycle , from prospecting to deal closure. Drive engagement with enterprise and strategic accounts , identifying customer pain points and aligning Celebal’s solutions accordingly. Understand client requirements , business processes, and use cases to propose fit-for-purpose solutions in collaboration with pre-sales and delivery teams. Lead the preparation of technical and commercial proposals , including Statements of Work (SoWs), RFP responses, presentations, and commercial pricing . Champion solution themes around Gen AI, Data Modernisation, App Re-engineering, Power Platform, SAP, Data Governance, and Intelligent Automation . Consult with customers to demonstrate technical capabilities and solution value; act as a trusted advisor throughout the sales cycle. Drive account mapping and pipeline planning , leveraging CRM and market intelligence to identify upsell and cross-sell opportunities. Stay updated with industry trends, partner capabilities (Microsoft, Databricks, AWS) , and technology developments to influence client conversations. Collaborate with Marketing and Inside Sales to drive campaigns, events, and ABM strategies targeting CXO personas. Maintain accurate sales forecasts and contribute to quarterly business planning. Own CRM updates and all mandatory reporting as required internally. Desired Skills & Experience 3+ years of consultative B2B sales experience, preferably in IT services, cloud solutions, or analytics. Proven track record in enterprise solution selling and exceeding revenue targets. Strong understanding of technologies such as Azure, Databricks, SAP, Power Platform, AI/ML, Data Lakehouse, and Automation Platforms . Excellent communication, presentation, and negotiation skills — with the ability to translate complex technical concepts into compelling business outcomes. Experience in drafting proposals, SoWs, pricing models, and solution documents independently. Familiarity with partner ecosystems (Microsoft, Databricks, SAP, AWS) and co-selling models is a strong plus. Entrepreneurial mindset with experience in high-growth or startup environments . Strong business acumen and understanding of industry-specific challenges. Travel as required to Client locations, Industry, Partner events. Work across flexible time zones and with global teams. Why Join Celebal? Be a part of a fast-growing digital transformation leader with global enterprise impact. Work with cutting-edge technologies and Fortune 500 clients . High visibility role with cross-functional collaboration and leadership access. Flexible work culture, performance-driven growth, and global exposure.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

4 - 8 Lacs

hyderabad

Work from Office

1. Understanding of various SDLC methodologies (Agile, Waterfall, V) 2. Experience in Programming or scripting languages (Java/JS etc.,). 3. Experience in designing automation frameworks (POM, Cucumber BDD, etc.,). 4. Generate Automation Test Strategy for applications deployed on Cloud. (Public and Private cloud) 5. Automate scripts using a given Selenium-based framework, Serenity based BDD framework or any open-source tools. 6. Working experience in testing Restful Webservices and APIs using frameworks or tools like POSTMAN, Rest API, etc. 7. Experience with VAPT. 8. Knowledge of Performance testing using Jmeter or any open-source tools. 9. Perform Risk Analysis of Project & automation deliverables and work towards mitigation plans. 10. Analyze requirements, perform impact analysis, and regression analysis, and communicate with stakeholders on the need for changes in requirements. 11. Mentored and managed a testing team member to meet project goals. 12. Experience with various testing types like Functional, Regression, UI/ Usability, Integration testing, etc., 13. Idea on Generative AI for testing. 14. An idea of estimation techniques for Automation testing efforts (Design and Execution). 15. Experience in testing and test planning of CI/CD-based solutions through tools like Azure DevOps. 16. Hands-on experience using ADO or any test management tools. 17. Good exposure to the AWS cloud platform. 18. Well versed with CSV and GxP process. 19. Experience preparing and reviewing various CSV documents like Test Protocol, Test Plan, Test Strategy, IQ/OQ/PQ scripts, RTM, Test Summary Report etc., 20. Experience working on CAPAS, Incident Management, FMEA Risk Assessment, Functional Risk Assessment etc.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

hyderabad

Work from Office

Key Responsibilities: Develop, optimize, and maintain scalable and responsive web applications using ReactJS, JavaScript, and TypeScript. Build and manage reusable components, hooks, and state management using Redux or Context API. Write clean, maintainable, and efficient HTML, CSS (SASS/SCSS) for modern UI designs. Ensure cross-browser compatibility and mobile-first responsive design. Optimize application performance by implementing lazy loading, memorization, and code-splitting techniques. Work closely with backend teams (Node.js, APIs, GraphQL, REST) to integrate frontend with backend services. Implement unit and integration testing using Jest, React Testing Library, or Cypress. Follow Agile methodologies and participate in sprint planning, code reviews, and retrospectives. Stay updated with the latest React ecosystem, frontend trends, and best practices. Required Skills & Qualifications: 5-10 years of professional experience in ReactJS and frontend development. Strong expertise in TypeScript, JavaScript (ES6+), and modern frontend frameworks. Hands-on experience with Redux, Context API, Hooks, and state management. Proficiency in HTML5, CSS3, SCSS, and Bootstrap. Knowledge of RESTful APIs, GraphQL, and third-party API integration. Experience in performance optimization and handling large-scale applications. Familiarity with Webpack, Babel, ESLint, and modern frontend build tools is added advantage. Experience with CI/CD pipelines and AWS cloud deployment.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Engineering Services Practitioner Project Role Description : Assist with end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Solve engineering problems and achieve business objectives using scientific, socio-economic, technical knowledge and practical experience. Work across structural and stress design, qualification, configuration and technical management. Must have skills : 5G Wireless Networks & Technologies Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationJob Title:5G Core Network Ops Senior Engineer Summary :We are seeking a skilled 5G Core Senior Network Engineer to join our team. The ideal candidate will have extensive experience with Nokia 5G Core platforms and will be responsible for fault handling, troubleshooting, session and service investigation, configuration review, performance monitoring, security support, change management, and escalation coordination.Roles and Responsibilities:1.Fault Handling & Troubleshooting:Provide Level 1 (T1) support for 5G Core SA network functions in production environment.Analyze alarms from NetAct/Mantaray, or external monitoring tools.Correlate events using Netscout, Mantaray, and PM/CM data.Troubleshoot and resolve complex issues related to registration, session management, mobility, policy, charging, DNS, IPSec and Handover issues.Handle node-level failures (AMF/SMF/UPF/NRF/UDM/UDR/SDL/PCF/CHF/Flowone, Nokia EDR restarts, crashes, overload).Handle troubleshooting on 5G Core Database, UDM, UDR, SDL, Provisioning, Flowone, CHF(Charging), PCF(Policy).Perform packet tracing (Wireshark) or core trace (PCAP, logs) and Nokia PCMD trace capturing and analysis.Perform root cause analysis (RCA) and implement corrective actions.Handle escalations and provide timely resolution.2.Session & Service Investigation:Trace subscriber issues (5G attach, PDU session, QoS).Use tools like EDR, Flow Tracer, Nokia Cloud Operations Manager (COM).Correlate user-plane drops, abnormal release, bearer QoS mismatch.Work on Preventive measures with L1 team for health check & backup. 3.Configuration and Change Management:Create a MOP for required changes, validate MOP with Ops teams, stakeholders before rollout/implementation.Maintain detailed documentation of network configurations, incident reports, and operational procedures.Support software upgrades, patch management, and configuration changes.Maintain documentation for known issues, troubleshooting guides, and standard operating procedures (SOPs).Audit NRF/PCF/UDM etc configuration & Database.Validate policy rules, slicing parameters, and DNN/APN settings.Support integration of new 5G Core nodes and features into the live network.4.Performance Monitoring:Use KPI dashboards (NetAct/NetScout) to monitor 5G Core KPIs e.g registration success rate, PDU session setup success, latency, throughput, user-plane utilization.Proactively detect degrading KPIs trends.5.Security & Access Support:Application support for Nokia EDR and CrowdStrike.Assist with certificate renewal, firewall/NAT issues, and access failures.6.Escalation & Coordination:Escalate unresolved issues to L3 teams, TAC, OSS/Core engineering.Work with L3 and care team for issue resolution.Ensure compliance with SLAs and contribute to continuous service improvement.7.ReportingGenerate daily/weekly/monthly reports on network performance, incident trends, and SLA compliance. Technical Experience and Professional Attributes:59 years of experience in Telecom industry with hands on experience in 5G Core.Mandatory experience with Nokia 5G Core-SA platform.Solid understanding for 5G Core Packet Core Network Protocol such as N1, N2, N3, N6, N7, N8, 5G Core interfaces, GTP-C/U, HTTPS and including ability to trace, debug the issues.Hands-on experience with 5GC components:AMF, SMF, UPF, NRF, AUSF, NSSF, UDM, PCF, CHF, UDR, SDL, Nokia EDR, Provisioning and Flowone.Troubleshooting and configuration hands on experience on 5G Core Database, UDM, UDR, SDL, Provisioning, Flowone, CHF(Charging), PCF(Policy).In-depth understanding of 3GPP call flows for 5G-SA, 5G NSA, Call routing, number analysis, system configuration, call flow, Data roaming, configuration and knowledge of Telecom standards e.g. 3GPP, ITU-T and ANSI.Familiarity with policy control mechanisms, QoS enforcement, and charging models (event-based, session-based).Hands-on experience with Diameter, HTTP/2, REST APIs, and SBI interfaces.Strong analytical and troubleshooting skills.Proficiency in monitoring and tracing tools (NetAct, NetScout, PCMD tracing). And log management systems (e.g., Prometheus, Grafana).Knowledge of network protocols and security (TLS, IPsec).Excellent communication and documentation skills. Educational Qualification:BE / BTech15 Years Full Time Education Additional Information:Nokia certifications (e.g., NCOM, NCS, NSP, Kubernetes).Experience in Nokia Platform 5G Core, NCOM, NCS, Nokia Private cloud and Public Cloud (AWS preferred), cloud-native environments (Kubernetes, Docker, CI/CD pipelines).Cloud Certifications (AWS)/ Experience on AWS Cloud Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

kolkata, hyderabad, bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationRole Overview:We are seeking an experienced Lead Cloud DevOps Engineer to drive cloud infrastructure, automation, and security initiatives for a strategic banking client in the Netherlands. The ideal candidate will have deep expertise in AWS, Infrastructure as Code (IaC), certificate and identity management, and DevOps practices. This role requires both hands-on execution and technical leadership to guide implementation teams and ensure alignment with enterprise standards. Must-Have Skills: 68+ years of experience in AWS cloud architecture and engineeringExpertise in Infrastructure as Code using Terraform or CloudFormationProven experience in certificate management (e.g., AWS ACM, Secrets Manager)AWS Cloud Certifications (Solutions Architect Associate/Professional preferred)Strong knowledge of logging and monitoring with AWS CloudWatch, CloudTrailExperience in leading small DevOps/infra teams, setting technical direction Good-to-Have Skills: Deep understanding of SSO protocols (OpenID Connect, SAML)Experience configuring SSO + MFA using Azure Entra ID (formerly Azure AD)Hands-on with App registrations, permissions, and certificate rotation in AzureProficiency in Azure DevOps:YAML-based CI/CD pipelinesRepo branching strategiesPython scripting for automationIntegrating and resolving Sonar Cloud tasks (SAST) Key Responsibilities:Lead the design and delivery of secure, scalable AWS infrastructureDefine and enforce DevOps best practices across project teamsArchitect and automate IaC deployments and certificate lifecycle managementGuide the setup of secure authentication flows using SSO and MFAReview and optimize CI/CD pipelines and ensure DevSecOps integrationAct as a technical mentor and cloud SME, collaborating with cross-functional teams Preferred Certifications:AWS Certified Solutions Architect ProfessionalTerraform Associate (optional)Azure Security Engineer Associate / Identity and Access Administrator (good to have) Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 7.0 years

15 - 20 Lacs

bengaluru

Work from Office

Job Description We are looking for an experienced engineer to design and orchestrate agentic workflows leveraging LangChain and LangGraph for AI-powered solutions The role involves building reusable components, ensuring scalability, and integrating context-aware logic for data transformation in real-time marketing environments, Your Impact Build and orchestrate agentic workflows using LangChain and LangGraph, Implement logic for context-aware agents to handle marketing-specific data transformation tasks, Develop reusable components for data ingestion, transformation, enrichment, and validation, Ensure reliability, scalability, and traceability across agent workflows Qualifications Your Skills & Experience: Strong hands-on experience with LangChain and LangGraph, Proficient in Python, FastAPI, and data transformation techniques, Experience with IDEs like VSCode and PyCharm and relevant Python libraries, Proficiency with AWS Cloud Services, Understanding of agent orchestration, memory management, and tools for LLM-powered workflows, Exposure to vector databases (e-g , FAISS, Chroma) and embedding-based retrieval, Prior experience with GenAI, CoPilot, or RAG-based systems, Familiarity with marketing data models and real-time customer journeys is a plus Set Yourself Apart With Proven expertise in implementing solutions using LangChain and LangGraph frameworks, Strong proficiency in Python programming, FastAPI framework, and advanced data transformation techniques, Experienced in using development environments like VSCode and PyCharm, with deep knowledge of essential Python libraries, A Tip from the Hiring Manager: Ideal candidates will have prior experience in resource management from a consulting/technology firm, Additional Information Gender-Neutral Policy 18 paid holidays throughout the year, Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clientsbusinesses through designing the products and services their customers truly value,

Posted 2 weeks ago

Apply

4.0 - 7.0 years

15 - 20 Lacs

gurugram

Work from Office

Job Description We are looking for an experienced engineer to design and orchestrate agentic workflows leveraging LangChain and LangGraph for AI-powered solutions The role involves building reusable components, ensuring scalability, and integrating context-aware logic for data transformation in real-time marketing environments, Your Impact Build and orchestrate agentic workflows using LangChain and LangGraph, Implement logic for context-aware agents to handle marketing-specific data transformation tasks, Develop reusable components for data ingestion, transformation, enrichment, and validation, Ensure reliability, scalability, and traceability across agent workflows Qualifications Your Skills & Experience: Strong hands-on experience with LangChain and LangGraph, Proficient in Python, FastAPI, and data transformation techniques, Experience with IDEs like VSCode and PyCharm and relevant Python libraries, Proficiency with AWS Cloud Services, Understanding of agent orchestration, memory management, and tools for LLM-powered workflows, Exposure to vector databases (e-g , FAISS, Chroma) and embedding-based retrieval, Prior experience with GenAI, CoPilot, or RAG-based systems, Familiarity with marketing data models and real-time customer journeys is a plus Set Yourself Apart With Proven expertise in implementing solutions using LangChain and LangGraph frameworks, Strong proficiency in Python programming, FastAPI framework, and advanced data transformation techniques, Experienced in using development environments like VSCode and PyCharm, with deep knowledge of essential Python libraries, A Tip from the Hiring Manager: Ideal candidates will have prior experience in resource management from a consulting/technology firm, Additional Information Gender-Neutral Policy 18 paid holidays throughout the year, Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clientsbusinesses through designing the products and services their customers truly value,

Posted 2 weeks ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

Project Role : Engineering Services Practitioner Project Role Description : Assist with end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Solve engineering problems and achieve business objectives using scientific, socio-economic, technical knowledge and practical experience. Work across structural and stress design, qualification, configuration and technical management. Must have skills : 5G Wireless Networks & Technologies Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationJob Title:5G Core Network Ops Senior Engineer Summary :We are seeking a skilled 5G Core Senior Network Engineer to join our team. The ideal candidate will have extensive experience with Nokia 5G Core platforms and will be responsible for fault handling, troubleshooting, session and service investigation, configuration review, performance monitoring, security support, change management, and escalation coordination.Roles and Responsibilities:1.Fault Handling & Troubleshooting:Provide Level 1 (T1) support for 5G Core SA network functions in production environment.Analyze alarms from NetAct/Mantaray, or external monitoring tools.Correlate events using Netscout, Mantaray, and PM/CM data.Troubleshoot and resolve complex issues related to registration, session management, mobility, policy, charging, DNS, IPSec and Handover issues.Handle node-level failures (AMF/SMF/UPF/NRF/UDM/UDR/SDL/PCF/CHF/Flowone, Nokia EDR restarts, crashes, overload).Handle troubleshooting on 5G Core Database, UDM, UDR, SDL, Provisioning, Flowone, CHF(Charging), PCF(Policy).Perform packet tracing (Wireshark) or core trace (PCAP, logs) and Nokia PCMD trace capturing and analysis.Perform root cause analysis (RCA) and implement corrective actions.Handle escalations and provide timely resolution.2.Session & Service Investigation:Trace subscriber issues (5G attach, PDU session, QoS).Use tools like EDR, Flow Tracer, Nokia Cloud Operations Manager (COM).Correlate user-plane drops, abnormal release, bearer QoS mismatch.Work on Preventive measures with L1 team for health check & backup. 3.Configuration and Change Management:Create a MOP for required changes, validate MOP with Ops teams, stakeholders before rollout/implementation.Maintain detailed documentation of network configurations, incident reports, and operational procedures.Support software upgrades, patch management, and configuration changes.Maintain documentation for known issues, troubleshooting guides, and standard operating procedures (SOPs).Audit NRF/PCF/UDM etc configuration & Database.Validate policy rules, slicing parameters, and DNN/APN settings.Support integration of new 5G Core nodes and features into the live network.4.Performance Monitoring:Use KPI dashboards (NetAct/NetScout) to monitor 5G Core KPIs e.g registration success rate, PDU session setup success, latency, throughput, user-plane utilization.Proactively detect degrading KPIs trends.5.Security & Access Support:Application support for Nokia EDR and CrowdStrike.Assist with certificate renewal, firewall/NAT issues, and access failures.6.Escalation & Coordination:Escalate unresolved issues to L3 teams, TAC, OSS/Core engineering.Work with L3 and care team for issue resolution.Ensure compliance with SLAs and contribute to continuous service improvement.7.ReportingGenerate daily/weekly/monthly reports on network performance, incident trends, and SLA compliance. Technical Experience and Professional Attributes:59 years of experience in Telecom industry with hands on experience in 5G Core.Mandatory experience with Nokia 5G Core-SA platform.Solid understanding for 5G Core Packet Core Network Protocol such as N1, N2, N3, N6, N7, N8, 5G Core interfaces, GTP-C/U, HTTPS and including ability to trace, debug the issues.Hands-on experience with 5GC components:AMF, SMF, UPF, NRF, AUSF, NSSF, UDM, PCF, CHF, UDR, SDL, Nokia EDR, Provisioning and Flowone.Troubleshooting and configuration hands on experience on 5G Core Database, UDM, UDR, SDL, Provisioning, Flowone, CHF(Charging), PCF(Policy).In-depth understanding of 3GPP call flows for 5G-SA, 5G NSA, Call routing, number analysis, system configuration, call flow, Data roaming, configuration and knowledge of Telecom standards e.g. 3GPP, ITU-T and ANSI.Familiarity with policy control mechanisms, QoS enforcement, and charging models (event-based, session-based).Hands-on experience with Diameter, HTTP/2, REST APIs, and SBI interfaces.Strong analytical and troubleshooting skills.Proficiency in monitoring and tracing tools (NetAct, NetScout, PCMD tracing). And log management systems (e.g., Prometheus, Grafana).Knowledge of network protocols and security (TLS, IPsec).Excellent communication and documentation skills. Educational Qualification:BE / BTech15 Years Full Time Education Additional Information:Nokia certifications (e.g., NCOM, NCS, NSP, Kubernetes).Experience in Nokia Platform 5G Core, NCOM, NCS, Nokia Private cloud and Public Cloud (AWS preferred), cloud-native environments (Kubernetes, Docker, CI/CD pipelines).Cloud Certifications (AWS)/ Experience on AWS Cloud Qualification 15 years full time education

Posted 2 weeks ago

Apply

12.0 - 18.0 years

30 - 45 Lacs

pune, bengaluru

Hybrid

About the Role We are seeking a highly skilled Cloud Solution Architect to define, design, and drive innovative cloud strategies and solutions. The role requires a strong background in cloud architecture, AI-driven cloud capabilities, and pre-sales expertise. As a trusted advisor, you will collaborate with stakeholders, hyperscaler partners, and customers to deliver scalable, secure, and future-ready cloud solutions that align with business objectives. Key Responsibilities Develop and execute a cloud technology strategy aligned with business goals and innovation roadmap. Architect and define multi-cloud, hybrid, edge, and sovereign cloud solutions , ensuring scalability, compliance, and security. Lead the development of an AI-powered cloud portfolio , enabling automation, predictive operations, and intelligent scaling. Collaborate with hyperscalers (AWS, Azure, Google Cloud) and sovereign cloud partners to co-create market-leading solutions. Partner with sales and pre-sales teams to design cloud architectures that address customer needs and business requirements. Translate complex technical concepts into clear business narratives for C-level stakeholders and executive leadership. Deliver compelling executive-level presentations, proposals, and solution roadmaps . Evaluate and recommend emerging cloud technologies and investment opportunities . Provide thought leadership through whitepapers, blogs, and external industry engagements. Identify and build strategic partnerships to strengthen the cloud and AI services portfolio. Engage in government and industry initiatives such as IPCEI, GovTech, and Bitkom. Must-Have Qualifications Bachelors/Masters degree in Computer Science, Engineering, or related field . 5+ years of experience in cloud architecture, cloud strategy, or related areas. Mandatory experience in Pre-Sales , solution design, and customer engagement. Proven success in defining and executing cloud technology strategies . Hands-on experience in developing AI-driven capabilities for cloud infrastructure (automation, predictive analytics, intelligent scaling). Cloud certifications (at least one of the following): AWS Certified Solutions Architect Professional Microsoft Azure Solutions Architect Expert Google Cloud Professional Cloud Architect Strong knowledge of multi-cloud, edge computing, and cloud sovereignty principles . Excellent presentation and PowerPoint skills with experience engaging executive stakeholders. Strong interpersonal, communication, and collaboration skills across technical and business teams

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 9 Lacs

bengaluru

Work from Office

About this opportunity: As an Solution Integrator at Ericsson, you will play a key role in ensuring the stability, performance, and reliability of critical applications. You will support a variety of systems, drive root cause analysis, and collaborate across teams to enhance service delivery in a fast-paced telecom environment. What You Will Do Provide L2/L3 application support by managing incidents, service requests, and operational tasks. Perform root cause analysis and troubleshooting to resolve application and system issues effectively. Execute operational acceptance testing (OAT) to ensure systems meet performance and stability standards. Manage application operations across platforms including Windows, Linux, iOS, and Android. Work with various databases and tools, ensuring seamless application performance and monitoring. Utilize your expertise in Kubernetes and the ELK stack to improve application deployment and observability. Communicate clearly in written and spoken English to collaborate with internal teams and external stakeholders. Apply your telecom domain knowledge, particularly in RAN (Radio Access Network) and Core Network, to support Ericssons technology landscape. Engage with cloud platforms such as Microsoft Azure and AWS; prior experience is a plus. What You Bring Bachelors degree in Computer Science, Information Technology, or a related field. 24 years of hands-on experience in application support, troubleshooting, and operational maintenance. Strong working knowledge of Kubernetes and ELK stack (Elasticsearch, Logstash, Kibana). Experience with cloud platforms like MS Azure and Amazon AWS. Solid understanding of telecom systems, especially in RAN and Core Network domains. Familiarity with ITIL frameworks; ITIL certification is preferred. Exposure to MongoDB Atlas or similar NoSQL databases is desirable. Proficiency in scripting or coding is an added advantage.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

12 - 17 Lacs

mumbai, chennai, bengaluru

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Your Role Develop, design & Implement Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Full life cycle Oracle EPM Cloud Implementation. Creating forms, OIC Integrations, and complex Business Rules. Understanding dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Experience in Implementation in EDMCS Modules Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you'll love about Capgemini You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learnon one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Location - Bengaluru,Chennai,Mumbai,Pune,Hyderabad

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

india, bengaluru

Work from Office

Job Title: Staff UI Developer ( ReactJs + AWS ) Role Overview: This role is ideal for someone who thrives at the intersection of UI/UX design, cloud integration, and secure development practices. You will help create responsive, scalable front-end applications that provide real-time threat visibility and control to our global customer base. About the role Design and implement intuitive, performant, and secure UI components using React and modern JavaScript frameworks. Build interfaces that help visualize threat detections, incident timelines, and investigation workflows in our platform. Integrate with backend services through RESTful and GraphQL APIs, ensuring robustness and responsiveness. Work with OAuth 2.0 OpenID Connect for secure user authentication and integrate with IAM platforms. Collaborate closely with backend engineers, security analysts, and product managers to deliver features aligned with cybersecurity use cases. Ensure code adheres to secure coding standards, and proactively identify and fix UI-level vulnerabilities (e.g., XSS, CSRF). Leverage AWS services (such as S3, CloudFront, Cognito, Lambda) for front-end deployment and scaling. Participate in design reviews, threat modeling sessions, and agile ceremonies. About You 5-8 years of experience in UI development with expertise in React, JavaScript, Node.js, HTML5, and CSS3. Strong integration experience with REST APIs and GraphQL. Hands-on experience with OAuth2, OpenID Connect, and Identity and Access Management (IAM) solutions. Proficiency in AWS cloud services related to front-end and application security. Demonstrated ability to write secure code and remediate common vulnerabilities (e.g., input validation, secure session handling). Good understanding of JSON, data formats, and interaction with backend services and databases. Excellent debugging, problem-solving, and collaboration skills.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 6 Lacs

hyderabad

Work from Office

1. Understanding various SDLC (Agile, Waterfall, V) 2. Analyze requirements, perform impact analysis, and regression analysis, and communicate with stakeholders on the need for changes in requirements. 3. Strong experience in Manual and Exploratory Testing. 4. Good experience with various testing types like Functional, Regression, UI/ Usability, Integration testing, etc., 5. Idea on Generative AI for testing 6. Good experience in various testing techniques. 7. Experience in testing and test planning of CI/CD-based solutions through tools like Azure DevOps. 8. Experience in Programming (Java/Python etc.,). 9. Experience in designing automation frameworks (POM, Cucumber BDD, etc.,.). 10. Automate scripts using a given Selenium-based framework and/ Playwright based framework with BDD implementation. 11. Working experience or knowledge of any low-code automation tools. 12. Working experience in testing Restful Webservices and APIs using frameworks or tools like POSTMAN, Rest API, etc. 13. Hands-on experience using ADO or any test management tools. 14. Good exposure to the AWS cloud platform. 15. Experienced working in GXP related implementations 16. Well versed with GxP terminology and having hands-on experience in IQ/OQ/Protocol, Test Plan documents, RTM, OQ Execution, TSR etc.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

gurugram

Hybrid

Software Engineer II (Python) As a Software Engineer II, you are an experienced builder who can work independently across the full stack to deliver production-grade software. You take ownership of features from ideation to deployment, and you play a key role in maintaining the health and reliability of the systems you build. You understand the why behind the work connecting technical decisions to customer outcomes and business value. You demonstrate good judgment when working through ambiguity and elevate the quality of the systems and team around you. You’re responsible for designing and delivering moderately complex features and services, often navigating evolving requirements and unclear boundaries. You understand how services fit together and are comfortable working across APIs, databases, cloud infrastructure, and front-end components. You contribute actively to design discussions, identify opportunities for simplification, and make pragmatic choices that balance speed and sustainability. You support a culture of code quality, mentorship, and operational ownership. You are expected to use AI-powered development tools to improve your productivity, especially for repetitive, boilerplate, or testing-related tasks. You incorporate these tools thoughtfully and are responsible for validating the accuracy and security of the output. You are fully accountable for ensuring your code is thoroughly tested and production-ready — including unit, integration, end-to-end, and any needed manual validation. You help maintain the health of our pipelines, observability tooling, and engineering process. You also begin to work more directly with product and business stakeholders — particularly on features or flows that touch customer-facing tools or internal operations. You help clarify scope, translate user needs into technical solutions, and provide insight into trade-offs and timelines. You represent engineering in cross-functional conversations and take responsibility for delivering outcomes, not just output. What you’ll do: Own end-to-end delivery of features or services, including design, implementation, testing, deployment, and operations. Use AI-based development tools to improve speed, reduce boilerplate, and boost development quality. Collaborate with product, design, and business stakeholders to scope and prioritize work that delivers impact. Identify edge cases and failure scenarios, and build systems that handle them gracefully. Participate in and lead technical discussions, providing feedback on design and implementation. Write clear, well-tested, and maintainable code — and help others do the same. Take full responsibility for testing your changes at all levels (unit, integration, e2e, and manual). Monitor and maintain services in production, contributing to on-call rotations and incident response. Continuously improve code quality, developer tooling, CI/CD pipelines, and engineering processes. Mentor early-career engineers and help raise the team’s technical bar. Requirements: Proficient in one or more modern languages and frameworks (e.g., TypeScript, Java, Python, PHP), preferably Python. Solid working knowledge of AWS; expected to independently build, deploy, and debug applications using common services, and contribute to infrastructure and deployment pipelines. Experience with distributed systems, RESTful APIs, and full-stack development. Familiarity with CI/CD pipelines, monitoring tools, and DevOps best practices. Ability to write high-quality, testable, and well-documented code. Comfort using AI development tools thoughtfully and critically. Strong collaboration skills and ability to work independently in a fast-paced environment.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

20 - 35 Lacs

kolkata, hyderabad, bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

kolkata, hyderabad, bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

10 - 12 Lacs

chennai

Work from Office

Roles and responsibilities: Considered subject matter expert within discipline Solves complex problems; takes a broad perspective to identify innovative solutions Can either work independently on in teams Requests guidance in complex situations or when needed Interprets challenges and recommends best practices to improve processes Capacity to lead functional teams or projects to solve complex problems and deliver solutions Communicates difficult concepts and negotiates with others to conclude on goal-centric points of view Provides resolution support to wide array of issues that are complex in scope Contributes to departmental business planning and solution design Uses expert level Cyber Security knowledge base to complete tasks Intrinsic understanding of software development life cycles Excellent oral and written communication skills Understanding of security by design principles, architecture concepts & security frameworks (NIST, PCI, OWASP, etc.) Knowledge of current and emerging security technologies, threats, and techniques for exploiting security vulnerabilities in the code or application Requirements : 6+ years of experience working with systems deployed on AWS 4+ years of technical experience in Incident Management for AWS Cloud solutions 1+ years of experience with AWS Incident Detection and Response Demonstrated experience using Splunk for Incident Management and processes supported by Okta CIAM, PhishER, PagerDuty, Imperva, CrowdStrike, AWS Guard Duty, Defender for Cloud Apps, etc. Incident Management (2+ years minimum) Risk Management techniques (2+ years minimum) Vulnerability Management Web Application Firewalls such as Imperva As a subject matter expert or stakeholder, has previously supported information security audits in any of the following frameworks or regulations: PCI DSS, NIST, SOC 1 or 2, ISO 27001, Sarbanes-Oxley (SOX) or HITRUST Experience in analyzing threats of cloud and application components, such as findings from Security Assessments Nice to have: Familiarity with Jira, GitHub, Okta, WordPress, Qualys VMDR, Jenkins, Rancher, Terraform, Snyk & Contrast Familiarity with some of the following concepts: SAST (Static Application Security Testing) DAST (Dynamic Application Security Testing) SCA (Software Composition Analysis) SBOM (Software Bill of Materials) Image Scanning SOAR (Security Orchestration, Automation and Response), good if experienced in IaC (Infrastructure as Code) Threat Modeling PenTesting (Web App, Mobile, External) CSA (Cloud Security Assessment) Familiarity with Java (including npm and Maven), Docker & Kubernetes

Posted 2 weeks ago

Apply

12.0 - 15.0 years

25 - 37 Lacs

chennai

Work from Office

Role and Responsibilities: Conduct security risk assessments, monitor organizational compliance, and ensure effective prioritization and remediation of cyber risks within agreed SLAs. Identify cloud-related risks, assess business impacts, and develop actionable mitigation strategies aligned with governance and control measures. Perform audits, manage gap analyses, and ensure compliance with standards like ISO/IEC 27001, PCI DSS, and NIS 2, including readiness and monitoring activities. Develop and maintain a corporate-wide Business Continuity Plan addressing recovery and emergency response, ensuring alignment with business and regulatory requirements. Create, implement, and maintain security policies, procedures, and awareness training programs to enhance organizational security posture. Collaborate with stakeholders, including Legal and third-party vendors, to manage security requirements, regulatory compliance, and operational decision alignment with policies. Facilitate ongoing improvement by analyzing risks, regulatory updates, and stakeholder feedback, ensuring effective communication and presentation of security findings. Skills & Experience: Extensive experience in security governance, risk, and compliance, including auditing IT systems, leading ISO 27001 certification processes, and conducting security risk assessments. Proven expertise in business continuity, cloud security, GRC tools, and virtualization technologies, with the ability to share technical knowledge across teams. Strong management and leadership skills, adept at setting goals, delegating tasks, and ensuring objectives are met in dynamic, deadline-oriented environments. Exceptional communication and interpersonal skills, capable of interacting with diverse groups, including executives and technical teams, and delivering effective presentations and training. Professional certifications such as CISSP, CISM, CRISC, or ISO 27001 Lead Implementer, combined with a relevant degree or equivalent experience in information security fields. Demonstrated ability to work independently with a proactive, results-driven mindset, while fostering team collaboration and maintaining focus on service delivery. Strategic thinker with a global perspective, innovative approach, and technical depth to lead discussions on cloud application security technologies and enterprise solutions.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

chennai

Work from Office

Roles and Responsibilities: Working with clients to understand their data. Based on the understanding you will be building the data structures and pipelines. You will be working on the application from end to end collaborating with UI and other development teams. You will be working with various cloud providers such as Azure & AWS. You will be engineering data using the Hadoop/Spark ecosystem. You will be responsible for designing, building, optimizing and supporting new and existing data pipelines. Orchestrating jobs using various tools such Oozie, Airflow, etc. Developing programs for cleaning and processing data. You will be responsible for building the data pipelines to migrate and load the data into the HDFS either on-prem or in the cloud. Developing Data ingestion/process/integration pipelines effectively. Creating Hive data structures, metadata and loading the data into data lakes / Bigdata warehouse environments. Optimized (Performance tuning) many data pipelines effectively to minimize cost. Code versioning control and git repository is up to date. You should be able to explain the data pipeline to internal and external stakeholders. You will be responsible for building and maintaining CI/CD of the data pipelines. Preferred Qualifications: Bachelors degree in computer science or related field. Minimum of 5+ years working experience with Spark, Hadoop eco systems. Minimum of 4+ years working experience on designing data streaming pipelines Minimum experience of 3+ years on NoSQL and Spark Streaming. Proven experience with big data ecosystem tools such as Sqoop, Spark, SQL, API, Hive, Oozie, Airflow, etc.. Solid experience in all phases of SDLC with 10+ years of experience (plan, design, develop, test, release, maintain and support) Hands-on experience using Azures data engineering stack. Should have implemented projects using programming languages such as Scala or Python. Working experience on SQL complex data merging techniques such as windowing functions etc.. Hands-on experience with on-prem distribution tools such as Cloudera/Horton Works/MapR.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

4 - 9 Lacs

ahmedabad

Work from Office

Job purpose: Design & implement the best-engineered technical solutions using the latest technologies and tools. Who You Are: 4+ years of experience in .NET Development on .Net core, ASP.NET, C#.NET, MVC, WebAPI, SQL Hands-on experience with tools and technologies like .NET MVC, .Net Core and .Net Core Microservices Having an experience in Angular will be a plus EF Core/ SQL/ MongoDB, Workflow Foundation Design and develop web products using Microsoft.Net technologies which include API integrations Understand the importance of code readability and promote clean code practices Ability to understand non-functional requirements such as scalability, security, application monitoring, performance, etc. What will excite us: Prior experience with Domain-Driven Design (DDD) Good experience with Test Driven Development Good verbal and written communication skills Experience in Containerization and CI/CD KeyCloak as Auth Service provider What will excite you: Opportunity to work on large scale enterprise solution building. Opportunity to explore new technologies & frameworks with accomplished solution architects & industry leaders. Will get exposure to latest aspects of security, AI/ML, Business Domain and data analytics. Location: Ahmedabad

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies