Jobs
Interviews

867 Lambda Expressions Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted Date not available

Apply

6.0 - 10.0 years

25 - 30 Lacs

gurugram, bengaluru

Work from Office

The Role and Responsibilities We are looking to hire an Associate Director in Data Science & Data Engineering Track. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data scientist track, you will be primarily responsible for managing and delivering analytics projects and helping teams design analytics solutions and models that consistently drive scalable high-quality solutions. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with global clients to understand their business challenges Exploring large-scale data and crafting models to answer core business problems Working with partners and principals to shape proposals that showcase our data science and analytics capabilities Explaining, refining, and crafting model insights and architecture to guide stakeholders through the journey of model building Advocating best practices in modelling and code hygiene Leading the development of proprietary statistical techniques, ML algorithms, assets, and analytical tools on varied projects Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art modelling and data science techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or Masters degree in a quantitative discipline from a top academic program (Data Science, Mathematics, Statistics, Computer Science, Informatics, and Engineering) Prior experience in data science, machine learning, and analytics Passion for problem-solving through big-data and analytics Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with the ability to manage workload and meet deadlines in a fast-paced environment Impactful presentation skills that succinctly and efficiently convey findings, results, strategic insights, and implications Excellent verbal and written communication skills and complete command of English Willingness to travel Collaborative team player Respect for confidentiality Technical Background (Data Science) Proficiency in modern programming languages (Python is mandatory; SQL, R, SAS desired) and machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Prior experience in designing and deploying large-scale technical solutions leveraging analytics Solid foundational knowledge of the mathematical and statistical principles of data science Familiarity with cloud storage, handling big data, and computational frameworks Valued but not required : Compelling side projects or contributions to the Open-Source community Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Technical Background (Data Engineering) Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence

Posted Date not available

Apply

0.0 - 5.0 years

5 - 10 Lacs

gurugram

Work from Office

Company: Oliver Wyman Description: Role: Data Engineer Who We Are? Oliver Wyman is a global leader in management consulting. With offices in 50+ cities across 30 countries, Oliver Wyman combines deep industry knowledge with specialized expertise in strategy, finance, operations, technology, risk management, and organizational transformation. Our 4000+ professionals help clients optimize their business, improve their IT, operations, and risk profile, and accelerate their organizational performance to seize the most attractive opportunities. Our professionals see what others don't, challenge conventional thinking, and consistently deliver innovative, customized solutions. As a result, we have a tangible impact on clients top and bottom lines. Our clients are the CEOs and executive teams of the top global 1000 companies. Oliver Wyman is a business of Marsh McLennan [NYSE: MMC] For more information, visit Follow Oliver Wyman on Twitter @OliverWyman Practice Overview Practice: Data and Analytics (DNA) - Analytics Consulting Location: Gurugram, India At Oliver Wyman DNA, we partner with clients to solve tough strategic business challenges with the power of analytics, technology, and industry expertise. We drive digital transformation, create customer-focused solutions, and optimize operations for the future. Our goal is to achieve lasting results in collaboration with our clients and stakeholders. We value and offer opportunities for personal and professional growth. Join our entrepreneurial team focused on delivering impact globally. Our Mission and Purpose Mission: Leverage Indias high-quality talent to provide exceptional analytics-driven management consulting services that empower clients globally to achieve their business goals and drive sustainable growth, by working alongside Oliver Wyman consulting teams. Purpose: Our purpose is to bring together a diverse team of highest-quality talent, equipped with innovative analytical tools and techniques to deliver insights that drive meaningful impact for our global client base. We strive to build long-lasting partnerships with clients based on trust, mutual respect, and a commitment to deliver results. We aim to build a dynamic and inclusive organization that attracts and retains the top analytics talent in India and provides opportunities for professional growth and development. Our goal is to provide a sustainable work environment while fostering a culture of innovation and continuous learning for our team members. The Role and Responsibilities We have open positions ranging from Associate Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Interview Process The application process will include testing technical proficiency, case study, and team-fit interviews. Please include a brief note introducing yourself, what youre looking for when applying for the role, and your potential value-add to our team. Roles and levels We are hiring for engineering role across the levels from Associate Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years. In addition to the base salary, this position may be eligible for performance-based incentives. We offer a competitive total rewards package that includes comprehensive health and welfare benefits as well as employee assistance programs. Oliver Wyman is an equal-opportunity employer. Our commitment to diversity is genuine, deep, and growing. Were not perfect, but were working hard right now to make our teams balanced, representative, and diverse. Marsh McLennan and its Affiliates are EOE Minority/Female/Disability/Vet/Sexual Orientation/Gender Identity employers.

Posted Date not available

Apply

13.0 - 16.0 years

12 - 16 Lacs

bengaluru

Work from Office

Job Title:Cloud Data Architect Experience: 8-16 Years Location:Remote Job Description : As a Cloud Data Architect, he/she will oversee the design, implementation, and management of cloud-based data solutions. He/she will lead a team of data engineers and collaborate with stakeholders to ensure data is managed efficiently, securely, and is accessible for business intelligence and analytics. Expertise in cloud platforms, data architecture, and leadership will drive the success of our data initiatives and support data-driven decision-making across the organization.Proven experience as a Data Architect, Data Engineer, or similar role with a strong understanding of end-to-end data architecture. Technical Skills: Skillsets we are looking for: 8 to 16 years of working experience in data engineering. 7+ years exp in PySprak 7+ years exp in AWS Glue 7+ years exp in AWS Redshift 7+ years exp in in AWS CI/CD pipeline like codebuild, codecommit, codedeploy and codepipeline Strong proficiency in AWS services such as S3, EC2, EMR, SNS, Lambda, StepFunctions and Eventbridge Experience implementing automated testing platforms like PyTest Strong proficiency in Python, Hadoop, Spark and or PySpark is required Skill of writing clean, readable, commented and easily maintainable code Understanding of fundamental design principles for building a scalable solution Skill for writing reusable libraries Proficiency in understanding code versioning tools such as Git, SVN, TFS etc., Bachelor''s degree or higher. Interpersonal skills: Excellent communication and collaboration skills. Ability as part of a team. Strong problem-solving and analytical skills. Must have worked with US customers and should have provided at least 3-4 hours overlap with Pacific Time (PT) Bonus points: Certifications in cloud platforms

Posted Date not available

Apply

11.0 - 16.0 years

5 - 10 Lacs

noida

Work from Office

Java Lead Dev Position (Java + AWS Senior Java professionals with strong experience in java and AWS cloud platform related technologies (11+ yrs) Strong experience of designing architecting enterprise grade projects in java and AWS based ecosystem Team handling experience with proven track record of delivering quality end products in production with seamless production support as needed Hands-on individual responsible for producing excellent quality of code, adhering to expected coding standards and industry best practices. Must have strong experience in Java 8, Multithreading, Springboot, Oracle/PostgreSql. Must have good knowledge on - Hibernate, Caching Frameworks, Memory Management AWS - Deployment (Docker and Kubernetes) + Common Services (mainly S3, Lambda, CloudFront, API Gateway, Cloud Formation and ALBs) Kafka, building event driven microservices and streaming applications Good to have MongoDB and ElasticSearch knowledge Excellent problem solving trouble shooting skills High levels of ownership and commitment on deliverables Strong Communication Skills - Should be able to interact with clients stakeholders comfortably to probe a technical problem or provide a clear progress update or clarify requirement specifications for team/peers Mandatory Competencies Programming Language - Java - Core Java (java 8+) Programming Language - Java - Java Multithreading Middleware - Java Middleware - Springboot Database - PostgreSQL - PostgreSQL Beh - Communication Programming Language - Java - Hibernate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - Amazon API Gateway Middleware - Message Oriented Middleware - Messaging (JMS, ActiveMQ, RabitMQ, Kafka, SQS, ASB etc) DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Containerization (Docker, Kubernetes)

Posted Date not available

Apply

5.0 - 10.0 years

20 - 25 Lacs

noida, pune, bengaluru

Work from Office

Description: Job Description: We are looking for an experienced Python Data Engineer with a strong background in Python, Pandas, AWS, and SQL to join our data engineering team. The ideal candidate should be passionate about building robust data pipelines, optimizing data workflows, and working on scalable cloud-based data solutions. Requirements: Required Skills: 5.5 to 8 years of hands-on experience in Python programming with strong expertise in Pandas for data manipulation. Solid understanding and experience with AWS cloud services (such as S3, Lambda, Glue, Redshift, Athena, etc.). Proficient in SQL with experience in querying and optimizing large datasets. Good understanding of data warehousing concepts, data modeling, and data integration. Strong debugging and problem-solving skills. Job Responsibilities: Key Responsibilities: Design, develop, and maintain scalable ETL pipelines using Python and Pandas. Work with large datasets to cleanse, transform, and prepare them for analytics and reporting. Leverage AWS cloud services (e.g., S3, Lambda, Glue, Redshift, Athena) for data ingestion, storage, and processing. Write efficient SQL queries to extract, manipulate, and analyze data from various relational and non-relational data sources. Collaborate with data scientists, analysts, and other engineers to understand data needs and deliver reliable solutions. Monitor data pipeline performance and troubleshoot issues to ensure data accuracy and integrity. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted Date not available

Apply

4.0 - 9.0 years

6 - 10 Lacs

bengaluru

Work from Office

Job Posting TitleSR. DATA ENGINEER Band/Level5-2-C--C Education ExperienceBachelors Degree (High School +4 years) Employment Experience5-7 years At TE, you will unleash your potential working with people from diverse backgrounds and industries to create a safer, sustainable and more connected world. Job Overview Responsible for designing and establishing the solution architecture for analytical platforms and/or data-enabled solutions.Has broad understanding of the entire data and analytics eco-system of tools and technologies, with hands-on experience across all three core data domains of data science, data engineering, and data visualization. Responsible for fleshing out the details of the end-to-end component architecture being implemented by the team. Contributes to the team's development velocity, if necessary. Understands how an organization's data platform integrates within the overall technical architecture. Understands the business purpose the role fulfills, and the roadmap for the lifecycle of an organization's systems. Sets the direction and establishes the approach for integrating information applications and programs. Roles & Responsibilities Key Responsibilities Architect, design, and build scalable data architectures, pipelines, and data products (Databricks, AWS). Design and develop enterprise data models; manage the full data life cycle. Extract, transform, and integrate data from SAP ECC/S4HANA and other non-SAP environments. Build solutions for real-time, streaming, and batch data workloads. Define and implement data security, governance, and compliance standards. Adapt and implement data quality and observability practices. Perform performance tuning and cost optimization. Support Operations teams in day-to-day business needs. Collaborate with business and technical teams to understand requirements and deliver scalable data architecture solutions. Desired Candidate Experience: Minimum 5 + years in data architecture/data engineering roles. Proven success in large-scale data environments : Delta Lake & Medallion Architecture DLT Pipelines PySpark Workbooks Spark SQL & SQL Warehouse Unity Catalog (data governance, lineage) Genie (query performance, indexing) Security & Role-Based Access Control BonusMLflow knowledge ER & Dimensional modeling Metadata management Python, SQL, PySpark S3, Lambda, EMR, Redshift, Bedrock Databricks Lakehouse platform : Data Modeling: Programming AWS Cloud Services: Competencies

Posted Date not available

Apply

3.0 - 6.0 years

5 - 9 Lacs

hyderabad

Work from Office

Job Title: Automation Developer Lead to Cash (L2C) Process Department: Digital Transformation / Process Automation Reports To: Automation Lead / Engineering Manager Location: Remote / Hybrid / Client Location (as applicable) Experience Required: 36 Years Employment Type: Full-Time Role Summary: We are seeking a highly skilled Automation Developer to design, develop, and implement automation solutions for the Lead to Cash (L2C) process. The ideal candidate will have strong programming skills, proficiency in web automation tools, and experience with automation frameworks to enhance operational efficiency and accuracy. Key Responsibilities: Design, develop, and implement robust automation solutions for L2C process optimization. Build and maintain automation scripts to streamline workflows and reduce manual efforts. Collaborate with business analysts, QA, and development teams to gather requirements and deliver automation solutions. Conduct unit testing and debugging using tools like Postman, Rest Assured, or Insomnia. Integrate automation solutions within AWS environments using services such as S3, SNS, and Lambda . Utilize Git/GitHub and Jenkins for version control and CI/CD pipeline setup. Document the design, functionality, and maintenance procedures for automation tools and scripts. Required Qualifications & Skills: Strong programming proficiency in Python with practical hands-on experience. Expertise in Selenium for end-to-end web automation . Proficient in Robot Framework (mandatory); PyTest knowledge is a plus. Working knowledge of SQL databases , preferably PostgreSQL . Familiarity with manual API testing tools such as Postman , Rest Assured , or Insomnia . Experience in AWS environments , including S3 , SNS , and Lambda . Skilled in version control systems ( Git, GitHub ) and build automation tools ( Jenkins ). Preferred Qualifications: Prior experience automating processes within L2C or similar enterprise workflows. Certification in any automation testing tools or cloud platforms. Exposure to Agile methodologies and DevOps practices. Soft Skills: Strong problem-solving and analytical thinking. Self-driven with the ability to work independently and as part of a team. Excellent communication and documentation skills. Ability to handle multiple tasks and work under tight deadlines. Key Relationships: QA Engineers & Test Automation Team Product Owners & Business Analysts DevOps and Cloud Infrastructure Teams L2C Process Owners Role Dimensions: Direct contributor to process efficiency and automation of critical L2C operations. Improves scalability and reliability of enterprise workflows. Enhances developer productivity and reduces operational risk. Success Measures (KPIs): Reduction in manual L2C process execution time Automation script coverage and reliability Successful integration and deployment using CI/CD pipelines Reduction in bugs or issues in automation outcomes Business stakeholder satisfaction Competency Framework Alignment: Automation Strategy & Execution Technical Programming & Scripting Cloud-Based Deployment (AWS) Quality Assurance & Testing Operational Efficiency & Innovation

Posted Date not available

Apply

3.0 - 6.0 years

4 - 8 Lacs

gurugram

Work from Office

Your Role Proven experience with Java and Spring Boot. Strong proficiency in AWS,(Specially in Lambda and EKS). Understanding of microservices architecture. Hands on experience with Unit testing using Junit & Mockitto Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your Profile Experience working with AWS services like RDS, Fargate, or related technologies. Familiarity with CI/CD practices and tooling (e.g., GitHub Actions, automated testing pipelines). Good Communication skills Team handling experience will add advantage What will you love working at Capgemini The company emphasizes team spirit, trust, and innovation , creating a positive and inclusive work environment. Employees often mention a sense of appreciation and purpose in their roles Capgemini scores 3.7/5 for work-life balance, with many developers appreciating the flexible work hours and hybrid work model You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications.

Posted Date not available

Apply

3.0 - 6.0 years

4 - 8 Lacs

hyderabad, bengaluru

Work from Office

Your Profile You would be working on Amazon Lex and Amazon Connect. AWS services such as Lambda, CloudWatch, DynamoDB, S3, IAM, and API Gateway, Node.js or Python. Contact center workflows and customer experience design. NLP and conversational design best practices. CI/CD processes and tools in the AWS ecosystem. Your Role Experience in design, develop, and maintain voice and chatbots using Amazon Lex. Experience in Lex bots with Amazon Connect for seamless customer experience. Experience in Amazon Connect contact flows, queues, routing profiles, and Lambda integrations. Experience in develop and deploy AWS Lambda functions for backend logic and Lex fulfillment. What youll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Chennai (ex Madras), Bangalore, Hyderabad, Mumbai, Pune

Posted Date not available

Apply

3.0 - 6.0 years

4 - 8 Lacs

gurugram

Work from Office

Your Role Proven experience with Java and Spring Boot. Strong proficiency in AWS,(Specially in Lambda and EKS). Understanding of microservices architecture. Hands on experience with Unit testing using Junit & Mockitto Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your Profile Experience working with AWS services like RDS, Fargate, or related technologies. Familiarity with CI/CD practices and tooling (e.g., GitHub Actions, automated testing pipelines). Good Communication skills Team handling experience will add advantage What will you love working at Capgemini The company emphasizes team spirit, trust, and innovation , creating a positive and inclusive work environment. Employees often mention a sense of appreciation and purpose in their roles Capgemini scores 3.7/5 for work-life balance, with many developers appreciating the flexible work hours and hybrid work model You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications.

Posted Date not available

Apply

3.0 - 6.0 years

4 - 8 Lacs

hyderabad, bengaluru

Work from Office

Your Profile You would be working on Amazon Lex and Amazon Connect. AWS services such as Lambda, CloudWatch, DynamoDB, S3, IAM, and API Gateway, Node.js or Python. Contact center workflows and customer experience design. NLP and conversational design best practices. CI/CD processes and tools in the AWS ecosystem. Your Role Experience in design, develop, and maintain voice and chatbots using Amazon Lex. Experience in Lex bots with Amazon Connect for seamless customer experience. Experience in Amazon Connect contact flows, queues, routing profiles, and Lambda integrations. Experience in develop and deploy AWS Lambda functions for backend logic and Lex fulfillment. What youll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Chennai (ex Madras), Bangalore, Hyderabad, Mumbai, Pune

Posted Date not available

Apply

3.0 - 7.0 years

6 - 10 Lacs

hyderabad, bengaluru

Work from Office

Your Role You would be implementing and supporting following Enterprise Planning & Budgeting Cloud Services (EPBCS) modules - Financials, Workforce, Capital, and Projects Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Oracle EPM Cloud Implementation Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad, Bangalore, Chennai (ex Madras), Mumbai (ex Bombay), Pune

Posted Date not available

Apply

3.0 - 7.0 years

6 - 10 Lacs

hyderabad, bengaluru

Work from Office

Your Role You would be implementing and supporting following Enterprise Planning & Budgeting Cloud Services (EPBCS) modules - Financials, Workforce, Capital, and Projects Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). Oracle EPM Cloud Implementation Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Location - Hyderabad, Bangalore, Chennai (ex Madras), Mumbai (ex Bombay), Pune

Posted Date not available

Apply

5.0 - 8.0 years

7 - 11 Lacs

noida

Work from Office

Tech Stack Java + Spring Boot AWS (ECS, Lambda, EKS) Drools (preferred but optional) APIGEE API observability, traceability, and security Skills Required: Strong ability to understand existing codebases, reengineer and domain knowledge to some extent. Capability to analyze and integrate the new systems various interfaces with the existing APIs. Hands-on experience with Java, Spring, Spring Boot, AWS, and APIGEE . Familiarity with Drools is an added advantage. Ability to write and maintain JIRA stories (10-15% of the time) and keeping existing technical specifications updated would be required. Should take end-to-end ownership of the project , create design, guide team and work independently on iterative tasks. Should proactively identify and highlight risks during daily scrum calls and provide regular updates. Mandatory Competencies Java - Core JAVA Others - Micro services Java Others - Spring Boot Cloud - AWS Lambda Cloud - Apigee Beh - Communication and collaboration

Posted Date not available

Apply

4.0 - 9.0 years

4 - 8 Lacs

hyderabad

Work from Office

As a Senior Java/AWS Developer, you will be part of a team responsible for contributing to the design, development, maintenance and support of ICE Digital Trade, a suite of highly configurable enterprise applications. The ideal candidate must be results-oriented, self-motivated and can thrive in a fast-paced environment. This role requires frequent interactions with project and product managers, developers, quality assurance and other stakeholders, to ensure delivery of a world class application to our users. Responsibilities Reviewing application requirements and interface designs. Contributing to the design and development of enterprise Java applications Developing and implementing highly responsive user interface components using react concepts. Writing application interface codes using JavaScript following react.js workflows. Troubleshooting interface software and debugging application code. Developing and implementing front-end architecture to support user interface concepts. Monitoring and improving front-end performance. Documenting application changes and developing updates. Collaborate with QA team to ensure quality production code. Support and enhance multiple mission-critical enterprise applications. Write unit and integration tests for new and legacy code. Take initiative and work independently on some projects while contributing to a large team on others. Provide second-tier production support for 24/7 applications. Follow team guidelines for quality and consistency within the design and development phases of the application. Identify opportunities to improve and optimize the application. Knowledge and Experience Bachelors degree in computer science or information technology. 4+ years of full stack development experience. In-depth knowledge of Java, JavaScript, CSS, HTML, and front-end languages. Knowledge of performance testing frameworks, Proven success with test-driven development Experience with browser-based debugging and performance testing software. Excellent troubleshooting skills. Good Object-oriented concepts and knowledge of core Java and Java EE. First-hand experience with enterprise messaging (IBM WebSphere MQ or equivalent) Practical knowledge of Java application servers (JBoss, Tomcat) preferred. Spring Framework working knowledge. Experience with the core AWS services Experience with the serverless approaches using AWS resources. Experience in developing infrastructure as code using CDK by efficient usage of AWS services. Experience in AWS services such as API Gateway, Lambda, DynamoDB, S3, Cognito and AWS CLI. Experience in using AWS SDK Understanding of distributed transactions Track record of completing assignments on time with a high degree of quality Experience and/or knowledge of all aspects of the SDLC methodology and related concepts and practices. Experience with Agile development methodologies preferred Knowledge of Gradle / Maven preferred Experience working with commodity markets or financial trading environments preferred Open to learn and willing to participate in development using new frameworks, programming languages. Good to Have Knowledge of REACT tools including React.js, TypeScript and JavaScript ES6, Webpack, Enzyme, Redux, and Flux. Experience with user interface design. experience in AWS Amplify, RDS, EventBridge, SNS, SQS and SES

Posted Date not available

Apply

4.0 - 9.0 years

10 - 18 Lacs

nashik, pune, mumbai (all areas)

Work from Office

Greetings from Sigma! Experience 4 to 6 years Location Pune Notice Period Immediate (Aug Joiners) PFB JD 2-3 years of experience - Hands on experience of the AWS cloud platform Experience working with Docker and Kubernetes – Working experience of scripting languages like Python, Bash – Good to have a experience with GO language - 2-3 years of experience – If you are interested, please send me a copy of your resume along with the following details. 1. Notice Period (LWD)- 2. Current CTC- 3. Expected CTC - 4. Current company- 5. Total year of experience- 6. Go Language experience- 7. Do you have any offer - 8. current location- 9. Preferred location-

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies