Home
Jobs

295 Lambda Expressions Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

3.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Position Overview As a Software Engineer in Test II, you will play a critical role in ensuring the quality and performance of our News and Competitive Data Analysis Platform. You’ll collaborate closely with product managers, engineering leads, and DevOps to understand customer needs, define test strategies, and translate product requirements into comprehensive test plans—covering both manual and automated test cases You will contribute to building robust, scalable, and efficient testing systems across our Python/Django-based microservices, React frontends, and event-driven pipelines leveraging RabbitMQ and Celery. Your responsibilities will include performance testing, system load simulations, and optimization validations, especially across high-throughput services like Solr-powered search and PostgreSQL-based data retrieval. You’ll have the full power of AWS at your disposal while you focus on building and testing Serverless solutions (either brand new or assisting in migrating services to Serverless). As part of our core governance teams, your team will fully own their services (from code to deployment and monitoring), you’ll participate in feature design and architecture discussion and regularly demo to the entire department. Key Responsibilities Provide strong QA working across multiple teams, and influencing engineering culture and practices. Create and document test strategies and automated solutions for functional and non-functional testing. Work in a collaborative environment where you regularly pair, plan, and execute tasks as a team and strive to optimize your team’s Lead Time, Deployment Frequency, Mean-time-to-recovery, and Change Failure Rate. Create and review test scenarios and cases based on specifications, keep them updated, provide test estimates, and execute test cases. Analyze test results, investigate failures, and accurately record and follow through on defects to resolution. Familiarity with test management tools (e.g., JIRA, Zephyr). Conduct testing and validation of automated processes to ensure accuracy and reliability. Monitoring application performance and responding to incidents. What you need to succeed Bachelor’s degree in Engineering, Math, or a related field with 3+ years of software QA experience. Experience in both manual and automation testing by preparing exhaustive test cases (Regression , Integration , System , and/or Sanity Test cases) Experience with PyTest for testing APIs, back-end services. Experience in UI automation testing, preferably with Cypress (Or Selenium) Experience with test data preparation, mocking & stubbing Familiarity with test management tools (e.g., JIRA, Zephyr). Aware of end-to-end QA lifecycles Aware of continuous integration tools like CodePipeline or Jenkins. Experience or Aware of performance, load, and stress testing. Good to have AWS experience that includes AWS Serverless Framework (Lambda , Fargate etc) testing, Monitoring & Observability services ( like AWS Cloudwatch ) , Data warehousing ecosystems services

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 22 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? The Senior Specialist Technical Support Engineer role is to deliver technical support to end users about how to use and administer the NICE Service and Sales Performance Management, Contact Analytics and/or WFM software solutions efficiently and effectively in fulfilling business objectives. We are seeking a highly skilled and experienced Senior Specialist Technical Support Engineer to join our global support team. In this role, you will be responsible for diagnosing and resolving complex performance issues in large-scale SaaS applications hosted on AWS. You will work closely with engineering, DevOps, and customer success teams to ensure our customers receive world-class support and performance optimization. How will you make an impact? Serve as a subject matter expert in troubleshooting performance issues across distributed SaaS environments in AWS. Interfacing with various R&D groups, Customer Support teams, Business Partners and Customers Globally to address CSS Recording and Compliance application related product issues and resolve high-level issues. Analyze logs, metrics, and traces using tools like CloudWatch, X-Ray, Datadog, New Relic, or similar. Collaborate with development and operations teams to identify root causes and implement long-term solutions. Provide technical guidance and mentorship to junior support engineers. Act as an escalation point for critical customer issues, ensuring timely resolution and communication. Develop and maintain runbooks, knowledge base articles, and diagnostic tools to improve support efficiency. Participate in on-call rotations and incident response efforts. Have you got what it takes? 10+ years of experience in technical support, site reliability engineering, or performance engineering roles. Deep understanding of AWS services such as EC2, RDS, S3, Lambda, ELB, ECS/EKS, and CloudFormation. Proven experience troubleshooting performance issues in high-availability, multi-tenant SaaS environments. Strong knowledge of networking, load balancing, and distributed systems. Proficiency in scripting languages (e.g., Python, Bash) and familiarity with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Excellent communication and customer-facing skills. Preferred Qualifications: AWS certifications (e.g., Solutions Architect, DevOps Engineer). Experience with observability platforms (e.g., Prometheus, Grafana, Splunk). Familiarity with CI/CD pipelines and DevOps practices. Experience working in ITIL or similar support frameworks. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7554 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 week ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? In Nice as a Senior Software Engineer, you will be responsible for designing, developing, testing, and maintaining scalable and efficient Java-based applications that meet business requirements. You will collaborate closely with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality software solutions. Your role involves writing clean, well-structured, and maintainable code following best practices and coding standards. Additionally, you will debug and troubleshoot application issues, ensuring optimal performance and user experience. How will you make an impact? Coordinate with Architecture to understand and develop platform architecture Develop a RESTful API solution supporting both AWS and Azure. Work with AWS CloudFormation templates to extend and refine our infrastructure Understand and define performance level needs for the platform Define logs, alarms, troubleshoot them and fix issues in a defined release cadence Integrating with multiple internal products to provide seamless CXone CCaaS offerings Manage RBAC permissions and work with DevOps to maintain “least privilege” Develop and refine Jenkins CI/CD pipelines to deploy code, run acceptance tests, and monitor environment health Effectively collaborate with cross geo team and willing to stretch at times Effectively collaborate with TS/TAM/NOC to address queries and concerns Have you got what it takes? Bachelor’s degree in Computer Science, or equivalent 6+ year of experience in software development Experience with following software languages: NodeJS : Must Have Angular 8: Must Have Java + Spring Boot: Good to Have Open to learn new tech stack as need be Working knowledge with AWS technologies (Open Search, SQS, Lambda, RDS) Experience developing with SQL Server or equivalent Experience designing, developing, deploying and supporting RESTful APIs Experience troubleshooting multi-threaded applications, mining through logs to determine patterns to identify potential issues and fix them Experience with developing services, clients and multi-threaded software Experience with/knowledge of agile development processes Experience with DevOps tools and processes Jenkins, Git, Docker Scripting: Unix, Shell, Groovy, Python Sonarqube Working knowledge of unit testing and test automation (mocha-chai, Cucumber, Playwright) Working knowledge of user stories and use cases Working knowledge of object-oriented software design and design patterns Comfortable working in a fast-paced environment Bonus Experience: Experience with telecommunications/telephony Experience with call centers Experience with Jira What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7472 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Noida

Work from Office

Naukri logo

Roles & Responsibilities: Proficient in Python including, Github, Git commands Develop code based on functional specifications through an understanding of project code Test code to verify it meets the technical specifications and is working as intended, before submitting to code review Experience in writing tests in Python by using Pytest Follow prescribed standards and processes as applicable to software development methodology, including planning, work estimation, solution demos, and reviews Read and understand basic software requirements Assist with the implementation of a delivery pipeline, including test automation, security, and performance Assist in troubleshooting and responding to production issues to ensure the stability of the application Must-Have and Mandatory: Very Good experience in Python Flask, SQL Alchemy, Pytest Knowledge of Cloud like AWS Cloud , Lambda, S3, Dynamo DB Database - Postgres SQL or MySQL or Any relational database. Can provide suggestions for performance improvements, strategy, etc. Expertise in object-oriented design and multi-threaded programming Total Experience Expected: 04-06 years

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

Mumbai, Hyderabad, Pune

Work from Office

Naukri logo

Develop and productionize cloud-based services and full-stack applications utilizing NLP solutions, including GenAI models Implement and manage CI/CD pipelines to ensure efficient and reliable software delivery Automate cloud infrastructure using Terraform Write unit tests, integration tests and performance tests Work in a team environment using agile practices Monitor and optimize application performance and infrastructure costs Collaborate with data scientists and other developers to integrate and deploy data science models into production environments Work closely with cross-functional teams to ensure seamless integration and operation of services Proficiency JavaScript for full-stack development Strong experience with AWS cloud services, including EKS, Lambda, and S3 Knowledge of Docker containers and orchestration tools including Kubernetes

Posted 1 week ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As an AWS Data Engineer at Kyndryl, you will be responsible for designing, building, and maintaining scalable, secure, and high-performing data pipelines using AWS cloud-native services. This role requires extensive hands-on experience with both real-time and batch data processing, expertise in cloud-based ETL/ELT architectures, and a commitment to delivering clean, reliable, and well-modeled datasets. Key Responsibilities: Design and develop scalable, secure, and fault-tolerant data pipelines utilizing AWS services such as Glue, Lambda, Kinesis, S3, EMR, Step Functions, and Athena. Create and maintain ETL/ELT workflows to support both structured and unstructured data ingestion from various sources, including RDBMS, APIs, SFTP, and Streaming. Optimize data pipelines for performance, scalability, and cost-efficiency. Develop and manage data models, data lakes, and data warehouses on AWS platforms (e.g., Redshift, Lake Formation). Collaborate with DevOps teams to implement CI/CD and infrastructure as code (IaC) for data pipelines using CloudFormation or Terraform. Ensure data quality, validation, lineage, and governance through tools such as AWS Glue Data Catalog and AWS Lake Formation. Work in concert with data scientists, analysts, and application teams to deliver data-driven solutions. Monitor, troubleshoot, and resolve issues in production pipelines. Stay abreast of AWS advancements and recommend improvements where applicable. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field Over 8 years of experience in data engineering More than 3 years of experience with the AWS data ecosystem Strong experience with Pyspark, SQL, and Python Proficiency in AWS services: Glue, S3, Redshift, EMR, Lambda, Kinesis, CloudWatch, Athena, Step Functions Familiarity with data modelling concepts, dimensional models, and data lake architectures Experience with CI/CD, GitHub Actions, CloudFormation/Terraform Understanding of data governance, privacy, and security best practices Strong problem-solving and communication skills Preferred Skills and Experience Experience working as a Data Engineer and/or in cloud modernization. Experience with AWS Lake Formation and Data Catalog for metadata management. Knowledge of Databricks, Snowflake, or BigQuery for data analytics. AWS Certified Data Engineer or AWS Certified Solutions Architect is a plus. Strong problem-solving and analytical thinking. Excellent communication and collaboration abilities. Ability to work independently and in agile teams. A proactive approach to identifying and addressing challenges in data workflows. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

12.0 - 15.0 years

35 - 60 Lacs

Chennai

Work from Office

Naukri logo

AWS Solution Architect: Experience in driving the Enterprise Architecture for large commercial customers Experience in healthcare enterprise transformation Prior experience in architecting cloud first applications Experience leading a customer through a migration journey and proposing competing views to drive a mutual solution. Knowledge of cloud architecture concepts Knowledge of application deployment and data migration Ability to design high availability applications on AWS across availability zones and availability regions Ability to design applications on AWS taking advantage of disaster recovery design guidelines Design, implement, and maintain streaming solutions using AWS Managed Streaming for Apache Kafka (MSK) Monitor and manage Kafka clusters to ensure optimal performance, scalability, and uptime. Configure and fine-tune MSK clusters, including partitioning strategies, replication, and retention policies. Analyze and optimize the performance of Kafka clusters and streaming pipelines to meet high-throughput and low-latency requirements. Design and implement data integration solutions to stream data between various sources and targets using MSK. Lead data transformation and enrichment processes to ensure data quality and consistency in streaming applications Mandatory Technical Skillset: AWS Architectural concepts - designs, implements, and manages cloud infrastructure AWS Services (EC2, S3, VPC, Lambda, ELB, Route 53, Glue, RDS, DynamoDB, Postgres, Aurora, API Gateway, CloudFormation, etc.) Kafka Amazon MSK Domain Experience: Healthcare domain exp. is required Blues exp. is preferred Location – Pan India

Posted 1 week ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? In NiCE as a Senior Software Engineer, you will be responsible for designing, developing, testing, and maintaining scalable and efficient Java-based applications that meet business requirements. You will collaborate closely with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality software solutions. Your role involves writing clean, well-structured, and maintainable code following best practices and coding standards. Additionally, you will debug and troubleshoot application issues, ensuring optimal performance and user experience. How will you make an impact? Coordinate with Architecture to understand and develop platform architecture Develop a RESTful API solution supporting both AWS and Azure. Work with AWS CloudFormation templates to extend and refine our infrastructure Understand and define performance level needs for the platform Define logs, alarms, troubleshoot them and fix issues in a defined release cadence Integrating with multiple internal products to provide seamless CXone CCaaS offerings Manage RBAC permissions and work with DevOps to maintain “least privilege” Develop and refine Jenkins CI/CD pipelines to deploy code, run acceptance tests, and monitor environment health Effectively collaborate with cross geo team and willing to stretch at times Effectively collaborate with TS/TAM/NOC to address queries and concerns Have you got what it takes? Bachelor’s degree in Computer Science, or equivalent 4+ year of experience in software development Experience with following software languages: NodeJS : Must Have Angular 8: Must Have Java + Spring Boot: Good to Have Open to learn new tech stack as need be Working knowledge with AWS technologies (Open Search, SQS, Lambda, RDS) Experience developing with SQL Server or equivalent Experience designing, developing, deploying and supporting RESTful APIs Experience troubleshooting multi-threaded applications, mining through logs to determine patterns to identify potential issues and fix them Experience with developing services, clients and multi-threaded software Experience with/knowledge of agile development processes Experience with DevOps tools and processes Jenkins, Git, Docker Scripting: Unix, Shell, Groovy, Python Sonarqube Working knowledge of unit testing and test automation (mocha-chai, Cucumber, Playwright) Working knowledge of user stories and use cases Working knowledge of object-oriented software design and design patterns Comfortable working in a fast-paced environment Bonus Experience: Experience with telecommunications/telephony Experience with call centers Experience with Jira What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID:7473 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 week ago

Apply

3.0 - 5.0 years

50 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

About the Opportunity Job TypeApplication 31 July 2025 TitleSenior Analyst Programmer DepartmentTechnology - Corporate Enablers (CFO & CE Technology) LocationBanaglore , India Reports ToSenior Manager Type Department Overview The CFO and CE Cloud Technology function provides systems development, implementation and support services for FILs Corporate enablers Team. We support several functions spanning across Business Finance & Management Accounting, Financial Accounting & Analytics, Taxation, Global Procurement, Corporate Treasury, and several other teams in all of FILs international locations, including UK, Japan, China and India. We provide IT services to the Fidelity International businesses, globally. These include development and support of business functions that underpin our financial accounting and decision making for global CFO Orgs, and we implement multiple systems including ERP platforms, home grown apps and third party products. We are system providers to key process lifecycles such as Procure to Pay (P2P/Global Procurement), Record to Report (R2R), Order to Cash (O2C) and Acquire to Retire (A2R). We also manage systems to enable cash management, forex trading and treasury operations across the Globe. We own warehouses that consolidate data from across the organisations functions to provide meaningful insights. We are seeking a skilled and experienced Python Developer to join our team. The ideal candidate will have a strong background in API development and PLSQL Store procedures along with good understanding of Kubernetes,AWS,SnapLogic cloud-native technologies.This role requires deep technical expertise and the ability to work in a dynamic and fast-paced environment. Essential Skills Must have technical skills Knowledge of latest Python frameworks and technologies (e.g., Django, Flask, FastAPI) Experience with Python libraries and tools (e.g., Pandas, NumPy, SQLAlchemy) Strong experience in designing, developing, and maintaining RESTful APIs. Familiarity with API security, authentication, and authorization mechanisms (e.g., OAuth, JWT) Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors) Experience in development & low-level design of Warehouse solutions Familiarity with Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation and Oracle performance optimisation techniques Good to have technical skills: Hands-on experience with Kubernetes for container orchestration Knowledge of deploying, managing, and scaling applications on Kubernetes clusters Proficiency in AWS services (e.g., EC2, S3, RDS, Lambda). Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Experience with SnapLogic cloud-native integration platform. Ability to design and implement integration pipelines using SnapLogic. Key Responsibilities Develop and maintain high-quality Python code for API services. Design and implement containerized applications using Kubernetes. Utilize AWS services for cloud infrastructure and deployment. Create and manage integration pipelines using SnapLogic. Write and optimize PL/SQL stored procedures for database operations. Collaborate with cross-functional teams to deliver high-impact solutions. Ensure code quality, security, and performance through best practices. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 5 to 7 years of experience with application development on Python language, API development along with Oracle RDBMS, SQL, PL/SQL Personal Characteristics Excellent communication skills both verbal and written Strong interest in Technology and its applications Self-motivated and Team Player Ability to work under pressure and meet deadlines Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title: Automation Developer – Lead to Cash (L2C) Process Department: Digital Transformation / Process Automation Reports To: Automation Lead / Engineering Manager Location: Remote / Hybrid / Client Location (as applicable) Experience Required: 3–6 Years Employment Type: Full-Time Role Summary: We are seeking a highly skilled Automation Developer to design, develop, and implement automation solutions for the Lead to Cash (L2C) process. The ideal candidate will have strong programming skills, proficiency in web automation tools, and experience with automation frameworks to enhance operational efficiency and accuracy. Key Responsibilities: Design, develop, and implement robust automation solutions for L2C process optimization. Build and maintain automation scripts to streamline workflows and reduce manual efforts. Collaborate with business analysts, QA, and development teams to gather requirements and deliver automation solutions. Conduct unit testing and debugging using tools like Postman, Rest Assured, or Insomnia. Integrate automation solutions within AWS environments using services such as S3, SNS, and Lambda . Utilize Git/GitHub and Jenkins for version control and CI/CD pipeline setup. Document the design, functionality, and maintenance procedures for automation tools and scripts. Required Qualifications & Skills: Strong programming proficiency in Python with practical hands-on experience. Expertise in Selenium for end-to-end web automation . Proficient in Robot Framework (mandatory); PyTest knowledge is a plus. Working knowledge of SQL databases , preferably PostgreSQL . Familiarity with manual API testing tools such as Postman , Rest Assured , or Insomnia . Experience in AWS environments , including S3 , SNS , and Lambda . Skilled in version control systems ( Git, GitHub ) and build automation tools ( Jenkins ). Preferred Qualifications: Prior experience automating processes within L2C or similar enterprise workflows. Certification in any automation testing tools or cloud platforms. Exposure to Agile methodologies and DevOps practices. Soft Skills: Strong problem-solving and analytical thinking. Self-driven with the ability to work independently and as part of a team. Excellent communication and documentation skills. Ability to handle multiple tasks and work under tight deadlines. Key Relationships: QA Engineers & Test Automation Team Product Owners & Business Analysts DevOps and Cloud Infrastructure Teams L2C Process Owners Role Dimensions: Direct contributor to process efficiency and automation of critical L2C operations. Improves scalability and reliability of enterprise workflows. Enhances developer productivity and reduces operational risk. Success Measures (KPIs): Reduction in manual L2C process execution time Automation script coverage and reliability Successful integration and deployment using CI/CD pipelines Reduction in bugs or issues in automation outcomes Business stakeholder satisfaction Competency Framework Alignment: Automation Strategy & Execution Technical Programming & Scripting Cloud-Based Deployment (AWS) Quality Assurance & Testing Operational Efficiency & Innovation

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 19 Lacs

Pune

Hybrid

Naukri logo

So, what’s the role all about? Seeking a skilled and experienced DevOps Engineer in designing, producing, and testing high-quality software that meets specified functional and non-functional requirements within the time and resource constraints given. How will you make an impact? Design, implement, and maintain CI/CD pipelines using Jenkins to support automated builds, testing, and deployments. Manage and optimize AWS infrastructure for scalability, reliability, and cost-effectiveness. To streamline operational workflows and develop automation scripts and tools using shell scripting and other programming languages. Collaborate with cross-functional teams (Development, QA, Operations) to ensure seamless software delivery and deployment. Monitor and troubleshoot infrastructure, build failures, and deployment issues to ensure high availability and performance. Implement and maintain robust configuration management practices and infrastructure-as-code principles. Document processes, systems, and configurations to ensure knowledge sharing and maintain operational consistency. Performing ongoing maintenance and upgrades (Production & non-production) Occasional weekend or after-hours work as needed Have you got what it takes? Experience: 5-8 years in DevOps or a similar role. Cloud Expertise: Proficient in AWS services such as EC2, S3, RDS, Lambda, IAM, CloudFormation, or similar. CI/CD Tools: Hands-on experience with Jenkins pipelines (declarative and scripted). Scripting Skills: Proficiency in either shell scripting or powershell Programming Knowledge: Familiarity with at least one programming language (e.g., Python, Java, or Go). IMP: Scripting/Programming is integral to this role and will be a key focus in the interview process. Version Control: Experience with Git and Git-based workflows. Monitoring Tools: Familiarity with tools like CloudWatch, Prometheus, or similar. Problem-solving: Strong analytical and troubleshooting skills in a fast-paced environment. CDK Knowledge in AWS DevOps. You will have an advantage if you also have: Prior experience in Development or Automation is a significant advantage. Windows system administration is a significant advantage. Experience with monitoring and log analysis tools is an advantage. Jenkins pipeline knowledge What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7318 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Okta is seeking a highly skilled Full-Stack Engineer with deep expertise in AWS Bedrock, generative AI, and modern software development to join our fast-moving team at the intersection of developer experience, machine learning, and enterprise software. As part of the Okta Business Technology (BT) team, you will build cutting-edge tools that make AI development intuitive, collaborative, and scalable. If you're passionate about building next-generation AI applications and empowering developers through innovative platforms, this is the role for youJob Duties and Responsibilities Design and develop full-stack applications that integrate seamlessly with Amazon Bedrock AI agents. Build scalable, production-grade AI/ML solutions using AWS Bedrock and the AWS Agent Development Kit. Implement back-end services and APIs to interact with foundation models for tasks such as automated sourcing, content generation, and prompt orchestration. Create intuitive and performant front-end interfaces using Angular that connect with GenAI capabilities. Ensure seamless integration of LLMs and foundation models into the broader application architecture. Explore and rapidly prototype with the latest LLMs and GenAI tools to iterate on new capabilities and features. Build sophisticated AI workflows using knowledge bases, guardrails, and prompt chaining/flows. Deploy and maintain enterprise-ready GenAI applications at scale. Collaborate with business analysts to understand customer needs and use cases, and work with the team to design, develop POCs to test, implement & support solutions Foster strong relationships with teammates, customers, and vendors to facilitate effective communication and collaboration throughout the project lifecycle Perform in-depth analysis of requirements, ensuring compatibility and adherence to established standards and best practicesRequired Skills: 7+ years of robust experience with hands-on development & design experience 3+ years of experience in one or more of the following areasDeep Learning, LLMs, NLP, Speech, Conversational AI, AI Infrastructure, Fine-tuning, and optimizations of PyTorch models. Software development experience in languages like Python (must have) and one from the optional (Go, Rust, and C/C++). Experience with at least one LLM such as Llama, GPT, Claude, Falcon, Gemini, etc. Expertise in AWS Bedrock and the AWS Agent Development Kit is mandatory Hands-on experience with Python libraries including boto3, NumPy, Pandas, TensorFlow or PyTorch, and Hugging Face Transformers Solid understanding of the AWS ecosystem (e.g., CloudWatch, Step Functions, Kinesis, Lambda) Familiarity with full software development lifecycle, including version control, CI/CD, code reviews, automated testing, and production monitoring Knowledge about the ERP, HR technology and Legal business processes is an advantageEducation and Certifications Bachelors degree in Computer Science or a related field, or equivalent practical experience AWS certifications (e.g., AWS Certified Solutions Architect, Machine Learning Specialty) are a plus Experience working with large-scale generative AI or LLM-based applications Knowledge of secure application development and data privacy practices in AI/ML workloads"This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment."

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Designs, develops and supports applications solutions with focus on HANA version of Advanced Business Application Programming (ABAP). This specialty may design, develop and/or re-engineer highly complex application components, and integrate software packages, programs and reusable objects residing on multiple platforms. This specialty may additionally have working knowledge of SAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4 -12 years of experience required. The ABAP on HANA Application Developers would possess the knowledge of the following topics and apply them to bring in value and innovation to client engagementsSAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption. Designing and developing, data dictionary objects, data elements, domains, structures, views, lock objects, search helps and in formatting the output of SAP documents with multiple options. Modifying standard layout sets in SAP Scripts, Smart forms & Adobe Forms Development experience in RICEF (Reports, Interfaces, Conversions, Enhancements, Forms and Reports) Preferred technical and professional experience Experience in working in Implementation, Upgrade, Maintenance and Post Production support projects would be an advantage Understanding of SAP functional requirement, conversion into Technical design and development using ABAP Language for Report, Interface, Conversion, Enhancement and Forms in implementation or support projects

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 week ago

Apply

2.0 - 3.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture) Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 1 week ago

Apply

2.0 - 3.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level: Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 1 week ago

Apply

3.0 - 4.0 years

5 - 9 Lacs

Kochi

Work from Office

Naukri logo

Job Title - + + Management Level : Location:Kochi, Coimbatore, Trivandrum Must have skills:Python, Pyspark Good to have skills:Redshift Job Summary : We are seeking a highly skilled and experienced Senior Data Engineer to join our growing Data and Analytics team. The ideal candidate will have deep expertise in Databricks and cloud data warehousing, with a proven track record of designing and building scalable data pipelines, optimizing data architectures, and enabling robust analytics capabilities. This role involves working collaboratively with cross-functional teams to ensure the organization leverages data as a strategic asset. Your responsibilities will include: Roles & Responsibilities Design, build, and maintain scalable data pipelines and ETL processes using Databricks and other modern tools. Architect, implement, and manage cloud-based data warehousing solutions on Databricks (Lakehouse Architecture) Develop and maintain optimized data lake architectures to support advanced analytics and machine learning use cases. Collaborate with stakeholders to gather requirements, design solutions, and ensure high-quality data delivery. Optimize data pipelines for performance and cost efficiency. Implement and enforce best practices for data governance, access control, security, and compliance in the cloud. Monitor and troubleshoot data pipelines to ensure reliability and accuracy. Lead and mentor junior engineers, fostering a culture of continuous learning and innovation. Excellent communication skills Ability to work independently and along with client based out of western Europe Professional & Technical Skills: Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 3-4 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:5-8 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 1 week ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

Kochi

Work from Office

Naukri logo

Job Title - Data Engineer Sr.Analyst ACS SONG Management Level:Level 10 Sr. Analyst Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture (do not remove the hyperlink) Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Architecture Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the entire application development process and ensuring its successful implementation. Your role will involve collaborating with cross-functional teams, managing the team's performance, and making key decisions. With your expertise in AWS Architecture, you will provide innovative solutions to problems and contribute to the success of multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact for application-related matters- Oversee the entire application development process- Collaborate with cross-functional teams- Manage the team's performance- Make key decisions to ensure successful implementation- Provide innovative solutions to problems- Contribute to the success of multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture- Strong understanding of cloud computing principles and best practices- Experience in designing and implementing scalable and secure AWS solutions- Knowledge of AWS services such as EC2, S3, Lambda, and RDS- Hands-on experience with infrastructure as code tools like CloudFormation or Terraform- Good To Have Skills: Experience with DevOps practices and tools- Recommendation:Familiarity with other cloud platforms like Azure or Google Cloud- Solid grasp of software development methodologies and practices Additional Information:- The candidate should have a minimum of 7.5 years of experience in AWS Architecture- This position is based in Gurugram- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 week ago

Apply

10.0 - 14.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Title : GN - SONG - Service - Amazon Connect Platforms ¢‚€ Associate Director The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 7 (Manager) + Entity (S&C GN) Management Level: Level 7 - Manager Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Amazon Connect - Contact Center Transformation, Analysis and Implementation | Level: Manager | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Education Qualification (Mandatory ):Post Graduation in Business Management | Years of Exp: 10-14 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Lead a team responsible for designing, developing, and implementing contact center transformations on platforms led solutions across Cloud Contact Center transformation Business Development :Lead and manage proposals in response to client requests / RFPs across multiple market units to ensure a continuous pipeline of opportunities / projects Market Unit Development :Connect and work with market unit leads to identify and understand the demand in the market in terms of both skill and scale required People Development: Grow the practice and business by engaging in hiring across platforms, along with nurturing, upskilling existing team. Project Delivery: Lead a team of contact center transformation consultants and engage with the client and implementation team:for conducting design sessions, requirement gathering and grooming, give regular working demos to all stakeholders, solution design and implementation activities Provide best practices guidance and implement approach based on industry or process benchmarks Develop innovative, fact-based, and achievable strategies and operating models after evaluation of multiple strategic options. Lead practice-specific initiatives including creating points of view, creating reusable assets in contact center space, performing analysis on industry research and market trends, and bringing in innovative solutions, etc. Bring your best skills forward to excel at the role: Seasoned techno-functional professional with significant experience working on a large- scale business / operational transformation project Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on Amazon Connect product features and contact center related AWS services like Lex, Pinpoint, Transcribe, Comprehend etc Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Strong program management / people management skills Read about us. Blogs Your experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 8 years of experience in delivering software as a service or platform as a service projects related (pref. a mix of cloud and on-premise contact center platforms) Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Pinpoint, Comprehend, Transcribe Experience in taking a lead role for building contact center applications that have been successfully delivered to customers Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.Qualification Good to have skills: AWS Lambda and Lex bots, Pinpoint, Transcribe, Comprehend Experience: Minimum 10 year(s) of experience is required Educational Qualification: MBA from a Tier 1 or Tier 2 institute

Posted 1 week ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.QualificationYour experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java

Posted 1 week ago

Apply

6.0 - 8.0 years

20 - 25 Lacs

Noida

Work from Office

Naukri logo

What youll be doing: Developing the Cloud Infrastructure assets and Solutions Analysing the business requirements and build the solution as per the plan Working as part of the Practice team to deliver the assists and IPs Work with the team to deliver Azure and AWS solutions. What you ll bring: Demonstrable experience in Azure and AWS with a technical background and experience in DevOps and automation. Demonstrable knowledge of Development process Good team player Core Technical Knowledge Required: Good Experience with Infrastructure as Code (ARM, Bicep, Terraform, PowerShell). Azure Azure IaaS (virtual machines, storage, networking, security). Azure Backup Recovery Services. Azure Governance (Blueprints, policies, tagging, cost management). Azure SQL Databases (Managed Instances, PaaS, IaaS). Azure Security (Zero Trust, Defender for Cloud, Sentinel, Entra, AIP). Azure Serverless and integration (Batch, Function, Logic Apps, EventGrid). Azure Containers (AKS, ACI, ACR). AWS WS IaaS (EC2, EBS/S3, VPC, Security Groups). AWS Backup Disaster Recovery Services. AWS Governance (Control Tower, Service Control Policies, AWS Config, Tagging, Cost Explorer). AWS SQL Databases (RDS, RDS Custom, EC2-based SQL). AWS Security (Zero Trust, Security Hub, GuardDuty, IAM Identity Center, Macie). AWS Serverless and Integration (Batch, Lambda, Step Functions, EventBridge). AWS Containers (EKS, ECS, Fargate, ECR). Total Experience Expected: 06-08 years Certifications: Microsoft Azure/AWS DevOps Engineer Microsoft AzureAWS Administrator (Desirable) Terraform/Certified (Desirable) / Expertise in Bicep/ARM

Posted 1 week ago

Apply

4.0 - 9.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

JR: R00195769 Experience: Minimum 4 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute --------------------------------------------------------------------- Job Title - AWS Devops + Level 9 (Consultant) + Entity (S&C GN) Management Level: 9-Team Lead/Consultant Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must-have skills: AWS Cloud, Cloud Contact Center transformation, CI/CD pipelines, AWS Devops, DynamoDB, S3, Lambda, Lex, IAM, VPC, CloudWatch, CloudTrial, API Gateway, Kinesis, WAF Good to have skills: AWS Connect, Amazon Connect , Python, Bash, or PowerShell or Node.js Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Strategy & Consulting Global Network Song practice is aligned to the Global Network Song Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Configure and manage Amazon Connect instances to enhance customer service operations. Integrate Amazon Connect with other AWS services and third-party applications. Work closely with cross-functional teams to understand requirements and deliver solutions that align with business goals. Troubleshoot and resolve issues related to deployment, infrastructure, and application performance. Provide technical support and guidance to team members and stakeholders. Maintain comprehensive documentation of systems, processes, and configurations. Proven experience in designing and managing cloud based(preferably AWS) infrastructure and deployment pipelines(CI/CD) Proficiency in AWS services, including DynamoDB, S3, Lambda, Lex, IAM, VPC, CloudWatch, CloudTrial, API Gateway, Kinesis, WAF, etc Strong experience in DevOps tools and practices, including CI/CD and infrastructure as code (Terraform, CloudFormation, CDK, etc) including configuration, integration, and optimization. Working knowledge of one of the programming/scripting languages such as Python, Bash, or PowerShell or Node.js Knowledge of security best practices and compliance requirements related to contact centers and cloud infrastructure. Experience in setting up cloud instances, account / users with security profiles and designing applications Experience in taking a lead role for building contact center applications that have been successfully delivered to customers Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: Minimum 4 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute

Posted 1 week ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Cloud Data Architecture Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Solutions Architect - Lead, you will analyze, design, code, and test multiple components of application code. You will perform maintenance, enhancements, and/or development work, contributing to the overall success of the projects. Roles & Responsibilities:Design and develop the overall architecture of our digital data platform using AWS services.Create and maintain cloud infrastructure designs and architectural diagrams. Collaborate with stakeholders to understand business requirements and translate them into scalable AWS-based solutions. Evaluate and recommend AWS technologies, services, and tools for the platform. Ensure the scalability, performance, security, and cost-effectiveness of the AWS-based platform. Lead and mentor the technical team in implementing architectural decisions and AWS best practices. Develop and maintain architectural documentation and standards for AWS implementations. Stay current with emerging AWS technologies, services, and industry trends. Optimize existing AWS infrastructure for performance and cost. Implement and manage disaster recovery and business continuity plans. Professional & Technical Skills: Minimum 8 years of experience in IT architecture, with at least 5 years in a solutions architect role. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).Experience in Infrastructure as Code (e.g., CloudFormation, Terraform). Exposure to Continuous Integration/Continuous Deployment (CI/CD) pipelines. Experience in Containerization technologies (e.g., Docker, Kubernetes).Proficiency in multiple programming languages and frameworks. AWS Certified Solutions Architect - Professional certification required. Additional Information:The candidate should have a minimum of 5 years of experience in solutions architect role.This position is based at our Hyderabad office.A 15 years full time education is required (Bachelor of Engineering in Electronics/Computer Science, or any related stream). Qualification 15 years full time education

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies