Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
7 - 15 Lacs
Pune, Bengaluru
Work from Office
Role & responsibilities Essential Skills: Experience: 6 to 10 yrs - Technical Expertise: Proficiency in AWS services such as Amazon S3, Redshift, EMR, Glue, Lambda, and Kinesis. Strong skills in SQL and experience with scripting languages like Python or Java. - Data Engineering Experience: Hands on experience in building and maintaining data pipelines, data modeling, and working with big data technologies. - Problem-Solving Skills: Ability to analyze complex data issues and develop effective solutions to optimize data processing and storage. - Communication and Collaboration: Strong interpersonal skills to work effectively with cross-functional teams and communicate technical concepts to non-technical stakeholders. Educational Qualifications A bachelor's degree in computer science, information technology, or a related field is typically required. Relevant AWS certifications, such as AWS Certified Data Analytics Specialty, are advantageous
Posted 1 week ago
2.0 - 5.0 years
2 - 6 Lacs
Vadodara
Work from Office
Collaborate with cross-functional teams on API design and implementation. Develop scalable backend solutions using Python, RDS/PostgreSQL, Oracle DB, AWS Lambda. Collaborate with frontend, AI, and domain experts in weekly sprints
Posted 1 week ago
5.0 - 8.0 years
18 - 30 Lacs
Hyderabad
Work from Office
AWS Data Engineer with Glue, Terraform, Business Intelligence (Tableau) development * Design, develop & maintain AWS data pipelines using Glue, Lambda & Redshift * Collaborate with BI team on ETL processes & dashboard creation with Tableau
Posted 2 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be responsible for designing, building, and maintaining scalable, secure, and reliable AWS cloud infrastructure. This is a hands-on engineering role requiring deep expertise in Infrastructure as Code (IaC), automation, cloud networking, and security . The ideal candidate should have strong AWS knowledge and be capable of writing and maintaining Terraform, CloudFormation, and CI/CD pipelines to streamline cloud deployments. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: AWS Infrastructure Design & Implementation Architect, implement, and manage highly available AWS cloud environments . Design VPCs, Subnets, Security Groups, and IAM policies to enforce security standard processes. Optimize AWS costs using reserved instances, savings plans, and auto-scaling . Infrastructure as Code (IaC) & Automation Develop, maintain, and enhance Terraform & CloudFormation templates for cloud provisioning. Automate deployment, scaling, and monitoring using AWS-native tools & scripting. Implement and manage CI/CD pipelines for infrastructure and application deployments. Cloud Security & Compliance Enforce standard processes in IAM, encryption, and network security. Ensure compliance with SOC2, ISO27001, and NIST standards. Implement AWS Security Hub, GuardDuty, and WAF for threat detection and response. Monitoring & Performance Optimization Set up AWS CloudWatch, Prometheus, Grafana, and logging solutions for proactive monitoring. Implement autoscaling, load balancing, and caching strategies for performance optimization. Solve cloud infrastructure issues and conduct root cause analysis. Collaboration & DevOps Practices Work closely with software engineers, SREs, and DevOps teams to support deployments. Maintain GitOps standard processes for cloud infrastructure versioning. Support on-call rotation for high-priority cloud incidents. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4 to 6 years of experience in computer science, IT, or related field with hands-on cloud experience OR Bachelors degree and 6 to 8 years of experience in computer science, IT, or related field with hands-on cloud experience OR Diploma and 10 to 12 years of experience in computer science, IT, or related field with hands-on cloud experience Must-Have Skills: Deep hands-on experience with AWS (EC2, S3, RDS, Lambda, VPC, IAM, ECS/EKS, API Gateway, etc.) . Expertise in Terraform & CloudFormation for AWS infrastructure automation. Strong knowledge of AWS networking (VPC, Direct Connect, Transit Gateway, VPN, Route 53) . Experience with Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, CodePipeline, etc.) . Strong troubleshooting and debugging skills in cloud networking, storage, and security . Preferred Qualifications: Good-to-Have Skills: Experience with Kubernetes (EKS) and service mesh architectures . Knowledge of AWS Lambda and event-driven architectures . Familiarity with AWS CDK, Ansible, or Packer for cloud automation. Exposure to multi-cloud environments (Azure, GCP) . Familiarity with HPC, DGX Cloud . Professional Certifications (preferred): AWS Certified Solutions Architect Associate or Professional AWS Certified DevOps Engineer Professional Terraform Associate Certification Soft Skills: Strong analytical and problem-solving skills. Ability to work effectively with global, virtual teams Effective communication and collaboration with cross-functional teams. Ability to work in a fast-paced, cloud-first environment.
Posted 2 weeks ago
7.0 - 12.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are currently seeking a Lead Data Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem - Architect data processing applications using Python, Kafka, Confluent Cloud and AWS - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams - Provide technical leadership and mentorship to development teams and lead engineers - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Strong experience with Confluent - Strong experience in Kafka - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Knowledge of Apache Airflow for data orchestration Preferred Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with Terraform - Deep experience with CI/CD pipelines - Strong understanding of the JVM language family - Understanding of GDPR and the correct handling of PII - Expertise with technical interface design - Use of Docker Responsibilities - Design and implement scalable data architectures using AWS services, Confluent and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture
Posted 2 weeks ago
2.0 - 6.0 years
1 - 5 Lacs
Noida
Work from Office
Req ID: 324014 We are currently seeking a Tableau Admin with AWS Experience to join our team in NOIDA, Uttar Pradesh (IN-UP), India (IN). Tableau Admin with AWS Experience We are seeking a skilled Tableau Administrator with experience in AWS to join our team. The ideal candidate will be responsible for managing and optimizing our Tableau Server environment hosted on AWS, ensuring efficient operation, data security, and seamless integration with other data sources and analytics tools. Key Responsibilities - Manage, configure, and administer Tableau Server on AWS, including setting up sites and managing user access and permissions. - Monitor server activity/performance, conduct regular system maintenance, and troubleshoot issues to ensure optimal performance and minimal downtime. - Collaborate with data engineers and analysts to optimize data sources and dashboard performance. - Implement and manage security protocols, ensuring compliance with data governance and privacy policies. - Automate monitoring and server management tasks using AWS and Tableau APIs. - Assist in the design and development of complex Tableau dashboards. Provide technical support and training to Tableau users. - Stay updated on the latest Tableau and AWS features and best practices, recommending and implementing improvements. Qualifications - - Proven experience as a Tableau Administrator, with strong skills in Tableau Server and Tableau Desktop. - Experience with AWS, particularly with services relevant to hosting and managing Tableau Server (e.g., EC2, S3, RDS). - Familiarity with SQL and experience working with various databases. Knowledge of data integration, ETL processes, and data warehousing principles. - Strong problem-solving skills and the ability to work in a fast-paced environment. - Excellent communication and collaboration skills. - Relevant certifications in Tableau and AWS are a plus. A Tableau Administrator, also known as a Tableau Server Administrator, is responsible for managing and maintaining Tableau Server, a platform that enables organizations to create, share, and collaborate on data visualizations and dashboards. Here's a typical job description for a Tableau Admin 1. Server Administration Install, configure, and maintain Tableau Server to ensure its reliability, performance, and security. 2. User Management Manage user accounts, roles, and permissions on Tableau Server, ensuring appropriate access control. 3. Security Implement security measures, including authentication, encryption, and access controls, to protect sensitive data and dashboards. 4. Data Source Connections Set up and manage connections to various data sources, databases, and data warehouses for data extraction. 5. L icense Management: Monitor Tableau licensing, allocate licenses as needed, and ensure compliance with licensing agreements. 6. Backup and Recovery Establish backup and disaster recovery plans to safeguard Tableau Server data and configurations. 7. Performance Optimization Monitor server performance, identify bottlenecks, and optimize configurations to ensure smooth dashboard loading and efficient data processing. 8. Scaling Scale Tableau Server resources to accommodate increasing user demand and data volume. 9. Troubleshooting Diagnose and resolve issues related to Tableau Server, data sources, and dashboards. 10. Version Upgrades Plan and execute server upgrades, apply patches, and stay current with Tableau releases. 11. Monitoring and Logging Set up monitoring tools and logs to track server health, user activity, and performance metrics. 12. Training and Support Provide training and support to Tableau users, helping them with dashboard development and troubleshooting. 13. Collaboration Collaborate with data analysts, data scientists, and business users to understand their requirements and assist with dashboard development. 14. Documentation Maintain documentation for server configurations, procedures, and best practices. 15. Governance Implement data governance policies and practices to maintain data quality and consistency across Tableau dashboards. 16. Integration Collaborate with IT teams to integrate Tableau with other data management systems and tools. 17. Usage Analytics Generate reports and insights on Tableau usage and adoption to inform decision-making. 18. Stay Current Keep up-to-date with Tableau updates, new features, and best practices in server administration. A Tableau Administrator plays a vital role in ensuring that Tableau is effectively utilized within an organization, allowing users to harness the power of data visualization and analytics for informed decision-making.
Posted 2 weeks ago
10.0 - 20.0 years
18 - 32 Lacs
South Goa, Pune
Hybrid
Define serverless server-based or microservice architectures Led teams on e2e solution delivery on Azure & AWS Code and deliver features to high-performance and scale criteria Mentor Juniors, Grow Tech Vision & IP Required Candidate profile NodeJs: Modules, Closures, Prototypes, Promises, Async Wait, Worker threads, Sequelize or Knex or TediousJs (Desirable) MERN & MEAN Stack, CI/CD Expertise on AWS & MS Azure Cloud Platforms
Posted 2 weeks ago
5.0 - 10.0 years
12 - 18 Lacs
Kolkata
Remote
Key Responsibilities: Design, develop, and implement AI-driven chatbots and IVAs to streamline customer interactions. Work on conversational AI platforms to create a seamless customer experience, with a focus on natural language processing (NLP), intent recognition, and sentiment analysis. Collaborate with cross-functional teams, including product managers and customer support, to translate business requirements into technical solutions. Build, train, and fine-tune machine learning models to enhance IVA capabilities and ensure high accuracy in responses. Continuously optimize models based on user feedback and data-driven insights to improve performance. Integrate IVA/chat solutions with internal systems such as CRM and backend databases. Ensure scalability, robustness, and security of IVA/chat solutions in compliance with industry standards. Participate in code reviews, testing, and deployment of AI solutions to ensure high quality and reliability. Required Skills and Qualifications: Bachelors or master’s degree in computer science, Data Science, AI/ML, or a related field. 3+ years of experience in developing IVA/chatbots, conversational AI, or similar AI-driven systems using AWS services Expert in using Amazon Lex, Amazon Polly, AWS lambda, AWS connect AWS Bedrock experience with Sage maker will have added advantage Solid understanding of API integration and experience working with RESTful services. Strong problem-solving skills, attention to detail, and ability to work independently and in a team. Excellent communication skills in English, both written and verbal. Preferred Qualifications: Experience in financial services or fintech projects. Knowledge of data security best practices and compliance requirements in the financial sector.
Posted 2 weeks ago
6.0 - 10.0 years
22 - 32 Lacs
Hyderabad, Gurugram
Hybrid
Position: Lead Engineer - Java Location: Hyderabad & Gurgaon Work Mode: Hybrid (3 Days per week from Office) Experience Range: 6 to 10 years Notice Period: Immediate Joiner Technical Skills: 1. Java, Springboot. Core Java, OOPs, Design principles, Design patterns, Spring IoC, annotations, Spring JPA 2. Prefer strong Java candidates with exposure to other languages as well such as NodeJS / Typescript (any one) 3. AWS services - Lambda, API Gateway, IAM, SQS (exposure to at least 2 services from the list) 4. Gitlab CICD / DevOps experience 5. Amazon EKS - Kubernetes in general, Docker 6. Kafka eventing (could be any eventing framework) 7. Java Swing, Struts (Good to have) 8. Experience of working in Agile SEND RESUMES TO amrita.nag@areteanstech.com Please Mention The details below to apply: (Mandatory) Current CTC- Expected CTC- Total Experience- Exp in Springboot- Exp in DevOps/CI/CD- Exp in NodeJS/Typescript - Notice Period- Location-
Posted 2 weeks ago
4.0 - 9.0 years
15 - 22 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Looking for Full stack Developer who should be strong in React JS + AWS Job Position (Title) UI Developer React+ JavaScript+ AWS Lambda Experience Required : 4 Years + Location : Delhi/NCR, Pune, Hyderabad Technical Skill Requirements React+ JavaScript + Typescript +HTML+CSS+AWS (Lambdas, DynamoDB) + Any Database knowledge preferable Role and Responsibilities Managing the complete software development process from conception to deployment Maintaining and upgrading the software following deployment Managing the end-to-end life cycle for the production of software and application Overseeing and guiding the analysing, writing, building, and deployment of software Overseeing the automated testing and providing feedback to management during the development process Modifying and testing changes to previously developed programs Required Skills 4+ years of experience in developing enterprise level applications, using ReactJS, NodeJS, CSS, HTML & AWS Experience working with AWS (lambdas, Dynamo DB) Worked on Rest API Excellent verbal and written communication and collaboration skills to effectively communicate with both business and technical teams. Comfortable working in a fast-paced, result-oriented environment.
Posted 2 weeks ago
6.0 - 9.0 years
14 - 24 Lacs
Kochi
Work from Office
Greetings from Cognizant!! #MegaWalkIn We have an exciting opportunity for the #AWS Services Role with Cognizant, join us if you are an aspirant for matching the below criteria!! Primary Skill: AWS Services Experience: 6-9 years Job Location: PAN India Interview Day: 14 Jun 2025 - Saturday Interview Location: Kochi Interview mode: Walk-in Drive ( Cognizant Technology Solutions, Infopark Phase 2, Kakkanad, Kochi, Kerala 682030 ) Interested Candidates, Apply here >> https://forms.office.com/r/t2X9WiRS9T Regards, Vinosha TAG-HR
Posted 2 weeks ago
7.0 - 12.0 years
20 - 32 Lacs
Hyderabad
Hybrid
Python + AWS Developer Location: Hyderabad- Hybrid Experience Required: 7 to 12 years Notice: Immediate to 15 days Primary skills: Python, AWS, Kubernetes or CI/CD-good to have serverless frameworks (AWS SAM, Serverless Framework, etc.). Required Candidate profile Strong knowledge of AWS services, including Lambda, Step Functions, DynamoDB, SNS, SQS, and CloudFront.
Posted 2 weeks ago
5.0 - 8.0 years
0 - 1 Lacs
Thane
Work from Office
As a Senior Python Developer, you will be responsible for designing, developing, and maintaining efficient and reliable Python applications. You will work collaboratively with cross-functional teams to deliver high-quality software solutions, ensure code quality, and provide technical guidance to other team members. Responsibilities Design and develop robust, scalable Python applications Collaborate with cross-functional teams to define and implement software solutions Mentor and guide junior developers to ensure adherence to coding standards Participate in code reviews to maintain high-quality codebase Identify and resolve performance and scalability issues Contribute to continuous improvement of development processes and practices Qualifications Bachelor's degree in Computer Science, Engineering, or related field 5 To 8 years of professional Python development experience Proven track record of delivering high-quality software solutions Strong understanding of software architecture and design principles Experience with version control systems such as Git Excellent problem-solving and analytical skills Strong communication and teamwork abilities knowledge of AWS will be added advantage. Skills Python Django Flask RESTful APIs SQL and NoSQL databases Git Docker AWS Test-driven development (TDD) CI/CD pipelines
Posted 2 weeks ago
10.0 - 13.0 years
25 - 37 Lacs
Gurugram
Work from Office
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! REQUIREMENTS: Total experience 9+ years. Hands-on experience in Big Data Engineering. Strong expertise in Apache Spark and PySpark/Python . Deep technical knowledge of AWS Glue (Crawler, Data Catalog). Hands on working experience in Python. Strong working experience with AWS services, including S3, Lambda, SNS, Secret Manager, and Athena. Proven experience with Infrastructure as Code using CloudFormation and Terraform. Solid experience in Snowflake. Proficiency in setting up and maintaining CI/CD pipelines with GitHub Actions. Familiarity with tools like Jira and GitHub. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. RESPONSIBILITIES: Understanding the clients business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements
Posted 2 weeks ago
3.0 - 6.0 years
2 - 6 Lacs
Chennai
Work from Office
AWS Lambda Glue Kafka/Kinesis RDBMS Oracle, MySQL, RedShift, PostgreSQL, Snowflake Gateway Cloudformation / Terraform Step Functions Cloudwatch Python Pyspark Job role & responsibilities: Looking for a Software Engineer/Senior Software engineer with hands on experience in ETL projects and extensive knowledge in building data processing systems with Python, pyspark and Cloud technologies(AWS). Experience in development in AWS Cloud (S3, Redshift, Aurora, Glue, Lambda, Hive, Kinesis, Spark, Hadoop/EMR) Required Skills: Amazon Kinesis, Amazon Aurora, Data Warehouse, SQL, AWS Lambda, Spark, AWS QuickSight Advanced Python Skills Data Engineering ETL and ELT Skills Experience of Cloud Platforms (AWS or GCP or Azure) Mandatory skills- Datawarehouse, ETL, SQL, Python, AWS Lambda, Glue, AWS Redshift.
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 2 weeks ago
6.0 - 10.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. AWS Data/API Gateway Pipeline Engineer responsible for designing, building, and maintaining real-time, serverless data pipelines and API services. This role requires extensive hands-on experience with Java, Python, Redis, DynamoDB Streams, and PostgreSQL, along with working knowledge of AWS Lambda and AWS Glue for data processing and orchestration. This position involves collaboration with architects, backend developers, and DevOps engineers to deliver scalable, event-driven data solutions and secure API services across cloud-native systems. Key Responsibilities API & Backend Engineering Build and deploy RESTful APIs using AWS API Gateway, Lambda, and Java and Python. Integrate backend APIs with Redis for low-latency caching and pub/sub messaging. Use PostgreSQL for structured data storage and transactional processing. Secure APIs using IAM, OAuth2, and JWT, and implement throttling and versioning strategies. Data Pipeline & Streaming Design and develop event-driven data pipelines using DynamoDB Streams to trigger downstream processing. Use AWS Glue to orchestrate ETL jobs for batch and semi-structured data workflows. Build and maintain Lambda functions to process real-time events and orchestrate data flows. Ensure data consistency and resilience across services, queues, and databases. Cloud Infrastructure & DevOps Deploy and manage cloud infrastructure using CloudFormation, Terraform, or AWS CDK. Monitor system health and service metrics using CloudWatch, SNS and structured logging. Contribute to CI/CD pipeline development for testing and deploying Lambda/API services. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor's degree in computer science, Engineering, or a related field. Over 6 years of experience in developing backend or data pipeline services using Java and Python . Strong hands-on experience with: AWS API Gateway , Lambda , DynamoDB Streams Redis (caching, messaging) PostgreSQL (schema design, tuning, SQL) AWS Glue for ETL jobs and data transformation Solid understanding of REST API design principles, serverless computing, and real-time architecture. Preferred Skills and Experience Familiarity with Kafka, Kinesis, or other message streaming systems Swagger/OpenAPI for API documentation Docker and Kubernetes (EKS) Git, CI/CD tools (e.g., GitHub Actions) Experience with asynchronous event processing, retries, and dead-letter queues (DLQs) Exposure to data lake architectures (S3, Glue Data Catalog, Athena) Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
4.0 - 9.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Skill : AWS/ Amazon Connect Developer/ Lead Experience : 3 years - 10 Years Location : PAN India Job Description : 2.Strong experience in contact center development 3.Experience in creating AC flows (Amazon Connect) , Lex chatbots and Lambda functions 4.Java / node.js Architect with knowledge on AWS environment, Design and develop APIs (Rest and SOAP services) 5.Knowledge on AWS Lambda services and familiarity in AWS environment and eco system. 6.Knowledge on Spring, Maven, Hibernate 7.Knowledge on Data base technologies like MySQL or SQL Server or DB2/ RDS 8. Application Development experience in any of Java, C#, Node.js, Python, PHP
Posted 2 weeks ago
4.0 - 9.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Skill : Amazon Connect Developer/ Lead Location :PAN India Job Descroption: 1. Minimum exp 3-9 years, 2.Strong experience in contact center development 3.Experience in creating AC flows (Amazon Connect) , Lex chatbots and Lambda functions 4.Java / node.js Architect with knowledge on AWS environment, Design and develop APIs (Rest and SOAP services) 5.Knowledge on AWS Lambda services and familiarity in AWS environment and eco system. 6.Knowledge on Spring, Maven, Hibernate 7.Knowledge on Data base technologies like MySQL or SQL Server or DB2/ RDS 8. Application Development experience in any of Java, C#, Node.js, Python, PHP
Posted 2 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Roles and Responsibilities: Experience in GLUE AWS Experience with one or more of the followingSpark, Scala, Python, and/or R . Experience in API development with NodeJS Experience with AWS (S3, EC2) or other cloud provider Experience in Data Virtualization tools like Dremio and Athena is a plus Should be technically proficient in Big Data concepts Should be technically proficient in Hadoop and noSQL (MongoDB) Good communication and documentation skills
Posted 2 weeks ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Experience in Modernizing applications to Container based platform using EKS, ECS, Fargate Proven experience on using DevOps tools during Modernization. Solid experience around No-SQL database. Should have used Orchestration engine like Kubernetes, Mesos Java8, spring boot, sql, Postgres DB and AWS Secondary Skills: React, redux , JavaScript Experience level knowledge on AWS Deployment Services, AWS beanstalk, AWS tools & SDK, AWS Cloud9, AWS CodeStar, AWS Command line interface etc and hands on experience on AWS ECS, AWS ECR, AWS EKS, AWS Fargate, AWS Lambda function, Elastic Chache, S3 objects, API Gateway, AWS Cloud Watch and AWS SNS. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Should have worked on at least 3 engagements modernizing client applications to Container based solutions. Should be expert in any of the programming languages like Java, .NET, Node .js, Python, Ruby, Angular .js Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 2 weeks ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Experience in Modernizing applications to Container based platform using EKS, ECS, Fargate Proven experience on using DevOps tools during Modernization. Solid experience around No-SQL database. Should have used Orchestration engine like Kubernetes, Mesos Java8, spring boot, sql, Postgres DB and AWS Secondary Skills: React, redux , JavaScript Experience level knowledge on AWS Deployment Services, AWS beanstalk, AWS tools & SDK, AWS Cloud9, AWS CodeStar, AWS Command line interface etc and hands on experience on AWS ECS, AWS ECR, AWS EKS, AWS Fargate, AWS Lambda function, Elastic Chache, S3 objects, API Gateway, AWS Cloud Watch and AWS SNS Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Should have worked on at least 3 engagements modernizing client applications to Container based solutions. Should be expert in any of the programming languages like Java, .NET, Node .js, Python, Ruby, Angular .js Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 2 weeks ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary...More... Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Should have worked on at least 3 engagements modernizing client applications Container based solutions. Should be expert in any of the programming languages like Java, .NET...More Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 2 weeks ago
4.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Container based solutions. Strong experience with Node.js and AWS stack - AWS Lambda, AWS APIGateway, AWS CDK, AWS DynamoDB, AWS SQS. Experience with infrastructure as a code using AWS CDK.Expertise in encryption and decryption techniques for securing APIs, API Authentication and Authorization Primarily more experience is required on Lambda and APIGateway. Candidates having the AWS Certified Cloud Practitioner / AWS Certified Developer Associate certifications will be preferred Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle
Posted 2 weeks ago
5.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers AWS S3 , Redshift , and EMR for data storage and distributed processing. AWS Lambda , AWS Step Functions , and AWS Glue to build serverless, event-driven data workflows and orchestrate ETL processes
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France