Jobs
Interviews

867 Lambda Expressions Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Req ID: 324162 We are currently seeking a Python Engineer with AWS and Java to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job DutiesMorgan Stanley is seeking a highly skilled Senior Python Developer with over 5 years of experience to join our team in developing a state-of-the-art electronic communications surveillance system. This system will monitor all voice communications, chats, and email messages of employees across the firm, ensuring compliance and security. The ideal candidate will have a proven track record in writing high-performance, low-latency code capable of processing millions of messages daily, with expertise in Python, a solid understanding of data structures, design patterns, and familiarity with Java. Responsibilities "¢ Design, develop, and implement a robust surveillance system from the ground up to monitor electronic communications in real-time. "¢ Write high-performance, low-latency Python code to handle large-scale message processing (millions of messages per day). "¢ Collaborate with cross-functional teams to define system architecture and ensure scalability, reliability, and maintainability. "¢ Optimize data processing pipelines using Apache Kafka for real-time message streaming. "¢ Leverage Amazon AWS for cloud-based infrastructure, ensuring secure and efficient deployment. "¢ Design and maintain database schemas in Postgres SQL for efficient data storage and retrieval. "¢ Integrate Collibra for data governance and metadata management. "¢ Utilize Airflow for workflow orchestration and scheduling. "¢ Implement CI/CD pipelines using Jenkins and manage containerized applications with Docker. "¢ Use Artifactory for artifact management and dependency tracking. "¢ Apply advanced knowledge of data structures and design patterns to create clean, modular, and reusable code. "¢ Contribute to code reviews, testing, and documentation to maintain high-quality standards. Minimum Skills Required"¢ Experience5+ years of professional software development experience, with a focus on Python. "¢ Technical Skills: o Expertise in writing high-performance, low-latency Python code for large-scale systems. o Strong understanding of data structures, algorithms, and design patterns. o Familiarity with Java for cross-language integration and support. o Hands-on experience with Apache Kafka for real-time data streaming. o Proficiency in Amazon AWS services (e.g., EC2, S3, Lambda, RDS). o Experience with Postgres SQL for relational database management. o Knowledge of Collibra for data governance (preferred). o Familiarity with Apache Airflow for workflow orchestration. o Experience with Jenkins CI for continuous integration and deployment. o Proficiency in Docker for containerization and Artifactory for artifact management. "¢ Soft Skills: o Strong problem-solving skills and attention to detail. o Ability to work independently and collaboratively in a fast-paced environment. o Excellent communication skills to articulate technical concepts to non-technical stakeholders. "¢ EducationBachelor"™s degree in Computer Science, Engineering, or a related field (or equivalent experience). Preferred Qualifications "¢ Experience in financial services or compliance systems. "¢ Familiarity with surveillance or monitoring systems for voice, chat, or email communications. Knowledge of regulatory requirements in the financial industry

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Req ID: 306668 We are currently seeking a Cloud Solution Delivery Sr Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 2+ years of experience inleading a team of data engineers - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification

Posted 2 months ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Pune

Work from Office

Req ID: 301930 We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Pune, Mahrshtra (IN-MH), India (IN). Position Overview We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders

Posted 2 months ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are currently seeking a Lead Data Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem - Architect data processing applications using Python, Kafka, Confluent Cloud and AWS - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams - Provide technical leadership and mentorship to development teams and lead engineers - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Strong experience with Confluent - Strong experience in Kafka - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Knowledge of Apache Airflow for data orchestration Preferred Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with Terraform - Deep experience with CI/CD pipelines - Strong understanding of the JVM language family - Understanding of GDPR and the correct handling of PII - Expertise with technical interface design - Use of Docker Responsibilities - Design and implement scalable data architectures using AWS services, Confluent and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture

Posted 2 months ago

Apply

5.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Req ID: 306669 We are currently seeking a Lead Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data/Product Engineer to join our dynamic team. The ideal candidate will have a strong background in streaming services and AWS cloud technology, leading teams and directing engineering workloads. This is an opportunity to work on the core systems supporting multiple secondary teams, so a history in software engineering and interface design would be an advantage. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineering reuseable assets for the later build of data products - Building foundational integrations with Kafka, Confluent Cloud and AWS - Integrating with a large number of upstream and downstream technologies - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 3+ years of experience with real time (or near real time) streaming systems - 2+ years of experience leading a team of data engineers - A willingness to independently learn a high number of new technologies and to lead a team in learning new technologies - Experience in AWS cloud services, particularly Lambda, SNS, S3, and EKS, API Gateway - Strong experience with Python - Strong experience in Kafka - Excellent understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts both directly and through documentation - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with terraform - Experience with CI pipelines - Ability to code in a JVM language - Understanding of GDPR and the correct handling of PII - Knowledge of technical interface design - Basic use of Docker

Posted 2 months ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Gurugram

Work from Office

About The Role : AWS Cloud Engineer Required Skills and Qualifications: 4-7 years of hands-on experience with AWS services, including EC2, S3, Lambda, ECS, EKS, and RDS/DynamoDB, API Gateway. Strong working knowledge of Python, JavaScript. Strong experience with Terraform for infrastructure as code. Expertise in defining and managing IAM roles, policies, and configurations . Experience with networking, security, and monitoring within AWS environments. Experience with containerization technologies such as Docker and orchestration tools like Kubernetes (EKS) . Strong analytical, troubleshooting, and problem-solving skills. Experience with AI/ML technologies and Services like Textract will be preferred. AWS Certifications ( AWS Developer, Machine Learning - Specialty ) are a plus. Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback2Self- ManagementProductivity, efficiency, absenteeism, Training Hours, No of technical training completed

Posted 2 months ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Locations- Pune/Bangalore/Hyderabad/Ahmedabad/Indore Job Responsibilities 1.Design and deploy scalable, highly available, secure, and fault tolerant systems on AWS for the development and test lifecycle of our cloud security product. 2.Focus on building Dockerized application components by integrating with AWS EKS. 3.Modify existing Application in AWS to improve performance. 4.Passion for solving challenging issues. 5.Promote cooperation and commitment within a team to achieve common goals. 6.Examine data to grasp issues, draw conclusions, and solve problems. Must Have Skills 1.Demonstrated competency with the following AWS servicesEKS, AppStream, Cognito, CloudWatch, Fargate Cluster, EC2, EBS, S3, Glacier, RDS, VPC, Route53, ELB, IAM, CloudFront, CloudFormation, SQS, SES, Lambda, APIGateway, 2.Knowledge in Containerization Hosting Technologies like Docker and Kubernetes is highly desirable 3.Experienced with ECS and EKS Managed node clusters, and Fargate 4.Proficient knowledge in scripting (Linux, Unix shell scripts, Python, Ruby, etc.) 5.Hands-on experience in Configuration Management and Deployment tools (CloudFormation, Terraform etc.) 6.Mastery of CI/CD tools (Jenkins, etc.) 7.Building CI/CD pipelines & competency in GIT Good working exposure with Jenkins and GitLab (GitHub, GitLab, Bitbucket) 8.Experience with DevOps Services of cloud vendors (AWS/Azure/GCP, etc.) is necessary. 9.Must be from Development Background 10.Exposure to application and infrastructure monitoring tools (Grafana, Prometheus, Nagios, etc.) outstanding skill to have! 11.Excellent soft skills for IT professionals 12.Aware with AWS IAM policies and basic guidelines 13.Sufficient understanding of AWS network components Good to Have 1.Experience in integrating SCM, Code Quality, Code Coverage, and Testing tools for CI/CD pipelines. 2.Developer background, worked with several code analysis tools, integrations like Sonarqure, Fortify etc, 3.Understands well about the static code analysis

Posted 2 months ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Mumbai, Pune

Work from Office

Design Containerized & cloud-native Micro services Architecture Plan & Deploy Modern Application Platforms & Cloud Native Platforms Good understanding of AGILE process & methodology Plan & Implement Solutions & best practices for Process Automation, Security, Alerting & Monitoring, and Availability solutions Should have good understanding of Infrastructure-as-code deployments Plan & design CI/CD pipelines across multiple environments Support and work alongside a cross-functional engineering team on the latest technologies Iterate on best practices to increase the quality & velocity of deployments Sustain and improve the process of knowledge sharing throughout the engineering team Keep updated on modern technologies & trends, and advocate the benefits Should possess good team management skills Ability to drive goals / milestones, while valuing & maintaining a strong attention to detail Excellent Judgement, Analytical & problem-solving skills Excellent in communication skills Experience maintaining and deploying highly-available, fault-tolerant systems at scale Practical experience with containerization and clustering (Kubernetes/OpenShift/Rancher/Tanzu/GKE/AKS/EKS etc) Version control system experience (e.g. Git, SVN) Experience implementing CI/CD (e.g. Jenkins, TravisCI) Experience with configuration management tools (e.g. Ansible, Chef) Experience with infrastructure-as-code (e.g. Terraform, Cloud formation) Expertise with AWS (e.g. IAM, EC2, VPC, ELB, ALB, Autoscaling, Lambda) Container Registry Solutions (Harbor, JFrog, Quay etc) Operational (e.g. HA/Backups) NoSQL experience (e.g. Cassandra, MongoDB, Redis) Good understanding on Kubernetes Networking & Security best practices Monitoring Tools like DataDog, or any other open source tool like Prometheus, Nagios Load Balancer Knowledge (AVI Networks, NGINX) Location: Pune / Mumbai [Work from Office]

Posted 2 months ago

Apply

3.0 - 6.0 years

2 - 6 Lacs

Chennai

Work from Office

AWS Lambda Glue Kafka/Kinesis RDBMS Oracle, MySQL, RedShift, PostgreSQL, Snowflake Gateway Cloudformation / Terraform Step Functions Cloudwatch Python Pyspark Job role & responsibilities: Looking for a Software Engineer/Senior Software engineer with hands on experience in ETL projects and extensive knowledge in building data processing systems with Python, pyspark and Cloud technologies(AWS). Experience in development in AWS Cloud (S3, Redshift, Aurora, Glue, Lambda, Hive, Kinesis, Spark, Hadoop/EMR) Required Skills: Amazon Kinesis, Amazon Aurora, Data Warehouse, SQL, AWS Lambda, Spark, AWS QuickSight Advanced Python Skills Data Engineering ETL and ELT Skills Experience of Cloud Platforms (AWS or GCP or Azure) Mandatory skills- Datawarehouse, ETL, SQL, Python, AWS Lambda, Glue, AWS Redshift.

Posted 2 months ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Coimbatore

Work from Office

Skills: AWSCompute, Networking, Security, EC2, S3, IAM, VPC, LAMBDA, RDS, ECS, EKS, CLOUDWATCH, LOAD BALANCERS,Autoscaling, CloudFront, Route53, Security Groups, DynamoDB, CloudTrail, REST API's, Fast-API, Node.js (Mandatory) Azure (Overview) (Optional) GCP (Overview) (Optional) Programming/IAC Skills: Python (Mandatory) Chef (Mandatory) Ansible (Mandatory) Terraform (Mandatory) Go (optional) Java (Optional) Candidate should be more than 4 years on cloud Development. Presently should be working on cloud Development.

Posted 2 months ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. AWS Data/API Gateway Pipeline Engineer responsible for designing, building, and maintaining real-time, serverless data pipelines and API services. This role requires extensive hands-on experience with Java, Python, Redis, DynamoDB Streams, and PostgreSQL, along with working knowledge of AWS Lambda and AWS Glue for data processing and orchestration. This position involves collaboration with architects, backend developers, and DevOps engineers to deliver scalable, event-driven data solutions and secure API services across cloud-native systems. Key Responsibilities API & Backend Engineering Build and deploy RESTful APIs using AWS API Gateway, Lambda, and Java and Python. Integrate backend APIs with Redis for low-latency caching and pub/sub messaging. Use PostgreSQL for structured data storage and transactional processing. Secure APIs using IAM, OAuth2, and JWT, and implement throttling and versioning strategies. Data Pipeline & Streaming Design and develop event-driven data pipelines using DynamoDB Streams to trigger downstream processing. Use AWS Glue to orchestrate ETL jobs for batch and semi-structured data workflows. Build and maintain Lambda functions to process real-time events and orchestrate data flows. Ensure data consistency and resilience across services, queues, and databases. Cloud Infrastructure & DevOps Deploy and manage cloud infrastructure using CloudFormation, Terraform, or AWS CDK. Monitor system health and service metrics using CloudWatch, SNS and structured logging. Contribute to CI/CD pipeline development for testing and deploying Lambda/API services. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor's degree in computer science, Engineering, or a related field. Over 6 years of experience in developing backend or data pipeline services using Java and Python . Strong hands-on experience with: AWS API Gateway , Lambda , DynamoDB Streams Redis (caching, messaging) PostgreSQL (schema design, tuning, SQL) AWS Glue for ETL jobs and data transformation Solid understanding of REST API design principles, serverless computing, and real-time architecture. Preferred Skills and Experience Familiarity with Kafka, Kinesis, or other message streaming systems Swagger/OpenAPI for API documentation Docker and Kubernetes (EKS) Git, CI/CD tools (e.g., GitHub Actions) Experience with asynchronous event processing, retries, and dead-letter queues (DLQs) Exposure to data lake architectures (S3, Glue Data Catalog, Athena) Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 2 months ago

Apply

4.0 - 9.0 years

35 - 50 Lacs

Bengaluru

Work from Office

Skill : Amazon Connect Developer/ Lead Location :PAN India Job Descroption: 1. Minimum exp 3-9 years, 2.Strong experience in contact center development 3.Experience in creating AC flows (Amazon Connect) , Lex chatbots and Lambda functions 4.Java / node.js Architect with knowledge on AWS environment, Design and develop APIs (Rest and SOAP services) 5.Knowledge on AWS Lambda services and familiarity in AWS environment and eco system. 6.Knowledge on Spring, Maven, Hibernate 7.Knowledge on Data base technologies like MySQL or SQL Server or DB2/ RDS 8. Application Development experience in any of Java, C#, Node.js, Python, PHP

Posted 2 months ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Business Analysis Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : Bachelor of Engineering in Electronics or any related stream Summary :Working closely with stakeholders across departments, the Business Analyst gathers and documents requirements, conducts data analysis, and supports project implementation to ensure alignment with business objectives. Roles & Responsibilities:1.Collaborate with stakeholders to gather, document, and validate business and technical requirements related to AWS cloud-based systems.2.Analyze current infrastructure, applications, and workflows to identify opportunities for migration, optimization, and cost-efficiency on AWS.3.Assist in creating business cases for cloud adoption or enhancements, including ROI and TCO analysis.4.Support cloud transformation initiatives by developing detailed functional specifications and user stories.5.Liaise with cloud architects, DevOps engineers, and developers to ensure solutions are aligned with requirements and business goals.6.Conduct gap analyses, risk assessments, and impact evaluations for proposed AWS solutions.7.Prepare reports, dashboards, and presentations to communicate findings and recommendations to stakeholders.8.Ensure compliance with AWS best practices and relevant security, governance, and regulatory requirements. Professional & Technical Skills: 1.Proven experience (3+ years) as a Business Analyst, preferably in cloud computing environments.2.Solid understanding of AWS services (EC2, S3, RDS, Lambda, IAM, etc.) and cloud architecture.3.Familiarity with Agile and DevOps methodologies.4.Strong analytical, problem-solving, and documentation skills.5.Excellent communication and stakeholder management abilities.6.AWS certification (e.g., AWS Certified Cloud Practitioner or Solutions Architect Associate) is a plus.7.Have well-developed analytical skills, a person who is rigorous but pragmatic, being able to justify decisions with solid rationale. Additional Information:- The candidate should have minimum 3 years of experience in Business Analyst.- This position is based at our Hyderabad office.- A Bachelor of Engineering in Electronics or any related stream is required. Qualification Bachelor of Engineering in Electronics or any related stream

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions with team members to brainstorm innovative solutions and ensure that the applications align with business objectives. Your role will also include reviewing design documents and providing feedback to enhance application performance and user experience, all while maintaining a focus on quality and efficiency. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Stencil.js, Vue.js; React; Python; Auth0, Terraform; Azure, Azure-ChatGPT; GenAI Basics; AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare, Cloudflare Workers; REST API; GitHub; Web Server; SQL Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of web development principles and best practices.- Experience with content management systems and their implementation.- Familiarity with front-end technologies such as HTML, CSS, and JavaScript.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will also engage in discussions with team members to ensure that the design aligns with business objectives and technical feasibility, while continuously iterating on your designs based on feedback and testing outcomes. Your role will be pivotal in ensuring that the applications developed are user-friendly, efficient, and meet the highest standards of quality. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Vue.js; React; Python; GenAI Basics, AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare; REST API; GitHub; Web Server; SQL. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and brainstorming sessions to foster innovative solutions.- Mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of application design principles and methodologies.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and data modeling.- Ability to create and maintain technical documentation. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, ensuring that the applications meet the required standards and specifications while fostering a collaborative environment for your team members. Roles & Responsibilities:- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously assess and improve application performance and user experience.- A resource with six years of experience and expertise in AWS is expected to take on a variety of responsibilities that leverage their technical skills and industry knowledge. Cloud Architecture Design:Developing and implementing scalable and secure cloud architectures tailored to meet business needs.- Deployment and Management:Overseeing the deployment of applications and services on AWS, ensuring optimal performance and reliability.- Cost Optimization:Analyzing cloud usage and implementing strategies to optimize costs while maintaining service quality.- Security Compliance:Ensuring that all AWS services comply with security best practices and organizational policies.-Collaboration and Mentorship:Working closely with cross-functional teams and mentoring junior staff to enhance their AWS skills and knowledge.- Troubleshooting and Support:Providing technical support and troubleshooting for AWS-related issues, ensuring minimal downtime and disruption.- Continuous Learning:Staying updated with the latest AWS features and industry trends to continuously improve cloud solutions and practices.- This role is pivotal in driving cloud initiatives and ensuring that the organization maximizes its investment in AWS technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Amazon Web Services (AWS).- Strong understanding of cloud architecture and deployment strategies.- Experience with application lifecycle management and DevOps practices.- Familiarity with containerization technologies such as Docker and Kubernetes.- Ability to troubleshoot and resolve application issues efficiently.- A professional with six years of experience and expertise in AWS possesses a robust skill set that includes cloud architecture design, deployment, and management of scalable applications.- They demonstrate proficiency in various AWS services such as EC2, S3, RDS, and Lambda, enabling them to optimize cloud solutions for performance and cost-efficiency.- Their experience also encompasses implementing security best practices, automating processes using AWS tools, and collaborating effectively with cross-functional teams to drive project success. -This resource is adept at troubleshooting and resolving issues, ensuring high availability and reliability of cloud-based systems.- A resource with six years of experience in AWS possesses a robust technical skill set that encompasses a variety of cloud services and solutions. -Their expertise typically includes proficiency in AWS core services such as EC2, S3, RDS, and Lambda, enabling them to design, deploy, and manage scalable applications in the cloud.-They are adept at implementing security best practices, optimizing costs, and ensuring high availability of services. Additionally, their experience often extends to automation tools like CloudFormation and Terraform, as well as monitoring and logging services such as CloudWatch.- This combination of skills allows them to effectively contribute to cloud architecture and operations within an organization. Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Container based solutions. Strong experience with Node.js and AWS stack - AWS Lambda, AWS APIGateway, AWS CDK, AWS DynamoDB, AWS SQS. Experience with infrastructure as a code using AWS CDK.Expertise in encryption and decryption techniques for securing APIs, API Authentication and Authorization Primarily more experience is required on Lambda and APIGateway. Candidates having the AWS Certified Cloud Practitioner / AWS Certified Developer Associate certifications will be preferred Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle

Posted 2 months ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 months ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 months ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Roles & Responsibilities: 3+ years of working experience in data engineering. Hands-on keyboard' AWS implementation experience across a broad range of AWS services. Must have in depth AWS development experience (Containerization - Docker, Amazon EKS, Lambda, EC2, S3, Amazon DocumentDB, PostgreSQL) Strong knowledge of DevOps and CI/CD pipeline (GitHub, Jenkins, Artifactory) Scripting capability and the ability to develop AWS environments as code Hands-on AWS experience with at least 1 implementation (preferred in an Enterprise scale environment) Experience with core AWS platform architecture, including areas such asOrganizations, Account Design, VPC, Subnet, segmentation strategies. Backup and Disaster Recovery approach and design Environment and application automation CloudFormation and third-party automation approach/strategy Network connectivity, Direct Connect and VPN AWS Cost Management and Optimization Skilled experience in Python libraries (NumPy, Pandas dataframe)

Posted 2 months ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Chennai

Work from Office

Full Stack Engineer ReactJs,NodeJs.,Next,Nest,Express. At least 6 years in relevant experience and overall 8-10 years in total for Full Stack Engineer (React/NodeJS). Technical Skills Required Cloud Platform Amazon Web Services(AWS) S3, IAM roles and policies, Lambda, API Gateway, Cognito user pool, Cloudwatch Programming Languages React.Js, Node.Js Databases Postgres SQL, Mongo DB, AWS Dynamo DB Scripting Languages JavaScript, TypeScript, HTML, XML Application Servers Tomcat 6.0 / 7.0, Nginx1.23.2 Framework Next Js, Nest Js, Express Js Version Control Systems Git Lab

Posted 2 months ago

Apply

8.0 - 10.0 years

12 - 16 Lacs

Noida

Work from Office

We are seeking an experienced Lead Database Administrator ( DBA ) with a strong background in Oracle, MySQL, and AWS to join our growing team. In this role, you will be responsible for overseeing the management, performance, and security of our database environments, ensuring high availability and optimal performance. You will lead a team of DBA s and work collaboratively with various departments to support database needs across the organization. Key Responsibilities: Database Administration: Oversee and manage Oracle, MySQL, and cloud-based databases (AWS RDS, Aurora, etc.) in a production environment. Ensure high availability, performance tuning, backup/recovery, and security of all databases. Perform regular health checks, performance assessments, and troubleshooting for all database platforms. Implement database changes, patches, and upgrades in a controlled manner, ensuring minimal downtime. Cloud Infrastructure Management: Design, implement, and manage database systems on AWS, including AWS RDS, Aurora, and EC2-based database instances. Collaborate with cloud engineers to optimize database services and architecture for cost, performance, and scalability. Team Leadership: Lead and mentor a team of DBAs, providing guidance on database best practices and technical challenges. Manage and prioritize database-related tasks and projects to ensure timely completion. Develop and enforce database standards, policies, and procedures. Database Optimization: Monitor database performance and optimize queries, indexes, and database structures to ensure efficient operations. Tune databases to ensure high availability and fast query response times. Security and Compliance: Implement and maintain robust database security practices, including access controls, encryption, and audit logging. Ensure databases comply with internal and external security standards, regulations, and policies. Disaster Recovery Backup: Design and maintain disaster recovery plans, ensuring business continuity through regular testing and validation of backup and recovery processes. Automate database backup processes and ensure backups are performed regularly and correctly. Collaboration Support: Work closely with development teams to provide database support for application development, data modeling, and schema design. Provide 24/7 on-call support for critical database issues or emergencies. Required Skills Qualifications: Technical Expertise: Extensive experience in Oracle and MySQL database administration (version 11g and higher for Oracle, 5.x and higher for MySQL). Strong understanding of AWS cloud services related to database management, particularly AWS RDS, Aurora, EC2, and Lambda. Experience in database performance tuning, query optimization, and indexing. Proficient in backup and recovery strategies, including RMAN for Oracle and MySQL backup techniques. Solid understanding of database replication, clustering, and high-availability technologies. Leadership Management: Proven experience leading and mentoring teams of DBAs. Strong project management skills, with the ability to manage multiple database-related projects simultaneously. Excellent problem-solving and analytical skills. Security: Knowledge of database security best practices, including encryption, auditing, and access control. Experience implementing compliance frameworks such as PCI DSS, GDPR, or HIPAA for database systems. Additional Skills: Strong scripting skills (e.g., Shell, Python, Bash) for automation and database maintenance tasks. Experience with database monitoring tools (e.g., Oracle Enterprise Manager, MySQL Workbench, CloudWatch). Familiarity with containerization technologies (Docker, Kubernetes) and CI/CD pipelines for database deployments is a plus. Education Certifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Oracle Certified Professional (OCP) and MySQL certifications preferred. AWS Certified Database - Specialty or similar AWS certification is a plus. Preferred Skills: Familiarity with other database technologies (SQL Server, PostgreSQL, NoSQL). Experience with DevOps practices and tools for database automation and infrastructure-as-code (e.g., Terraform, CloudFormation).

Posted 2 months ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Noida, Pune, Bengaluru

Work from Office

- Java,Spring boot,Microservices,Azure, Core Java, Generics, Collections- Streams, Lambda- File Handling Multithreading - Spring boot experience must, with any RDBMS - Experience with Microservices - Experience with docker containers - Experience with CI/CD(Jenkins/AWS/Azure) tools and SCM tools(Git) - Good to have exposure to Unix/Linux eco systems and shell commands - Strong analytical and communication skills - Participate in scrum process and deliver stories/features according to the schedule - Collaborate with team, leads, and stakeholders to understand the scope and design of a deliverable - Ensure good quality coding by adhering to best practices - Experience working in Agile methodology - Triage production support issues post-deployment and drive solutions as required - Able to build reusable components, frameworks and libraries which can be leveraged across applications - Work very closely with leads

Posted 2 months ago

Apply

8.0 - 14.0 years

15 - 20 Lacs

Bengaluru

Work from Office

- Technical Lead with a total IT experience of 8-12 years. - 2+ years of experience as a Technical Lead. - Strong programming knowledge in one of the following technology areas: 1. Python: Familiarity with frameworks like FastAPI or Flask, along with datalibraries like NumPy and Pandas. 2. .NET: Knowledge of ASP.NET and Web API development. 3. Java: Proficiency with Spring or Spring Boot. - Experience with any one of the following cloud platforms and services: 1. Azure: Azure App Service or Azure Functions, Azure Storage 2. AWS: Elastic Beanstalk, Lambda, S3 - Experience with at least one of the following databases: Oracle, Azure SQL,SQL Server, Cosmos DB, MySQL, PostgreSQL, or MongoDB. - Minimum of 3 months experience in developing GenAI solutions using any LLMsand deploying them on cloud platforms. - Lead, mentor, and manage a team of developers to deliver complex ITsolutions.

Posted 2 months ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Design, develop, and maintain scalable and efficient Python applications using frameworks like FastAPI or Flask. Develop, test, and deploy RESTful APIs to interact with front-end services. Integrate and establish connections between various relational and non-relational databases (e.g., SQL Alchemy, MySQL, PostgreSQL, MongoDB, etc.). Solid understanding of relational and NoSQL databases and the ability to establish and manage connections from Python applications. Write clean, maintainable, and efficient code, following coding standards and best practices. Leverage AWS cloud services for deploying and managing applications (e.g., EC2, Lambda, RDS, S3, etc.). Troubleshoot and resolve software defects, performance issues, and scalability challenges.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies