Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
20 - 30 Lacs
Hyderabad, Bengaluru
Hybrid
Notice Period - Immediate to 15 days max Virtusa JD: 8+ years of experience in data engineering, specifically in cloud environments like AWS. Dont Not Share Data Science Profiles. Proficiency in Python and PySpark for data processing and transformation tasks. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Deep understanding of ETL concepts and best practices.. Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data Pipelines: Design, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL Development: Develop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow Automation: Build and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data Integration: Work with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and Scaling: Optimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.
Posted 3 weeks ago
7.0 - 9.0 years
7 - 17 Lacs
Pune
Remote
Requirements for the candidate: The role will require deep knowledge of data engineering techniques to create data pipelines and build data assets. At least 4+ years of Strong hands on programming experience with Pyspark / Python / Boto3 including Python Frameworks, libraries according to python best practices. Strong experience in code optimization using spark SQL and pyspark. Understanding of Code versioning, Git repository, JFrog Artifactory. AWS Architecture knowledge specially on S3, EC2, Lambda, Redshift, CloudFormation etc and able to explain benefits of each Code Refactorization of Legacy Codebase: Clean, modernize, improve readability and maintainability. Unit Tests/TDD: Write tests before code, ensure functionality, catch bugs early. Fixing Difficult Bugs: Debug complex code, isolate issues, resolve performance, concurrency, or logic flaws.
Posted 3 weeks ago
6.0 - 11.0 years
15 - 25 Lacs
Kolkata, Hyderabad, Bengaluru
Work from Office
Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 3 weeks ago
8.0 - 13.0 years
30 - 40 Lacs
Noida, Hyderabad
Hybrid
Job Title: Data Engineer Location : Noida / Hyderabad (Hybrid 3 days/week) Shift Timings : 2:30 PM to 10:30 PM IST Start Date : Immediate / July 2025 Experience : 8+ years Tech Stack : AWS, Python, PySpark, EMR, Athena, Glue, Lambda, EC2, S3, Git, Data Warehousing, Parquet, Avro, ORC Job Description : We're hiring experienced Data Engineers with a strong background in building scalable data pipelines using AWS and PySpark. You'll work with distributed systems, big data tools, and analytics services to deliver solutions for high-volume data processing. Key Responsibilities : Build and optimize PySpark applications Work with AWS services: EMR, Glue, Lambda, Athena, etc. Implement data modeling and warehousing concepts Collaborate with teams using Git and CI/CD pipelines Utilize formats like Parquet, Avro with compression techniques Apply only if you can join within 2 weeks or are an immediate joiner. To Apply : Send your resume to vijay.s@xebia.com with the following details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location (Noida/Hyderabad) Notice Period / Last Working Day Primary Skills LinkedIn Profile
Posted 3 weeks ago
2.0 - 7.0 years
20 - 35 Lacs
Bengaluru
Hybrid
Role & responsibilities Handle service delivery and ensure performance across all Amazon Connect support areas. Oversee overall support operations, enhancements, and system updates. Act as primary escalation points for incidents. Manage SLAs and ensure service standards are met. Identify process gaps and implement improvements. Lead and mentor Junior engineers. Maintain relationships with internal and external stakeholders. Skills Required: Deep hands-on experience with Amazon Connect Strong knowledge of AWS Lambda, DynamoDB, S3 In-depth understanding of contact flows, queues, routing profile, quick connect, telephony config, and call routing Strong troubleshooting skills in WebRTC and voice issues Experience with CloudWatch, Connect Metrics, CI/CD pipelines Experience integrating with Salesforce. (Service Cloud Voice) Good documentation and process improvement capability Strong leadership and communication skills Preferred candidate profile
Posted 3 weeks ago
8.0 - 13.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Must HaveJava, Microservices, Spring boot (with knowledge of Cloud Implementation - AWS, Azure or GCP, Kubernetes/Docker), Designing Skills (Designing Patterns), Data Structures, Angular (Any front end tech stack), CICD, Lamda Preferred Skills: Technology-Cloud Platform-AWS Database Technology-Java-Java - ALL Technology-UI & Markup Language-Angular JS/Angular 1.x-Angular 2 Technology-Reactive Programming-react JS Technology-Algorithms and Data Structures-Algorithms and Data Structures - ALL Technology-Java-Springboot Technology-Cloud Platform-AWS Networking Services-AWS Transit Gateway Technology-Full stack-Java Full stack Foundational -SDLC-Architecture Designing
Posted 3 weeks ago
6.0 - 11.0 years
25 - 40 Lacs
Bengaluru
Remote
Key Responsibilities: C++: 6-8+ years of working Experience on C++ Programming, Memory Management & File I/O and Streams Concepts. Multithreading : Strong understanding on Multithreading (creating and managing threads, synchronization mechanisms (such as mutexes and condition variables)) & Kernel Level. Linux: Good Understanding on develop and triage on Linux with Understanding on (Command-Line Tools, POSIX, Processes, Network) Unit Test: Good understanding on writing Unit Testing for developed Application Coding Test: Evaluate Coding Test & Coding Standards C++ C++: 5+ years of working Experience on C++ Programming, Memory Management & File I/O and Streams Concepts. Multithreading : Strong understanding on Multithreading (creating and managing threads, synchronization mechanisms (such as mutexes and condition variables)) & Kernel Level. Linux: Good Understanding on develop and triage on Linux with Understanding on (Command-Line Tools, POSIX, Processes, Network) Architecture: Strong understanding on building applications on C++ environment Good to have Skill SCM Tool & IDE: Good exposure to AgAbility to integrate IDE with Source Code system such as ClearCase, Ability to setup Linux IDEile & Scrum Methodologies, GIT, Confluence. Web Application: Good understanding on Developing Web Application on C++ Platform Project Exposure: Strong understanding on Project and SDLC Process. Troubleshooting: Experience in Debugging and troubleshooting Performance optimization: Performance optimization (like reducing memory allocations, optimizing loops, and using inline functions). SCM Tool & IDE: Good exposure to AgAbility to integrate IDE with Source Code system such as ClearCase, Ability to setup Linux IDEile & Scrum Methodologies, GIT, Confluence. Web Application: Good understanding on Developing Web Application on C++ Platform Project Exposure: Strong understanding on Project and SDLC Process. Troubleshooting: Experience in Debugging and troubleshooting and performance optimization(like reducing memory allocations, optimizing loops, and using inline functions). Docker & Containers: Good understanding in Docker & Containers for Deployment Soft Skill Communication : Concise and articulate written and verbal communication Interpersonal Skills: Maintaining positive relationship by empathy, active listening and emotional intelligence Attitude : Positive attitudes to be more adaptable, collaborative, and able to overcome challenges effectively Decision Making: Understanding the factors that influence decision making and employing appropriate strategies and techniques Collaboration: Working together with others to achieve a common goal or objective Communication: Concise and articulate written and verbal communication Interpersonal Skills: Maintaining positive relationship by empathy, active listening and emotional intelligence Attitude : Positive attitudes to be more adaptable, collaborative, and able to overcome challenges effectively Decision Making: Understanding the factors that influence decision making and employing appropriate strategies and techniques Collaboration: Working together with others to achieve a common goal or objective
Posted 3 weeks ago
5.0 - 10.0 years
22 - 37 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Inviting applications for the role of Senior Principal Consultant-Data Engineer, AWS! Locations Bangalore, Hyderabad, Kolkata Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process
Posted 3 weeks ago
2.0 - 5.0 years
3 - 9 Lacs
Ahmedabad
Work from Office
Sr. AWS developer * AWS Lambda * SFTP and S3 * Python, Java, JavaScript * Terraform, API Gateway ,IAM roles Annual bonus
Posted 3 weeks ago
4.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Hybrid
Job Title: AWS Engineer Experience: 4 - 8 Years Location: Bengaluru (Hybrid 2- 3 Days Onsite per Week) Employment Type: Full-Time Notice Period: Only Immediate to 15 Days Joiners Preferred Job Description: We are looking for an experienced AWS Engineer to join our dynamic data engineering team. The ideal candidate will have hands-on experience building and maintaining robust, scalable data pipelines and cloud-based architectures on AWS. Key Responsibilities: Design, develop, and maintain scalable data pipelines using AWS services such as Glue, Lambda, S3, Redshift, and EMR Collaborate with data scientists and ML engineers to operationalize machine learning models using AWS SageMaker Implement efficient data transformation and feature engineering workflows Optimize ETL/ELT processes and enforce best practices for data quality and governance Work with structured and unstructured data using Amazon Athena, DynamoDB, RDS, and similar services Build and manage CI/CD pipelines for data and ML workflows using AWS CodePipeline, CodeBuild, and Step Functions Monitor data infrastructure for performance, reliability, and cost-effectiveness Ensure data security and compliance with organizational and regulatory standards Required Skills: Strong experience with AWS data and ML services Solid knowledge of ETL/ELT frameworks and data modeling Proficiency in Python, SQL, and scripting for data engineering Experience with CI/CD and DevOps practices on AWS Good understanding of data governance and compliance standards Excellent collaboration and problem-solving skills
Posted 3 weeks ago
5.0 - 10.0 years
16 - 30 Lacs
Hyderabad
Work from Office
Dear Candidate, Greetings of the day..!! We have an urgent requirement for a .NET Full Stack Developer with one of our esteemed client and please find the requirement below: Position Details: Skillset: .NET Core, AWS Services (S3, EC2, Lambda, SQS and Cloud watch) Experience : 5 to 10 Years Location (Job): Hyderabad Work from Office: 5 days a week Interview Location: Hyderabad Interview Dates: 18th & 19th July 2025 Interview Mode: Face to Face Interview Notice Period : Immediate to 30 Days Walk-in Interview Details: Interview Dates: 18th & 19th July 2025 Interview Location: Hyderabad Job Location: Hyderabad Required Skills & Qualifications: Full Stack Development: Proven experience using .NET technologies (e.g., .NET 4.6.1, .NET Core 3, ASP.NET Web API 2). Cloud Technologies: Experience with AWS services like S3, Lambda, CloudWatch, and EC2. If you are interested and available for the walk-in interview, please send your updated resume to Sneha.k@precisiontechcorp.com. We look forward to hearing from you soon! Thanks & Regards Sneha K Sneha.k@precisiontechcorp.com
Posted 3 weeks ago
3.0 - 8.0 years
10 - 18 Lacs
Pune
Work from Office
Required Skills & Qualifications : Strong hands-on experience with AWS Glue , AWS Lambda , and Azure Data Services . Experience with Databricks for large-scale data processing and test validation. Proficiency in PySpark and Python scripting for test automation and data validation. Strong SQL skills for data validation and transformation testing. Familiarity with cloud-native monitoring and logging tools (e.g., CloudWatch, Azure Monitor). Understanding of data warehousing concepts, data lakes, and batch/streaming architectures. Experience with CI/CD pipelines and automated testing frameworks is a plus.
Posted 3 weeks ago
5.0 - 8.0 years
9 - 19 Lacs
Pune, Chennai
Work from Office
AWS+Python senior engineer Experience : 5-8 Years Location : Pune, Chennai Skills : Python, SQL, Git, AWS (Lambda, ECS,EC2 S3) Desired Competencies (Technical/Behavioral Competency) Must-Have** Good knowledge in Python Good knowledge in Rest API development Good knowledge in AWS services Lambda, ECS, EC2, S3 Good Knowledge in Agile concept Good Knowledge in Problem solving, SQL. Good-to-Have Good knowledge Git Good Knowledge in PLSQL Good knowledge in Vue JS. Good knowlwdge in AWS networking. Good Knowledge in Unit Testing using pytest Responsibility of / Expectations from the Role Develop Python code as per requirement Develop Rest API as per Requirement Work in Agile team
Posted 3 weeks ago
7.0 - 12.0 years
15 - 25 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Essential Skills 2+ years of experience in AWS. 4+ years of in-depth knowledge in core Java, spring and Hibernate 2+ years of experience in spring boot 2+ years of experience in RESTful Http services design 2+ years of experience in Java script, JQuery, Bootstrap, Html 5, CSS3 Good to have: Exposure to Microservices, Docker, Kubernetes and cloud deployment.
Posted 3 weeks ago
4.0 - 9.0 years
0 Lacs
Indore, Hyderabad, Pune
Work from Office
SN Required Information Details 1 Role Digital Python : Python Developer for AWS platform 2 Required Technical Skill Set Implement features, sub-components & services leveraging AWS services where possible; Data parsing and processing using Python Test, Troubleshoot, bug fix, deploy. Desired Experience Range 5 10 years Location of Requirement Pune (Sahyadri Park), Hyderabad (Adibatla), Indore Desired Competencies (Technical/Behavioral Competency) Must-Have Hands-on experience in Python and AWS services Good-to-Have S3, IAM, Lambda, Cloud formation SN Responsibility of / Expectations from the Role 1 Should be able to interact with the business and understand the requirements 2 Build technological capabilities and mentor team 3 Experience with Agile methodologies - Scrum, Continuous integration 4 Attention to detail, Desire and ability to work in a multi-distributed team environment 5 Ability to excel in a short timeframe under short sprints 6 Strong communication and documentation skills
Posted 4 weeks ago
12.0 - 17.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Work Location: Bangalore Experience :10+yrs Required Skills: Experience AWS cloud and AWS services such as S3 Buckets, Lambda, API Gateway, SQS queues; Experience with batch job scheduling and identifying data/job dependencies; Experience with data engineering using AWS platform and Python; Familiar with AWS Services like EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway; Familiar with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Thanks & Regards Suganya R suganya@spstaffing.in
Posted 4 weeks ago
4.0 - 9.0 years
0 - 3 Lacs
Hyderabad, Bengaluru
Hybrid
Dear Candidate, Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history. Seasons Greetings! Role: Python Data Engineer Work Nature - Contract To hire (3rd Party Payroll) Work Location Pan India Total Experience 4+ Yrs Immediate Joiners only Email: tuppari.pradeep@firstmeridianglobal.com Job Description: Pyspark and Scala Other data engineering skills like SQL Good knowledge on pyspark execution, optimization, and python. Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history to the mentioned email: tuppari.pradeep@firstmeridianglobal.com Element Details Full Name as Per Govt Proofs: Contact Number Alternate Contact Number Email ID Date Of Birth Fathers Name Total Experience Relevant Experience PAN Card Number Current CTC Expected CTC Current Work Location Preferred Location Open for Relocation (Yes/No)- Current Company Name Notice Period Mode of Employment (Contract/Permanent) If Contract, please provide payroll company: Do you know anyone who could be interested in this profile? Feel free to share this email with the person. You might make their day just by forwarding this email. Regards Pradeep tuppari.pradeep@firstmeridianglobal.com
Posted 4 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Looking for Full Stack developer who can join immediately. Job Position (Title) : UI Full Stack Engineer Experience Required : 6 Years + Job Type : FTE Location : Bangalore ( Domlur) Technical Skill Requirements React JS, Redux, TypeScript, JavaScript, JS utility libraries, CSS/Flex/Grid, HTML Node Js, Express JS, Microservice Design Patterns AWS - Lambda, API Gateway, DynamoDB, S3, SQS, SNS, Cloud Watch, EC2, CI/CD Role and Responsibilities Managing the complete software development process from conception to deployment Maintaining and upgrading the software following deployment Managing the end-to-end life cycle to produce software and application. Overseeing and guiding the analysing, writing, building, and deployment of software Overseeing the automated testing and providing feedback to management during the development process Modifying and testing changes to previously developed programs Required Skills 6+ years of experience in developing enterprise level applications, using Nodejs, ReactJS, Typescript, JavaScript, HTML, CSS + AWS. Experience working with AWS is must. Should have good experience into any of the database like MongoDB, MySQL Proficient in any of the Unit testing Worked on Rest API, CI/CD Excellent verbal and written communication and collaboration skills to effectively communicate with both business and technical teams. Comfortable working in a fast-paced, result-oriented environment.
Posted 1 month ago
7.0 - 12.0 years
30 - 45 Lacs
Pune
Remote
Role - Python Full stack Developer Must have skills - ReactJS, Node.Js, STRONG Python, Flask, Django, Numpy, FastAPI, Cloud (AWS Preferred) Exp - 6 to 10 Yrs Loc – Remote Np - Immediate Joiners and max 30 days.
Posted 1 month ago
6.0 - 9.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Your Role Knowledge in Cloud Computing by using AWS Services like Glue, Lamda, Athena, Step Functions, S3 etc. Knowledge in programming language Python/Scala. Knowledge in Spark/PySpark (Core and Streaming) and hands-on to transform using Streaming. Knowledge building real time or batch ingestion and transformation pipelines. Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Your Profile Working experience and strong knowledge in Databricks is a plus. Analyze existing queries for performance improvements. Develop procedures and scripts for data migration. Provide timely scheduled management reporting. Investigate exceptions regarding asset movements. What will you love working at Capgemini Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you canbring your original self to work . Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internalsports events , yoga challenges, or marathons. Capgemini serves clients across industries, so you may get to work on varied data engineering projects involving real-time data pipelines, big data processing, and analytics. You'll work extensively with AWS services like S3, Redshift, Glue, Lambda, and more.
Posted 1 month ago
8.0 - 13.0 years
7 - 14 Lacs
Pune, Mumbai (All Areas)
Hybrid
Job Title: Lead Data Engineer Location: Mumbai / Pune Experience: 8+ yrs Job Summary: We are seeking a technically strong and delivery-focused Lead Engineer to support and enhance enterprise-grade data and application products under the Durables model. The ideal candidate will act as the primary technical interface for the client, ensuring high system availability, performance, and continuous improvement. This role requires a hands-on technologist with strong team management experience, cloud (AWS) expertise, and excellent communication skills to handle client interactions and drive technical decisions. Key Responsibilities: Support & Enhancement Leadership Act as the primary technical lead for support and enhancement of assigned products in the Durable portfolio. Ensure incident resolution, problem management, and enhancement delivery within agreed SLAs. Perform root cause analysis (RCA) and provide technical solutions to recurring issues. Should design data engineering solutions end to end. Ability to come up with scalable and modular solutions. Experience working in Agile implementations. Technical Ownership Provide technical direction and architectural guidance for improvements, optimizations, and issue resolutions. Drive best practices in code performance tuning, ETL processing, and cloud-native data management. Lead the modernization of legacy data pipelines and applications by leveraging AWS services (Glue, Lambda, Redshift, S3, EMR, Athena, etc.). Leverage PySpark, SQL, Python, and other tools to manage big data processing pipelines efficiently. Client Engagement Maintain high visibility with client stakeholders, act as a trusted technical advisor. Proactively identify and suggest improvements or innovation areas to improve business outcomes. Participate in daily stand-ups, retrospectives, and client presentations; communicate technical concepts clearly. Team & Delivery Management Lead a cross-functional team of engineers, ensuring effective task allocation, mentorship, and upskilling. Monitor team performance, support capacity planning, and ensure timely and high-quality deliveries. Ensure adherence to governance, documentation, change management practices and High Availability. Process & Quality Assurance Implement and ensure compliance with engineering best practices including CI/CD, version control, and automated testing. Define support procedures, documentation standards, and ensure knowledge transition and retention within the team. Identify risk areas, dependencies, and propose mitigation strategies. Required Skills & Qualifications: 8+ years of experience in Data Engineering / Application Support and Development. 4+ Strong hands-on expertise in AWS ecosystem (Glue, Lambda, Redshift, S3, Athena, EMR, CloudWatch). Proficiency in PySpark, SQL, Python, and handling big data pipelines. Strong application debugging skills across batch and near real-time systems. Good knowledge of incident lifecycle management, RCA, and performance optimization. Proven experience leading engineering teams (3+ years), preferably in support/enhancement environments. Excellent communication skills with proven client-facing capabilities. Strong documentation and process adherence mindset. Experience with tools like JIRA, Confluence, Git, Jenkins, or any CI/CD pipeline. Good to Have: Experience in on-prem to AWS migration projects. Familiarity with legacy tech and their interaction with modern cloud stacks. Good knowledge of designing Hive tables with partitioning for performance.
Posted 1 month ago
7.0 - 12.0 years
14 - 24 Lacs
Hyderabad, Ahmedabad, Chennai
Hybrid
Job Title: UI Lead Key Responsibilities: Develop and maintain web applications using React , Node.js , TypeScript , and SCSS Write and maintain unit tests using Jest and react-test-renderer Collaborate with backend and DevOps teams to integrate APIs and ensure seamless functionality Implement infrastructure components leveraging AWS CDK , Lambda , and API Gateway Participate in code reviews, performance optimization, and deployment support Follow best practices for responsive design, security, and accessibility Required Skills: Strong hands-on experience with React , Node.js , and TypeScript Proficient in SCSS and modern frontend development workflows Working knowledge of Jest with react-test-renderer Exposure to AWS services like CDK, Lambda, and API Gateway Solid understanding of RESTful APIs and cloud-native application architecture Additional Requirements: Ability to work independently and collaboratively in a fast-paced environment Strong problem-solving skills and attention to detail Excellent communication and team collaboration abilities Regards, Vibha
Posted 1 month ago
3.0 - 5.0 years
22 - 30 Lacs
Hyderabad
Hybrid
Key Skills: AWS, Java, Kafka, Springboot, Python, Lamda Roles and Responsibilities: Application Development: Design, develop, test, and maintain Spring Boot applications. Microservices Architecture: Build and integrate microservices using Spring Boot. API Development: Develop and manage RESTful APIs. Cloud Deployment: Deploy and manage Spring Boot applications on AWS. AWS Infrastructure: Utilize AWS services like EC2, S3, Lambda, and others. Database Management: Work with relational and NoSQL databases. CI/CD: Implement and maintain CI/CD pipelines for automated deployments. Troubleshooting and Monitoring: Identify and resolve issues in production environments, and monitor application performance. Collaboration: Collaborate with other developers, product managers, and stakeholders. Skills Required: Programming Languages: Java, Python, SQL. Frameworks: Spring Boot, Spring Framework. Cloud Platforms: AWS (EC2, S3, Lambda, etc.). Databases: Relational (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB). API Development: RESTful APIs. Microservices: Experience with microservices architecture. CI/CD: Experience with CI/CD tools and pipelines. Tools: Docker, Kubernetes. Agile Development: Experience with Agile methodologies. Troubleshooting and Debugging: Strong debugging and problem-solving skills. Communication: Excellent communication and collaboration skills. 3 - 5 Years of Experience Education: College degree (Bachelor) in related technical/business areas or equivalent work experience.
Posted 1 month ago
5.0 - 10.0 years
5 - 12 Lacs
Mumbai
Work from Office
Greeting from Future Focus Infotech!!! We have multiple opportunities AWS Developer Exp: 5+yrs Skills: Proficiency in Databricks platform, management, and optimization. Strong experience in AWS Cloud, particularly in data engineering and administration, with expertise in Apache Spark, S3, Athena, Glue, Kafka, Lambda, Redshift, and RDS. Proven experience in data engineering performance tuning and analytical understanding in business and program contexts. Solid experience in Python development, specifically in pySpark within the AWS Cloud environment, including experience with Terraform. Knowledge of databases (Oracle, SQL Server, PostgreSQL, Redshift, MySQL, or similar) and advanced database querying. Experience with source control systems (Git, Bitbucket) and Jenkins for build and continuous integration. Location : Mumbai Job Type- This is a Permanent position with Future Focus Infotech Pvt Ltd & you will be deputed with our client. A small glimpse about Future Focus Infotech Pvt Ltd. (Company URL: www.focusinfotech.com) If you are interested in above opportunity, send updated CV and below information to reema.b@focusinfotech.com Kindly mention the below details. Total Years of Experience: Current CTC: Expected CTC: Notice Period : Current location: Available for interview on Weekdays : Pan Card : Thanks & Regards, Reema reema.b@focusinfotech.com 8925798887
Posted 1 month ago
6.0 - 11.0 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
-Design, build & deployment of cloud-native and hybrid solutions on AWS and GCP -Exp in Glue, Athena, PySpark & Step function, Lambda, SQL, ETL, DWH, Python, EC2, EBS/EFS, CloudFront, Cloud Functions, Cloud Run (GCP), GKE, GCE, EC2, ECS, S3, etc
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough