Jobs
Interviews

519 Lambda Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

7 - 14 Lacs

Kolkata

Remote

Hiring for Full Stack .Net Developer Immediate to 20 days joiners only. Mandatory skill set- .net core/MVC React js AWS (Lambda) Required Candidate profile This Role is purely Full stack Developer with .net Full stack & AWS Lamda for role. Its permanent remote/Work From Oppourtunity. Immediate joiner Exp. range 3 Yrs. to 5 Yrs.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

12 - 24 Lacs

Bengaluru

Work from Office

Exp writing test cases, building test frameworks, AWS services ( Lambda, S3, EC2, CloudWatch, Datadog. Exp in software/systems testing concepts. Exp Automotive Security, IT Security, Linux security concepts. Testing embedded systems, IoT devices.

Posted 3 weeks ago

Apply

1.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

We are seeking a Senior Engineer Developer with 4-6 years of experience to take on the responsibility of designing and developing AWS IoT and cloud-based applications. The ideal candidate should possess a strong proficiency in TypeScript, Node.js, and cloud technologies. In this role, you will collaborate closely with cross-functional teams to deliver high-quality solutions. Your key responsibilities will include designing and developing AWS IoT/cloud-based applications using TypeScript and Node.js. You will be expected to work effectively with onsite, offshore, and cross-functional teams, such as Product Management, frontend developers, and SQA teams, to ensure the timely and high-quality delivery of projects. Proactively identifying risks and potential issues early in the development lifecycle and developing proofs of concept (POCs) to mitigate these risks will be crucial. Strong communication skills, self-direction, high motivation, and organizational abilities are essential qualities for this role. You should also demonstrate analytical thinking and problem-solving skills while managing multiple projects. Writing high-quality code using Test-Driven Development (TDD) and code quality tools is another key aspect of this position. The ideal candidate should have a total of 6+ years of experience in the software domain, with at least 3 years of experience in cloud-native applications. A solid working knowledge of TypeScript and Node.js is required. Additionally, 1-3 years of experience with AWS Serverless technologies (Lambda, SQS, SNS, Kinesis Streams), 3-5 years of hands-on experience with Node.js, JavaScript, and TypeScript, and 3-5 years of experience with RDBMS and NoSQL datastores are expected. Experience in building high TPS, high availability, and low latency distributed systems and 1-2 years of experience in Infrastructure as Code are also necessary. Familiarity with Microservices Architecture, cloud security, security design controls, developing APIs (REST, GraphQL), MQTT protocols, and Agile/Scrum methodologies is highly desirable. Qualifications: - Education: B.E/B.Tech/MCA/M.Tech or equivalent qualification - Certifications: AWS Certification preferred but not mandatory If you are passionate about developing cutting-edge cloud-based applications and possess the required experience and skills, we would love to hear from you. Contact: Khushboo - 7987108409 Nikhil - 8770401020 Job Type: Full-time Schedule: Fixed shift Work Location: In person Education: - Bachelor's (Preferred) Experience: - Angular: 4 years (Preferred) - Total work: 6 years (Preferred) - Java: 4 years (Preferred) Language: - English (Preferred),

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Senior Manager specializing in Data Analytics & AI, you will be a pivotal member of the EY Data, Analytics & AI Ireland team. Your role as a Databricks Platform Architect will involve enabling clients to extract significant value from their information assets through innovative data analytics solutions. You will have the opportunity to work across various industries, collaborating with diverse teams and leading the design and implementation of data architecture strategies aligned with client goals. Your key responsibilities will include leading teams with varying skill sets in utilizing different Data and Analytics technologies, adapting your leadership style to meet client needs, creating a positive learning culture, engaging with clients to understand their data requirements, and developing data artefacts based on industry best practices. Additionally, you will assess existing data architectures, develop data migration strategies, and ensure data integrity and minimal disruption during migration activities. To qualify for this role, you must possess a strong academic background in computer science or related fields, along with at least 7 years of experience as a Data Architect or similar role in a consulting environment. Hands-on experience with cloud services, data modeling techniques, data management concepts, Python, Spark, Docker, Kubernetes, and cloud security controls is essential. Ideally, you will have the ability to effectively communicate technical concepts to non-technical stakeholders, lead the design and optimization of the Databricks platform, work closely with the data engineering team, maintain a comprehensive understanding of the data pipeline, and stay updated on new and emerging technologies in the field. EY offers a competitive remuneration package, flexible working options, career development opportunities, and a comprehensive Total Rewards package. Additionally, you will benefit from support, coaching, opportunities for skill development, and a diverse and inclusive culture that values individual contributions. If you are passionate about leveraging data to solve complex problems, drive business outcomes, and contribute to a better working world, consider joining EY as a Databricks Platform Architect. Apply now to be part of a dynamic team dedicated to innovation and excellence.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Founded in 2006, Rite Software is a global IT consulting company headquartered in Houston, Texas. Rite Software delivers strategic IT solutions for clients facing complex challenges involving cloud applications, cloud infrastructure, analytics and digital transformation. You are looking for an AWS Developer with 3-7 years of experience. The bare minimum requirements include proficiency in Python/Java scripting and API development. It would be good to have experience with Django and Flask. Keywords for this role include serverless architecture, Lambda, EC2, S3, CloudFormation, and CloudWatch.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

The client is a global technology consulting and digital solutions company with a vast network of entrepreneurial professionals spread across more than 30 countries. They cater to over 700 clients, leveraging their domain and technology expertise to drive competitive differentiation, enhance customer experiences, and improve business outcomes. As a part of the agile team, you will be responsible for developing applications, leading design sprints, and ensuring timely deliveries. Your role will involve designing and implementing low-latency, high-availability, and high-performance applications. You will also be required to ensure code modularity using microservices architecture in both frontend and backend development, following best practices in backend API development. Throughout the software development lifecycle, you will write code that is maintainable, clear, and concise. Your technical leadership will be crucial in mentoring team members to help them achieve their goals. Additionally, you will manage application deployment with a focus on security, scalability, and reliability. Your responsibilities will also include managing and evolving automated testing setups for backend and frontend applications to facilitate faster bug reporting and fixing. A solid understanding of RESTful API design, database design, and management, along with experience in version control systems, will be essential for this role. Strong problem-solving and communication skills, along with proficiency in object-oriented programming, C or VBNet, and writing reusable libraries are required. Familiarity with design and architectural patterns like Singleton and Factory patterns, RDBMS such as SQL, Postgres, MySQL, and writing clean, readable, and maintainable code will be beneficial. Experience in implementing automated testing platforms, unit tests, and identifying opportunities to optimize code and improve performance will be valuable assets. Understanding the best software engineering coding practices is essential for this position. Nice-to-have skills include proficiency in AWS services like EC2, S3, RDS, EKS, Lambda, CloudWatch, CloudFront, VPC, experience with Git, DevOps tools such as Jenkins, UCD, Kubernetes, ArgoCD, Splunk, and skills in NET-ReactJs.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Cloud Data Integration Consultant, you will be responsible for leading a complex data integration project that involves API frameworks, a data lakehouse architecture, and middleware solutions. The project focuses on technologies such as AWS, Snowflake, Oracle ERP, and Salesforce, with a high transaction volume POS system. Your role will involve building reusable and scalable API frameworks, optimizing middleware, and ensuring security and compliance in a multi-cloud environment. Your expertise in API development and integration will be crucial for this project. You should have deep experience in managing APIs across multiple systems, building reusable components, and ensuring bidirectional data flow for real-time data synchronization. Additionally, your skills in middleware solutions and custom API adapters will be essential for integrating various systems seamlessly. In terms of cloud infrastructure and data processing, your strong experience with AWS services like S3, Lambda, Fargate, and Glue will be required for data processing, storage, and integration. You should also have hands-on experience in optimizing Snowflake for querying and reporting, as well as knowledge of Terraform for automating the provisioning and management of AWS resources. Security and compliance are critical aspects of the project, and your deep understanding of cloud security protocols, API security, and compliance enforcement will be invaluable. You should be able to set up audit logs, ensure traceability, and enforce compliance across cloud services. Handling high-volume transaction systems and real-time data processing requirements will be part of your responsibilities. You should be familiar with optimizing AWS Lambda and Fargate for efficient data processing and be skilled in operational monitoring and error handling mechanisms. Collaboration and support are essential for the success of the project. You will need to provide post-go-live support, collaborate with internal teams and external stakeholders, and ensure seamless integration between systems. To qualify for this role, you should have at least 10 years of experience in enterprise API integration, cloud architecture, and data management. Deep expertise in AWS services, Snowflake, Oracle ERP, and Salesforce integrations is required, along with a proven track record of delivering scalable API frameworks and handling complex middleware systems. Strong problem-solving skills, familiarity with containerization technologies, and experience in retail or e-commerce industries are also desirable. Your key responsibilities will include leading the design and implementation of reusable API frameworks, optimizing data flow through middleware systems, building robust security frameworks, and collaborating with the in-house team for seamless integration between systems. Ongoing support, monitoring, and optimization post-go-live will also be part of your role.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

We are looking for an experienced and visionary Staff Backend Engineer to join our health tech startup. As a Staff Engineer, you will be responsible for architecting, developing, and scaling our platform that supports acute and preventive healthcare. Your role will involve collaborating with senior leadership, mentoring engineers, and leading the development of robust, cloud-native, scalable services. Your expertise will be crucial in driving technical strategy, ensuring high-quality deliverables, and implementing innovative solutions to meet our platform's evolving needs. Your key responsibilities will include spearheading the architecture of scalable backend services, ensuring high availability, fault tolerance, and performance across microservices using Kotlin, Spring Boot, and Python. You will design and develop cloud-native services with AWS technologies like Lambda, EC2, EKS, SQS, SNS, and other services, ensuring optimal utilization of cloud infrastructure. Additionally, you will oversee the development and maintenance of a microservices-based architecture, ensuring efficient service communication and smooth data flow across the platform. As a technical leader, you will provide guidance to engineering teams by setting coding standards, system design best practices, and mentoring senior and junior engineers on technical topics. Collaboration with product managers, front-end engineers, DevOps teams, and other stakeholders will be essential to define requirements, prioritize technical work, and ensure alignment across teams. You will also be responsible for designing, optimizing, and maintaining scalable data models for PostgreSQL, MongoDB, and Redis, ensuring system performance and low-latency data access. Furthermore, you will architect distributed data processing solutions, implement caching mechanisms, and build high-throughput, low-latency systems to meet the demands of real-time health applications. You will own the design and improvement of CI/CD pipelines for automated testing and deployments, ensuring smooth, frequent, and reliable releases using tools like Jenkins, GitLab, or GitHub Actions. Monitoring production environments, establishing incident response processes, and providing leadership during critical incidents will be part of your responsibilities. Your collaboration with the executive team will help shape the company's technology roadmap and future strategic decisions. Additionally, you will mentor senior engineers, guide technical growth, and help elevate their engineering practices and problem-solving abilities. Qualifications: - 8+ years of professional experience in backend development with a focus on Kotlin, Spring Boot, and Python. - Expertise in architecting and scaling microservices architectures, including distributed systems and service-oriented architectures. - Strong experience with AWS services, relational and NoSQL databases, Redis for caching, and designing high-throughput systems. - Leadership experience in CI/CD pipelines, DevOps practices, containerization, and orchestration. - Excellent communication and leadership skills to guide technical discussions and influence teams. - Proficiency in Agile methodologies and prior experience in mentoring engineering talent. Nice to Have: - Experience in healthcare technology. - Exposure to event-driven architectures and message brokers. - Familiarity with AI/ML technologies in healthcare solutions. - Knowledge of front-end technologies for seamless integrations.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a talented and experienced professional, you are invited to explore exciting opportunities in various engineering roles with a focus on cutting-edge technologies and cloud-native solutions. Join our team in Pune/Nagpur, Maharashtra, for full-time positions in UK shift timings. We are looking for individuals with a minimum of 6 to 8 years of relevant experience who are passionate about innovation and possess deep technical expertise. If you thrive in an agile environment and are eager to contribute to projects involving infrastructure automation, software development life cycle (SDLC), and cloud services, then we encourage you to consider the roles below: Automation Engineer: - Develop and maintain orchestration designs, RESTful APIs, and microservices. - Implement BDD frameworks and work on CI/CD pipelines using Jenkins. - Contribute to infrastructure management and cloud services deployment (AWS/Azure). - Collaborate within Scrum and Agile frameworks. Infrastructure Automation Delivery Manager: - Manage end-to-end infrastructure automation projects. - Provide leadership in project management, requirement gathering, and cloud architecture. - Ensure compliance with SDLC and GxP processes. DevOps/Release Engineer: - Streamline deployment pipelines and maintain release schedules. - Work on configuration management and cloud deployment processes. Chef/SCM Engineer & Developer: - Develop and manage Chef-based configuration management solutions. - Enhance SCM tools and processes for improved efficiency. Kubernetes Engineer: - Design, deploy, and manage Kubernetes clusters and orchestration tools. - Optimize deployments with Helm charts, ArgoCD, and Docker. AWS Engineer: - Architect and maintain cloud-based applications using AWS services. - Utilize AWS tools like Lambda, CloudFront, and CloudWatch for deployment and monitoring. Lead Frontend Software DevOps Engineer: - Lead frontend DevOps projects, ensuring best practices in serverless architecture. - Collaborate on cloud-native designs with a focus on performance and scalability. Frontend Software DevOps Engineer: - Implement frontend development with a focus on automation and scalability. - Leverage modern frameworks for cloud-native solutions. Azure Engineer: - Deploy and manage Azure-based infrastructure and applications. - Collaborate with cross-functional teams for cloud-native solutions. Scrum Master - Cloud Native: - Facilitate Agile/Scrum ceremonies for cloud-native projects. - Drive collaboration across teams for efficient delivery. Cloud Agnostic Engineer: - Design and implement cloud-agnostic solutions across AWS and Azure platforms. - Enhance cloud infrastructure with Kubernetes and Terraform. Additionally, we are seeking individuals with strong communication, writing, and presentation skills, experience in Agile/Scrum methodologies, and familiarity with modern tools such as Artifactory, Docker, CI/CD pipelines, and serverless architectures. Join us to work on innovative projects, collaborate with a highly skilled team, and explore opportunities for professional growth and certifications. If this opportunity excites you, please share your updated resume with the job title mentioned in the subject line to jagannath.gaddam@quantumintegrators.com. Let's build the future together!,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Airflow Data Engineer in AWS platform Job Title Apache Airflow Data Engineer ROLE” as per TCS Role Master • 4-8 years of experience in AWS, Apache Airflow (on Astronomer platform), Python, Pyspark, SQL • Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. • Experience in creating data pipelines and orchestrating using Apache Airflow • Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. • Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR or Matillion or any other ELT tool • Excellent communication skills to liaise with Business & IT stakeholders. • Expertise in planning execution of a project and efforts estimation. • Exposure to working in Agile ways of working. Candidate for this position to be offered with TAIC or TCSL as Entity Data warehousing , pyspark , Github, AWS data platform, Glue, EMR, RedShift, databricks,Data Marts. DBT/Glue/EMR or Matillion, data engineering, data modelling, data consumption

Posted 3 weeks ago

Apply

4.0 - 6.0 years

8 - 15 Lacs

Hyderabad, Chennai

Hybrid

Mid-Level from 4+years Fullstack with Angular & .NetCore Angular(>15) with Typescript, API endpoints integrations (RxJs & Observables) Angular state management techniques, Angular Routing, Reactive Forms etc HTML5, JavaScript, CSS & Bootstrap knowledge Browser Developer Tools Usage (POSTMAN). Version Control(GitHub, MsGit etc) Analytical, Debugging & Troubleshooting skills .NetCore , C#.Net , Asp.Net Core 6.0/8.0 RESTful WebAPI programming Experience with EF Core 6.0/8.0 Lambda Expressions/Linq ,OOPS & ADO.Net Tasks based Prog. with C#( Asynchronous programming), Microsoft SQL Server(T-SQL)

Posted 3 weeks ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Hyderabad

Hybrid

Role & responsibilities : We are seeking dynamic individuals to join our team as individual contributors, collaborating closely with stakeholders to drive impactful results. Working hours - 5:30 pm to 1:30 am (Hybrid model) Must have Skills* 1. 15 years of experience in design and delivery of Distributed Systems capable of handling petabytes of data in a distributed environment. 2. 10 years of experience in the development of Data Lakes with Data Ingestion from disparate data sources, including relational databases, flat files, APIs, and streaming data. 3. Experience in providing Design and development of Data Platforms and data ingestion from disparate data sources into the cloud. 4. Expertise in core AWS Services including AWS IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail, CloudWatch. 5. Proficiency in programming languages like Python and PySpark to ensure efficient data processing. preferably Python. 6. Architect and implement robust ETL pipelines using AWS Glue, Lamda, and step-functions defining data extraction methods, transformation logic, and data loading procedures across different data sources 7. Experience in the development of Event-Driven Distributed Systems in the Cloud using Serverless Architecture. 8. Ability to work with Infrastructure team for AWS service provisioning for databases, services, network design, IAM roles and AWS cluster. 9. 2-3 years of experience working with Document DB or MongoDB environment. Nice to have Skills: 1. 10 years of experience in the development of Data Audit, Compliance and Retention standards for Data Governance, and automation of the governance processes. 2. Experience in data modelling with NoSQL Databases like Document DB. 3. Experience in using column-oriented data file format like Apache Parquet, and Apache Iceberg as the table format for analytical datasets. 4. Expertise in development of Retrieval-Augmented Generation (RAG) and Agentic Workflows for providing context to LLMs based on proprietary enterprise data. 5. Ability to develop re-ranking strategies using results from Index and Vector stores for LLMs to improve the quality of the output. 6. Knowledge of AWS AI Services like AWS Entity Resolution, AWS Comprehend.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Zirakpur

Work from Office

AWS Services ( Lambda, Glue, S3, Dynamo, EventBridge, Appsync, Open search) Terraform Python React/Vite Unit testing (Jest, Pytest) Software development lifecycle

Posted 3 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

Job Title/Role: Teach Lead [Java & Python] Location: Noida/Delhi NCR Experience : 7- 10 yrs Roles & Responsibilities Understanding the clients business use cases and technical requirements and be able to convert them into technical solutions which elegantly meets the requirements Identifying different solutions and being able to narrow down the best option that meets the business requirements. Develop solution design considering various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Excellent communication and teamwork skills Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Should be confident, self-driven with a lot of initiative, and should have the zeal and energy to quickly ramp-up on upcoming technologies. Create and contribute to an environment that is geared to innovation, high productivity, high quality and customer service. Experience in communicating with end clients, business users, other technical teams and provide estimates. Qualification BTech. Or MCA in computer science More than 7 years of experience working as Java full stack technologist/ - software development, testing and production support & Python programming language and the FastAPI framework. Design / Development experience in Java technical stack, Java/J2EE, Design Patterns, Spring Framework-Core| Boot| MVC /Hibernate, JavaScript, CSS, HTML, Multithreading, Data Structures, Kafka and SQL Experience with data analytics tools and libraries such as pandas, NumPy, and scikit-learn Familiarity in Content Management Tools and, experience with integrating databases both relational (e.g. Oracle, PostgreSQL, MySQL) and non-relational databases (e.g. DynamoDB, Mongo, Cassandra) An in-depth understanding of Public/Private/Hybrid Cloud solutions and experience in securely integrating public cloud into traditional hosting/delivery models with a specific focus on AWS (S3, lambda, API gateway, EC2, Cloudflare) Working knowledge of Docker, Kubernetes, UNIX-based operating systems, and Micro-services Should have clear understating on continuous integration, build, release, code quality GitHub/Jenkins Should have an experience of managing teams and time bound projectsWorking in F&B Industry or Aerospace could be an added advantage,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You should have a good knowledge of the JavaScript language and expertise in the Cypress Automation Framework through BDD. It is essential to be well-versed with GIT repositories and have expertise in RESTful API Automation and UI Automation. Additionally, experience in Performance Testing using Gatling is required. Knowledge of AWS services like S3, SES, EC2, IAM, Lambda is a plus. Hands-on experience in DevOps tools such as Docker, Kubernetes, Gitlab/Jenkins, and CICD implementation is necessary. Basic networking knowledge is also required. You should have good experience in Testing and Test management, along with experience working in the Unified Communication domain. Your key responsibilities will include participating in Pre-PI, PI planning, and all SAFe Agile ceremonies. You will be responsible for creating an Automation test framework from scratch and working on automated scripts, back-end scripting, and user interface. Implementing new ideas to enhance the Automation framework and conducting Performance tests using Gatling are also part of the role. You will be involved in peer/junior code reviews, supporting juniors in creating reusable keywords/utils, and training new joiners and juniors through knowledge sharing. Monitoring and managing changes in regression tests, understanding and implementing CI/CD pipeline flow, and pushing code daily into the code repository are crucial tasks. Understanding the product by conducting exploratory testing manually, participating in estimation, Agile Scrum ceremonies, and backlog grooming are part of your responsibilities. Collating and monitoring the defect management process, uploading test cases in JIRA, providing daily status updates of stories in JIRA, and generating reports will also be expected. Furthermore, you will be required to provide input to the Test Manager.,

Posted 3 weeks ago

Apply

7.0 - 12.0 years

17 - 27 Lacs

Hyderabad

Work from Office

Job Title: Data Quality Engineer Mandatory Skills Data Engineer, Python, AWS, SQL, Glue, Lambda, S3, SNS, ML, SQS Job Summary: We are seeking a highly skilled Data Engineer (SDET) to join our team, responsible for ensuring the quality and reliability of complex data workflows, data migrations, and analytics solutions across both cloud and on-premises environments. The ideal candidate will have extensive experience in SQL, Python, AWS, and ETL testing, along with a strong background in data quality assurance, data science platforms, DevOps pipelines, and automation frameworks. This role involves close collaboration with business analysts, developers, and data architects to support end-to-end testing,data validation, and continuous integration for data products. Expertise in tools like Redshift, EMR,Athena, Jenkins, and various ETL platforms is essential, as is experience with NoSQL databases, big data technologies, and cloud-native testing strategies. Role and Responsibilities: Work with business stakeholders, Business Systems Analysts and Developers to ensure quality delivery of software. Interact with key business functions to confirm data quality policies and governed attributes. Follow quality management best practices and processes to bring consistency and completeness to integration service testing. Designing and managing the testing AWS environments of data workflows during development and deployment of data products Provide assistance to the team in Test Estimation & Test Planning Design, development of Reports and dashboards. Analyzing and evaluating data sources, data volume, and business rules. Proficiency with SQL, familiarity with Python, Scala, Athena, EMR, Redshift and AWS. No SQL data and unstructured data experience. Extensive experience in programming tools like Map Reduce to HIVEQL. Experience in data science platforms like SageMaker/Machine Learning Studio/ H2O. Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing. Interpret and analyses data from various source systems to support data integration and data reporting needs. Experience in testing Database Application to validate source to destination data movement and transformation. Work with team leads to prioritize business and information needs. Develop complex SQL scripts (Primarily Advanced SQL) for Cloud ETL and On prem. Develop and summarize Data Quality analysis and dashboards. Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL. Execute testing of data analytic and data integration on time and within budget. Work with team leads to prioritize business and information needs Troubleshoot & determine best resolution for data issues and anomalies Experience in Functional Testing, Regression Testing, System Testing, Integration Testing & End to End testing. Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms Required Skills and Qualifications: Extensive Experience in Data migration is a must (Teradata to Redshift preferred). Extensive testing Experience with SQL/Unix/Linux scripting is a must. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR, Redshift, AWS, and Cloud Technologies. Experienced in large-scale application development testing Cloud/ On Prem Data warehouse, Data Lake, Data science. Experience with multi-year, large-scale projects. Expert technical skills with hands-on testing experience using SQL queries. Extensive experience with both data migration and data transformation testing. Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive testing Experience with SQL/Unix/Linux. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR , Redshift and AWS and Cloud Technologies. API/Rest Assured automation, building reusable frameworks, and good technical expertise/acumen. Java/Java Script - Implement core Java, Integration, Core Java and API. Functional/UI/ Selenium - BDD/Cucumber, Specflow, Data Validation/Kafka, BigData, also automation experience using Cypress. AWS/Cloud - Jenkins/ Gitlab/ EC2 machine, S3 and building Jenkins and CI/CD pipelines, SouceLabs. Preferred Skills: API/Rest API - Rest API and Micro Services using JSON, SoapUI. Extensive experience in DevOps/Data Ops space. Strong experience in working with DevOps and build pipelines. Strong experience of AWS data services including Redshift, Glue, Kinesis, Kafka (MSK) and EMR/Spark, Sage Maker etc. Experience with technologies like Kubeflow, EKS, Docker. Extensive experience using No SQL data and unstructured data experience like MongoDB, Cassandra, Redis, ZooKeeper. Extensive experience in Map reduce using tools like Hadoop, Hive, Pig, Kafka, S4, Map R. Experience using Jenkins and Gitlab. Experience using both Waterfall and Agile methodologies. Experience in testing storage tools like S3, HDFS. Experience with one or more industry-standard defect or Test Case management Tools. Great communication skills (regularly interacts with cross functional team members).

Posted 3 weeks ago

Apply

10.0 - 17.0 years

20 - 27 Lacs

Hyderabad

Work from Office

Required Skills and Qualifications: Extensive Experience in Data migration is a must (Teradata to Redshift preferred). Extensive testing Experience with SQL/Unix/Linux scripting is a must. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR, Redshift, AWS, and Cloud Technologies. Experienced in large-scale application development testing Cloud/ On Prem Data warehouse, Data Lake, Data science. Experience with multi-year, large-scale projects. Expert technical skills with hands-on testing experience using SQL queries. Extensive experience with both data migration and data transformation testing. Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive testing Experience with SQL/Unix/Linux. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR , Redshift and AWS and Cloud Technologies. API/Rest Assured automation, building reusable frameworks, and good technical expertise/acumen. Java/Java Script - Implement core Java, Integration, Core Java and API. Functional/UI/ Selenium - BDD/Cucumber, Specflow, Data Validation/Kafka, BigData, also automation experience using Cypress. AWS/Cloud - Jenkins/ Gitlab/ EC2 machine, S3 and building Jenkins and CI/CD pipelines, SouceLabs.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

22 - 27 Lacs

Hyderabad, Bengaluru

Work from Office

Job description: 8+ years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in Python and PySpark for data processing and transformation tasks. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Deep understanding of ETL concepts and best practices.. Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data Pipelines: Design, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL Development: Develop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow Automation: Build and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data Integration: Work with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and Scaling: Optimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.

Posted 3 weeks ago

Apply

8.0 - 11.0 years

30 - 35 Lacs

Hyderabad

Work from Office

NP: Immediate to 15 Days. Required Skills & Qualifications: Strong experience in backend development using Java (Java 8 or later). Hands-on experience with front-end technologies such as Vue.js or Angular (with a strong preference to work with Vue.js). Solid understanding of PostgreSQL and ability to write optimized SQL queries and stored procedures. AWS cloud experience with knowledge of services like EC2, RDS, S3, Lambda, API Gateway, etc. Experience building and consuming RESTful APIs. Proficiency with version control systems (e.g., Git). Familiarity with Agile/Scrum methodologies. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Kolkata, Pune, Chennai

Work from Office

Key Responsibilities: Develop and maintain applications using Python. Work with AWS services like Lambda, EC2, and S3. Collaborate using DevOps tools such as GitLab for CI/CD. Handle data processing tasks using NumPy and Pandas. Optimize and scale data pipelines and services in the cloud. Essential Skills: Strong experience in Python programming. Proficient in AWS services: Lambda, EC2, S3. Familiarity with GitLab for version control and CI/CD. Hands-on experience with NumPy and Pandas (3+ years). Good understanding of cloud-based application architecture.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 - 3 Lacs

Kolkata, Pune, Chennai

Work from Office

Skill Required: Digital : Python~Digital : Amazon Web Service(AWS) Cloud Computing Experience Range in Required Skills: 4-6 Years Job Description: Language: Python Cloud Platform: AWS (Lambda, EC2, S3)DevOps Tools: GitLab Data / ML: NumPy, Pandas3+ years Hands experience for Developer Essential Skills: Language: Python Cloud Platform: AWS (Lambda, EC2, S3)DevOps Tools: GitLab Data / ML: NumPy, Pandas3+ years Hands experience for Developer

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Kochi, Bengaluru

Work from Office

Job Summary: We are seeking a highly skilled and motivated Machine Learning Engineer with a strong foundation in programming and machine learning, hands-on experience with AWS Machine Learning services (especially SageMaker), and a solid understanding of Data Engineering and MLOps practices. You will be responsible for designing, developing, deploying, and maintaining scalable ML solutions in a cloud-native environment. Key Responsibilities: • Design and implement machine learning models and pipelines using AWS SageMaker and related services. • Develop and maintain robust data pipelines for training and inference workflows. • Collaborate with data scientists, engineers, and product teams to translate business requirements into ML solutions. • Implement MLOps best practices including CI/CD for ML, model versioning, monitoring, and retraining strategies. • Optimize model performance and ensure scalability and reliability in production environments. • Monitor deployed models for drift, performance degradation, and anomalies. • Document processes, architectures, and workflows for reproducibility and compliance. Required Skills & Qualifications: • Strong programming skills in Python and familiarity with ML libraries (e.g., scikitlearn, TensorFlow, PyTorch). • Solid understanding of machine learning algorithms, model evaluation, and tuning. • Hands-on experience with AWS ML services, especially SageMaker, S3, Lambda, Step Functions, and CloudWatch. • Experience with data engineering tools (e.g., Apache Airflow, Spark, Glue) and workflow orchestration. Machine Learning Engineer - Job Description • Proficiency in MLOps tools and practices (e.g., MLflow, Kubeflow, CI/CD pipelines, Docker, Kubernetes). • Familiarity with monitoring tools and logging frameworks for ML systems. • Excellent problem-solving and communication skills. Preferred Qualifications: • AWS Certification (e.g., AWS Certified Machine Learning Specialty). • Experience with real-time inference and streaming data. • Knowledge of data governance, security, and compliance in ML systems

Posted 3 weeks ago

Apply

6.0 - 7.0 years

27 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Provide technical leadership and mentorship to data engineering teams. Architect, design, and deploy scalable, secure, and high-performance data pipelines. Collaborate with stakeholders, clients, and cross-functional teams to deliver end-to-end data solutions. Drive technical strategy and implementation plans in alignment with business needs. Oversee project execution using tools like JIRA, ensuring timely delivery and adherence to best practices. Implement and maintain CI/CD pipelines and automation tools to streamline development workflows. Promote best practices in data engineering and AWS implementations across the team. Preferred candidate profile Strong hands-on expertise in Python, PySpark, and Spark architecture, including performance tuning and optimization. Advanced proficiency in SQL and experience in writing optimized stored procedures. In-depth knowledge of the AWS data engineering stack, including: AWS Glue Lambda API Gateway EMR S3 Redshift Athena Experience with Infrastructure as Code (IaC) using CloudFormation and Terraform. Familiarity with Unix/Linux scripting and system administration is a plus. Proven ability to design and deploy robust, production-grade data solutions.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Mumbai, Aurangabad

Work from Office

Joining Preferred immediate joiners. Job Summary: We are seeking a skilled and motivated Data Developer with 3 to 5 years of hands-on experience in designing, developing, and maintaining scalable data solutions. The ideal candidate will work closely with data architects, data analysts, and application developers to build efficient data pipelines, transform data, and support data integration across various platforms. Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources (structured and unstructured). Develop and optimize SQL queries , stored procedures, views, and functions for data analysis and reporting. Work with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery, Azure Synapse) to support business intelligence solutions. Collaborate with data engineers and analysts to implement robust data models and schemas for analytics. Ensure data quality , consistency, and accuracy through data validation, testing, and monitoring. Implement data security, compliance, and governance protocols in alignment with organizational policies. Maintain documentation related to data sources, data flows, and business rules. Participate in code reviews , sprint planning, and agile development practices. Technical Skills Required: Languages & Tools: SQL (Advanced proficiency required) Python or Scala for data processing Shell scripting (Bash, PowerShell) ETL Tools / Data Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, SSIS, or equivalent Data Warehousing & Databases: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse SQL Server, PostgreSQL, Oracle, or MySQL Cloud Platforms (at least one): AWS (Glue, S3, Redshift, Lambda) Azure (ADF, Blob Storage, Synapse) GCP (Dataflow, BigQuery, Cloud Storage) Big Data & Streaming (Nice to Have): Apache Spark, Databricks, Kafka, Hadoop ecosystem Version Control & DevOps: Git, Bitbucket CI/CD pipelines (Jenkins, GitHub Actions) Qualifications: Bachelors or Masters degree in Computer Science , Information Systems , or related field. 35 years of professional experience as a Data Developer or Data Engineer. Strong problem-solving skills and the ability to work both independently and in a team environment. Experience working in Agile/Scrum teams is a plus. Excellent communication and documentation skills. Preferred Certifications (Optional): Microsoft Certified: Azure Data Engineer Associate AWS Certified Data Analytics Specialty Google Cloud Professional Data Engineer

Posted 3 weeks ago

Apply

0.0 - 3.0 years

3 - 8 Lacs

Chennai

Hybrid

Key Responsibilities AWS Infrastructure Management Design, deploy, and manage AWS infrastructure using services such as EC2, ECS, EKS, Lambda, RDS, S3, VPC, and CloudFront Implement and maintain Infrastructure as Code using AWS CloudFormation, AWS CDK, or Terraform Optimize AWS resource utilization and costs through rightsizing, reserved instances, and automated scaling Manage multi-account AWS environments using AWS Organizations and Control Tower Implement disaster recovery and backup strategies using AWS services CI/CD Pipeline Development Build and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, CodeDeploy, and CodeCommit Integrate with third-party tools like Jenkins, GitLab CI, or GitHub Actions when needed Implement automated testing and security scanning within deployment pipelines Manage deployment strategies including blue-green deployments using AWS services Automate application deployments to ECS, EKS, Lambda, and EC2 environments Container and Serverless Management Deploy and manage containerized applications using Amazon ECS and Amazon EKS Implement serverless architectures using AWS Lambda, API Gateway, and Step Functions Manage container registries using Amazon ECR Optimize container and serverless application performance and costs Implement service mesh architectures using AWS App Mesh when applicable Monitoring and Observability Implement comprehensive monitoring using Amazon CloudWatch, AWS X-Ray, and AWS Systems Manager Set up alerting and dashboards for proactive incident management Configure log aggregation and analysis using CloudWatch Logs and AWS OpenSearch Implement distributed tracing for microservices architectures Create and maintain operational runbooks and documentation Security and Compliance Implement AWS security best practices using IAM, Security Groups, NACLs, and AWS Config Manage secrets and credentials using AWS Secrets Manager and Systems Manager Parameter Store Implement compliance frameworks and automated security scanning Configure AWS GuardDuty, AWS Inspector, and AWS Security Hub for threat detection Manage SSL/TLS certificates using AWS Certificate Manager Automation and Scripting Develop automation scripts using Python, Bash, and AWS CLI/SDK Create AWS Lambda functions for operational automation Implement event-driven automation using CloudWatch Events and EventBridge Automate backup, patching, and maintenance tasks using AWS Systems Manager Build custom tools and utilities to improve operational efficiency Required Qualifications AWS Expertise Strong experience with core AWS services: EC2, S3, RDS, VPC, IAM, CloudFormation Experience with container services (ECS, EKS) and serverless technologies (Lambda, API Gateway) Proficiency with AWS networking concepts and security best practices Experience with AWS monitoring and logging services (CloudWatch, X-Ray) Technical Skills Expertise in Infrastructure as Code using CloudFormation, CDK, or Terraform Strong scripting skills in Python, Bash, or PowerShell Experience with CI/CD tools, preferably AWS native services and Bitbucket Pipelines. Knowledge of containerization with Docker and orchestration with Kubernetes Understanding of microservices architecture and distributed systems Experience with configuration management and automation tools DevOps Practices Strong understanding of CI/CD best practices and GitOps workflows Experience with automated testing and deployment strategies Knowledge of monitoring, alerting, and incident response procedures Understanding of security scanning and compliance automation AWS Services Experience Compute & Containers Amazon EC2, ECS, EKS, Fargate, Lambda, Batch Storage & Database Amazon S3, EBS, EFS, RDS, DynamoDB, ElastiCache, Redshift Networking & Security VPC, Route 53, CloudFront, ALB/NLB, IAM, Secrets Manager, Certificate Manager Developer Tools CodePipeline, CodeBuild, CodeDeploy, CodeCommit, CodeArtifact Monitoring & Management CloudWatch, X-Ray, Systems Manager, Config, CloudTrail, AWS OpenSearch

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies