Jobs
Interviews

864 Lambda Expressions Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Expertise in development using Core Java, J2EE,Spring Boot, Microservices, Web Services SOA experience SOAP as well as Restful with JSON formats, with Messaging Kafka. Working proficiency in enterprise developmental toolsets like Jenkins, Git/ Bitbucket, Sonar, Black Duck, Splunk, Apigee etc. Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed. Experience with XACML Authorization policies. Experience in NoSQL , SQL database such as Cassandra, Aurora, Oracle. Good understanding of React JS ,Photon framework , Design, Kubernetes Working with GIT/Bitbucket, Maven, Gradle, Jenkins tools to build and deploy code deployment to production environments.

Posted 1 month ago

Apply

10.0 - 15.0 years

14 - 18 Lacs

Hyderabad

Work from Office

8 years of hands-on experience in AWS PostgreSQL, Oracle, MySQL, MongoDB, Performance Tuning, Backup, Replication REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Pune

Work from Office

Responsibilities / Qualifications: Candidate must have 5-6 years of IT working experience with at least 3 years of experience on AWS Cloud environment is preferred Ability to understand the existing system architecture and work towards the target architecture. Experience with data profiling activities, discover data quality challenges and document it. Experience with development and implementation of large-scale Data Lake and data analytics platform with AWS Cloud platform. Develop and unit test Data pipeline architecture for data ingestion processes using AWS native services. Experience with development on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Glue Data Catalog, Lake formation, Apache Airflow, Lambda, etc Experience with development of data governance framework including the management of data, operating model, data policies and standards. Experience with orchestration of workflows in an enterprise environment. Working experience with Agile Methodology Experience working with source code management tools such as AWS Code Commit or GitHub Experience working with Jenkins or any CI/CD Pipelines using AWS Services Experience working with an on-shore / off-shore model and collaboratively work on deliverables. Good communication skills to interact with onshore team.

Posted 1 month ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Gurugram

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions with team members to ensure that the design aligns with business objectives and technical feasibility, while also participating in the iterative process of application development to refine and enhance the solutions provided.You should have knowledge on PHP, Laravel, Drupal; HTML, CSS; SQL; Auth0, Terraform; AWS Basics, AWS DevOps, AWS SAM (Lambda); Cloudflare, Cloudflare Workers; REST API; GitHub; Web Server; SQL. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and knowledge in application design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of application design principles and methodologies.- Experience with user interface design and user experience best practices.- Familiarity with web development technologies such as HTML, CSS, and JavaScript.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Bachelors degree in Computer Science or related field, or equivalent practical experience. 3 to 5 years of hands-on experience in Java development, especially with Spring Boot. Experience working with AWS cloud services in a development environment. Knowledge of RESTful APIs, microservices, and distributed systems. Familiarity with CI/CD pipelines and version control tools such as Git. Strong problem-solving skills and a collaborative mindset. Design, develop, and deploy backend services and APIs using Java and Spring Boot. Develop and integrate applications with AWS services such as Lambda, S3, RDS, API Gateway, DynamoDB, and SQS. Write clean, maintainable, and testable code following best practices. Collaborate with cross-functional teams to define, design, and deliver new features. Participate in code reviews, unit testing, and CI/CD pipeline processes. Troubleshoot issues and improve performance, reliability, and scalability of existing systems. Ensure security, compliance, and performance standards are met across deployments

Posted 1 month ago

Apply

14.0 - 19.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Bengaluru

Work from Office

6+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements. Ensure performance tuning, fault tolerance, and reliability of distributed data processing systems.

Posted 1 month ago

Apply

12.0 - 17.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Expertise in development using Core Java, J2EE,Spring Boot, Microservices, Web Services SOA experience SOAP as well as Restful with JSON formats, with Messaging Kafka. Working proficiency in enterprise developmental toolsets like Jenkins, Git/ Bitbucket, Sonar, Black Duck, Splunk, Apigee etc. Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed. Experience with XACML Authorization policies. Experience in NoSQL , SQL database such as Cassandra, Aurora, Oracle. Good understanding of React JS ,Photon framework , Design, Kubernetes Working with GIT/Bitbucket, Maven, Gradle, Jenkins tools to build and deploy code deployment to production environments.

Posted 1 month ago

Apply

9.0 - 14.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Expertise in development using Core Java, J2EE,Spring Boot, Microservices, Web Services SOA experience SOAP as well as Restful with JSON formats, with Messaging Kafka. Working proficiency in enterprise developmental toolsets like Jenkins, Git/ Bitbucket, Sonar, Black Duck, Splunk, Apigee etc. Experience in AWS cloud monitoring tools like Datadog, Cloud watch, Lambda is needed. Experience with XACML Authorization policies. Experience in NoSQL , SQL database such as Cassandra, Aurora, Oracle. Good understanding of React JS ,Photon framework , Design, Kubernetes Working with GIT/Bitbucket, Maven, Gradle, Jenkins tools to build and deploy code deployment to production environments.

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills

Posted 1 month ago

Apply

12.0 - 17.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Bachelors degree in Computer Science, Engineering, Information Systems, or related field. 12+ years of experience in technical program or project management. Proven experience with mainframe technologies (e.g., COBOL, JCL, DB2, IMS, CICS, VSAM). Strong hands-on understanding of AWS services such as EC2, S3, RDS, Lambda, CloudFormation, and networking/security features. Experience leading modernization/migration projects from mainframe to cloud. Familiarity with agile delivery models (Scrum, SAFe) and hybrid program management approaches. Strong communication, stakeholder management, and risk management skills. Proficiency in tools like Jira, Confluence, MS Project, or Smartsheet.

Posted 1 month ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Hyderabad

Work from Office

10+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements.

Posted 1 month ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Pune

Work from Office

Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice to have : Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.

Posted 1 month ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Hyderabad

Work from Office

P1-C3-STS Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift, Cloud Formation and other AWS serverless resources Cloud Formation and other AWS serverless resources

Posted 1 month ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).

Posted 1 month ago

Apply

14.0 - 19.0 years

13 - 18 Lacs

Hyderabad

Work from Office

10 years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Java 11, Spring Boot, Angular/React, REST APIs, Docker, Kubernetes, Microservices Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.

Posted 1 month ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Key Responsibilities Build and maintain backend services in Python, writing clean, maintainable, and well-tested code. Develop and scale public APIs, ensuring high performance and reliability. Work with GraphQL services, contributing to schema design and implementation of queries, mutations, and resolvers. Collaborate cross-functionally with frontend, product, and DevOps teams to ship features end-to-end. Containerize services using Docker and support deployments within Kubernetes environments. Use GitHub Actions to manage CI/CD workflows, including test automation and deployment pipelines. Participate in code reviews, standups, and planning sessions as part of an agile development process. Take ownership of features and deliverables with guidance from senior engineers. Required Skills Python expertiseStrong grasp of idiomatic Python, async patterns, type annotations, unit testing, and modern libraries. API developmentExperience building and scaling RESTful and/or GraphQL APIs in production. GraphQL proficiencyFamiliarity with frameworks like Strawberry, Graphene, or similar. ContainerizationHands-on experience with Docker and container-based development workflows. GitHub Actions CI/CDWorking knowledge of GitHub Actions for automating tests and deployments. Team collaborationEffective communicator with a proactive, self-directed work style. Preferred Qualifications KubernetesExperience deploying or troubleshooting applications in Kubernetes environments. AWSFamiliarity with AWS services such as ECS, EKS, S3, RDS, or Lambda. HealthcareBackground in the healthcare industry or building patient-facing applications. Monitoring and securityFamiliarity with observability tools (e.g., Datadog, Prometheus) and secure coding practices

Posted 1 month ago

Apply

7.0 - 12.0 years

6 - 10 Lacs

Hyderabad

Work from Office

AWS Devops Mandatory skills VMware AWS Infra Ec2 Containerizations, Devops Jenkins Kubernetes Terraform Secondary skills Python Lambda Step Functions Design and implement cloud infrastructure solutions for cloud environments. Evaluate and recommend cloud infrastructure tools and services. Manage infrastructure performance, monitoring, reliability and scalability. Technical Skills: Overall experience of 8+ years with 5+ years of Infrastucture Architecture experience Cloud Platforms Proficient in AWS along with other CSP Good understanding of cloud networking services VPC Load Balancing DNS etc Infrastructure as Code IaC Proficient with hands on experience on Terraform or AWS CloudFormation for provisioning Security Strong knowledge of cloud security fundamentals IAM security groups firewall rules Automation Familiarity proficient with hands on experience on CI CD pipelines containerization Kubernetes Docker and configuration management tools eg Chef Puppet Monitoring Performance Experience with cloud monitoring and logging tools CloudWatch Azure Monitor Stackdriver Disaster Recovery Knowledge of backup replication and recovery strategies in cloud environments Support cloud migration efforts and recommend strategies for optimization Collaborate with DevOps and security teams to integrate best practices Evaluate implement and streamline DevOps practices Supervising Examining and Handling technical operations

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Gurugram

Work from Office

Data Engineer Location PAN INDIA Workmode Hybrid Work Timing :2 Pm to 11 PM Primary Skill Data Engineer Experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark.. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Also, experience in Redshift is required along with other SQL DB experience Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. understanding of building an end-to end Data pipeline. Strong understanding of Kinesis, Kafka, CDK. Experience with Kafka and ECS is also required. strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required Experience in Node Js and CDK. JDResponsibilities Lead the architectural design and development of a scalable, reliable, and flexible metadata-driven data ingestion and extraction framework on AWS using Python/PySpark. Design and implement a customizable data processing framework using Python/PySpark. This framework should be capable of handling diverse scenarios and evolving data processing requirements. Implement data pipeline for data Ingestion, transformation and extraction leveraging the AWS Cloud Services Seamlessly integrate a variety of AWS services, including S3,Glue, Kafka, Lambda, SQL, SNS, Athena, EC2, RDS (Oracle, Postgres, MySQL), AWS Crawler to construct a highly scalable and reliable data ingestion and extraction pipeline. Facilitate configuration and extensibility of the framework to adapt to evolving data needs and processing scenarios. Develop and maintain rigorous data quality checks and validation processes to safeguard the integrity of ingested data. Implement robust error handling, logging, monitoring, and alerting mechanisms to ensure the reliability of the entire data pipeline. QualificationsMust Have Over 6 years of hands-on experience in data engineering, with a proven focus on data ingestion and extraction using Python/PySpark. Extensive AWS experience is mandatory, with proficiency in Glue, Lambda, SQS, SNS, AWS IAM, AWS Step Functions, S3, and RDS (Oracle, Aurora Postgres). 4+ years of experience working with both relational and non-relational/NoSQL databases is required. Strong SQL experience is necessary, demonstrating the ability to write complex queries from scratch. Strong working experience in Redshift is required along with other SQL DB experience. Strong scripting experience with the ability to build intricate data pipelines using AWS serverless architecture. Complete understanding of building an end-to end Data pipeline. Nice to have Strong understanding of Kinesis, Kafka, CDK. A strong understanding of data concepts related to data warehousing, business intelligence (BI), data security, data quality, and data profiling is required. Experience in Node Js and CDK. Experience with Kafka and ECS is also required.

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Developer P3 C3 TSTS Hybrid US Shift Primary Skills DevOps and infrastructure engineering CI/CD tools AWS networking services, storage services, certificate management, secrets management, and database setup (RDS) Terraform/Cloud Formation/AWS CDK Python and Bash Secondary Skills Expertise in AWS CDK and CDK Pipelines for IaC. Understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services Communication and collaboration skills to work effectively in a team-oriented environment. JD Design, implement, and maintain cloud infrastructure using AWS Cloud Development Kit (CDK) Develop and evolve Infrastructure as Code (IaC) to ensure efficient provisioning and management of AWS resources. Develop and automate Continuous Integration/Continuous Deployment (CI/CD) pipelines for infrastructure provisioning and application deployment. Configure and manage various AWS services, including but not limited to EC2, VPC, Security Group, NACL, S3, CloudFormation, CloudWatch, AWS Cognito, IAM, Transit Gateway, ELB, CloudFront, Route53, and more. Collaborate with development and operations teams, bridging the gap between infrastructure and application development. Monitor and troubleshoot infrastructure performance issues, ensuring high availability and reliability. Implement proactive measures to optimize resource utilization and identify potential bottlenecks. Implement security best practices, including data encryption and adherence to security protocols. Ensure compliance with industry standards and regulations. Must Have 5+ years of hands-on experience in DevOps and infrastructure engineering Solid understanding of AWS services and technologies, including EC2, VPC, S3, Lambda, Route53, and CloudWatch Experience with CI/CD tools, DevOps implementation and HA/DR setup In-depth experience with AWS networking services, storage services, certificate management, secrets management, and database setup (RDS) Proven expertise in Terraform/Cloud Formation/AWS CDK Strong scripting and programming skills, with proficiency in languages such as Python and Bash Nice to have Proven expertise in AWS CDK and CDK Pipelines for IaC. Familiarity or understanding of logging and monitoring services like AWS CloudTrail, CloudWatch, GuardDuty, and other AWS security services. Excellent communication and collaboration skills to work effectively in a team-oriented environment.

Posted 1 month ago

Apply

10.0 - 15.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Lead the design and development of web application using React.js, ensuring high-quality and scalable solutions Python scriptinh experience of 5 years, framework experience is good to have Collaborate with cross-functional teams (product managers, and back-end developers) to deliver features and enhancements. Develop reusable components and libraries for future use, optimizing for performance and maintainability. Work with RESTful APIs and third-party services to integrate data and functionality. Ensure responsive design and cross-browser compatibility for applications. Contribute to architectural discussions and decisions regarding the evolution of the front-end codebase. Proficiency in JavaScript (ES6+), HTML5, CSS, and modern front-end build tools (Webpack, Babel, etc.). Experience with state management in React (Redux, Context API, etc.). Experience with front-end testing frameworks (Jest, React Testing Library, etc.). Secondary Skills AWS services such as Lambda, S3, Step Functions, DocumentDB, RDS, and Glue jobs. Infrastructure as Code (IaC) using tools like Terraform or similar, to automate infrastructure provisioning and management.

Posted 1 month ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Hyderabad

Work from Office

We are looking for a talented Full Stack Python Developer with a minimum of 8 years of experience. The ideal candidate will be proficient in Python, with expertise in either Flask or Django frameworks, and have a strong background in cloud technologies such as AWS or GCP. Additionally, experience with front-end technologies like Angular or React is essential. Key Responsibilities: Backend DevelopmentDesign, develop, and maintain backend systems using Python, with a focus on either Flask or Django. Cloud ServicesUtilize cloud services such as AWS (Lambda, Cloud Function) or GCP (App Engine) to deploy and manage scalable applications. Frontend DevelopmentDevelop user interfaces using Angular or React, ensuring a seamless and responsive user experience. IntegrationIntegrate frontend and backend components, ensuring smooth data flow and application functionality. Qualifications: Bachelor's or Master's degree in Computer Science or a related field. 5+ years of professional experience in full-stack Python development. Proficiency in Python and experience with either Flask or Django frameworks. Strong knowledge of cloud platforms, particularly AWS (Lambda, Cloud Function) or GCP (App Engine). Experience with frontend technologies such as Angular or React. Understanding of RESTful API design and integration. Solid understanding of version control systems, such as Git. Desired Skills: Familiarity with database systems, both SQL and NoSQL. Knowledge of containerization tools such as Docker. Experience with serverless architecture and microservices. Understanding of CI/CD pipelines. Familiarity with Agile/Scrum methodologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Mumbai

Work from Office

Skill—Java AWS Experience:6-9Yrs Ro leT2 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs.ac Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).Develop and maintain robust backend services and RESTful APIs using Java and Spring Boot. Design and implement microservices that are scalable, maintainable, and deployable in AWS. Integrate backend systems with AWS services including but not limited to Lambda, S3, DynamoDB, RDS, SNS/SQS, and CloudFormation. Collaborate with product managers, architects, and other developers to deliver end-to-end features. Participate in code reviews, design discussions, and agile development processes.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies