Home
Jobs
Companies
Resume

370 Lambda Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 4 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Zeta Global is looking for an experienced Machine Learning Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organisation. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning engineering best-practices and industry standards. Empower the product and engineering teams to make data-driven decisions. What you need to succeed: 2 to 4 years of proven experience as a Machine Learning Engineer in a professional setting. Proficiency in any programming language (Python preferable). Prior experience in building and deploying Machine learning systems. Experience with containerization: Docker & Kubernetes. Experience with AWS cloud services like EKS, ECS, EMR, Lambda, and others. Fluency with workflow management tools like Airflow or dbt. Familiarity with distributed batch compute technologies such as Spark. Experience with modern data warehouses like Snowflake or BigQuery. Knowledge of MLFlow, Feast, and Terraform is a plus.

Posted 2 months ago

Apply

3 - 7 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a skilled Fullstack Developer with expertise in ReactJS and Java to lead our backend team.Develop and maintain scalable web applications using ReactJS for frontend and Java (Spring Boot) for backend

Posted 2 months ago

Apply

8 - 11 years

15 - 27 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Role & responsibilities 1. Scripting and Automation: Develop and maintain scripts using Python, Bash, Perl, Ruby, and Groovy for automation tasks. Implement and manage infrastructure as code using tools like Terraform and AWS CloudFormation. 2. Cloud Infrastructure Management: Design, deploy, and manage AWS services such as EC2, S3, RDS, Lambda, and CloudFormation. • Ensure the scalability, performance, and reliability of cloud infrastructure. 3. Configuration Management: Utilize server configuration management tools like Salt, Puppet, Ansible, and Chef to automate system configurations. Maintain and troubleshoot Linux-based environments. 4. CI/CD Pipeline Development: Build and manage service delivery pipelines using CI/CD tools such as Jenkins, GitLab, or CodePipeline. Ensure seamless integration and deployment of applications. 5. System Security and Best Practices: Implement best practices for system security and ensure compliance with security standards. Monitor and manage system performance using tools like AWS CloudWatch and Datadog. 6. Networking and Virtualization: Configure and manage virtualized environments and containerization technologies. Maintain solid networking knowledge, including OSI network layers and TCP/IP protocols. 7. Collaboration and Communication: Work closely with product engineers to develop tools and services that enhance productivity. Communicate effectively with team members and stakeholders to ensure alignment on project goals. 8. Continuous Improvement: Stay updated with the latest cloud technologies and best practices. Continuously improve infrastructure and automation processes to enhance efficiency and reliability Preferred candidate profile Mandatory Skills: AWS Infrastructure Engineer/Cloud Engineer/Python or bash shell scripts, Perl, Ruby, Groovy Perks and benefits

Posted 2 months ago

Apply

3 - 8 years

15 - 25 Lacs

Pune, Delhi NCR, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Required Qualifications: 1. bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB, Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 months ago

Apply

5 - 10 years

20 - 25 Lacs

Kolkata

Work from Office

Naukri logo

Backend Development: Design, develop, and optimize backend services using Java Spring Boot and Microservices architecture. API Development & Integration: Build and maintain RESTful and GraphQL APIs for seamless communication between services. Frontend Understanding: Have a strong grasp of frontend technologies to collaborate effectively with frontend teams for both web and mobile applications. Database Management: Work with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) databases to optimize data storage and retrieval. DevOps & Cloud: Deploy, monitor, and manage applications in AWS (EC2, S3, Lambda, RDS, API Gateway, etc.). Scalability & Performance: Implement solutions to optimize performance, scalability, and security of applications. Code Reviews & Best Practices: Conduct code reviews, write clean, reusable, and efficient code following best practices. Collaboration: Work closely with cross-functional teams including frontend developers, DevOps engineers, and product managers to develop end-to-end solutions. Automation & CI/CD: Implement CI/CD pipelines and automated testing strategies for efficient development workflows. Security & Compliance: Ensure data security, authentication, and authorization mechanisms are in place. Experience with Serverless computing (AWS Lambda, Firebase Functions, etc.). Familiarity with Event-Driven Architecture using Kafka, RabbitMQ, or similar tools. Exposure to Machine Learning & AI frameworks (TensorFlow, PyTorch) is a plus. Experience with Mobile App Development (React Native, Flutter, or Native development). Contributions to open-source projects or active participation in developer communities.

Posted 2 months ago

Apply

10 - 13 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Department: ISS Reports To: Head of Data Platform - ISS Grade : 7 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our team and feel like youre part of something bigger. Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics. Be accountable for technical delivery and take ownership of solutions. Lead a team of senior and junior developers providing mentorship and guidance. Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress. Drive technical innovation within the department to increase code reusability, code quality and developer productivity. Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion. Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes. Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making. Communication:Strong in strategic communication and stakeholder engagement. Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources.

Posted 2 months ago

Apply

4 - 6 years

6 - 8 Lacs

Gurgaon

Work from Office

Naukri logo

Department GPS Technology Reports To Project Manager Level 3 About your role As a Web Application developer you will play a key role on a global programme working with Product Owners Digital and Technical teams within Fidelity International to deliver a new technology platform to support Fidelity Internationals Workplace Invesments business. Working alongside multiple stakeholder groups you will need to utilise your experience of working in agile delivery teams to assist with the design, definition, exploration and delivery of an end to end technology solution to service a scaling global Workplace Invesments business. The successful candidate will have a detailed knowledge of working in an Agile environment using Agile tools and techniques, behaviour driven development (BDD). You will have experience working with full scale Scrum or Kanban, and be a competent user of Jira and Confluence. You must have a passion for delivering high quality and scalable solutions with a continued focus on the customer need. You should be both willing to challenge and to be challenged on where things can be improved and are comfortable working alongside other engineers in a pair programming environment. You will be required to effectively influence and assist key stakeholders, support the formation of a new team and commence delivery of a largely greenfield solution. A willingness to take on new challenges, collaborate and share knowledge freely with global teams is absolutely critical to success. About you This position requires a strong self-starter with solid technical engineering background and influencing skills, who can lead the way, assist development teams with architecture, design decisions, coding, trouble shooting and any other technical issues related to implementation of a customer facing proposition. Responsible for delivering and providing technical expertise as part of the delivery team from both design and day to day coding. Working with the product owners identify new improvements, customer requirements and follow through to delivery. Ensure delivery in a timely, efficient and cost effective manner. Stakeholder management across various Technology and Business teams. Ensures that technical solutions are fit for purpose, including for functional, non-functional and support requirements and aligned to Global Technology Strategies. Be the trusted advisor to the business. Partner closely with Architecture, business and supporting central groups while working within a global team. The ideal candidate will have 4-6 years experience working as a web app dev with: Required: Strong JavaScript experience with a minimum of 4 years experience Hand-on Experience in ReactJS Knowledge of WebPack, Babel and its Plugins Knowledge of ES6, Redux, Jest, Typescript Knowledge of JavaScript Design Patterns Knowledge of NodeJS and Restful API design AWS working knowledge (Lambda, S3, ECS etc) Good to have knowledge on CSS usage. Experience of Source Control Tools such as Git Experience in software delivery in agile methodologies TDD and pair programming best practise with CI/CD pipelines (Eg: Jenkins) Strong communication skills & a customer centric focus Passion for growing your skills and, tackling challenging problems Strong communication skills and interest in a pair-programming environment Passion for working with the newest technologies, prototyping your ideas for others to see Desired: Working knowledge of Angular (Preferably v.8 - v.10) Knowledge of Redux-Saga Redux-Thunk Knowledge of React Hook and Context API Knowledge of AWS (CloudFront API Gateway Lambda) Working knowledge of APIs, caching and messaging Understanding of Dev Ops Automation Experience AWS CloudFormation Terraform Continuous Integration (AWS CodePipeline Jenkins) Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 2 months ago

Apply

3 - 6 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities Requirements- -3+ years of hands-on experience with AWS services including EMR, GLUE, Athena, Lambda, SQS, OpenSearch, CloudWatch, VPC, IAM, AWS Managed Airflow, security groups, S3, RDS, and DynamoDB. -Proficiency in Linux and experience with management tools like Apache Airflow and Terraform. Familiarity with CI/CD tools, particularly GitLab. Responsibilities- -Design, deploy, and maintain scalable and secure cloud and on-premises infrastructure. -Monitor and optimize performance and reliability of systems and applications. -Implement and manage continuous integration and continuous deployment (CI/CD) pipelines. -Collaborate with development teams to integrate new applications and services into existing infrastructure. -Conduct regular security assessments and audits to ensure compliance with industry standards. -Provide support and troubleshooting assistance for infrastructure-related issues. -Create and maintain detailed documentation for infrastructure configurations and processes.

Posted 2 months ago

Apply

6 - 10 years

5 - 15 Lacs

Delhi NCR, Bengaluru, Hyderabad

Hybrid

Naukri logo

Here are the various AWS services we need expertise on: CDK with Python Amazon AppFlow Step Functions Lambda S3 EventBridge CloudWatch/CloudTrail/XRay GitHub Role & responsibilities Preferred candidate profile Immediate joiner only Perks and benefits

Posted 2 months ago

Apply

12 - 16 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Your Impact: We are seeking a skilled and experienced Lead Software Engineer with expertise in Large Language Models (LLM), Natural Language Processing (NLP), Java, Python, Kubernetes, Helm and cloud technologies like AWS. The ideal candidate will contribute to designing, developing, and maintaining scalable software solutions using microservices architecture and lead technical initiatives within the team. This role offers an exciting opportunity to work with cutting-edge technologies in a collaborative environment . What the role offers: Lead the design, development, and maintenance of robust and scalable Java and Python-based applications leveraging microservices architecture. Integrate Large Language Models (LLMs) and NLP capabilities into business applications to enhance functionality and user experience. Designs enhancements, updates, and programming changes for portions and subsystems of application software, utilities, databases, and Internet-related tools. Analyses design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution. Develop RESTful APIs and ensure seamless integration across services. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Implement best practices for cloud-native development using AWS services like EC2, VPC, Lambda, S3, SageMaker etc. Deploy, manage, and scale containerized applications using Kubernetes (K8S) and Helm. Ensure system reliability, security, and performance through effective monitoring and troubleshooting. Drive technical strategy, mentor junior engineers, and foster knowledge-sharing within the team. Participate in code reviews, architectural discussions, and technical leadership initiatives. Ability to write good clean code on a day-to-day basis and actively participates in Code reviews. Leads a project team of other software systems engineers and internal and outsourced development partners to develop reliable, cost effective and high-quality solutions for assigned systems portion or subsystem. Collaborates and communicates with management, internal, and outsourced development partners regarding software systems design status, project progress, and issue resolution. Represents the software systems engineering team for all phases of larger and more-complex development projects. What you need to succeed: Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent. Typically, 12+ years experience Mandatory Skills: Strong understanding of Large Language Models (LLM) and NLP techniques with experience applying them in real-world applications. Expertise in designing and developing RAG based Expertise with Python for LLM-related tasks Expertise of database systems like PostgreSQL. Expertise in transformer models (GPT-3, BERT, T5, etc.) and NLP techniques Experience working with NLP frameworks such as Hugging Face, OpenAI, or similar. Expertise in designing and implementing microservices architecture. Solid experience with AWS services like EC2, VPC, Lambda, SageMaker etc. for cloud deployment and management. Significant hands-on experience with Kubernetes for container orchestration, scaling, and Helm deployment in production environments Experience with event-driven architectures and distributed messaging systems (e.g., Kafka, RabbitMQ). Strong problem-solving skills and the ability to troubleshoot complex issues. Proven leadership experience with mentoring and guiding development teams. Strong knowledge of DevOps process and tools, continues integration and delivery. Experience with CI/CD pipelines, DevOps practices, and infrastructure as code (e.g., Terraform, CloudFormation). Ability to effectively communicate product architectures, design proposals and negotiate options at management levels. Strong experience in Java (Java 17+) and Python (3.x) development, with a deep understanding of frameworks such as Spring Boot, Hibernate, or Java EE. Expertise in designing and building microservices-based architectures using REST API standards. Ability to demonstrate effective teamwork both within the immediate team and across teams. Experience working with RDBMS Databases such as PostgreSQL, Oracle or MSSQL Server with good SQL knowledge. Experience in working with version control and build tools like GitLab, GIT, Maven, and Jenkins, GitLab CI. Expertise in implementing best practices for security, reliability, and performance in software systems. Strong problem-solving abilities and a track record of delivering solutions to complex technical challenges. Desirable Skills: Knowledge of additional database systems like MongoDB, or DynamoDB. Experience with observability tools like Prometheus, Grafana, or ELK Stack. Familiarity with infrastructure as code (IaC) tools such as Terraform or Ansible. Familiarity with Agile methodologies (SAFe agile, Scrum, Kanban) and experience leading Agile teams. A background in software security best practices and knowledge of Kubernetes security considerations.

Posted 2 months ago

Apply

6 - 10 years

65 - 70 Lacs

Chennai, Pune, Kolkata

Work from Office

Naukri logo

For a leading MNC in Fintech , we are seeking a seasoned DevOps Engineer to own parts of our cloud infrastructure and DevOps operations. In this role, you will lead by example, and design, deploy, and optimize our AWS-based infrastructure, ensuring seamless orchestration of workloads across Kubernetes and serverless environments like AWS Lambda. You will play a pivotal role in automating processes, enhancing system reliability, and driving the adoption of DevOps best practices. Collaborating closely with our Engineering, Product, and Data teams, youll contribute to scaling our infrastructure and supporting our rapid growth. This position offers a unique opportunity to refine your technical expertise in a dynamic and fast-paced environment. Location : Remote (Pan India Candidate can apply) Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai Responsibilities Own and drive the architecture, design, and scaling of various parts of our cloud infrastructure on AWS, ensuring security, resilience, and cost efficiency Optimize Kubernetes clusters, including advanced scheduling, networking, and security enhancements to support mission-critical workloads Architect and improve CI/CD pipelines, incorporating automation, canary deployments, and rollback strategies for seamless releases Design and implement monitoring, logging, and observability solutions to ensure proactive issue detection and system performance tuning at scale Establish and enforce security best practices, including IAM governance, secret management, and compliance frameworks Be the go-to expert for multiple infrastructure components, providing technical leadership and driving improvements across interconnected systems Lead large-scale projects spanning multiple quarters, defining roadmaps, tracking progress, and ensuring timely execution with minimal supervision Drive collaboration with cross-functional teams, including ML, Data, and Product, to align infrastructure solutions with business and engineering goals Mentor and support junior and mid-level engineers, fostering a culture of continuous learning, technical excellence, and best practices Set and refine DevOps standards, driving automation, scalability, and system reliability across the organization Qualifications A minimum of 7 years of experience in DevOps, SRE, or a similar role, with expertise in designing and managing large-scale cloud infrastructure Experience working on software product development, with proficiency in a mainstream stack. Deep hands-on experience with AWS services such as EC2, S3, RDS, Lambda, ECS, EKS, and VPC networking Advanced proficiency in Terraform for infrastructure as code, including best practices for scaling and managing cloud resources Strong expertise in Kubernetes, including cluster provisioning, networking, security hardening, and performance optimization Proficiency in scripting and automation using Python, Bash, or Go, with experience integrating APIs and optimizing workflows Experience designing and maintaining CI/CD pipelines using tools like CircleCI, Jenkins, GitLab CI/CD, or ArgoCD Strong knowledge of monitoring, logging, and observability tools such as DataDog, Prometheus, Grafana, and AWS CloudWatch Deep understanding of cloud security, IAM governance, role-based access control (RBAC), and compliance frameworks like SOC2 or ISO 27001 Proven ability to lead and mentor junior engineers while fostering a collaborative and high-performance team culture Excellent communication skills, with the ability to work effectively across cultures, functions, and time zones in a globally distributed team

Posted 2 months ago

Apply

4 - 7 years

10 - 15 Lacs

Thane

Work from Office

Naukri logo

Job Description Design, deploy, and manage scalable cloud infrastructure on AWS. Implement CI/CD pipelines using AWS services (CodePipeline, CodeBuild, etc.). Automate infrastructure using Terraform/CloudFormation. Monitor system health and optimize performance using CloudWatch and X-Ray. Manage IAM roles, security groups, and compliance per Axis Bank policies. Handle S3, EC2, RDS, Lambda, VPCs, and networking configurations. Collaborate with development and DevOps teams for cloud-native solutions. Ensure high availability, disaster recovery, and data integrity. Maintain documentation and conduct periodic infrastructure reviews. AWS certification (SysOps/Admin/Architect) preferred.

Posted 2 months ago

Apply

6 - 10 years

4 - 7 Lacs

Chennai

Work from Office

Naukri logo

Min 5+ years relevant Experience A good candidate for DAP prod support would be a good communicator with experience supporting some/all of our stack (Snowflake AWS services for data (EMR, lambda, S3 StreamSets (or Informatica or similar) and ideally could help us creating dashboards and alerts. Should have excellent Inter Personal skill. Contact Person - Hanna Contact Number- 9840082217 Email - hanna@gojobs.biz

Posted 2 months ago

Apply

9 - 14 years

35 - 45 Lacs

Hyderabad

Remote

Naukri logo

Senior Data Engineer (SQL, Python & AWS) Experience: 9-15 years Salary : INR 35,00,000-45,00,000 / year Preferred Notice Period : Within 30 Days Shift : 5:30AM to 2:30PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Airflow, ETL pipelines, PostgreSQL Aurora database, PowerBI, AWS, Python, RestAPI, SQL Good to have skills : Athena, Data Lake Architecture, Glue, Lambda, JSON, Redshift, Tableau Leading Proptech Company (One of Uplers' Clients) is Looking for: Data Engineer (WFH) who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description We are seeking an experienced Data Engineer to join our team of passionate professionals working on cutting-edge technology. In this role, you will be responsible for the ELT of our Company using Python, Airflow, and SQL within an AWS environment. Additionally, you will create and maintain data visualizations and dashboards using PowerBI, connecting to our SQL Server and PostgreSQL Aurora database through a gateway. This role requires strong critical thinking, the ability to assess data and outcomes, and proactive problem-solving skills. Responsibilities: Design, develop, and maintain ELT pipelines using Python, Airflow, and SQL in an AWS environment. Create and manage a data lake and data warehouse solutions on AWS. Develop and maintain data-driven dashboards and reporting solutions in PowerBI. ¢ Connect PowerBI to SQL Server and PostgreSQL Aurora databases using a gateway. ¢ Extract and integrate data from third-party APIs to populate the data lake. ¢ Perform data profiling and source system analysis to ensure data quality and integrity. ¢ Collaborate with business stakeholders to capture and understand data requirements. ¢ Implement industry best practices for data engineering and visualization. ¢ Participate in architectural decisions and contribute to the continuous improvement of data solutions. ¢ Follow agile practices and a Lean approach in project development. ¢ Critically assess the outcomes of your work to ensure they align with expectations before marking tasks as complete. ¢ Optimize SQL queries for performance and ensure efficient database operations. ¢ Perform database tuning and optimisation as needed. ¢ Proactively identify and present alternative solutions to achieve desired outcomes. ¢ Take ownership of end-to-end data-related demands from data extraction (whether from internal databases or third-party apps) to understanding the data, engaging with relevant people when needed, and delivering meaningful solutions. Required Skills and Experience: ¢ At least 9+ years of experience will be preferred. Strong critical thinking skills to assess outcomes, evaluate results, and suggest better alternatives where appropriate. ¢ Expert-level proficiency in SQL (TSQL, MS SQL) with a strong focus on optimizing queries for performance. ¢ Extensive experience with Python (including data-specific libraries) and Airflow for ELT processes. ¢ Proven ability to extract and manage data from third-party APIs. ¢ Proven experience in designing and developing data warehousing solutions on the AWS cloud platform. ¢ Strong expertise in PowerBI for data visualization and dashboard creation. ¢ Familiarity with connecting PowerBI to SQL Server and PostgreSQL Aurora databases. ¢ Experience with REST APIs and JSON ¢ Agile development experience with a focus on continuous delivery and improvement. ¢ Proactive mindset able to suggest alternative approaches to achieve goals efficiently. ¢ Excellent problem-solving skills and a proactive can-do attitude. ¢ Strong communication skills and the ability to work collaboratively in a team environment. ¢ Ability to independently assess data, outcomes, and potential gaps to ensure results align with business goals. ¢ Ability to perform database tuning and optimization to ensure efficient data operations. Desired Skills: ¢ Exposure to AWS Cloud Data Services such as RedShift, Athena, Lambda, Glue, etc. ¢ Experience with other reporting tools like Tableau. ¢ Knowledge of data lake architectures and best practices. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: We are a cloud-based residential sales platform designed to bridge the communication gap between clients, sales teams, and construction teams. Our goal is to ensure seamless collaboration, resulting in buildable and well-aligned residential projects. As builders with a strong tech foundation, we bring deep industry expertise to every solution we create. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 months ago

Apply

1 - 3 years

3 - 7 Lacs

Mumbai

Work from Office

Naukri logo

about the role Cloud Engineers with experience in managing, planning, architecting, monitoring, and automating large scale deployments to Public Cloud.you will be part of a team of talented engineers to solve some of the most complex and exciting challenges faced in IT Automation and Hybrid Cloud Deployments. key responsibilities Consistently strive to acquire new skills on Cloud, DevOps, Big Data, AI and ML technologies Design, deploy and maintain Cloud infrastructure for Clients Domestic & International Develop tools and automation to make platform operations more efficient, reliable and reproducible Create Container Orchestration (Kubernetes, Docker), strive for full automated solutions, ensure the up-time and security of all cloud platform systems and infrastructure Stay up to date on relevant technologies, plug into user groups, and ensure our client are using the best techniques and tools Providing business, application, and technology consulting in feasibility discussions with technology team members, customers and business partners Take initiatives to lead, drive and solve during challenging scenarios preferred qualifications 1-3 years experience in Cloud Infrastructure and Operations domains Experience with Linux systems and/OR Windows servers Specialize in one or two cloud deployment platforms: AWS, GCP, Azure Hands on experience with AWS services (EKS, ECS, EC2, VPC, RDS, Lambda, GKE, Compute Engine) Experience with one or more programming languages (Python, JavaScript, Ruby, Java, .Net) Good understanding of Apache Web Server, Nginx, MySQL, MongoDB, Nagios Logging and Monitoring tools (ELK, Stackdriver, CloudWatch) DevOps Technologies Knowledge on Configuration Management tools such as Ansible, Terraform, Puppet, Chef Experience working with deployment and orchestration technologies (such as Docker, Kubernetes, Mesos) Deep experience in customer facing roles with a proven track record of effective verbal and written communications Dependable and good team player Desire to learn and work with new technologies Automation in your blood

Posted 2 months ago

Apply

11 - 13 years

13 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Position Overview: The job profile for this position is Software Engineering Advisor, which is a Band 4 Contributor Career Track Role. Excited to grow your career ? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! We are looking for a software engineer to join our team. In this position, the primary responsibility involves delivering software with the help of research, solution analysis, and understanding the backend systems. This includes performing tasks in coding, unit testing, integration testing and deploying applications. The software engineer will aim to achieve efficiency through aligning frontend application and backend services with the delivery and business teams' objectives during a project's development and testing phase. This individual will build Java springboot REST API by architecting, building and managing cloud native APIs. They are expected to work closely with Subject Matter Experts, developers, technical project managers, principal engineers and business stakeholders to ensure that application solutions meet business/customer requirements. Responsibilities: Gathers and analyzes test software requirements for various applications within the Digital organization. Understands a programs architecture / design, logical and physical data models. Design and develop Java springboot REST APIs Be hands-on in the design and development of robust solutions to hard problems, while considering scale, security, reliability, and cost Support other product delivery partners in the successful build, test, and release of solutions Work with distributed requirements and technical stakeholders to complete shared design and development Support the full software lifecycle of design, development, testing, and support for technical delivery Works with both onsite (Scrum Master, Product, QA and Developers) and offshore QA team members in properly defining testable scenarios based on requirements/acceptance criteria. Be part of a fast-moving team, working with the latest tools and open-source technologies Work on a development team using agile methodologies. Understand the Business and the Application Architecture End to End Solve problems by crafting software solutions using maintainable and modular code Participate in daily team standup meetings where you'll give and receive updates on the current backlog and challenges. Participate in code reviews. Ensure Code Quality and Deliverables Provide Impact analysis for new requirements or changes. Responsible for low level design with the team Qualifications Mandatory Skills: Experience in working with the SAM (Serverless Application Model) framework, with a strong command of Lambda functions using NodeJS/JavaScript. Java Springboot. Knowledge of API tools like Postman, SoapUI, etc Good to Have: Technology Stack: , Kafka, AWS Serverless, Lambda, Step Functions, DynamoDB, ElastisCache Redis, AWS Cloud & DevOps, Jenkins CI/CD pipelines, Terraform, NOSQL and Relational Databases, JUnit Hands-on experience with AWS SDK and NodeJS/JavaScript, demonstrating proficiency in leveraging AWS services. Knowledge in internal integration within AWS ecosystem using Lambda functions, leveraging services such as Event Bridge, S3, SQS, SNS, API Gateway and others. Experienced in internal integration within AWS using DynamoDB with Lambda functions, demonstrating the ability to architect and implement robust serverless applications. Experience with Kafka, NoSQLand relational databases CI/CD experience: must have GitHub experience. Experience in developing backends using golang or Springboot Experience in developing frontend with Angular or React Experience with cloud-based platforms like AWS . Recognized internally as the go-to person for the most complex software engineering assignments. Good knowledge in Healthcare and Pharmacy Benefit Management Required Experience & Education: 11 - 13 years of experience Proven experience with architecture, design, and development of large-scale enterprise application solutions. College degree (Bachelor) in related technical/business areas or equivalent work experience. Location & Hours of Work: Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate. Primarily based in the Innovation Hub in Hyderabad, India, with flexibility to work remotely as required.

Posted 2 months ago

Apply

4 - 6 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: Cigna, a leading Health Services company, is looking for data engineers/developers in our Data & Analytics organization. The Full Stack Engineer is responsible for the delivery of a business need end-to-end starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the joband add value to the business. Critical attributes of being a Full Stack Engineer, among others, is Ownership & Accountability. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. He/She should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Behaviors of a Full Stack Engineer: Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers not institutionalized developers Job Description & Responsibilities: Minimize "meetings" to get requirements and have direct business interactions Write referenceable & modular code Design and architect the solution independently Be fluent in particular areas and have proficiency in many areas Have a passion to learn Take ownership and accountability Understands when to automate and when not to Have a desire to simplify Be entrepreneurial / business minded Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact Take risks and champion new ideas Experience Required: 4+ years being part of Agile teams 3+ years of scripting 3+ years of database Teradata. 2+ years of AWS services 1+ years of experience with FHIR. (Good to have). Experience Desired: Experience with GITHUB Devops experience Jenkins, Terraform & Docker. Python/Pyspark. SQL experience good to have. Education and Training Required: Knowledge and/or experience with Health care information domains is a plus Computer science Good to have Primary Skills: Must Haves Inter systems experience. Python / Pyspark experience. Exposure to AWS services - Glue, S3, SNS, SQS, Lambda, Step Functions, Opensearch, DynamoDB, API Gateway etc. Additional Skills: Excellent troubleshooting skills Strong communication skills Fluent in BDD and TDD development methodologies Work in an agile CI/CD environment (Jenkins experience a plus)

Posted 2 months ago

Apply

7 - 12 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Candiate, We are hiring PTC FlexPLM Developers for MNC clients. For your reference, please find the job details below as per the client's requirements. Job Title: PTC FlexPLM Developer Location: Bangalore Experience: 7+ Years Must-Have Skills: Strong experience in PTC FlexPLM 12+ customization (SSPs, CSP’s, JSP Override). System integration using web services (JSON) . Expertise in Microservices, Java 8+, Spring Boot . Experience with SQL/PLSQL, JavaScript frameworks (Angular, React, NodeJS, jQuery) . Nice-to-Have: Experience with AWS (S3, EC2, SQS, SNS, Lambda) , Splunk, Jenkins, JIRA, and Confluence. Responsibilities: Customize and configure FlexPLM components. Integrate with external systems via APIs. Provide ongoing support and documentation. Apply now! If you are interested, please send us your updated rsum.

Posted 2 months ago

Apply

10 - 15 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

About the Role We are seeking a skilled AWS Redshift ETL Developer to design, develop, and maintain scalable ETL pipelines to support our data warehousing and analytics initiatives. The ideal candidate will have hands-on experience with AWS data services, strong SQL expertise, and a deep understanding of data integration best practices. Key Responsibilities Design, develop, and manage robust ETL pipelines to move and transform data into AWS Redshift. Work with large datasets from various structured and semi-structured sources (S3, RDS, DynamoDB, external APIs, etc.). Write efficient SQL queries and perform performance tuning on Redshift clusters. Collaborate with data engineers, analysts, and stakeholders to ensure data availability and quality. Implement data quality checks, error handling, and monitoring for ETL workflows. Automate and schedule ETL jobs using tools such as AWS Glue, Lambda, or Airflow. Maintain documentation for ETL processes, data models, and data flows. Participate in code reviews and enforce data engineering standards. Required Skills & Qualifications 3+ years of experience in building ETL pipelines. Strong experience with AWS Redshift and AWS data services (S3, Glue, Lambda, Step Functions, etc.). Proficient in SQL and data warehousing principles. Hands-on with Python, PySpark, or other scripting languages. Familiarity with data modeling techniques and schema design in Redshift. Experience with version control (e.g., Git) and CI/CD for ETL deployments. Strong problem-solving and communication skills. Preferred Qualifications Experience with Apache Airflow or other orchestration tools. Knowledge of data governance, security, and compliance standards. Exposure to BI tools like Tableau, Power BI, or QuickSight. AWS Certification (e.g., AWS Certified Data Analytics Specialty) is a plus.

Posted 2 months ago

Apply

1 - 3 years

3 - 6 Lacs

Gurgaon

Work from Office

Naukri logo

About your role As a Web Application developer you will play a key role on a global programme working with Product Owners Digital and Technical teams within Fidelity International to deliver a new technology platform to support Fidelity Internationals Workplace Invesments business. Working alongside multiple stakeholder groups you will need to utilise your experience of working in agile delivery teams to assist with the design, definition, exploration and delivery of an end to end technology solution to service a scaling global Workplace Invesments business. The successful candidate will have a detailed knowledge of working in an Agile environment using Agile tools and techniques, behaviour driven development (BDD). You will have experience working with full scale Scrum or Kanban, and be a competent user of Jira and Confluence. You must have a passion for delivering high quality and scalable solutions with a continued focus on the customer need. You should be both willing to challenge and to be challenged on where things can be improved and are comfortable working alongside other engineers in a pair programming environment. You will be required to effectively influence and assist key stakeholders, support the formation of a new team and commence delivery of a largely greenfield solution. A willingness to take on new challenges, collaborate and share knowledge freely with global teams is absolutely critical to success. About you This position requires a strong self-starter with solid technical engineering background and influencing skills, who can lead the way, assist development teams with architecture, design decisions, coding, trouble shooting and any other technical issues related to implementation of a customer facing proposition. Responsible for delivering and providing technical expertise as part of the delivery team from both design and day to day coding. Working with the product owners identify new improvements, customer requirements and follow through to delivery. Ensure delivery in a timely, efficient and cost effective manner. Stakeholder management across various Technology and Business teams. Ensures that technical solutions are fit for purpose, including for functional, non-functional and support requirements and aligned to Global Technology Strategies. Be the trusted advisor to the business. Partner closely with Architecture, business and supporting central groups while working within a global team. The ideal candidate will have 1-3 years experience working as a web app dev with: Required: Strong JavaScript experience with a minimum of 1 year experience Hand-on Experience in ReactJS Knowledge of WebPack, Babel and its Plugins Knowledge of ES6, Redux, Jest, Typescript Knowledge of JavaScript Design Patterns Knowledge of NodeJS and Restful API design AWS working knowledge (Lambda, S3, ECS etc) Good to have knowledge on CSS usage. Experience of Source Control Tools such as Git Experience in software delivery in agile methodologies TDD and pair programming best practise with CI/CD pipelines (Eg: Jenkins) Strong communication skills & a customer centric focus Passion for growing your skills and, tackling challenging problems Strong communication skills and interest in a pair-programming environment Passion for working with the newest technologies, prototyping your ideas for others to see Desired: Working knowledge of Angular (Preferably v.8 - v.10) Knowledge of Redux-Saga Redux-Thunk Knowledge of React Hook and Context API Knowledge of AWS (CloudFront API Gateway Lambda) Working knowledge of APIs, caching and messaging Understanding of Dev Ops Automation Experience AWS CloudFormation Terraform Continuous Integration (AWS CodePipeline Jenkins)

Posted 2 months ago

Apply

3 - 8 years

6 - 16 Lacs

Mumbai

Work from Office

Naukri logo

Job Description AWS Data engineer We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Exp- 3 to 7 years Location- Bangalore, Pune, Hyderabad, Coimbatore, Delhi NCR, Mumbai Key Responsibilities: 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 months ago

Apply

3 - 8 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description AWS Data engineer We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Exp- 3 to 7 years Location- Bangalore, Pune, Hyderabad, Coimbatore, Delhi NCR, Mumbai Key Responsibilities: 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 months ago

Apply

4 - 9 years

15 - 27 Lacs

Delhi, Gurgaon, Noida

Hybrid

Naukri logo

ROLE: We are looking for an energetic and motivated Software Engineer to design, develop and deliver leading edge enterprise level products and features with a focus on customer requests and business scenarios. You will work in a team of 4-5 developers and required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as well as partner continuously with your stakeholders daily to stay focused on common goals. KEY RESPONSIBILITIES: Perform hands-on coding. Follow best practices, methodologies, & coding standards in all stages: development, configuration, testing, deployment, integration, & maintenance Participate in all aspects of agile software development, including design, implementation, and deployment Develop rock-solid, scalable systems that are thoughtfully designed Collaborate with other team leads, engineers, product owners, scrum masters, and domain experts to build the right solutions that addresses business needs Implement modern delivery practices such as CI/CD pipelines Solve the challenging design and technical problems in the system to improve reliability, resilience, and performance KEY SKILLS AND QUALIFICATIONS A-player, high energy, excellent team player with strong work ethics. 5+ years of experience developing desktop and web-based applications. Hands-on expertise with modern .Net based technologies, C#, JavaScript, Databases (MS SQL, PostgreSQL), NoSQL (Mongo), Python, RESTful API. Hands-on experience with containerization platforms, serverless computing and event streaming, such as Docker, EKS, Lambda, SQS, SNS, Apache Kafka. Significant experience in SDLC in Agile/Lean/Scrum methodologies. High proficiency in DevOps methods and CI/CD automation practice.

Posted 2 months ago

Apply

6 - 11 years

22 - 30 Lacs

Pune

Work from Office

Naukri logo

Hi All, We are currently hiring for the position of AWS Devops Engineer, Role :Sr.Devops Engineer Job Location: Pune Experience : 5 - 10years Work Mode : Work from Office(Monday Friday) Interview Mode: Virtual Job Description : Primary Skills :AWS CLOUD, AWS Lambda, Jenkins, GitLab, Kubernetes, Terraform Cloud Platform: AWS Cloud (VPC, EC2, S3, Cloud watch, EKS, ECS, ECR, Elk , Lambda, etc) Containerization: EKS (Kubernetes) Infrastructure Automation: CloudFormation and Terraform Monitoring Tools: ELK Stack, DataDog (Licensed version ) Security: AWS Managed Services such as AWS GuardDuty, WAF . CI/CD Pipeline: Jenkins for CI/CD and GitLab as the source code repository Serverless Adoption: AWS Lambda (Python scripting is mandatory) If you are interested in exploring this opportunity, please send us your updated resume to shakambari.nayak@intelliswift.com with below details Total Exp: Rel exp in K8S: Rel exp in CI/CD: Rel Exp in Gitlab Genkins: Current CTC: Expected CTC: Notice Period:

Posted 2 months ago

Apply

8 - 12 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: Design and develop our next generation of RESTful APIs and Event driven services in a distributed environment. Be hands-on in the design and development of robust solutions to hard problems, while considering scale, security, reliability, and cost Support other product delivery partners in the successful build, test, and release of solutions. Work with distributed requirements and technical stakeholders to complete shared design and development. Support the full software lifecycle of design, development, testing, and support for technical delivery. Works with both onsite (Scrum Master, Product, QA and Developers) and offshore QA team members in properly defining testable scenarios based on requirements/acceptance criteria. Be part of a fast-moving team, working with the latest tools and open-source technologies Work on a development team using agile methodologies. Understand the Business and the Application Architecture End to End Solve problems by crafting software solutions using maintainable and modular code. Participate in daily team standup meetings where you'll give and receive updates on the current backlog and challenges. Participate in code reviews. Ensure Code Quality and Deliverables Provide Impact analysis for new requirements or changes. Responsible for low level design with the team Qualifications: Required Skills: Technology Stack: Java Spring Boot, GitHub, OpenShift, Kafka, MongoDB, AWS, Serverless, Lambda, OpenSearch Hands on experience with Java 1.8 or higher, Java, Spring Boot, OpenShift, Docker, Jenkins Solid understanding of OOP, Design Patterns and Data Structures Experience in building REST APIs/Microservices Strong experience in frontend skills like React JS/Angular JS Strong understanding of parallel processing, concurrency and asynchronous concepts Experience with NoSQL databases like MongoDB, PostgreSQL Proficient in working with the SAM (Serverless Application Model) framework, with a strong command of Lambda functions using Java. Proficient in internal integration within AWS ecosystem using Lambda functions, leveraging services such as Event Bridge, S3, SQS, SNS, and others. Must have experience in Apache Spark. Experienced in internal integration within AWS using DynamoDB with Lambda functions, demonstrating the ability to architect and implement robust serverless applications. CI/CD experience: must have GitHub experience. Recognized internally as the go-to person for the most complex software engineering assignments Required Experience & Education: 11-13 years of experience Experience with vendor management in an onshore/offshore model. Proven experience with architecture, design, and development of large-scale enterprise application solutions. College degree (Bachelor) in related technical/business areas or equivalent work experience. Industry certifications such as PMP, Scrum Master, or Six Sigma Green Belt

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies