Home
Jobs
Companies
Resume

104 Lamda Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 15 years

20 - 32 Lacs

Gurgaon

Work from Office

Naukri logo

Key Responsibilities:. • Hands-on development using Node.js (Express.js, Nest.js, Serverless). • Write and review unit test cases using Mocha, Jest, or Jasmine. • Manage CI/CD pipelines using Jenkins, AWS CICD, Bitbucket CICD, etc. • Lead and mentor a team of at least 5-7 engineers. • Ensure application and infra monitoring using CloudWatch, NewRelic, Instana, etc. • Optimize application performance with load testing and improvements. Technical Skills: • Backend: Node.js (Express.js, Nest.js, Serverless). • Databases: PostgreSQL, MySQL, MongoDB. • Caching & Search: Redis, Elasticsearch. • Cloud: AWS, Azure, GCP, Heroku. • CI/CD: Jenkins, AWS CICD, Bitbucket CICD, GitHub Hooks, Circle CI, Serverless. • Containerization: Docker, Kubernetes, ECS, EKS. • Monitoring: CloudWatch, NewRelic, Instana, ELK. • Authentication: JWT, Cognito, SSO, LDAP, Firebase

Posted 2 months ago

Apply

7 - 12 years

20 - 35 Lacs

Chennai

Work from Office

Naukri logo

Hiring Immediate AI Engineers !!!!! Job Description : 811 years of experience in a software engineering role, with a focus on backend or full-stack development Proven track record of AI/LLM application development or integration Strong experience in Python-based AI application development with API engineering Proficiency in RESTful APIs, microservices, and cloud-based AI deployments (AWS, Kubernetes, Lambda) Familiarity with AI orchestration tools for AI workflow automation Knowledge of SQL and NoSQL databases (PostgreSQL) for AI-powered search Experience working in Agile teams and delivering AI-driven features in a cloud-first environment Bachelors Degree in Computer Science or related field Understanding of healthcare data privacy regulations (HIPAA, GDPR) is a plus BEHAVIORS & ABILITIES REQUIRED Ability to learn and adapt rapidly while producing high-quality code Capable of translating AI/LLM concepts into practical, scalable software solutions Innovative thinker who finds creative ways to execute when historical context is limited Strong analytical skills to assess potential designs and choose the best solution for the business Committed to delivering results under challenging circumstances Skilled at mentoring and coaching to elevate junior team members Able to uphold best engineering practices for quality, security, and performance

Posted 2 months ago

Apply

5 - 9 years

9 - 19 Lacs

Bengaluru, Hyderabad

Hybrid

Naukri logo

We are seeking a skilled Data Engineer with expertise in PySpark, Databricks, SQL, and experience with the AWS cloud platform. The ideal candidate will design, develop, and maintain scalable data pipelines and processing systems, ensuring data quality, integrity, and security. Responsibilities include implementing ETL processes, collaborating with stakeholders to meet data requirements, and utilizing AWS services such as S3, Lambda, and Glue. 5+ years of data engineering experience, proficiency in SQL, and strong problem-solving and communication skills are required.

Posted 2 months ago

Apply

8 - 12 years

20 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer Location: Bangalore Experience: 8+ Years Job Description: We are looking for a highly skilled Data Engineer to join our team and contribute to our Data Ingestion and Lakehouse initiatives. The ideal candidate will have extensive experience with data streaming technologies and cloud infrastructure management. 8+ years experiences on relevant field below (Internship, prototype, and personal projects do not count) Coding is required . (Ideally Python or Java) Own end to end lifecycle (From development to deployment to production environment) Experience in building or deploying solution in the cloud. Either Cloud Native (Serverless) : S3, Lambda, AWS Batch, ECS Or Cloud Agnostic: Kubernetes, Helm Chart, ArgoCD, Prometeus, Grafana. CICD experience: Github action or Jenkin. Infrastructure as code : e.g., Terraform And experience in at least one of this focus area: Big Data: Building Big data pipeline or Platform to process petabytes of data: (PySpark, Hudi, Data Lineage, AWS Glue, AWS EMR, Kafka, Schema Registry) Or GraphDB : Ingesting and consuming data in Graph Database such as Neo4J, AWS Neptune, JanusGraph or DGraph How to Apply: Interested candidates can send their resumes to ritu.singh@calsoftinc.com We are looking for candidates who can either join in 7 days or, at the maximum within 15 days.

Posted 2 months ago

Apply

5 - 10 years

0 - 1 Lacs

Pune

Hybrid

Naukri logo

Role & Responsibilities Full-Stack Development: Build and maintain the Next.js frontend and Laravel API backend . Optimize and scale the integration between QuickBooks, Stripe, and Acodei . Cloud & Infrastructure: Enhance our AWS-based architecture (EC2, RDS, SQS, Lambda, CloudWatch, etc.). Implement best practices for high availability, security, and performance . Data Processing & Webhooks: Manage and optimize webhook processing via AWS Lambda, SQS, and Laravel Workers . Develop and maintain the Go microservice for intensive data processing (historical pulls, analytics). Scalability & Performance Optimization: Ensure database scalability (AWS RDS, MySQL/Postgres, Multi-AZ Replication). Optimize Laravel workers, queueing, and caching mechanisms using Redis/ElastiCache. Security & Compliance: Implement and enforce best practices for IAM policies, encryption, VPC security, and audit monitoring . Leverage AWS CloudTrail & CloudWatch for proactive system monitoring and alerting. Collaboration & Leadership: Work closely with cross-functional teams (Product, DevOps, and QA). Mentor junior engineers and contribute to technical decision-making. Qualifications & Experience Must-Have Skills: 5+ years of experience in full-stack development with Next.js (React), Laravel (PHP), and Go (Optional) . Strong experience with AWS (EC2, RDS, Lambda, SQS, S3, IAM, ElastiCache, SES, CloudWatch, CloudTrail, VPCs, ECR) . Hands-on experience with webhook handling , queue processing, and event-driven architectures. Expertise in MySQL/PostgreSQL , query optimization, and multi-instance database setups. Deep understanding of scalability, performance tuning, and caching strategies . Proficiency in containerization & CI/CD (Docker, ECR, and automated deployments). Strong knowledge of OAuth authentication flows (QuickBooks, Stripe). Experience with security best practices for cloud-native applications. Nice-to-Have Skills: Experience with serverless architectures and API Gateway. Background with microservice architecture . Background in AI/ML data processing for financial analytics. Exposure to SOC2, PCI compliance , and secure handling of financial data. Previous experience in B2B SaaS and fintech-related integrations . Location: Pune

Posted 2 months ago

Apply

3 - 7 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: AWS Data engineer We are seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Exp- 3 to 7 years Location- Bangalore, Pune, Hyderabad, Coimbatore, Delhi NCR, Mumbai Key Responsibilities:1. Design and implement scalable, high-performance data pipelines using AWS services2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake4. Create and manage analytics solutions using Amazon Athena and Redshift5. Design and implement database solutions using Aurora, RDS, and DynamoDB6. Develop serverless workflows using AWS Step Functions7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards9. Collaborate with data scientists and analysts to support their data needs10. Optimize data architecture for performance and cost-efficiency11. Troubleshoot and resolve data pipeline and infrastructure issues Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions- Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL Data Warehousing and Analytics- ETL/ELT processes- Data Lake architectures Version control: Git- Agile methodologies

Posted 2 months ago

Apply

3 - 8 years

4 - 9 Lacs

Bengaluru

Hybrid

Naukri logo

Hi, We have a urgent opening for TechOps-DE-Cloudops-Senior for Bangalore location. The opportunity As a Senior Data Engineer, this role will play a pivotal role in managing and optimizing large-scale data architectures that are crucial for providing valuable insights to business users and downstream systems. We are looking for an innovative and experienced professional who is adept at overseeing data flow from diverse sources and ensuring the continuous operation of production systems. Your expertise will be instrumental in maintaining data platforms that empower front-end analytics, contributing to the effectiveness of Takedas dashboards and reporting tools. As a key member of the Analytics Production Support team, you will ensure seamless end-to-end data flow and coordinate with stakeholders and team members across various regions, including India and Mexico. The ability to manage major incidents effectively, including handling Major Incident Management (MIM) bridges, is crucial and flexible to work in 24x7x365 support model. Your key responsibilities Manage and maintain the Data Pipeline (ETL/ ELT Layer) to guarantee high availability and performance. Resolve Data Quality issues within the Service Level Agreement (SLA) parameters by coordinating with cross-functional teams and stakeholders. Proactively monitor the system and take pre-emptive measures against alerts such as Databricks job failures and data quality issues. Monitor and maintain AWS data services, including S3, DMS, Step Functions, and Lambda, to ensure efficient and reliable data loading processes Conduct thorough analyses of code repositories to understand Databricks job failures and determine appropriate corrective actions. Take ownership of support tickets, ensuring timely and effective resolution. Manage major incidents with meticulous attention to detail, ensuring compliance with regulatory requirements and effective data presentation. Perform root cause analysis for major incidents, recurring incidents and propose solutions for permanent resolution. Identify and execute automation opportunities to enhance operational efficiency. Escalate complex issues to the next level of support to ensure a swift resolution. Mentor junior team members, providing a structured training plan for skill enhancement and professional growth. . Skills and attributes for success 3 to 8 years of experience in Data Analytics, with a focus on maintaining and supporting ETL data pipelines using Databricks & AWS Services Proficiency in Databricks and PySpark for code debugging and root cause analysis. Proven experience in a Production Support environment and readiness to work in a 24x7 support model. Strong understanding of: Relational SQL databases. Data Engineering Programming Languages (e.g., Python). Distributed Data Technologies (e.g., PySpark). Cloud platform deployment and tools (e.g., Kubernetes). AWS cloud services and technologies (e.g., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS). Databricks/ETL processes. Familiarity with ITIL principles. Effective communication skills for collaborating with multifunctional teams and strategic partners. Strong problem-solving and troubleshooting abilities. Capacity to thrive in a dynamic environment and adapt to evolving business needs. Commitment to continuous integration and delivery principles to automate code deployment and improve code quality. Familiarity with the following tools and technologies will be considered as an added advantage: Power BI Informatica Intelligent Cloud Services (IICS) Tidal. Must be a proactive learner, eager to cross-skill and advance within our innovative team environment. Databricks Associate Certification is required. ITIL 4 Foundation Level Certification is a plus. To qualify for the role, you must have Databricks Associate Certification Relational SQL databases. Data Engineering Programming Languages (e.g., Python). Distributed Data Technologies (e.g., PySpark). Cloud platform deployment and tools (e.g., Kubernetes). AWS cloud services and technologies (e.g., Lambda, S3, DMS, Step Functions, Event Bridge, Cloud Watch, RDS). Databricks/ETL processesWhat we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations Argentina, China, India, the Philippines, Poland and the UK and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. Well introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: Youll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs.

Posted 2 months ago

Apply

9 - 14 years

15 - 22 Lacs

Pune, Bengaluru, Hyderabad

Hybrid

Naukri logo

Role & responsibilities . AWS Cloud services with Python and its frameworks such as Django on the backend 2. Cloud - AWS such as Lambda, DynamoDB, RDS, AppSync. 3. Experience working with RESTful APIs and/or GraphQl 4. Good understanding of development best practices such as pair programming, TDD 5. Work in an agile

Posted 2 months ago

Apply

5 - 8 years

14 - 24 Lacs

Pune, Navi Mumbai, Bengaluru

Work from Office

Naukri logo

Job Description: We are seeking a highly skilled Data Manager with strong coding skills in Python. This role involves supporting data preparation, curation, and ingestion, as well as pre-processing and post-processing activities. Experience in image data processing, particularly DICOM, is essential. Key Responsibilities: Support data preparation, curation, and ingestion processes. Perform data pre-processing and post-processing activities. Handle image data processing, specifically DICOM. Mandatory Skills: Data Engineering: 7+ years of experience, high usage. Python (Big Data Programming): 7+ years of experience, high usage. DICOM (Medical Imaging): 6+ years of experience, high usage. Optional Skills: AWS (Cloud Providers): 4+ years of experience, high usage. Life Sciences: 5+ years of experience, high usage. Qualifications: Bachelors degree in Computer Science, Data Science, or a related technical discipline. Proven experience in data engineering and management. Strong coding skills in Python. Experience with image data processing, particularly DICOM. Familiarity with AWS and life sciences is a plus. Skills and Attributes: Thrives in dynamic, cross-functional team environments. Possesses a team-first mindset, valuing diverse perspectives and contributing to a collaborative work culture. Approaches challenges with a positive and can-do attitude. Willing to challenge the status quo and take appropriate risks to drive performance. A passionate problem solver with high learning agility. Experience: 5 to 8 years Location: Bangalore/Mumbai/Pune Mandatory Skill: Python, AWS, Data Engineering, Big Data & DICOM (Medical Imaging) Notice: Looking for immediate to 15days joiners only. If you are interested with above job profile, pls share your resume to manojkumar.sampathkumar@citiustech.com along with below details, Total experience on papers: Current CTC: Exp CTC: Notice period: Preferred location: Availability for Virtual interview on Weekday / Weekend:

Posted 2 months ago

Apply

8 - 10 years

18 - 30 Lacs

Navi Mumbai, Bengaluru

Hybrid

Naukri logo

Job Description: We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in API Integration, Python, and AWS. The ideal candidate will have a passion for data engineering and a proven track record of developing robust data pipelines and platforms. Key Responsibilities: Develop and maintain ETL/ELT data pipelines and API integrations (Fast API preferred). Design and implement data platforms/products and data warehouses. Develop data-intensive solutions on AWS, Azure, or GCP for analytics workloads. Design both ETL/ELT processes for batch processing and data streaming architectures for real-time or near real-time data ingestion and processing. Work with various database technologies (e.g., MySQL, PostgreSQL, MongoDB) and data warehouses (e.g., Redshift, BigQuery, Snowflake). Utilize cloud-based data engineering technologies (e.g., Kafka, PubSub, Apache Airflow, Glue). Develop conceptual, logical, and physical data models using ERDs. Create dashboards and data visualizations using tools such as Tableau and Quicksight. Qualifications: Bachelors degree in Computer Science, Data Science, or a related technical discipline. 7+ years of hands-on experience in data engineering. 4+ years of experience in developing data-intensive solutions on AWS, Azure, or GCP. 3+ years of experience in designing ETL/ELT processes and data streaming architectures. 3+ years of experience with database technologies and data warehouses. 5+ years of programming experience in Python. Proficiency in dashboard/BI and data visualization tools (e.g., Tableau, Quicksight). Skills and Attributes: Thrives in dynamic, cross-functional team environments. Possesses a team-first mindset, valuing diverse perspectives and contributing to a collaborative work culture. Approaches challenges with a positive and can-do attitude. Willing to challenge the status quo and take appropriate risks to drive performance. A passionate problem solver with high learning agility. Experience: 8 to 10 years Location: Bangalore/Mumbai Mandatory Skill: Python, Data pipeline, AWS/GCP, Kafka/Airflow Notice: Looking for immediate to 15days joiners only If you are interested with above job profile, pls share your resume to manojkumar.sampathkumar@citiustech.com along with below details, Total experience on papers: Current CTC: Exp CTC: Notice period: Preferred location: Availability for Virtual interview on Weekday / Weekend:

Posted 2 months ago

Apply

3 - 5 years

10 - 16 Lacs

Navi Mumbai, Bengaluru

Work from Office

Naukri logo

Job Description: We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in API Integration, Python, and AWS. The ideal candidate will have a passion for data engineering and a proven track record of developing robust data pipelines and platforms. Key Responsibilities: Develop and maintain ETL/ELT data pipelines and API integrations (Fast API preferred). Design and implement data platforms/products and data warehouses. Develop data-intensive solutions on AWS, Azure, or GCP for analytics workloads. Design both ETL/ELT processes for batch processing and data streaming architectures for real-time or near real-time data ingestion and processing. Work with various database technologies (e.g., MySQL, PostgreSQL, MongoDB) and data warehouses (e.g., Redshift, BigQuery, Snowflake). Utilize cloud-based data engineering technologies (e.g., Kafka, PubSub, Apache Airflow, Glue). Develop conceptual, logical, and physical data models using ERDs. Create dashboards and data visualizations using tools such as Tableau and Quicksight. Qualifications: Bachelors degree in Computer Science, Data Science, or a related technical discipline. 3+ years of hands-on experience in data engineering. 2+ years of experience in developing data-intensive solutions on AWS, Azure, or GCP. 3+ years of experience in designing ETL/ELT processes and data streaming architectures. 2+ years of experience with database technologies and data warehouses. 3+ years of programming experience in Python. Proficiency in dashboard/BI and data visualization tools (e.g., Tableau, Quicksight). Skills and Attributes: Thrives in dynamic, cross-functional team environments. Possesses a team-first mindset, valuing diverse perspectives and contributing to a collaborative work culture. Approaches challenges with a positive and can-do attitude. Willing to challenge the status quo and take appropriate risks to drive performance. A passionate problem solver with high learning agility. Experience: 3 to 5years Location: Bangalore/Mumbai Mandatory Skill: Python, Data pipeline, AWS/GCP, Kafka/Airflow Looking for immediate to 15days joiners only If you are interested with above job profile, pls share your resume to manojkumar.sampathkumar@citiustech.com along with below details, Total experience on papers: Current CTC: Exp CTC: Notice period: Preferred location: Availability for Virtual interview on Weekday / Weekend:

Posted 2 months ago

Apply

9 - 14 years

20 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Pyspark/NoSQL is Mandatory) 1. Person should be strong in Pyspark 2. Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework 3. Hands on and working knowledge in Python 4. Knowledge in AWS services like EMR , S3, Lamda, Step Function, Aurora RDS. 5. Having good knowldge on RDBMS Any SQL 6. Person should work as Individual contributor Good to have: 1. Having experience to convert large set of data from RDMBS to No SQL. 2. Having experience to build data lake & configuration on delta tables. 3. Having good experience with computing & cost optimization. 4. Understanding the environment and use case and ready to build holistic frame works. Soft skill: 1. Having good communication to interact with IT-Stake holders and Business. Understand the pain point to delivery.

Posted 2 months ago

Apply

3 - 7 years

10 - 20 Lacs

Pune

Work from Office

Naukri logo

Job Description We are looking for data engineers who have the right attitude, aptitude, skills, empathy, compassion, and hunger for learning. Build products in the data analytics space. A passion for shipping high-quality data products, interest in the data products space; curiosity about the bigger picture of building a company, product development and its people. Roles and Responsibilities Develop and manage robust ETL pipelines using Apache Spark (Scala) Understand park concepts, performance optimization techniques and governance tools Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse/Data Lake/Data Mesh Collaborate cross-functionally to design effective data solutions Implement data workflows utilizing AWS Step Functions for efficient orchestration. Leverage AWS Glue and Crawler for seamless data cataloging and automation Monitor, troubleshoot, and optimize pipeline performance and data quality Maintain high coding standards and produce thorough documentation. Contribute to high-level (HLD) and low-level (LLD) design discussions Technical Skills Minimum 3 years of progressive experience building solutions in Big Data environments. Have a strong ability to build robust and resilient data pipelines which are scalable, fault tolerant and reliable in terms of data movement. 3+ years of hands-on expertise in Python, Spark and Kafka. Strong command of AWS services like EMR, Redshift, Step Functions, AWS Glue, and AWS Crawler. Strong hands on capabilities on SQL and NoSQL technologies. Sound understanding of data warehousing, modeling, and ETL concepts Familiarity with High-Level Design (HLD) and Low-Level Design (LLD) principles Excellent written and verbal communication skills.

Posted 2 months ago

Apply

4 - 5 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Preferred candidate profile 4-6 years of Professional experience designing/building/maintaining highly available data and analytics platform. 3+ years of experience in data engineering, with a focus on building large-scale data processing systems. Hands-on experience with AWS or similar cloud platform building data engineering solutions for analytics and science. (2+ years) Must have experience building complex data pipelines batch and/or real time event-based processing (2+ years) Strong experience in designing, building and maintaining data warehouse in Redshift or similar cloud-based solutions. (2+ years) Experience in Matillion or similar ETL/ELT tool for developing data ingestion and curation flow (2+ years) Must have strong hands-on experience in SQL. (2+ years) Strong hands-on experience in modern scripting languages using Python. (2+ years) Experience building complex ETL using Spark (Scala or Python) for event based big data processing (1+ years) Strong hands-on experience with NoSQL DBs MongoDB, Cassandra or DynamoDB (1+ years)

Posted 2 months ago

Apply

5 - 10 years

10 - 20 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities Design, deploy, and manage scalable cloud infrastructure on AWS using ECS, Lambda, and other services. Collaborate with cross-functional teams to identify areas for improvement in existing systems and implement changes using DevOps practices. Develop automation scripts using Python to streamline deployment processes and improve efficiency. Ensure high availability of applications by implementing load balancing strategies and monitoring system performance. Troubleshoot issues related to containerized applications deployed on ECS. Desired Candidate Profile 5-10 years of experience in DevOps engineering with expertise in AWS ecosystem (ECS, Lambda). Strong understanding of containerization concepts using Docker and orchestration tools like Kubernetes or ECS. Proficiency in scripting languages such as Python for automating tasks. Experience working with CI/CD pipelines using Jenkins or similar tools.

Posted 2 months ago

Apply

3 - 5 years

9 - 16 Lacs

Coimbatore

Work from Office

Naukri logo

We are looking for an experienced and skilled AWS developer who can integrate and build AWS-based computer data and systems. The candidate should be able to maintain, design, enhance and develop the cloud infrastructure of the web applications. Candidates must also possess a proper understanding of the AWS services. They must also ensure the application of best practices regarding scalability and security. You also be responsible for overseeing, implementing, and identifying the maintenance of the AWS architecture. Should also come up with quick and effective suggestions to improve or change the application infrastructure. The candidate must know how to document and define the best strategies and practices for application infrastructure and deployment maintenance. Role & responsibilities Understand the current application infrastructure and suggest changes to it. Define and document best practices and strategies regarding application deployment and infrastructure maintenance. Migrate our infrastructure with zero downtime to a highly available, scalable one. Set up a monitoring stack. Define service capacity planning strategies. Implement the applications CI/CD pipeline using the AWS CI/CD stack. Write infrastructure as code using CloudFormation or similar. Preferred candidate profile Strong expertise in AWS services like EKS, ECS, EC2, S3, Lambda, RDS, VPC, IAM, CloudFormation, and CloudWatch. Hands-on experience with Infrastructure as Code (IaC) using Terraform or AWS CloudFormation. Proficiency in programming languages like C#, or Node.js. Experience with containerization (Docker, Kubernetes, or AWS Fargate). Knowledge of CI/CD tools like Jenkins, GitHub Actions, AWS CodePipeline. Strong understanding of networking concepts and security best practices in AWS. Experience with monitoring tools like AWS CloudWatch, Prometheus, or Grafana.

Posted 2 months ago

Apply

5 - 10 years

8 - 15 Lacs

Pune

Work from Office

Naukri logo

Job Title: Senior DevOps Engineer Work Experience: 6 + years Location: Pune About Us: InCred Money was launched in May 2023 with the acquisition of Orowealth, one of the pioneers of the digital wealth ecosystem and a leading investment platform with an AuM of 1,150+ Cr. Under the InCred fold, InCred Money will have significant competitive advantages with access to a deep network of issuers and industry leading credit & risk assessment capabilities. In addition to its fintech capabilities, InCred Money will also have the added advantage of the support of a large financial services organization. This will greatly help in the journey of making alternate assets easy, trustworthy, and lucrative for the end investor. Job Description: We are seeking a skilled and experienced DevOps Engineer to join our dynamic team. With 6+ years of experience in the field, the ideal candidate will be responsible for automating and optimizing our operations and processes, building and maintaining tools for deployment, monitoring, and operations. They will also troubleshoot and resolve issues in our dev, test, and production environments. The role demands a proactive individual who can work collaboratively with software development teams to enable continuous integration, continuous delivery, and ensure high availability and reliability of our applications. Responsibilities: Design, develop, and implement software integrations based on user feedback. Implement automation tools and frameworks (CI/CD pipelines). Collaborate with team members to improve the companys engineering tools, systems and procedures, and data security. Conduct systems tests for security, performance, and availability. Develop and maintain design and troubleshooting documentation. Participate in the processes of strategic project-planning meetings. Provide guidance and expertise on system options, risk, impact, and costs vs. benefits. Create and maintain detailed documentation of configurations and processes. Engage in and improve the whole lifecycle of servicesfrom inception and design, through deployment, operation, and refinement. Support services before they go live through activities such as system design consulting, developing software platforms and frameworks, capacity planning, and launch reviews. Maintain services once they are live by measuring and monitoring availability, latency, and overall system health. Qualifications: Bachelors degree in computer science, engineering, or relevant field. 6+ years of experience as a DevOps Engineer or equivalent software-engineering role. Expert in code deployment tools like (Argo CD, Puppet, Ansible, and Chef). Experience in network, server, and application-status monitoring. Strong command of software-automation production systems (Jenkins and Selenium). Expertise in software development methodologies. Working knowledge of JavaScript and known DevOps tools like Git and GitHub. Working knowledge of databases and SQL. Problem-solving attitude. Collaborative team spirit.

Posted 3 months ago

Apply

8 - 12 years

19 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Senior AWS Developer Location: Bangalore - India General Purpose We are looking to hire a Senior AWS Developer, with excellent technical and communication skills, to effectively collaborate with Digital, IT and business stakeholders to understand their needs and be responsible for the development and implementation of integration components across the Digital and Retail engineering team. This role is responsible for integrations between systems such as back and front end web applications, warehouse management, order management, point of sale, ERP, and CRM. The focus of this role will be to develop a micro services architecture to implement a consistent and modern integration framework using cloud providers such as AWS. Participate in all phases of the software development lifecycle. Apply advanced knowledge of best practices to drive consistency and reliability of the systems. Help create an environment open to sharing knowledge and learning from one another across the digital engineering team. This will be a hands- on technical role in a public facing, enterprise class digital ecosystem working to complete complex projects and solve technical problems across systems. This role transcends organizational and geographical boundaries as it aims at supporting and enabling the various divisions of the Herman Miller business across the globe. The ideal candidate should understand the software development lifecycle and use agile methodology to design, develop, test, and implement solutions that deliver on end-user needs. Essential functions • Assists in defining best practices within functional area. • Assists in the definition of technical requirements and solutions to drive projects through all phases of the software development lifecycle. • Builds and maintains good, positive relationships with both internal and external partners. • Conceptualizes and executes solutions to complex problems. • Create and maintain technical support documentation. • Develops and actively supplies implementations, services definitions, component interfaces, and experimental prototypes to the engineering teams. • Develops architectures and integrations that are inherently secure, robust, scalable, modular, API-centric, and global. • Ensures proper integration of system components, and performs hands-on technical tasks such as systems integration, performance tuning, and troubleshooting. • Identifies opportunities and improvements to technologies across functional area. • Maintains and supports existing applications including functional enhancements and issue resolution. • Part of the team responsible for the design, development, and implementation of a modern integration platform for the global Retail and Digital Systems architecture. • Shares passion for creating an environment of shared knowledge to grow and mentor others and continuously improve technical capabilities. • Write unit and integration tests. • Performs additional responsibilities as requested to achieve business objectives. Qualifications To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Education / Experience • Bachelors degree in Computer Science, Information Systems, or relevant academic discipline. • Minimum 6 and up to 10+ years of professional technology experience in integration and/or application development. Skills and Abilities • Thorough understanding of AWS Cloud Services including Lambda, API Gateway, SQS, SNS, Cognito, DynamoDB, Step Functions, S3, EventBridge, Appsync, IAM and AWS Transfer. • Must be able to demonstrate ability to use Terraform to create infrastructure on AWS for components such as API Gateway, SNS, Cloudwatch, SQS and Event Bridge • Understanding of Serverless Framework technology used to create lambda, IAM roles, Appsync and other AWS application resources. • Thorough understanding of Node.js and or Python. • Thorough understanding of micro services architecture and best practices and how technology solutions can support them. • An active researcher with comprehensive understanding of and ability to strategically connect the needs of multiple HMI functional areas. • Active participant in the technical development of frameworks within the functional area. Will be called on to lead architecture tasks and development. • Experience in designing, developing, and integrating solutions in a digital and retail environment. • Experience integrating to systems such as Commerce, CMS, CRM, POS, WMS, OMS, and ERP. • Strong experience working with web services and APIs. • Proficient in retail, commerce, and order processing systems, including experience with multiple Salesforce Clouds (Sales, Service, Commerce, Marketing). • Experience with microservices, distributed systems, and Integration Platform as a Service (iPaaS). • Experience developing integrations, API endpoint management, and programming against protocols/frameworks such REST, gRPC, OpenAPI, and GraphQL. • Familiarity with API management and tools/frameworks such as Boomi and AWS API Gateway. • Experience with headless architectures. • Experience with solution design and creating technical documentation such as blueprints and diagrams. • Strong verbal and written communication skills with an ability to communicate well virtually with fellow developers and business partners across physical office locations. • Must be able to perform all essential functions of the position with or without reasonable accommodations. • Strong analysis skills and ability to translate business needs into technical solutions. • Experience working on an Agile development team preferably using SCRUM. • Active participation through all phases of the development lifecycle. • Excellent written and verbal communication and collaboration skills. • Self-driven, motivated, result oriented. Aptitude to independently learn new technologies. • Strong organizational skills to deal with a varied workload and be responsive to the needs of the business • Understands the necessity of and contributes to efficient coding standards. • Demonstrated ability to influence and consult (providing options with pros, cons and risks) around all key technical decisions during project delivery. • Ability to effectively use office automation, communication, software and tools currently used in the HMI office environment. • Must be able to perform all essential functions of the position with or without supervision • This role will work in the shift timings of 12:00 P.M. to 9:00 P.M. or 2:00 P.M. to 11:00 P.M. IST. The employees could be requested to work in a different shift on rare occasions to support the business during a critical issue or for any releases/migrations that maybe scheduled. Reporting structure • Locally reporting to Working Team lead in India. • Matrix reporting to WTL in US Herman Miller is an equal opportunity employer

Posted 3 months ago

Apply

5 - 10 years

15 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Hiring for one of our Big4 client: Job Title: Senior AWS Data Engineer Location: Bangalore Job Type: Full-Timee Overview: We are looking for an experienced Senior AWS Data Engineer (P1) with 5+ years of hands-on experience in building scalable, reliable data pipelines, and data engineering solutions using AWS services. The ideal candidate will have expertise in AWS Glue , Lambda , Redshift , Step Functions , CloudWatch , and advanced Python/PySpark skills. You will be responsible for designing, implementing, and optimizing ETL/ELT pipelines, managing data integration workflows, and ensuring high-performance data solutions on AWS. This role requires deep technical expertise in cloud-based data platforms, excellent problem-solving skills, and the ability to lead complex data projects while working collaboratively with business stakeholders. Key Responsibilities: Lead ETL/ELT Pipeline Development: Architect, design, and implement advanced ETL/ELT pipelines using AWS Glue , Lambda , PySpark , and SQL , ensuring efficient data integration and transformation. Data Warehousing & Optimization: Optimize Amazon Redshift data warehouse performance, design schema structures, and develop efficient data models to handle large-scale data workloads. Orchestration & Workflow Automation: Use AWS Step Functions , Airflow , and CloudWatch for orchestrating data workflows, automating tasks, and ensuring smooth pipeline operations. Cloud Services Integration: Leverage a broad set of AWS services, including API Gateway , S3 , SQS , SNS , SES , DMS , CloudFormation , CDK , and IAM , to integrate various data sources, manage permissions, and automate data processes. Technical Leadership: Provide guidance and mentorship to junior engineers, help them grow in their technical skills, and ensure best practices for coding, testing, and deployment. Solution Design & Development: Work closely with business analysts and product owners to translate functional requirements into high-performance, scalable technical solutions on AWS. Data Quality & Monitoring: Utilize CloudWatch for monitoring data pipelines, ensure optimal performance, and troubleshoot issues in production environments. Security & Compliance: Implement best practices for data security, access control using IAM , and ensure compliance with data governance and regulatory requirements. Documentation & Process Standardization: Create comprehensive technical documentation, including system designs, data models, and pipeline configurations. Standardize best practices across the team. Primary Skills Required: AWS Services Expertise: Extensive experience with AWS services, including Glue , Lambda , Step Functions , Redshift , S3 , API Gateway , SQS , SNS , SES , DMS , CloudFormation , CDK , IAM , and VPC . Programming: Strong proficiency in Python , PySpark , and SQL , with hands-on experience in developing data transformation scripts and automation for large-scale data processing. ETL/ELT Pipelines: Proven experience designing and implementing ETL/ELT pipelines using AWS Glue , Lambda , and other AWS services to efficiently process and transform data. Data Warehousing: Expertise in Amazon Redshift for data warehousing, including schema design, query optimization, performance tuning, and data integration. Orchestration & Workflow Management: Advanced experience with AWS Step Functions , Airflow , and CloudWatch to manage and orchestrate data workflows, monitor pipeline health, and ensure process efficiency. Cloud Infrastructure & Automation: Strong experience with CloudFormation and CDK for infrastructure automation, and managing resources in AWS. Security & Permissions Management: Deep knowledge of IAM for managing security and access control, ensuring secure data operations. Troubleshooting & Debugging: Expertise in monitoring data pipelines using CloudWatch , identifying bottlenecks, and resolving issues in data processes. Additional Skills (Nice to Have): Experience with Data Migration Service (DMS) for database replication and migration. Familiarity with Airflow or other orchestration frameworks for data workflows. Strong understanding of data governance and compliance standards in cloud environments. Knowledge of Agile development methodologies and proficiency with Git for version control. AWS certifications in relevant areas (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics). Education and Experience: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field. 5+ years of experience as a Data Engineer or Cloud Engineer with a focus on AWS services, data engineering, and building ETL/ELT pipelines. Note: Immediate joiners / who are currently serving the notice or who can join us within 30 days

Posted 3 months ago

Apply

8 - 11 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities Experience: 6-9 years as a Data Engineer with a strong focus on PySpark and large scale data processing. PySpark Expertise: Decent to proficient in writing optimized PySpark code+ including working with DataFrames+ Spark SQL+ and performing complex transformations. AWS Cloud Proficiency: Fair experience with core AWS services+ such as S3+ Glue+ EMR+ Lambda+ and Redshift+ with the ability to manage and optimize data workflows on AWS. Performance Optimization: Proven ability to optimize PySpark jobs for performance+ including experience with partitioning+ caching+ and handling skewed data. Problem Solving Skills: Strong analytical and problem solving skills+ with a focus on troubleshooting data issues and optimizing performance in distributed environments. Communication and Collaboration: Excellent communication skills to work effectively with cross functional teams and clearly document technical processes. Added advantage AWS Glue ETL: Hands on experience with AWS Glue ETL jobs+ including creating and managing workflows+ handling job bookmarks+ and implementing transformations. Database Good working knowledge of Data warehouse like Redshift. Hands on experience with AWS Glue ETL jobs+ including creating and managing workflows+ handling job bookmarks+ and implementing transformations. Database Good working knowledge of Data warehouse like Redshift.

Posted 3 months ago

Apply

5 - 10 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description Key Responsibilities: Full Stack Development: Design, develop, and maintain web applications using .NET technologies, including .NET Core and ASP.NET Web API. Build and maintain front-end applications using Angular 10+ versions. Implement responsive and user-friendly UI features, ensuring seamless user experience across devices. Cloud Development and Management: Utilize AWS services (S3, Lambda, EC2, CloudWatch) for hosting, deployment, and monitoring of applications. Work with AWS services for automation, infrastructure management, and scaling solutions. Backend Development & API Design: Develop robust backend APIs using .NET 4.6.1 / .NET Core 3, ensuring high performance and security. Integrate third-party APIs and services into applications, ensuring scalability and reliability. Code Quality & CI/CD: Implement best practices for code quality and standards. Use tools like SonarQube to ensure the code is free from errors, and maintain high-quality standards. Work with Jenkins for continuous integration and continuous deployment (CI/CD), ensuring smooth deployments and minimal downtime. Collaboration and Agile Practices: Collaborate effectively with cross-functional teams, including designers, product managers, and other developers. Use Agile methodologies for efficient development, and actively participate in sprint planning, standups, and retrospectives. Track and manage tasks using Jira, ensuring all tasks are completed on time and according to project requirements. Version Control & Docker: Manage source code and collaborate with the team using Git for version control. Use Docker for containerization and deployment, ensuring consistent environments across development, staging, and production. Required Skills & Qualifications: Experience in Full Stack Development: Proven experience as a full-stack developer using .NET technologies (e.g., .NET 4.6.1, .NET Core 3, ASP.NET Web API 2). Frontend Technologies: Strong hands-on experience with Angular 10+ and other front-end technologies. Cloud Technologies: Hands-on experience working with AWS services such as S3, Lambda, CloudWatch, and EC2. CI/CD & Code Quality: Experience with tools like Jenkins, SonarQube, and other DevOps practices. Version Control & Collaboration Tools: Experience using Git for version control and Jira for task tracking. Containerization: Knowledge of Docker for creating and managing containerized applications. Strong Problem-Solving Skills: Ability to troubleshoot, debug, and optimize both front-end and back-end issues. Team Player: Strong communication skills and the ability to collaborate effectively in a team-oriented environment. Preferred Qualifications: Bachelors degree in Computer Science, Engineering, or related field. Experience with other cloud platforms or services is a plus. Familiarity with Agile methodologies and Scrum practices. Familiarity with additional tools such as Kubernetes, Terraform, or other infrastructure automation tools. Why Join Us? Competitive salary and benefits. Opportunity to work with cutting-edge technologies in a dynamic and innovative team. Career growth opportunities with mentorship and training programs. A collaborative work environment where your input is valued and your contributions make a difference. Roles and Responsibilities Job Title .Net AWS Developer Mandatory Skills .net core, aws, Lamda, amazon ec2, lambda expressions. Asp.net core, WEB Api, Angular, Desirable Skills .net fullstack, C#, ASP.Net, Aws Cloud If you feel the JD is suitable. Please share the Resume to spandana.tudi@qentelli.com

Posted 3 months ago

Apply

6 - 7 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Maintaining the stability and security of our applications hosted in AWS Developing and maintaining CDK scripts An initial task is to safely migrate an existing legacy environment to CDK Working with and mentoring the junior DevOps who has experience of our systems Working independently with a remote team Project management A high level of English and good communication skills are required Deep experience of the following in AWS is required: Security Scalable architecture Docker / containers Networking Monitoring tools Performance analysis Experience with or familiarity of the following tools and technologies: CDK Golang is highly desirable ECS Fargate Lambda RDS Aurora / PostreSQL EC2 S3 IAM Route 53 SMS/SNS/SES CloudWatch

Posted 3 months ago

Apply

7 - 12 years

0 - 0 Lacs

Pune, Bengaluru, Hyderabad

Hybrid

Naukri logo

Hiring for Top MNC (For Long term contract) THE CANDIDATE MUST BE ABLE TO CREATE VARIETY OF INFRA AS CODE ON AWS THAT WILL BE LEVERAGED BY DATA ENGINEERING TEAMS. Candidate will use tools like cloud formation (CFM) to create IAC, deploy and operate the platform based on requirements from DEs. 2 FTEs; 5+ years of relevant experience senior PE Ability to lead a small team (2 pizzas squad) Ability to create scalable & stable serverless architecture/design Expert in Python (Object Oriented) development Expert in writing Python Unit Tests Extensive use of the following AWS services S3 Lambda Glue SQS IAM DynamoDB CloudWatch Event Bridge Step Functions EMR (inc. serverless) Redshift (inc. serverless) API Gateway and/or AppSync Optionally, AWS Lake Formation DMS DataSync Appflow Fluent with REST API Experience with CFM and CDK Knowledge of Data Engineering principles using the services above is important Previous experience with Azure AD and OAuth2 Previous experience in BDD (Business Driven Development) testing (or Integration Testing) Optionally, previous experience with Node.js/React development

Posted 3 months ago

Apply

7 - 9 years

0 Lacs

Pune

Hybrid

Naukri logo

In-depth knowledge of AWS services including EC2, S3, RDS, Lambda, ACM, SSM, and IAM. Experience with Kubernetes (EKS) and Elastic Container Services (ECS)for orchestration and deployment of microservices. Engineers are expected to be able to execute upgrades independently. Cloud Architecture : Proficient knowledge on AWS advanced networking services including CloudFront, Transit Gateway Monitoring & Logging: Knowledge of AWS CloudWatch, CloudTrail, OpenSearch and Grafana monitoring tools. Security Best Practices : Understanding of AWS security features and compliance standards. API: RestAPI/OneAPI Relevant experience mandatory Infrastructure as Code (IaC): Proficient in AWS CloudFormation and Terraform for automated provisioning. Scripting Languages: Proficient in common languages (PowerShell, Python and Bash) for automation tasks. CI/CD Pipelines: Familiar with tools like Azure DevOps Pipelines for automated testing and deployment. Relevant Experience - A minimum of 4-5 years experience in a comparable Cloud Engineer Role Nice to Have: Knowledge/Hands-On Azure services Agile Frameworks: Proficient knowledge about Agile ways of working (SCRUM, SAFe) Certification In case of AWS at least:: Certified Cloud Practitioner + Certified Solutions Architect Associate + Certified Solution Architect Professional. In case of Azure at least: Microsoft Certified: Azure Solutions Architect Expert Mindset: Platform engineers must focus on automating activities where possible, to ensure stability, reliability and predictability.

Posted 3 months ago

Apply

3 - 7 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a skilled AWS Cloud Developer to join our growing team. In this role, you will be responsible for designing, developing, and maintaining cloud-based applications and services using Amazon Web Services (AWS). You will work closely with cross-functional teams to define requirements and ensure seamless integration of cloud solutions into our existing architecture. Work with AWS lead on the design of the architectures for the new capabilities hosted in our AWS platform. Design and develop microservices using NodeJs. Previous experience with Python following OOP is a plus. Evaluate product requirements for operational feasibility and create detailed specifications based on user stories. Write clean, efficient, high quality, secure, testable, maintainable code based on specifications. Coordinate with stakeholders (Product Owner, Scrum Master, Architect, Quality and DevOps teams) to ensure successful execution of the project. Troubleshoot and resolve issues related to the infrastructure. Ensure best practices are followed in cloud services, focusing on scalability, maintainability, and security. Keep abreast of the latest advancements in AWS cloud technologies and trends to recommend process improvements and technology upgrades. Mentor and provide guidance to junior team members, fostering a culture of continuous learning and innovation. Participate in architecture review board meetings and make strategic recommendations for choice of services Roles and Responsibilities 3+ years’ experience working with NodeJs along with AWS Cloud Services or a similar role. Master’s degree in computer science with a focus on Cloud/Data or equivalent (Or Bachelor with more years of XP) Comprehensive knowledge and hands-on experience with NodeJs, AWS Cloud services, especially the modules (Lambda, serverless, API Gateway, SQS, SNS, SES, DynamoDB, CloudWatch,). Knowledge of best practices in Python is a plus. Knowledge of branching and version control systems like GIT (mandatory) Experience with IaC tools such as Terraform and/or CloudFormation is a plus. Excellent collaboration skills. A desire for continuous learning and staying updated with emerging technologies. Due to the nature of this position sitting on a global team, fluent English communication skills (written & spoken) is required. Strong interpersonal skills, with ability to communicate and convince at various levels of the organization, and in a multicultural environment. Ability to effectively multi-task and manage priorities. Strong analytical and synthesis skills. Initiative to uncover and solve problems proactively. Ability to understand complex software development environments

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies