Home
Jobs

56 Apache Kafka Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

12 - 18 Lacs

Maharashtra

Work from Office

Naukri logo

Staff Software Engineers are the technology leaders of our highest impact projects. Your high energy is contagious, you actively collaborate with others across the engineering organization, and you seek to learn as much as you like to teach. You personify the notion of constant improvement as you work with your team and the larger engineering group to build software that delivers on our mission. You use your extraordinary technical competence to ensure a high bar for excellence while you mentor other engineers on their own path towards craftsmanship. You are most likely T-shaped, with broad knowledge across many technologies plus strong skills in a specific area. Staff Software Engineers embrace the opportunity to represent HMH in industry groups and open-source communities. Area of Responsibility: You will be working on the HMH Assessment Platform that is part of the HMH Educational Online/Digital Learning Platform. The Assessment team builds highly scalable and available platform. The platform is built using Microservices Architecture, Java microservices backend, REACT JavaScript UI Frontend, REST APIs, Postgres Database, AWS Cloud technologies, AWS Kafka, Kubernetes or Mesos orchestration, DataDog for logging/monitoring/alerting, Concourse CI or Jenkins, Maven etc. Responsibilities: Be the technical lead for feature development in a team of 5-10 engineers and influencing the technical direction of the overall engineering organization. Decompose business objectives into valuable, incrementally releasable user features accurately estimating the effort to complete each. Contribute code to feature development efforts demonstrating to others efficient design, delivery and testing patterns and techniques. Strive for high quality outcomes, continuously look for ways to improve team productivity and product reliability, performance, and security. Develop the talents and abilities of peers and colleagues. Create a memorable legacy as you progress toward your personal and professional objectives. Foster your personal and professional development continually seeking assignments that challenge you. Skills & Experience: Successful Candidates must demonstrate an appropriate combination of: 10+ years of experience as a software engineer. 3+ years of experience as a Staff or lead software engineer. Bachelor's degree in computer science or a STEM field. A portfolio of thought leadership and individual technical accomplishments. Full understanding of Agile software development methodologies and practices. Strong communication skills both verbal and written. Extensive experience working with technologies and concepts such: Behavior-driven or test-driven development JVM-based languages such as Java and Scala Development frameworks such as Spring Boot Asynchronous programming concepts, including Event processing Database technologies such as SQL, Postgres/MySQL, AWS Aurora DBs, Redshift, Liquibase or Flyway No-SQL technologies such as Redis, MongoDB and Cassandra Streaming technologies such as Apache Kafka, Apache Spark or Amazon Kinesis Unit-testing frameworks such as jUnit Performance testing frameworks such as Gatling Architectural concepts such as micro-services and separation of concerns Expert knowledge of class-based, object-oriented programming and design patterns Development tools such as GitHub, Jira, Jenkins, Concourse, and Maven Cloud technologies such as AWS and Azure Data Center Operating Technologies such as Kubernetes, Apache Mesos Apache Aurora, and TerraForm and container services such as Docker and Kubernetes Monitoring and operational data analysis practices and tools such as DataDog, Splunk and ELK.

Posted 1 week ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Noida

Work from Office

Naukri logo

Java Technical Lead- Enterprise Solutions & AI Integration Specialist About the Role We are seeking an experienced Senior Java Developer to join our team in building modern enterprise applications with AI capabilities. You will work on mission-critical systems involving real-time data processing, automated workflows, and intelligent business solutions. This role offers the opportunity to work with cutting-edge technologies while developing scalable, cloud-native applications. Required Experience 5+ years of experience in Java development 5+ years of experience with Spring Boot / Spring Cloud and Micro services Strong experience with: Java 11/17 Spring Boot / Spring Cloud Apache Kafka containerization (Docker, Kubernetes) AI/ML integration Key Responsibilities Design and implement scalable microservices using Spring Boot Build robust error processing pipelines using Apache Kafka Integrate with Zendesk API for automated ticket management Implement AI-powered error classification and resolution Create and maintain CI/CD pipelines Write clean, maintainable, and well-tested code Mentor junior developers and conduct code reviews Technical Skills Required Java 11/17 Spring Boot 3.x Apache Kafka Docker & Kubernetes Maven/Gradle Junit / Mockito Git, Jenkins AI/ML Experience with AI/ML frameworks Integration with AI services ML model deployment Natural Language Processing Required Qualifications Bachelor's/Master's degree in Computer Science or related field Strong understanding of distributed systems Experience with high-throughput message processing Solid understanding of RESTful architecture Experience with agile development methodologies Preferred Qualifications Experience with Zendesk API integration Knowledge of error management systems Experience with AI/ML model integration Understanding of ITIL practices Experience with cloud platforms (AWS/Azure/GCP)

Posted 1 week ago

Apply

8.0 - 10.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Job Responsibilities: Collaborates with Product and Engineering stakeholders to design and build platform services that meet key product and infrastructure requirements Produces both detailed designs for platform-level services Must be able to evaluate software and products against business requirements and turn business requirements into robust technical solutions fitting into corporate standards and strategy. Designs and implements microservices with thoughtfully defined APIs Should be conversant with frameworks & Architectures - Spring Boot, Spring Cloud, Spring Batch, Messaging Frameworks (like Kafka), Micro service Architecture Work with other areas of technology team to realize end to end solution and estimation for delivery proposals. Sound understanding of Java concepts, understanding of the technologies in the various architecture tiers - presentation, middleware, data access and integration to propose solution using Java /open-source technologies Design modules that are scalable, reusable, modular, secure. Clearly communicates design decisions, roadblocks and timelines to key stakeholders Adheres to all industry best practices and standards for Agile/Scrum Frameworks adopted by the Organization including but not limited to daily stand-ups, grooming, planning, retrospectives, sprint reviews, demos, and analytics via systems (JIRA) administration to directly support initiatives set by Product Management and the Organization at large Actively participate in Production stabilization and lead system software improvements along with team members. Technical Skills: Candidate Should have at least total 8+ years of experience in IT software development/design architecture. 3+ experience as an Architect in building distributed, highly available and scalable, microservice-based Cloud Native architecture Experience in one or more open-source Java frameworks such as Spring Boot, Spring Batch, Quartz, Spring Cloud, Spring Security, BPM, etc. Experience in single page web application framework like Angular. Experience with at least one type messaging system (Apache Kafka (Required), RabbitMQ) Experience with at least one RDBMS (MySQL, PostgreSQL, Oracle) Experience with at least one document-oriented DB (MongoDB, Preferably Couchbase DB) Experience with NoSQL DB like Elasticsearch Proficient in creating design documents - LLD documents with UML Good Exposure on Design Patterns, Microservices Architecture Design patterns and 12 factor application Experience working with observability/monitoring framework (Prometheus/Grafana, ELK) along with any APM tool Ability to conceptualize end-to-end system components across a wide range of technologies and translate into architectural design patterns for implementation Knowledge of security systems like Oauth 2, Keyclaok and SAML Familiarity with source code version control systems like Git/SVN Experience using, designing, and building REST/GRPC/ GraphQL/Web Service APIs Production experience with container orchestration (Docker, Kubernetes/CI/CD) and maintaining production environments Good understanding of public clouds GCP, AWS Etc. Good Exposure on API Gateways, Config servers Familiar with OWASP Experience in Telecom BSS (Business Support System) for CRM components added advantage. Immediate Joiner/30 days

Posted 1 week ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Udaipur

Work from Office

Naukri logo

Required Skills : Expert in Python, with knowledge of at least one Python web framework {{such as Django, Flask, etc depending on your technology stack}} Familiarity with some ORM (Object Relational Mapper) libraries Able to integrate multiple data sources and databases into one system Understanding of the threading limitations of Python, and multi-process architecture Good understanding of server-side templating languages {{such as Jinja 2, Mako, etc depending on your technology stack}} Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Understanding of accessibility and security compliance Knowledge of user authentication and authorization between multiple systems, servers, and environments Understanding of fundamental design principles behind a scalable application Familiarity with event-driven programming in Python Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform Able to create database schemas that represent and support business processes Design, develop, and maintain Microservices using Python to ensure high performance and scalability. Collaborate with cross-functional teams to define and implement Microservices architecture best practices. Design, implement, and maintain systems that utilize queueing services for asynchronous communication. Integrate and configure queueing services like RabbitMQ or Apache Kafka within the application architecture. Strong unit test and debugging skills Proficient understanding of code versioning tools Work collaboratively with the design team to understand end-user requirements to provide technical solutions and for the implementation of new software features Knowledge of application deployment process and server set up Responsibilities- Develop reusable, testable, and efficient code. Implement moderately complex applications and features following the underlying architectural decisions. Collaborate with team members to follow established development guidelines. Integrate and manage data storage solutions with a focus on execution. Design and implementation of low-latency, high-availability, and performant applications

Posted 2 weeks ago

Apply

4.0 - 5.0 years

8 - 16 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities We are seeking a Senior Java Developer with strong experience in Spring Boot , AWS , Apache Kafka , and React JS to join our fast-growing development team. The ideal candidate will have a solid background in designing scalable microservices, hands-on cloud deployment, and BPM integration using Groovy scripts and interceptors. Key Responsibilities Design, develop, and maintain scalable Java microservices using Spring Boot Work with AWS services (EC2, Lambda, S3, Glue, EKS) to deploy and manage applications in a cloud environment Develop and manage Kafka producers and consumers, handle topic/partition configurations Design and optimize PostgreSQL schemas and complex queries Collaborate with frontend developers to integrate APIs with React JS UI Write JUnit test cases and ensure code coverage with tools like JaCoCo and SonarQube Implement JWT -based API security standards Build and maintain CI/CD pipelines , participate in DevOps processes Integrate business workflows using BPMN , Groovy scripting , and event listeners Monitor and troubleshoot using Prometheus , Grafana , and centralized logging tools Mentor junior developers and collaborate in Agile/Scrum ceremonies Required Skills Java (13+), Spring Boot, REST APIs Apache Kafka (Topics, Partitions, Producer/Consumer APIs) AWS (EC2, Lambda, S3, Glue, EKS) Docker, Kubernetes PostgreSQL (Schema design, indexing, optimization) JUnit, JaCoCo, SonarQube JWT, API Security React JS Git, Agile/Scrum BPM tools, Groovy scripting, Event Listeners, Interceptors Monitoring tools Prometheus, Grafana Interested candidates candidates can share your resume to sarvani.j@ifinglobalgroup.com

Posted 2 weeks ago

Apply

8.0 - 10.0 years

40 - 45 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

Roles & Responsibilities: Data Engineering Leadership & Strategy: Lead and mentor a team of data engineers, fostering a culture of technical excellence and collaboration. Define and implement data engineering best practices, standards, and processes. Data Pipeline Architecture & Development: Design, build, and maintain scalable, robust, and efficient data pipelines for ingestion, transformation, and loading of data from various sources. Optimize data pipelines for performance, reliability, and cost-effectiveness. Implement data quality checks and monitoring systems to ensure data integrity. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Cloud-Based Data Infrastructure: Design, implement, and manage cloud-based data infrastructure using platforms like AWS, Azure, or GCP. Leverage cloud services (e.g., data lakes, data warehouses, serverless computing) to build scalable and cost-effective data solutions. Leverage opensource airbyte , mage ai and similar Ensure data security, governance, and compliance within the cloud environment. Data Modeling & Warehousing: Design and implement data models to support business intelligence, reporting, and analytics. Optimize data warehouse performance for efficient querying and reporting. Collaboration & Communication: Collaborate effectively with cross-functional teams including product managers, software engineers, and business stakeholders. Requirements: Bachelor's or master's degree in computer science, Engineering, or a related field. 8+ years of proven experience in data engineering, with at least 3+ years in a lead role. Expertise in building and maintaining data pipelines using tools such as Apache Spark, Apache Kafka, Apache Beam, or similar. Proficiency in SQL and one or more programming languages like Python, Java, or Scala. Hands-on experience with cloud-based data platforms (AWS, Azure, GCP) and services. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Work Timings: 2.30 pm - 11.30 pm IST

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities Implementing the design and architecture of complex web applications using Angular framework, ensuring adherence to best practices and architectural principles. Collaborate closely with product managers, UX/UI designers, and development teams to translate business requirements into technical specifications and architectural designs. Define and implement scalable and maintainable front-end architecture, including component-based architecture, state management, and data flow patterns. Provide technical guidance and mentorship to development teams, promoting code quality, performance optimization, and maintainability. Conduct code reviews and architectural reviews to ensure compliance with established standards and design guidelines. Evaluate and recommend tools, libraries, and frameworks to enhance productivity and efficiency in Angular development. Stay current with industry trends and emerging technologies related to front-end development, and incorporate them into our architectural roadmap. Drive continuous improvement initiatives to streamline development processes, increase development velocity, and elevate overall product quality. Preferred Skills Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting Requirements Skill Requirements Bachelor/Master of Engineering or equivalent in Computers/Electronics and Communication with 8+ years experience. Proven Experience as Software Architect or Solution Architect or Senior Full Stack Developer or in web application development. Hands-on experience in C#, ASP.NET development. Expert-level proficiency in Angular framework and its ecosystem (Angular CLI, RxJS, Angular Material and related technologies). Expert-level proficiency in designing and implementing microservices-based applications, with a strong understanding of micro services design principles, patterns, and best practices. Architect level Cloud Certification is recommended. Deep knowledge of front-end development technologies such as HTML5, CSS3, JavaScript/TypeScript, and RESTful APIs. Experience with state management libraries (e.g., NgRx, Redux) and reactive programming concepts. Strong understanding of software design principles, design patterns, and architectural styles, with a focus on building scalable and maintainable front-end architectures. Excellent communication and collaboration skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Experience working in Agile/Scrum development environments and familiarity with DevOps practices is a plus. Experience to work in multiple cloud environments - Azure, AWS web services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Design, Implementation and Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

5 - 8 Lacs

Chennai

Work from Office

Naukri logo

Responsibilities What you'll do Engineer, test, document and manage GCP Dataproc, DataFlow and VertexAI services used in high-performance data processing pipelines and Machine Learning. Help developers optimize data processing jobs using Spark, Python, and Java. Collaborate with development teams to integrate data processing pipelines with other cloud services and applications. Utilize Terraform and Tekton for infrastructure as code (IaC) and CI/CD pipelines, ensuring efficient deployment and management. Good to have Experience with Spark for large-scale data processing. Solid understanding and experience with GitHub for version control and collaboration. Experience with Terraform for infrastructure management and Tekton for continuous integration and deployment. Experience with Apache NiFi for data flow automation. Knowledge of Apache Kafka for real-time data streaming. Familiarity with Google Cloud Pub/Sub for event-driven systems and messaging. Familiarity with Google BigQuery Mandatory Key Skills Python,Java,Google Cloud Pub/Sub,Apache Kafka,Big Query,CI/CD*,Machine Learning*,Spark*

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Implementing the design and architecture of complex web applications using Angular framework, ensuring adherence to best practices and architectural principles. Collaborate closely with product managers, UX/UI designers, and development teams to translate business requirements into technical specifications and architectural designs. Define and implement scalable and maintainable front-end architecture, including component-based architecture, state management, and data flow patterns. Provide technical guidance and mentorship to development teams, promoting code quality, performance optimization, and maintainability. Conduct code reviews and architectural reviews to ensure compliance with established standards and design guidelines. Evaluate and recommend tools, libraries, and frameworks to enhance productivity and efficiency in Angular development. Stay current with industry trends and emerging technologies related to front-end development, and incorporate them into our architectural roadmap. Drive continuous improvement initiatives to streamline development processes, increase development velocity, and elevate overall product quality. Preferred Skills Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting. Requirements Skill Requirements Bachelor Master of Engineering or equivalent in Computers/Electronics and Communication with 8+ years experience. Proven Experience as Software Architect or Solution Architect or Senior Full Stack Developer or in web application development. Hands-on experience in C#, ASP.NET development. Expert-level proficiency in Angular framework and its ecosystem (Angular CLI, RxJS, Angular Material and related technologies). Expert-level proficiency in designing and implementing microservices-based applications, with a strong understanding of micro services design principles, patterns, and best practices. Architect level Cloud Certification is recommended. Deep knowledge of front-end development technologies such as HTML5, CSS3, JavaScript TypeScript, and RESTful APIs. Experience with state management libraries (e.g., NgRx, Redux) and reactive programming concepts. Strong understanding of software design principles, design patterns, and architectural styles, with a focus on building scalable and maintainable front-end architectures. Excellent communication and collaboration skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Experience working in Agile/Scrum development environments and familiarity with DevOps practices is a plus. Experience to work in multiple cloud environments - Azure, AWS web services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Design, Implementation and Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams. Keywords RxJS,Angular Material,microservices,DevOps,Git,SVN,PostgreSQL,Azure IOT,MQTT,Apache Kafka,Kubernetes,Docker,CI/CD,Angular CLI*,Swagger Hub*,SignalR API*,C#*,ASP.NET development*

Posted 2 weeks ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities Responsible for programming and testing of cloud applications Integration of user-facing elements developed by a front-end developers with server side logic Optimization of the application for maximum speed and scalability Design and implementation of data storage solutions Writing reusable, testable, and efficient code Design, Code, test, debug, and document software according to the functional requirements Participate as a team member in fully agile Scrum deliveries Provide Low Level Design Document for the components Work collaboratively in Agile/Scrum team environment Test driven development based on unit tests Preferred Skills Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting Good to have knowledge on cloud and CI/CD. Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Requirements Skill Requirements: Bachelor/Master of Engineering or equivalent in Computers/Electronics and Communication with 5-7 yrs experience. Hands-on Experience in web application development using Angular. Hands-on experience in C#, ASP.NET development. Dev level Cloud application Certification is recommended. Proficiency in designing and implementing microservices-based applications, with a strong understanding of micro-services design principles, patterns, and best practices. Experience to work in multiple cloud environments - Azure, AWS web-services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Basic understanding of front-end technologies, such as JavaScript, TypeScript, HTML5, and CSS Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams. Keywords RESTful APIs,HTTP protocols,JSON,XML,blob stores,JavaScript,TypeScript,HTML5,CSS,Git,PostgreSQL Database,Azure IOT,MQTT,Apache Kafka,Kubernetes,Docker,C#*,ASP.NET development*,Azure*,AWS web-services*,GCP*

Posted 2 weeks ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Position - Senior Software Engineer II - Java Fullstack - Java, Spring boot, Microservices, React.Js and Azure Primary Responsibilities Analyze, Design, Code, Review, Integrate, Implement, Install, Deploy and Maintenance of the computer programs and software components using specific tools and latest technologies including Java/J2EE, Camunda, Angular, React, Microservices Framework, Apache Kafka, Spring Boot, Kubernetes, and Docker, JSON, Node JS, Casandra, GraphQL etc. Design, develop, and maintain robust and scalable RESTful APIs for data exchange between healthcare systems Provides technical direction for the development, design, and systems integration across multiple client engagements from definition phase through implementation Leading the Implementation best engineering practices on CI,CD,CT and automated code reviews Leading Development of Azure Cloud Technology Identify opportunities to fine-tune and optimize applications of Java developed projects Communicate effectively with other engineers and QA Fully encourage and facilitate high quality code through of support QE team efforts including but not limited to mentoring, assistance, and writing/executing automation tests as needed Establish, refine, and integrate development and test environment tools and software as needed Identify production and non-production application issues Provide technical support and consultation for Java application and infrastructure questions Serve as a mentor to less experienced Developers Be able to envision the overall solution for defined functional and non-functional requirements Create, understand, and validate Design and estimated effort for given module/task, and be able to justify it Required Qualifications Bachelors degree or higher in Computer Science, Software Engineering, or equivalent 10+ years of experience on Java API development 10+ years of experience as a Full Stack Engineer with Solid proficiency in Java/J2EE, SpringBoot or other backend programming languages 3+ years of experience with Experience with frontend development frameworks (e.g., Node JS, Angular/React) and modern web technologies 3+ years of experience with Microsoft Azure 2+ years of experience with Node JS Experience on Camunda BPM and DMN Solid experience on Full stack Application development Solid process experience on waterfall and agile practices Experience working in the healthcare industry or other regulated environments Solid knowledge of SQL and Postgres DB profiling/performance tuning Azure Cloud Engineering, CI/CD, Microservice background, Infrastructure as Code (IaC) Tools (e.g., Terraform) Test-driven development (TDD) and automated testing frameworks Knowledge of Full Stack, Java, Angular, Azure Cloud, Docker, Container-Based development Knowledge of automated testing tools such as Cucumber, Junit, Mockito, Selenium Knowledge of Cloud/Distributed architecture design patterns Proficient with azure deployments and security best practices. Experience on Cloud computing technologies Proven solid analytical skills and experience on proof of concept development Proven ability to provide guidance to team and provide status to senior management.

Posted 3 weeks ago

Apply

5.0 - 6.0 years

18 - 25 Lacs

Gurugram, Sector-20

Work from Office

Naukri logo

5-7 years of experience in Solution, Design and Development of Cloud based data models, ETL Pipelines and infrastructure for reporting, analytics, and data science. Experience working with Spark, Hive, HDFS, MR, Apache Kafka/AWS Kinesis Experience with version control tools (Git, Subversion) Experience using automated build systems (CI/CD) Experience working in different programming languages (Java, python, Scala) Experience working with both structured and unstructured data. Strong proficiency with SQL and its variation among popular databases Ability to create the data model from scratch. Experience with some of the modern relational databases Skilled at optimizing large complicated SQL statements Knowledge of best practices when dealing with relational databases Capable of configuring popular database engines and orchestrating clusters as necessary Ability to plan resource requirements from high level specifications Capable of troubleshooting common database issues. Experience of Data Structures and algorithms Knowledge of different databases technologies (Relational, NoSQL, Graph, Document, Key-Value, Time Series, etc). This should include building and managing scalable data models. Knowledge of ML model deployment Knowledge of Cloud based platforms (AWS) Knowledge of TDD/BDD Strong desire to improve upon their skills in software development, frameworks, and technologies.

Posted 3 weeks ago

Apply

7.0 - 11.0 years

8 - 10 Lacs

Kolkata, Bhubaneswar

Work from Office

Naukri logo

Data Science professional with a proven track record in training Engineering, IT, Diploma, Polytechnic and Technical candidates. With over a 7 yrs of experience in Artificial Intelligence, Machine Learning, Big Data, and Cloud Computing, Specialise in delivering industry-oriented, hands-on training that equips candidates with the technical proficiency required in today's data-driven world.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

18 - 32 Lacs

Bengaluru

Hybrid

Naukri logo

Dear Candidate, EY Is currently hiring for Sr. business intelligence role who has Relevant experience of 5+ years in for EY India- Bengaluru Location. ONLY Immediate joiners or 30 days NP will be considered. Key Skills: Experience: 8+ years in data engineering and data management, focusing on large-scale systems. Programming Proficiency: Proficient in Java with a strong understanding of SOLID principles and design patterns (or other JVM languages), experienced in building real-time/batch data processing solutions with Apache Kafka, Apache Flink, Apache Spark, and Apache Iceberg. Data Architecture: Hands-on experience designing data architectures (data lakes, data warehouses, data marts) and building end-to-end ETL/ELT pipelines. Containerization and Orchestration: Strong understanding of containerization (Docker) and orchestration with Kubernetes for deploying scalable microservices. Database Experience: Experience with relational databases (Oracle, SQL Server, Postgres, MySQL) and knowledge of best practices in data modeling and optimization. Pipeline Orchestration: Familiarity with DAG schedulers (Airflow, Luigi, etc.) for pipeline orchestration and scheduling. Analytical Skills: Excellent analytical, quantitative, problem-solving, and critical thinking skills. Collaboration: Collaborative mindset with the ability to thrive in a fast-paced environment and manage multiple priorities effectively. If Interested, Please share below details to Krithika.L@in.ey.com with updated resume : Name: Skill: Notice Period (if serving mention LWD) : Contact Number: Email: Current Location: Preferred Location: Total Exp: Relevant Exp: Current Company: Education (Mention Year of completion): Current CTC (LPA): Expected CTC (LPA) : Offer In Hand (Mention Date of joining): Regards, Talent Team Krithika

Posted 3 weeks ago

Apply

8.0 - 12.0 years

18 - 20 Lacs

Pune, Chennai, Coimbatore

Work from Office

Naukri logo

Node.js & NoSQL systems to design and maintain scalable APIs and real-time data pipelines. API development, data integration, and cloud infrastructure, leveraging Apache Kafka and GCP to build robust, event-driven backend systems. Required Candidate profile Exp in APIs using Node.js and TypeScript. API integration solutions (REST, GraphQL, webhooks). manage applications using Docker, Kubernetes, and GCP. Strong Node.js, TypeScript, &JavaScript (ES2019+)

Posted 3 weeks ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Kolkata, Mumbai, Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and maintain robust applications using Java and Spring Boot. Implement microservices architecture to enhance scalability and performance. Collaborate with cross-functional teams to define, design, and ship new features. Ensure code quality through unit testing and code reviews. Participate in agile development processes and contribute to sprint planning and retrospectives. Troubleshoot and debug applications to optimize performance. Document technical specifications and user guides. Mandatory Skills: Java: Strong proficiency in Java programming. Spring Boot: Hands-on experience (2+ years) with Spring Boot framework. Microservices: Practical experience in designing and implementing microservices. GitHub: Proficient in using GitHub for version control and collaboration. REST Principles: Good understanding of RESTful services and API design. Jira: Familiarity with Jira for project management and issue tracking. Git: Strong understanding of Git for version control. IntelliJ: Experience using IntelliJ IDEA as a development environment. Kafka: Experience with Apache Kafka for real-time data streaming. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Keywords Maven,spring boot,GitHub,REST principles,Jira,Git,Intellij,Apache Kafka,Java*,SpringBoot*,Microservices*,Kafka*

Posted 3 weeks ago

Apply

6.0 - 8.0 years

15 - 18 Lacs

Bhubaneswar, Hyderabad, Bengaluru

Work from Office

Naukri logo

Client is looking for a strong Java candidate with the following skills. Spring Webflux and streaming knowledge is a must and key thing that they are looking for. Here is the overall JD for the position. RDBMS No At least 1 year Is Required CI/CD 2-5 Years Is Required Cloud Computing 2-5 Years Is Required Core Java 5-10 Years Is Required Kubernetes 2-5 Years Is Required microservices - 2-5 Years Is Required MongoDB - At least 1 year Nice To Have NoSQL -At least 1 year Nice To Have python At least 1 year Is Required Spring Boot - 5-10 Years Is Required Spring Data - 2-5 Years Is Required Spring Security - 2-5 Years Is Required Spring Webflux At least 1 year Is Required Stream processing At least 1 year Is Required Java 17 - 2-5 Years Is Required Apache Kafka - At least 1 year Is Required Apache SOLR At least 1 year Is Required Expertise with solution design and enterprise large scale applications development In-depth knowledge of integration patterns, integration technologies and integration platforms Experience with Queuing related technologies like Kafka Good hands-on experience to design and build cloud ready application Good programming skills in Java, Python etc Proficiency of dev/build tools: git, maven, gradle Experience with the modern NoSQL/Graph DB/Data Streaming technologies is a plus Good understanding of Agile software development methodology Location-hyd,mangalore.bubaneshawr,trivendrum

Posted 3 weeks ago

Apply

10 - 20 years

20 - 35 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Gurgaon/ Bengaluru Payroll: BCforward Work Mode: Hybrid JD Skills: Java; Apache Kafka; AWS; microservices, Event Driven Architecture Experienced Java engineer with over 10 years of experience with expertise in microservices , event driven architecture Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 15-Days joiners at most. All the best

Posted 4 weeks ago

Apply

4 - 9 years

3 - 8 Lacs

Thane, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

J2EE using MVC framework (JSP/Servlet/Webservices/JSF/Struts/Spring ), knowledge Java, JSP, servlet, Oracle, MySQL, Apache Solr, Apache kafka, Bootstrap, Hibernate/JDBC, JBoss, Apache, Junit, jquery, javascript, Java Webservices, SOAP, REST API, JSON Required Candidate profile Multithreading, Linux, Jboss, microservices, Agile methodology, GITHUB & SVN, MySQL, Oracle, MVC framework, application & web servers, Data Structure, Basic Networking, performing troubleshooting

Posted 4 weeks ago

Apply

7 - 12 years

11 - 14 Lacs

Chennai

Remote

Naukri logo

Job Summary: We are seeking a highly skilled and experienced Microservices Developer with strong expertise in Java Spring Boot, AWS ROSA (Red Hat OpenShift Service on AWS), and event-driven architecture. The ideal candidate will be responsible for developing scalable and secure microservices, integrating enterprise systems including Microsoft Dynamics CRM, and ensuring high code quality and security standards. Key Responsibilities: Design, develop, and deploy Java Spring Boot-based microservices on AWS ROSA platform Leverage Kong API Gateway and KPI Management Platform for API governance, observability, and traffic control Create and manage API proxies, plugins, and integrations in Kong for secure and monitored API access Implement event-driven architectures using Apache Kafka Write unit tests using JUnit and ensure code quality with SonarQube Perform secure coding and address vulnerabilities using Veracode (SAST & DAST) Integrate with Microsoft Dynamics CRM and other internal enterprise systems Work with AWS CI/CD pipelines, automate build and deployment workflows Apply integration design patterns for robust and maintainable system interfaces Participate in Agile/Scrum ceremonies, collaborate in a cross-functional team Maintain technical documentation including API specs, architecture diagrams, and integration flows Troubleshoot and resolve technical issues across environments Participate in code reviews and support mentoring of junior developers Required Skills and Qualifications: 7+ years of experience in Java Spring Boot development Proven experience deploying microservices on AWS ROSA (Red Hat OpenShift on AWS) Hands-on experience with Kong API Gateway, including API proxy setup, plugins, routing, and KPI monitoring Strong understanding of Kafka, event-driven design, and asynchronous communication patterns Hands-on experience with JUnit, SonarQube, and Veracode (static & dynamic analysis) Experience working with AWS services: EC2, EKS, S3, Lambda, RDS, CloudWatch, etc. Proficient in creating and maintaining CI/CD pipelines using AWS CodePipeline or similar tools Demonstrated ability to integrate with Microsoft Dynamics CRM and enterprise systems Knowledge of integration patterns, API design (REST/JSON), and OAuth2/SAML authentication Strong background in Agile/Scrum delivery models Excellent verbal and written communication skills Ability to work independently in remote environments while aligning with Singapore time zone Preferred Qualifications: Experience with Kong Enterprise features such as Dev Portal, Analytics, or Service Hub Experience with Red Hat OpenShift CLI and deployment automation Familiarity with Docker, Helm, and Terraform Exposure to DevSecOps pipelines Certification in AWS (Developer Associate or Architect) or Red Hat OpenShift is a plus Work Conditions: Remote work setup Must be available during Singapore working hours (GMT+8) Laptop and VPN access will be provided Collaborative team culture with frequent standups and technical reviews

Posted 1 month ago

Apply

8 - 10 years

27 - 32 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Naukri logo

The Team: As a Senior Lead Machine Learning Engineer of the Document Platforms and AI Team, you will play a critical role in building the next generation of data extraction tools, working on cutting-edge ML-powered products and capabilities that power natural language understanding, information retrieval, and data sourcing solutions for the Enterprise Data Organization and our clients. This is an exciting opportunity to shape the future of data transformation and see your work make a real difference, all while having fun in a collaborative and engaging environment. You'll spearhead the development and deployment of production-ready AI products and pipelines, leading by example and mentoring a talented team. This role demands a deep understanding of machine learning principles, hands-on experience with relevant technologies, and the ability to inspire and guide others. You'll be at the forefront of a rapidly evolving field, learning and growing alongside some of the brightest minds in the industry. If you're passionate about AI, driven to make an impact, and thrive in a dynamic and supportive workplace, we encourage you to join us! The Impact: The Document Platforms and AI team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. Whats in it for you: Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Responsibilities: Build production ready data acquisition and transformation pipelines from ideation to deployment. Being a hands-on problem solver and developer helping to extend and manage the data platforms. Apply best practices in data modeling and building ETL pipelines (streaming and batch) using cloud-native solutions Technical leadership: Drive the technical vision and architecture for the extraction project, making key decisions about model selection, infrastructure, and deployment strategies. Model development: Design, develop, and evaluate state-of-the-art machine learning models for information extraction, leveraging techniques from NLP, computer vision (if applicable), and other relevant domains. Data preprocessing and feature engineering: Develop robust pipelines for data cleaning, preprocessing, and feature engineering to prepare data for model training. Model training and evaluation: Train, tune, and evaluate machine learning models, ensuring high accuracy, efficiency, and scalability. Deployment and monitoring: Deploy and maintain machine learning models in a production environment, monitoring their performance and ensuring their reliability. Research and innovation: Stay up-to-date with the latest advancements in machine learning and NLP, and explore new techniques and technologies to improve the extraction process. Collaboration: Work closely with product managers, data scientists, and other engineers to understand project requirements and deliver effective solutions. Code quality and best practices: Ensure high code quality and adherence to best practices for software development. Communication: Effectively communicate technical concepts and project updates to both technical and non-technical audiences. What Were Looking For: 8-10 years of professional software work experience, with a strong focus on Machine Learning, Natural Language Processing (NLP) for information extraction and MLOps Expertise in Python and related NLP libraries (e.g., spaCy, NLTK, Transformers, Hugging Face) Experience with Apache Spark or other distributed computing frameworks for large-scale data processing. AWS/GCP Cloud expertise, particularly in deploying and scaling ML pipelines for NLP tasks. Solid understanding of the Machine Learning model lifecycle, including data preprocessing, feature engineering, model training, evaluation, deployment, and monitoring, specifically for information extraction models . Experience with CI/CD pipelines for ML models, including automated testing and deployment. Docker & Kubernetes experience for containerization and orchestration. OOP Design patterns, Test-Driven Development and Enterprise System design SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Excellent Problem-solving, Code Review and Debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Core Java 17+, preferably Java 21+, and associated toolchain Apache Avro Apache Kafka Other JVM based languages - e.g. Kotlin, Scala C# - in particular .NET Core

Posted 1 month ago

Apply

2 - 3 years

4 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

We're looking for engineers who love to create elegant, easy-to-use interfaces, and enjoy new JavaScript technologies as they show up every day. Particularly ReactJS. You will help drive our technology selection, and will coach your team on how to use these new technologies effectively in a production platform development environment. We need our engineers to be versatile, display leadership qualities and be enthusiastic to tackle new problems across the full-stack as we continue to push our technology forward. Responsibilities Design, develop, test, deploy, maintain and improve software Manage individual project priorities, deadlines and deliverables Keep software components loosely coupled as we grow Contribute improvements to our continuous delivery infrastructure Participate in recruiting and mentoring of top engineering talent Drive roadmap execution and enhance customer feedback into the product Develop, collaborate on, and execute Agile development, product scenarios, in order to release high quality software on a regular cadence Proactively assist your team to find and solve development and production software issues through effective collaboration Work with company stakeholders including PM, PO, Customer Facing teams, DevOps, Support to communicate and collaborate on execution Desirable - Contribute to frameworks selection, microservice extraction, and deployment in On-Premise and SAAS scenarios. Experience with troubleshooting, profiling and debugging applications Familiarity with web debugging tools (Chrome development tools, Fiddler etc) is a plus Experience with different databases (ElasticSearch, Impala, HDFS, Mongo etc) is a plus Basic Git command knowledge is a plus Messaging systems (e.g. RabbitMQ, Apache Kafka, Active MQ, AWS SQS, Azure Service Bus, Google Pub/Sub) Cloud solutions (e.g. AWS, Google Cloud Platform, Microsoft Azure) Personal Skills - Strong written and verbal communications skills to collaborate developers, testers, product owners, scrum masters, directors, and executives Experience taking part in the decision-making process in application code design, solution development, code review Strong worth ethic and emotional intelligence including being on time for meetings Ability to work in fast-changing environment and embrace change while still following a greater plan Qualifications Requirements - BS or MS degree in Computer Science or a related field, or equivalent job experience 2-3 years of experience in web application and any experience on building web IDEs and ETL driven web apps Strong knowledge and experience in C#(2+ years) Experience with ReactJS, microservices (2+ years) Experience in CI/CD pipeline Experience with relational databases, hands-on experience with SQL queries Strong experience with several JavaScript frameworks and tools, such as React, Node Strong knowledge in REST APIs Experience with Atlassian suite products such as JIRA, Bitbucket, Confluence Strong knowledge in Computer Science, Computing Theory: Paradigm & Principles (OOP, SOLID) Database theory (RDBMS) Code testing practices Algorithms Data structures Design Patterns Understanding of network interactions: Protocols conventions (e.g. REST, RPC) Authentication and authorization flows, standards and practices (e.g. oAuth, JWT)

Posted 1 month ago

Apply

6 - 11 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Skills : Apache Kafka or Confluent Kafka, Optimization and tuning, in Schema Regitry, KSQL, Connectors, replication across organizations and Cluster LinkingNotice Period: 0- 30 days

Posted 2 months ago

Apply

7 - 12 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

8+ Years of exp in Database Technologies: AWS Aurora-PostgreSQL, NoSQL,DynamoDB, MongoDB,Erwin data modeling Exp in pg_stat_statements, Query Execution Plans Exp in Apache Kafka,AWS Kinesis,Airflow,Talend.AWS Exp in CloudWatch,Prometheus,Grafana, Required Candidate profile Exp in GDPR, SOC2, Role-Based Access Control (RBAC), Encryption Standards. Exp in AWS Multi-AZ, Read Replicas, Failover Strategies, Backup Automation. Exp in Erwin, Lucidchart, Confluence, JIRA.

Posted 2 months ago

Apply

8 - 10 years

27 - 32 Lacs

Bengaluru, Gurgaon, Hyderabad

Work from Office

Naukri logo

The Team: As a Senior Lead Machine Learning Engineer of the Document Platforms and AI Team, you will play a critical role in building the next generation of data extraction tools, working on cutting-edge ML-powered products and capabilities that power natural language understanding, information retrieval, and data sourcing solutions for the Enterprise Data Organization and our clients. This is an exciting opportunity to shape the future of data transformation and see your work make a real difference, all while having fun in a collaborative and engaging environment. You'll spearhead the development and deployment of production-ready AI products and pipelines, leading by example and mentoring a talented team. This role demands a deep understanding of machine learning principles, hands-on experience with relevant technologies, and the ability to inspire and guide others. You'll be at the forefront of a rapidly evolving field, learning and growing alongside some of the brightest minds in the industry. If you're passionate about AI, driven to make an impact, and thrive in a dynamic and supportive workplace, we encourage you to join us! The Impact: The Document Platforms and AI team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. Whats in it for you: Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Responsibilities: Build production ready data acquisition and transformation pipelines from ideation to deployment. Being a hands-on problem solver and developer helping to extend and manage the data platforms. Apply best practices in data modeling and building ETL pipelines (streaming and batch) using cloud-native solutions Technical leadership: Drive the technical vision and architecture for the extraction project, making key decisions about model selection, infrastructure, and deployment strategies. Model development: Design, develop, and evaluate state-of-the-art machine learning models for information extraction, leveraging techniques from NLP, computer vision (if applicable), and other relevant domains. Data preprocessing and feature engineering: Develop robust pipelines for data cleaning, preprocessing, and feature engineering to prepare data for model training. Model training and evaluation: Train, tune, and evaluate machine learning models, ensuring high accuracy, efficiency, and scalability. Deployment and monitoring: Deploy and maintain machine learning models in a production environment, monitoring their performance and ensuring their reliability. Research and innovation: Stay up-to-date with the latest advancements in machine learning and NLP, and explore new techniques and technologies to improve the extraction process. Collaboration: Work closely with product managers, data scientists, and other engineers to understand project requirements and deliver effective solutions. Code quality and best practices: Ensure high code quality and adherence to best practices for software development. Communication: Effectively communicate technical concepts and project updates to both technical and non-technical audiences. What Were Looking For: 8-10 years of professional software work experience, with a strong focus on Machine Learning, Natural Language Processing (NLP) for information extraction and MLOps Expertise in Python and related NLP libraries (e.g., spaCy, NLTK, Transformers, Hugging Face) Experience with Apache Spark or other distributed computing frameworks for large-scale data processing. AWS/GCP Cloud expertise, particularly in deploying and scaling ML pipelines for NLP tasks. Solid understanding of the Machine Learning model lifecycle, including data preprocessing, feature engineering, model training, evaluation, deployment, and monitoring, specifically for information extraction models . Experience with CI/CD pipelines for ML models, including automated testing and deployment. Docker & Kubernetes experience for containerization and orchestration. OOP Design patterns, Test-Driven Development and Enterprise System design SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Excellent Problem-solving, Code Review and Debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Core Java 17+, preferably Java 21+, and associated toolchain Apache Avro Apache Kafka Other JVM based languages - e.g. Kotlin, Scala C# - in particular .NET Core

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies