Jobs
Interviews

97 Apache Zookeeper Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 8 years

9 - 10 Lacs

Bengaluru

Work from Office

Experienced in Kafka cluster maintenance, HA/DR setup, SSL/SASL/LDAP auth, ACLs, Kafka components (ZK, Connect, Schema Registry, etc.), upgrades, monitoring, capacity planning, and DB optimization. Mail:kowsalya.k@srsinfoway.com

Posted 4 months ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

bengaluru

Work from Office

SENIOR ENGINEER - KAFKA STREAMING PLATFORM - Here s a smattering of approaches important to us and the technologies we use Everything we do is as-code in version control. We don t like clicking buttons or doing things manually. All development or infra config changes go through a pull-request process, so you ll always have a say to thumbs up or down things you catch. Everything should have test cases and they go through a continuous integration process. We understand the importance of logs and metrics, so having visibility to things you need to see to do your job isn t an issue. And if you need to add more metrics or see more logs, it s within our control to improve that. We try to own as much of the platform as we reasonably can. You don t need to rely on other teams outside our own to improve the stack or change the way we do things. Kafka/Streaming Stack CodeSpring Boot (Java/Kotlin), Restful API, Golang PlatformApache Kafka 2.x, TAP, GCP, Ansible, Terraform, Docker, Vela Alerting/MonitoringGrafana, Kibana, ELK stack As a Senior Engineer on Target s Streaming Platform Team, you'll . . Help build out the Kafka/Streaming capability in India Write and deploy code that enhances the Kafka platform Designs infrastructure solutions that support automation, self- provisioning, product health, security/compliance, resiliency, zero- call aspiration, and are Guest/Team Member experience focused Troubleshoot and resolve platform operational issues 4+ years of experience developing in JVM-based languages (e.g. Java/Kotlin) Ability to apply skills to solve problems, aptitude to learn additional technologies or go deeper in an area. Has good basic programming/infrastructure skills and is able to quickly gather the skills necessary to accomplish the task at hand. Intermediate knowledge and skills associated with infrastructure- based technologies Works across the team to recommend solutions that are in accordance with accepted testing frameworks. Experience with modern platforms and CI/CD stacks (e.g. GitHub, Vela, Docker) Highly productive, self-starter and self-motivated Passionate about staying current with new and evolving technologies Desired 4+ years of experience developing high quality applications and/or supporting critical enterprise platforms Experience with Kafka, Containers(k8s), Zookeeper, worked with any one of the major public cloud providers ( GCP/AWS/AZURE) Familiarity with Golang and microservices architecture is a big plus Participate in day-to-day support requests by performing the admin tasks. Install and maintain standard Kafka componentsControl Center, ZooKeeper, and Brokers Strong understanding of infrastructure/software and how these systems are secured, analyzed, and investigated. Is a contact point for their team and is able to help answer questions for other groups and/or management Partner with teams to prioritize and improve services throughout the software development lifecycle Personal or professional experience contributing to open-source projects Innovative mindset - willingness to push new ideas into the company Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/diversity-and-inclusion

Posted Date not available

Apply

5.0 - 10.0 years

10 - 14 Lacs

mumbai

Work from Office

Role Purpose Required Skills: 5+Years of experience in system administration, application development, infrastructure development or related areas 5+ years of experience with programming in languages like Javascript, Python, PHP, Go, Java or Ruby 3+ years of in reading, understanding and writing code in the same 3+years Mastery of infrastructure automation technologies (like Terraform, Code Deploy, Puppet, Ansible, Chef) 3+years expertise in container/container-fleet-orchestration technologies (like Kubernetes, Openshift, AKS, EKS, Docker, Vagrant, etcd, zookeeper) 5+ years Cloud and container native Linux administration /build/ management skills Key Responsibilities Hands-on design, analysis, development and troubleshooting of highly-distributed large-scale production systems and event-driven, cloud-based services Primarily Linux Administration, managing a fleet of Linux and Windows VMs as part of the application solutions Involved in Pull Requests for site reliability goals Advocate IaC (Infrastructure as Code) and CaC (Configuration as Code) practices within Honeywell HCE Ownership of reliability, up time, system security, cost, operations, capacity and performance-analysis Monitor and report on service level objectives for a given applications services. Work with the business, Technology teams and product owners to establish key service level indicators. Ensuring the repeatability, traceability, and transparency of our infrastructure automation Support on-call rotations for operational duties that have not been addressed with automation Support healthy software development practices, including complying with the chosen software development methodology (Agile, or alternatives), building standards for code reviews, work packaging, etc. Create and maintain monitoring technologies and processes that improve the visibility to our applications' performance and business metrics and keep operational workload in-check. Partnering with security engineers and developing plans and automation to aggressively and safely respond to new risks and vulnerabilities. Develop, communicate, collaborate, and monitor standard processes to promote the long-term health and sustainability of operational development tasks. Reinvent your world.We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted Date not available

Apply

7.0 - 12.0 years

8 - 18 Lacs

bengaluru, delhi / ncr, mumbai (all areas)

Work from Office

7+ years’ experience (3+ in Kafka – Apache, Confluent, MSK – & RabbitMQ) with strong skills in monitoring, optimization, and incident resolution. Proficient in brokers, connectors, Zookeeper/KRaft, schema registry, and middleware performance metrics.

Posted Date not available

Apply

7.0 - 10.0 years

18 - 21 Lacs

bengaluru

Work from Office

Must Have Skills: Kafka and RabbitMQ in production environments. Deep understanding of RabbitMQ and Kafka ecosystems: brokers, connectors, zookeeper/KRaft, schema registry. Proficiency with monitoring tools and middleware performance metrics.

Posted Date not available

Apply

7.0 - 11.0 years

15 - 20 Lacs

bengaluru

Work from Office

Hiring for Middleware Admin in Bangalore with 7+ years of experience in below skills: Must Have: RabbitMQ/Kafka clusters Monitor health, troubleshot latency/downtime & optimized performance Automated ops with Shell/Python, Ansible, Terraform, CI/CD Required Candidate profile - Implemented security (SASL/Kerberos, SSL/TLS, RBAC) and compliance controls. - Immediate joiner - Strong in Communication - Ready to work from the client office 5 days every week

Posted Date not available

Apply

5.0 - 7.0 years

22 - 30 Lacs

hyderabad

Work from Office

Senior Engineer Software Product Development Hyderabad | Work from Office (WFO) | Preferred Joiners: Immediate – 15 Days Salary: Up to 30 LPA Experience: 4–7 Years About the Role We are looking for a Senior Engineer – Software Product Development to join our core engineering team. This role involves working on high-performance, distributed systems using modern programming languages and event-driven architectures. You will be responsible for building, deploying, and maintaining scalable solutions while following best practices for product development. Key Responsibilities Understand and work according to detailed design specifications. Follow best practices for core product development. Program for distributed systems with parallel processing in Golang, C++, Java. Develop features such as backup/restore and resizing. Manage distributed deployment processes. Handle installation, configuration, and process management. Implement leader election, monitoring, and alerting mechanisms. Requirements Strong experience in building performant and distributed systems with parallel processing. Hands-on with Kafka, Zookeeper, Spark, ETCD (any one or combination). Proficiency in event-driven architectures. Experience with Agile development methodologies. Familiarity with CI/CD pipelines. Who Should Apply Professionals from IT companies with strong expertise in distributed systems. Immediate joiners or those with a notice period of 15 days or less. Based in or willing to relocate to Hyderabad (Work from Office).

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

surat

Work from Office

Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

pune

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

ahmedabad

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

24.0 - 29.0 years

8 - 13 Lacs

bengaluru

Work from Office

We are looking for a skilled Software Engineer with experience in developing carrier-grade network management solutions. In this role, you will design and develop scalable, high-performance applicationsparticularly for Nokias Network Services Platform (NSP)with a focus on intuitive, modern UIs using technologies like React and Redux. You will work in a collaborative Agile environment alongside cross-functional teams to refine requirements, ensure product quality, and drive innovation in network automation and operations. You have: Education & Experience: Bachelors degree in Engineering, Computer Science, or related field with 24 years of relevant work experience. Programming Skills: Proficient in Java and Golang. Frameworks & Tools: Hands-on experience with J2EE and Spring Boot. Microservices & Infrastructure: Experience with Microservices architecture, Docker/Kubernetes, Kafka, and Zookeeper. Telecom Domain: Background in developing EMS/NMS products for managing telecommunication networks. Network Protocols: Knowledge of network management protocols and modeling such as SNMP, NetConf-YANG, RESTCONF, and GNMI. Scalable Systems: Experience building highly distributed, scalable, and carrier-grade software systems. Soft Skills & Passion: Strong communication skills, fast learner, and passionate about software development, automation, and delivering user-centric solutions. Design and Develop scalable, reliable network management applications with a focus on performance and carrier-grade standards. Contribute to Standardization efforts by participating in defining and implementing best practices for network management solutions. Collaborate in Agile Teams, working closely with Developers, Testers, and Product Managers to ensure timely and high-quality project delivery. Refine Requirements by actively engaging with stakeholders to translate business needs into technical solutions. Ensure Quality and Security by developing and testing solutions that meet high standards of scalability, reliability, and cybersecurity. Drive Innovation by exploring and applying new technologies, tools, and frameworks that enhance product capabilities. Support CI/CD Processes and automation to streamline development, testing, and deployment workflows. Maintain Technical Documentation and contribute to knowledge sharing within the team to support ongoing improvement and onboarding.

Posted Date not available

Apply

8.0 - 12.0 years

11 - 15 Lacs

bengaluru

Work from Office

As a Senior Technical Specialist, you will design and develop UIs for Network Management applications in Optical Transport, WDM, and SDH/SONET networks. With 8-12 years of experience, you'll use ReactJS, JavaScript, HTML, CSS, and Angular to build high-performing UIs. Experience with Core Java, Spring, Kafka, Python, and databases is a plus. You will work in an agile, distributed environment, enhancing functionality, user experience, and customer satisfaction. You have: Bachelor's degree or equivalent with 8 to 12 years of experience in user interface design and development. Expertise in UI development technologies such as ReactJS, JavaScript, HTML, CSS, Angular, and JQuery. Working experience with CORE JAVA, Spring, Kafka, Zookeeper, Hibernate, and Python is desirable. Good understanding of continuous integration and agile practices. It would be nice if you also had: Working knowledge of RDBMS, PL-SQL, Linux, Docker, and database concepts. Domain knowledge in OTN, Photonic network management. Hands-on experience in development activities like requirement analysis, design, coding, and unit testing. Solid and well-balanced individual, able to manage high-paced development activities with self-organizing capabilities. Develop software for Network Management of Optics Division products, including Photonic/WDM, Optical Transport, and SDH/SONET. Provide users with control over network configuration through Optics Network Management applications. Interface Optics Network Management applications with various Network Elements. Deploy Optics Network Management applications globally for installations. Contribute to new developments and maintain applications as part of the development team. Enhance functionality and customer satisfaction through ongoing application improvements.

Posted Date not available

Apply

8.0 - 12.0 years

11 - 15 Lacs

bengaluru

Work from Office

As a Senior Technical Specialist, you will design and develop software for Optical Network Management, working with Core Java, Spring, Kafka, and RDBMS. You will contribute to new developments and maintenance, collaborate with cross-functional teams, and ensure high-quality solutions. Your role includes requirement analysis, coding, testing, and integration, with exposure to UI technologies like React a plus. Passionate about innovation, you will drive agile development for global deployments. You have: Bachelor's degree or equivalent with 8 to 12 years of experience. Motivated for high-quality software development in complex, challenging, and geographically distributed environments. Hands-on experience in development activities such as requirement analysis, design, coding, and unit testing. Domain knowledge in OTN (Optical Transport Network), Photonic network management, will be a value add. It would be nice if you also had: A motivated and never-give-up attitude to reach assigned goals in a dynamic and competitive international environment. Solid, equilibrated personality, able to properly face high-paced development activity and demonstrate self-organizing capabilities. Good knowledge of continuous integration and agile practices. Develop software for Network Management of Optics Division products, including Photonic/WDM, Optical Transport, and SDH/SONET. Enable user control over network configuration through Optics Network Management applications. Interface Optics Network Management applications with various Network Elements, providing a user-friendly graphical interface and implementing algorithms. Deploy Optics Network Management applications globally, supporting hundreds of installations across different customer sizes. Contribute to new developments as part of the Optics Network Management development team. Hands-on experience with CORE JAVA, Spring, Kafka, Zookeeper, Hibernate, and Python. Working knowledge of RDBMS, PL-SQL, Linux, Docker, and database concepts. Exposure to UI technologies like REACT is desirable.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

kolkata

Hybrid

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

5.0 - 8.0 years

4 - 8 Lacs

chennai

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Hadoop Admin. Experience: 5-8 Years.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

mumbai

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

hyderabad

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

gurugram

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

5.0 - 10.0 years

2 - 6 Lacs

bengaluru

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted Date not available

Apply

2.0 - 5.0 years

4 - 8 Lacs

bengaluru

Work from Office

We are looking for Software Engineers with back-end web application and systems-level experience to join our Fabric Development Team. Our passion for innovation and winning in the cloud marketplace is infectious, and we hope you will feel it with us. The Fabric Development team is dedicated to ensuring that the IBM Cloud is at the forefront of cloud technology, from API design to application architecture to flexible infrastructure services. We are running IBM's current generation cloud platform to deliver performance and predictability for our customers' most demanding workloads, at global scale and with leadership efficiency, resiliency and security. It is an exciting time, and as a team we are driven by this incredible opportunity to thrill our clients. The Fabric Development Team sits at the center of our larger development effort. Developers work in areas that are used by the larger development organization and are required to work with developers and stakeholders in other teams to help solve problems. Design and develop innovative, company impacting products and services to support infrastructure operations Design, develop and implement object-oriented PHP applications from prototype through implementation Integrate open source and commercial enterprise applications into an exposed API and web-based portal Create highly scalable and performant REST/SOAP web services Keep focus on end users and goals all through the development process Work closely with product management and stakeholders to ensure applications meet needs and expectations Adhere to the highest levels of technical discipline and excellence to set a standard for the larger development organization Required education Bachelor's Degree Required technical and professional expertise Requirements Bachelor's Degree in Computer Science, related field, or comparable extra work experience 6+ years of experience with back end object oriented PHP development Solid experience with SQL and relation databases Solid experience with Object Oriented Design and Development Experience Developing Web API Interfaces Experience with version control systems, preferably Git Experience writing and debugging C code Preferred technical and professional experience Beneficial Skills Experience with shell scripting Experience with Java and/or Python Solid experience developing backend code using PHP Experience with non-relational data stores such as ZooKeeper or Memcache Experience with Docker and container orchestration technologies such as Kubernetes

Posted Date not available

Apply

10.0 - 15.0 years

3 - 5 Lacs

hyderabad, india

Hybrid

Job Purpose Designs, develops, and implements Java applications to support business requirements. Follows approved life cycle methodologies, creates design documents, writes code and performs unit and functional testing of software. Contributes to the overall architecture and standards of the group, acts as an SME and plays a software governance role. Key Activities / Outputs • Work closely with business analysts to analyse and understand the business requirements and business case, in order to produce simple, cost effective and innovative solution designs • Implement the designed solutions in the required development language (typically Java) in accordance with the Vitality Group standards, processes, tools and frameworks • Testing the quality of produced software thoroughly through participation in code reviews, the use of static code analysis tools, creation and execution of unit tests, functional regression tests, load tests and stress tests and evaluating the results of performance metrics collected on the software. • Participate in feasibility studies, proof of concepts, JAD sessions, estimation and costing sessions, evaluate and review programming methods, tools and standards, etc. • Maintain the system in production and provide support in the form of query resolution and defect fixes • Prepare the necessary technical documentation including payload definitions, class diagrams, activity diagrams, ERDs, operational and support documentation, etc • Driving the skills development of team members, coaching of team members for performance and coaching on career development, recruitment, staff training, performance management, etc Technical Skills or Knowledge Extensive experience working with Java, Solid understanding of Object Orientated programming fundamentals, Needs to have a high-level understanding of the common frameworks in the Java technology stack, Extensive knowledge of design patterns and the ability to recognize and apply them, Spring, Hibernate, Junit, SOA, Microservices, Docker, Data Modelling, UML, SQL, SoapUI (SOAP) / REST client (JSON), Architectural Styles, Kafka, Zookeeper, Zuul, Eureka, Obsidian, Elasticsearch, Kibana, FluentD Preferred Technical Skills (Would be advantageous) This position is a hybrid role based in Hyderabad which requires you to be in the office on a Tuesday, Wednesday and Thursday.

Posted Date not available

Apply

5.0 - 7.0 years

22 - 27 Lacs

bengaluru

Work from Office

The Opportunity Nutanix pioneered the SDDC revolution by bringing the simplicity and elasticity of the cloud to the on-prem data center. Our converged virtualization product can scale to manage petabytes of data while running thousands of virtual machines. And while cloud adoption is Surely accelerating laws of economics and compliance are ensuring that the future belongs to hybrid clouds. Nutanix is heavily investing in features that make hybrid cloud deployments seamless and frictionless. We're looking for solid engineers who are constantly looking to push the envelope in terms of not only usability and functionality, but also performance and scale. This role will provide you with an opportunity to deeply validate the functionality of features on the Core HCI software Infrastructure. We have a strong bias towards automation in everything we do, so be prepared to code extensively as part of your role. About the Team At Nutanix, you'll be joining the CoreInfra team, which consists of 17 dedicated professionals based in our Bangalore office. We foster a culture of collaboration and unity, emphasizing the importance of "one team, one goal." Our environment is built on mutual respect and support, encouraging each member to contribute their unique skills while working together seamlessly to achieve common objectives. You will report to a Senior Engineering Manager who is committed to guiding the team through innovative projects while promoting professional growth. Additionally, there are no travel requirements for this role, enabling you to focus on your projects and responsibilities without the added demand of traveling for work. Your Role Writing and executing detailed functional test plans to thoroughly validate features Executing customer deployment scenario tests Automating all aspects of test workflows, including test libraries and frameworks. Collaborating with a team of developers and testers to plan and execute test activities. Working closely with development engineers to analyze and find the root cause of failures. Engage with support engineers as customers to debug and solve production issues What You Will Bring 5 to 7 years of software testing experience in a distributed systems product company. BE or MS in Computer Science or related field. Experience in Software Development or Test Automation in Python programming languages. Good understanding of distributed systems in any 1 or more Domain - Server virtualization, Storage, Networking, Security, Containerization or related products. Experienced in hypervisor virtualization such as vSphere, KVM, HyperV. Good understanding of Linux debugging and resource management with respect to functional or system testing. Experienced with testing REST APIs & UI automation. Exposure to clustering technologies such as zookeeper, Cassandra, HA routing etc. will be a plus. Prior system testing experience of large-scale deployments will be a plus. Understanding of CI/CD tools: Jenkins/CircleCI, Git. Hands-on exposure to a public cloud environment (AWS, GCP, Azure) Work Arrangement Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies