Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
15 - 20 Lacs
Noida, Gurugram
Work from Office
Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.
Posted 1 week ago
10.0 - 15.0 years
12 - 16 Lacs
Pune, Bengaluru
Work from Office
We are seeking a talented and experienced Kafka Architect with migration experience to Google Cloud Platform (GCP) to join our team. As a Kafka Architect, you will be responsible for designing, implementing, and managing our Kafka infrastructure to support our data processing and messaging needs, while also leading the migration of our Kafka ecosystem to GCP. You will work closely with our engineering and data teams to ensure seamless integration and optimal performance of Kafka on GCP. Responsibilities: Discovery, analysis, planning, design, and implementation of Kafka deployments on GKE, with a specific focus on migrating Kafka from AWS to GCP. Design, architect and implement scalable, high-performance Kafka architectures and clusters to meet our data processing and messaging requirements. Lead the migration of our Kafka infrastructure from on-premises or other cloud platforms to Google Cloud Platform (GCP). Conduct thorough discovery and analysis of existing Kafka deployments on AWS. Develop and implement best practices for Kafka deployment, configuration, and monitoring on GCP. Develop a comprehensive migration strategy for moving Kafka from AWS to GCP. Collaborate with engineering and data teams to integrate Kafka into our existing systems and applications on GCP. Optimize Kafka performance and scalability on GCP to handle large volumes of data and high throughput. Plan and execute the migration, ensuring minimal downtime and data integrity. Test and validate the migrated Kafka environment to ensure it meets performance and reliability standards. Ensure Kafka security on GCP by implementing authentication, authorization, and encryption mechanisms. Troubleshoot and resolve issues related to Kafka infrastructure and applications on GCP. Ensure seamless data flow between Kafka and other data sources/sinks. Implement monitoring and alerting mechanisms to ensure the health and performance of Kafka clusters. Stay up to date with Kafka developments and GCP services to recommend and implement new features and improvements. Requirements: Bachelors degree in computer science, Engineering, or related field (Masters degree preferred). Proven experience as a Kafka Architect or similar role, with a minimum of [5] years of experience. Deep knowledge of Kafka internals and ecosystem, including Kafka Connect, Kafka Streams, and KSQL. In-depth knowledge of Apache Kafka architecture, internals, and ecosystem components. Proficiency in scripting and automation for Kafka management and migration. Hands-on experience with Kafka administration, including cluster setup, configuration, and tuning. Proficiency in Kafka APIs, including Producer, Consumer, Streams, and Connect. Strong programming skills in Java, Scala, or Python. Experience with Kafka monitoring and management tools such as Confluent Control Center, Kafka Manager, or similar. Solid understanding of distributed systems, data pipelines, and stream processing. Experience leading migration projects to Google Cloud Platform (GCP), including migrating Kafka workloads. Familiarity with GCP services such as Google Kubernetes Engine (GKE), Google Cloud Storage, Google Cloud Pub/Sub, and Big Query. Excellent communication and collaboration skills. Ability to work independently and manage multiple tasks in a fast-paced environment.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Administer and maintain Apache Kafka clusters , including installation, upgrades, configuration, and performance tuning. Design and implement Kafka topics, partitions, replication , and consumer groups. Ensure high availability and scalability of Kafka infrastructure in production environments. Monitor Kafka health and performance using tools like Prometheus, Grafana, Confluent Control Center , etc. Implement and manage security configurations such as SSL/TLS, authentication (Kerberos/SASL), and access control. Collaborate with development teams to design and configure Kafka-based integrations and data pipelines . Perform root cause analysis of production issues and ensure timely resolution. Create and maintain documentation for Kafka infrastructure and configurations. Required Skills: Strong expertise in Kafka administration , including hands-on experience with open-source and/or Confluent Kafka . Experience with Kafka ecosystem tools (Kafka Connect, Kafka Streams, Schema Registry). Proficiency in Linux-based environments and scripting (Bash, Python). Experience with monitoring/logging tools and Kafka performance optimization. Ability to work independently and proactively manage Kafka environments. Familiarity with DevOps tools and CI/CD pipelines (e.g., Jenkins, Git, Ansible). Preferred Skills: Experience with cloud platforms (AWS, GCP, or Azure) Kafka services. Knowledge of messaging alternatives like RabbitMQ, Pulsar, or ActiveMQ . Working knowledge of Docker and Kubernetes for Kafka deployment.
Posted 2 weeks ago
6.0 - 11.0 years
4 - 9 Lacs
Bengaluru
Work from Office
SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.
Posted 3 weeks ago
6.0 - 11.0 years
4 - 9 Lacs
Bengaluru
Work from Office
SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.
Posted 3 weeks ago
3 - 8 years
10 - 20 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.
Posted 2 months ago
7 - 10 years
20 - 22 Lacs
Chennai, Pune, Noida
Work from Office
Experience in Java, Apache Kafka, Streams, Clusters Application Development, Topic Management, Data Pipeline Development, Producer & Consumer Implementation, Integration & Connectivity, Cluster Administration, Security & Compliance, Apache Zookeeper Required Candidate profile 7 -10 year exp in Kafka Expertis, Programming Skill, Big Data & Streaming Technologie, Database Knowledge, Cloud & DevOp, Event-Driven Architecture, Security & Scalability, Problem Solving & Teamwork
Posted 2 months ago
7 - 11 years
13 - 19 Lacs
Chennai, Pune, Delhi NCR
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC for Kafka Developer Exp - 7 - 11 Years Only immediate Joiners Location - Pune , Chennai , Noida Educational Background A degree in Computer Science, IT, or a related field. Kafka Expertise Strong knowledge of Kafka architecture, brokers, producers, consumers, and stream processing. Programming Skills Proficiency in Java, Scala, or Python for developing Kafka-based applications. Big Data & Streaming Technologies Experience with Spark, Flink, or Apache Storm is a plus. Database Knowledge Familiarity with SQL and NoSQL databases like Cassandra, MongoDB, or PostgreSQL. Cloud & DevOps Experience with cloud platforms (AWS, Azure, GCP) and Kubernetes/Docker. Event-Driven Architecture Understanding of event-driven and microservices architectures. Monitoring & Debugging Experience with Kafka monitoring tools like Confluent Control Center, Kafka Manager, or ELK stack. Security & Scalability Knowledge of Kafka security, access control, and scaling strategies. Problem-Solving & Communication Strong analytical skills and ability to work in cross-functional teams Preferred candidate profile Kafka Application Development Data Pipeline Development Producer & Consumer Implementation Integration & Connectivity Performance Optimization Security & Compliance Cluster Administration Monitoring & Logging Documentation Perks and benefits
Posted 2 months ago
7 - 9 years
3 - 7 Lacs
Pune
Work from Office
J ob Description: Senior Kafka Developer, Having over around 7+ years of experience in different integration technologies, data streaming technologies and Kafka. Solid understanding of Kafka architecture and experience in using Kafka. Ensure optimum performance, high availability and stability of solutions Strong hands-on designing integration solutions using Kafka and aware of integration best practices for data streaming solutions. Strong awareness of Kafka ecosystem, Zookeeper, Kafka cluster, Kafka broker, producer, consumer, Connectors, different APIs, Kafka Topic etc. Strong hands-on SQL connector, HTTP connector etc. Strong knowledge on different APIs exposed by Kafka to handle the integration, data steaming and for data management. Strong awareness on designing deployment architecture for Kafka solutions. Solid hands-on Kafka deployment and configuration. Experience on production deployment, invoke Kafka components as background processes, configuration, trouble shooting and environment maintenance. Location:Pune
Posted 2 months ago
5 - 10 years
15 - 30 Lacs
Bengaluru
Work from Office
5+ YEARS OF EXPERIENCE CONFLUENT KAFKA KAFKA CONNECT KAFKA CLUSTER ZOOKEEPER
Posted 3 months ago
3 - 7 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Minimum 15 years of full time education Summary :As an Application Designer, you will be responsible for assisting in defining requirements and designing applications to meet business process and application requirements using Apache Kafka. Your typical day will involve working with cross-functional teams, analyzing requirements, and designing scalable and reliable applications. Roles & Responsibilities: Collaborate with cross-functional teams to analyze business requirements and design scalable and reliable applications using Apache Kafka. Design and develop Kafka-based solutions for real-time data processing and streaming. Ensure the performance, scalability, and reliability of Kafka clusters and applications. Implement security and access control measures for Kafka clusters and applications. Stay updated with the latest advancements in Kafka and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Apache Kafka. Good To Have Skills:Experience with Apache Spark, Apache Flink, and other big data technologies. Experience in designing and developing Kafka-based solutions for real-time data processing and streaming. Strong understanding of Kafka architecture, configuration, and performance tuning. Experience in implementing security and access control measures for Kafka clusters and applications. Solid grasp of distributed systems and microservices architecture. Additional Information: The candidate should have a minimum of 3 years of experience in Apache Kafka. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful Kafka-based solutions. This position is based at our Mumbai office. Qualification Minimum 15 years of full time education
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Pune
Work from Office
Job Title:Solutions IT Developer - Kafka SpecialistLocation:Toronto/ Offshore -Pune About The Role ::We are seeking a seasoned Solutions IT Developer with a strong background in Apache Kafka to join our developer advocacy function in our event streaming team. The ideal candidate will be responsible for Kafka code reviews with clients, troubleshooting client connection issues with Kafka and supporting client onboarding to Confluent Cloud. This role requires a mix of software development expertise along with a deep understanding of Kafka architecture, components, and tuning. Responsibilities:1. Support for Line of Business (LOB) Users: - Assist LOB users with onboarding to Apache Kafka (Confluent Cloud/Confluent Platform), ensuring a smooth integration process and understanding of the platforms capabilities. 2. Troubleshooting and Technical Support: - Resolve connectivity issues, including client and library problems, to ensure seamless use of our Software Development Kit (SDK), accelerators and Kafka client libraries. - Address network connectivity and access issues - Provide a deep level of support for Kafka library, offering advanced troubleshooting and guidance. - Java 11, 17 and Spring Boot (Spring Kafka, Spring Cloud Stream Kafka Spring Cloud Stream) experience 3. Code Reviews and Standards Compliance: - Perform thorough code reviews to validate client code against our established coding standards and best practices. - Support the development of Async specifications tailored to client use cases, promoting effective and efficient data handling.4. Developer Advocacy: - Act as a developer advocate for all Kafka development at TD, fostering a supportive community and promoting best practices among developers.5. Automation and APIs: - Manage and run automation pipelines for clients using REST APIs as we build out GitHub Actions flow.6. Documentation and Knowledge Sharing: - Update and maintain documentation standards, including troubleshooting guides, to ensure clear and accessible information is available. - Create and disseminate knowledge materials, such as how-tos and FAQs, to answer common client questions in general chats related to Kafka development. Role Requirements:Qualifications:- Bachelors degree in Computer Science- Proven work experience as a Solutions Developer or similar role with a focus on Kafka design and development Skills:- In-depth knowledge of Java 11, 17 and Spring Boot (Spring Kafka, Spring Cloud Stream Kafka Spring Cloud Stream)- Deep knowledge of Apache Kafka, including Kafka Streams and Kafka Connect experience- Strong development skills in one or more high-level programming languages (Java, Python).- Familiarity with Kafka API development and integration.- Understanding of distributed systems principles and data streaming concepts.- Experience with source control tools such as Git, and CI/CD pipelines.- Excellent problem-solving and critical-thinking skills.Preferred:- Kafka certification (e.g., Confluent Certified Developer for Apache Kafka).- Experience with streaming data platforms and ETL processes.- Prior work with NoSQL databases and data warehousing solutions. Experience:- Minimum of 4 years of hands-on experience with Apache Kafka.- Experience with large-scale data processing and event-driven system design. Other Requirements:- Good communication skills, both written and verbal.- Ability to work independently as well as collaboratively.- Strong analytical skills and attention to detail.- Willingness to keep abreast of industry developments and new technologies.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2