Jobs
Interviews

33 Kafka Cluster Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.

Posted 1 week ago

Apply

4.0 - 8.0 years

7 - 17 Lacs

Hyderabad, Bengaluru

Hybrid

Role & responsibilities Kafka Admin (CTH 3- 6months conversion based the performance) Job Description: Standing up and administer on premise Kafka cluster. •Ability to architect and create reference architecture for kafka Implementation standards • Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. • Ensure optimum performance, high availability and stability of solutions. • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. • Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. • Use automation tools like provisioning using Docker, Jenkins and GitLab. • Ability to perform data related benchmarking, performance analysis and tuning. • Strong skills in In-memory applications, Database Design, Data Integration. • Involve in design and capacity review meetings to provide suggestion in Kafka usage. • Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. • Setting up security on Kafka. • Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. • Monitor, prevent and troubleshoot security related issues. • Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS • 4-6 years demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. • Hands on experience on recovery in Kafka. • 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. • Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). • Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) • Considerable experience in implementing Kerberos Security. • Support 24*7 Model and be available to support rotational on-call work ( including Saturday/Sunday ) • Competent working in one or more environments highly integrated with an operating system. • Experience implementing and administering/managing technical solutions in major, large-scale system implementations. • High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. • Ability to manage tasks independently and take ownership of responsibilities • Ability to learn from mistakes and apply constructive feedback to improve performance • Ability to adapt to a rapidly changing environment. • Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. • Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. • Willingness to learn new technologies/tool and train your peers. • Proven track record to automate. Preferred candidate profile If interested please share CV to harshini.d@ifinglobalgroup.com

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Chennai

Work from Office

Kafka Admin Consult with inquiring teams on how to leverage Kafka within their pipelines Architect, Build and Support existing and new Kafka clusters via IaC Partner with Splunk teams to route trac through Kafka by utilizing open-source agents and collectors deployed via ChefRemediate any health issues within Kafka Automate (where possible) any operational processes on the team Create new and/or update monitoring dashboards and alerts as neededManage a continuous improvement / continuous development (CI/CD pipelinePerform PoCs on new components to expand/enhance teams Kafka oerings Preferred QualificationsKnowledge and experience with Splunk, Elastic, Kibana and GrafanaKnowledge and experience with log collection agents such as Open-Telemetry, Fluent Bit, FluentD, Beats and LogStash.Knowledge and experience with Kubernetes / DockerKnowledge and experience with Kafka-ConnectKnowledge and experience with AWS or AzureKnowledge and experience with Streaming Analytics Mandatory Skills: API Microservice Integration. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Must have hands-on experience in Confluent/Apache Kafka environment (Kafka Cluster, Apache Zookeeper). Transform data into topics, streaming processors in Kafka, Confluent tools. Knowledge on one of the cloud platforms like AWS or GCPMust have hands-on experience in Confluent/Apache Kafka environment (Kafka Cluster, Apache Zookeeper). Transform data into topics, streaming processors in Kafka, Confluent tools. Knowledge on one of the cloud platforms like AWS or GCP

Posted 2 weeks ago

Apply

3.0 - 8.0 years

18 - 22 Lacs

Navi Mumbai, Mumbai (All Areas)

Work from Office

1 Education : B.E./B.Tech/MCA in Computer Science 2 Experience : Must have 7+ years relevant experience in the field of database Administration. 3 Mandatory Skills/Knowledge Candidate should be technically sound in multiple distribution like Cloudera, Confluent, open source Kafka. Candidate should be technically sound in Kafka, Zookeeper Candidate should well versed in capacity planning and performance tuning. Candidate should be expertise in implementation of security in ecosystem Hadoop Security ranger , Kerberos ,SSL Candidate should be expertise in dev ops tool like ansible, Nagios, shell scripting python , Jenkins, Ansible, Git, Maven to implement automation . Candidate should able to Monitor, Debug & RCA for any service failure. Knowledge of network infrastructure for eg. TCP/IP, DNS, Firewall, router, load balancer. Creative analytical and problem-solving skills Provide RCAs for critical & recurring incidents. Provide on-call service coverage within a larger group Good aptitude in multi-threading and concurrency concepts. 4 Preferred Skills/Knowledge Expert Knowledge of database administration and architecture Hands on Operating System Commands Kindly Share CVs on snehal.sankade@outworx.com

Posted 2 weeks ago

Apply

5.0 - 9.0 years

10 - 17 Lacs

Bengaluru

Work from Office

Role name Kafka Platform Engineer No of years experience 5 + Years of relavant skill Detailed JD Kafka Platform Engineer We are seeking a highly skilled and motivated Kafka Platform Engineer to join our team. As a Kafka Platform Engineer, you will be responsible for operating and managing our Kafka cluster, ensuring its scalability, reliability, and security. You will collaborate with cross-functional teams to design, implement, and optimize Kafka solutions that meet the needs of our business. This is a key role in modernizing our application infrastructure and adopting industry best practices. Primary Skills: Strong expertise in operating and administering Kafka clusters. Experience in performance tuning and troubleshooting of middleware technologies, applying them to infrastructure. Proficiency in shell scripting and/or Python/Python, with specific experience in administering Kafka. Experience with Java application servers on cloud platforms is a significant advantage. Provide operational support for the Kafka cluster, ensuring high availability and stability 24/7 (on-call support). Utilize infrastructure as code (IaC) principles to provision and manage Kafka infrastructure. Work Location Bangalore (No remote access, need to operate from base location) Client Interview / F2F Applicable Yes

Posted 3 weeks ago

Apply

10.0 - 20.0 years

30 - 40 Lacs

Hyderabad

Work from Office

Job Description Kafka/Integration Architect Position brief: Kafka/Integration Architect is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions. This role involves configuring, deploying, and monitoring Kafka clusters to ensure the high availability and scalability of data streaming services. The Kafka Architect collaborates with cross-functional teams to integrate Kafka into various applications and ensures optimal performance and reliability of the data infrastructure. Kafka/Integration Architect play a critical role in driving data-driven decision-making and enabling real-time analytics, contributing directly to the company’s agility, operational efficiency and ability to respond quickly to market changes. Their work supports key business initiatives by ensuring that data flows seamlessly across the organization, empowering teams with timely insights and enhancing the customer experience. Location: Hyderabad Primary Role & Responsibilities: Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency. Collaborate with development teams to integrate Kafka into applications and services. Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights. Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance. Optimize Kafka configurations for performance, reliability, and scalability. Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead. Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications. Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration. Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends. Required Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to translate business requirements into technical solutions. Working Experience and Qualification: Education: Bachelor’s or master’s degree in computer science, Information Technology or related field. Experience: Proven experience of 8-10 years as a Kafka Architect or in a similar role. Skills: Strong knowledge of Kafka architecture, including brokers, topics, partitions and replicas. Experience with Kafka security, including SSL, SASL, and ACLs. Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments. Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink. Solid understanding of distributed systems, data streaming and messaging patterns. Proficiency in Java, Scala, or Python for Kafka-related development tasks. Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging. Experience with tools like Zookeeper, Schema Registry, and Kafka Connect. Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment. Experience with cloud platforms like AWS, Azure, or GCP. Preferred Skills: (Optional) Kafka certification or related credentials, such as: Confluent Certified Administrator for Apache Kafka (CCAAK) Cloudera Certified Administrator for Apache Kafka (CCA-131) AWS Certified Data Analytics – Specialty (with a focus on streaming data solutions) Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with other messaging systems like RabbitMQ or Apache Pulsar. Experience with data serialization formats like Avro, Protobuf, or JSON. Company Profile: WAISL is an ISO 9001:2015, ISO 20000-1:2018, ISO 22301:2019 certified, and CMMI Level 3 Appraised digital transformation partner for businesses across industries with a core focus on aviation and related adjacencies. We transform airports and relative ecosystems through digital interventions with a strong service excellence culture. As a leader in our chosen space, we deliver world-class services focused on airports and their related domains, enabled through outcome-focused next-gen digital/technology solutions. At present, WAISL is the primary technology solutions partner for Indira Gandhi International Airport, Delhi, Rajiv Gandhi International Airport, Hyderabad, Manohar International Airport, Goa, Kannur International Airport, Kerala, and Kuwait International Airport, and we expect to soon provide similar services for other airports in India and globally. WAISL, as a digital transformation partner, brings proven credibility in managing and servicing 135+Mn passengers, 80+ airlines, core integration, deployment, and real-time management experience of 2000+ applications vendor agnostically in highly complex technology converging ecosystems. This excellence in managed services delivered by WAISL has enabled its customer airports to be rated amongst the best-in-class service providers by Skytrax and ACI awards, and to win many innovation and excellence awards

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 16 Lacs

Mumbai, Navi Mumbai

Work from Office

Kafka Administrator Strong experience with Apache Kafka and its ecosystemKafka Connect,Schema Registry, Kafka Streams in Kafka cluster monitoring and performance tuning Location CBD Belapur Navi Mumbai B.E / B.Tech / BCA B.Sc-IT MCA / M.Sc-IT M.Tech

Posted 4 weeks ago

Apply

4.0 - 6.0 years

15 - 20 Lacs

Bengaluru

Work from Office

IBM Event Stream (Kafka) Platform consulting, Platform architecture and implementation IBM Event Stream (Kafka) Cluster, Security, Disaster recovery, Data pipeline, Data replication, Performance optimization

Posted 4 weeks ago

Apply

8.0 - 12.0 years

8 - 18 Lacs

Hyderabad, Bengaluru

Work from Office

**Job Title:** Confluent Kafka Engineer (Azure & GCP Focus) **Location:** [Bangalore or Hyderabad ] **Role Overview** We are seeking an experienced **Confluent Kafka Engineer** with hands-on expertise in deploying, administering, and securing Kafka clusters in **Microsoft Azure** and **Google Cloud Platform (GCP)** environments. The ideal candidate will be skilled in cluster administration, RBAC, cluster linking and setup, and monitoring using Prometheus and Grafana, with a strong understanding of cloud-native best practices. **Key Responsibilities** - **Kafka Cluster Administration (Azure & GCP):** - Deploy, configure, and manage Confluent Kafka clusters on Azure and GCP virtual machines or managed infrastructure. - Plan and execute cluster upgrades, scaling, and disaster recovery strategies in cloud environments. - Set up and manage cluster linking for cross-region and cross-cloud data replication. - Monitor and maintain the health and performance of Kafka clusters, proactively identifying and resolving issues. - **Security & RBAC:** - Implement and maintain security protocols, including SSL/TLS encryption and role-based access control (RBAC). - Configure authentication and authorization (Kafka ACLs) across Azure and GCP environments. - Set up and manage **Active Directory (AD) plain authentication** and **OAuth** for secure user and application access. - Ensure compliance with enterprise security standards and cloud provider best practices. - **Monitoring & Observability:** - Set up and maintain monitoring and alerting using Prometheus and Grafana, integrating with Azure Monitor and GCP-native monitoring as needed. - Develop and maintain dashboards and alerts for Kafka performance and reliability metrics. - Troubleshoot and resolve performance and reliability issues using cloud-native and open-source monitoring tools. - **Integration & Automation:** - Develop and maintain automation scripts (Bash, Python, Terraform, Ansible) for cluster deployment, scaling, and monitoring. - Build and maintain infrastructure as code for Kafka environments in Azure and GCP. - Configure and manage **Kafka connectors** for integration with external systems, including **BigQuery Sync connectors** and connectors for Azure and GCP data services (such as Azure Data Lake, Cosmos DB, BigQuery). - **Documentation & Knowledge Sharing:** - Document standard operating procedures, architecture, and security configurations for cloud-based Kafka deployments. - Provide technical guidance and conduct knowledge transfer sessions for internal teams. **Required Qualifications** - Bachelors degree in Computer Science, Engineering, or related field. - 5+ years of hands-on experience with Confluent Platform and Kafka in enterprise environments. - Demonstrated experience deploying and managing Kafka clusters on **Azure** and **GCP** (not just using pre-existing clusters). - Strong expertise in cloud networking, security, and RBAC in Azure and GCP. - Experience configuring **AD plain authentication** and **OAuth** for Kafka. - Proficiency with monitoring tools (Prometheus, Grafana, Azure Monitor, GCP Monitoring). - Hands-on experience with Kafka connectors, including BQ Sync connectors, Schema Registry, KSQL, and Kafka Streams. - Scripting and automation skills (Bash, Python, Terraform, Ansible). - Familiarity with infrastructure as code practices. - Excellent troubleshooting and communication skills. **Preferred Qualifications** - Confluent Certified Developer/Admin certification. - Experience with cross-cloud Kafka streaming and integration scenarios. - Familiarity with Azure and GCP data services (Azure Data Lake, Cosmos DB, BigQuery). - Experience with other streaming technologies (e.g., Spark Streaming, Flink). - Experience with data visualization and analytics tools.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 17 Lacs

Noida, Gurugram, Greater Noida

Work from Office

5+ yrs in Python, Django Microservices Architecture and API development Deploy via Kubernetes Redis for performance tuning Celery for distributed task DBs: PostgreSQL, Time-series Redis, Celery, RabbitMQ/Kafka Microservices architecture exp

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Dubai, Pune, Chennai

Hybrid

Job Title: Confluent CDC System Analyst Role Overview: A leading bank in the UAE is seeking an experienced Confluent Change Data Capture (CDC) System Analyst/ Tech lead to implement real-time data streaming solutions. The role involves implementing robust CDC frameworks using Confluent Kafka , ensuring seamless data integration between core banking systems and analytics platforms. The ideal candidate will have deep expertise in event-driven architectures, CDC technologies, and cloud-based data solutions . Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solutions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .

Posted 1 month ago

Apply

10.0 - 15.0 years

25 - 32 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Skill Set Confluent + Kafka Architect Hiring Location Bangalore/Chennai/Pune/Mumbai/Hyderabad Notice Period Immediate Joiners- Max 30 Days Experience 10-12 Yrs Interview Levels 2 Internal/1 CI Shift Timings 2 PM- 11 PM (Sweden Timing) Job description Confluent Kafka Architect P3 its for CHina IEB support The team member will be part of a Kafka platform development team responsible for architecture administration and operations for Kafka clusters across all nonproduction and production environments The role involves best design solution ensuring system reliability optimizing performance automating regular tasks providing guidance to team for onboarding applications helping team for upskillcrosstraining and troubleshooting issues to maintain seamless Kafka operations Key Responsibilities Architect Manage and maintain Kafka clusters in nonprod and prod environments Responsible for high level discussions with customers Responsible for proposing best solutions as per industry standards Responsible for doing POCs and documenting the exercise Getting involved with team members to solve their technical problems Handling high priority discussions and drive the meetings with vendor if required Skills and Abilities Strong knowledge of Kafka architecture Handson experience with Kafka cluster setup using ZookeeperKraft Proficiency in working with Kafka connectors and ksqlDB Experience with Kubernetes for Kafka deployment and management Ability to create Docker images and work with containerized environments Proficiency in writing YAML configurations Strong troubleshooting and debugging skills Experience with monitoring tools like Datadog

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Looking exp in 5+ Yrs exp in Kafka Administrator Kafka Administrator Required Skills & Experience: Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Interested candidate share me your updated resume in recruiter.wtr26@walkingtree.in

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Develop and maintain Kafka-based data pipelines for real-time processing. Implement Kafka producer and consumer applications for efficient data flow. Optimize Kafka clusters for performance, scalability, and reliability. Design and manage Grafana dashboards for monitoring Kafka metrics. Integrate Grafana with Elasticsearch, or other data sources. Set up alerting mechanisms in Grafana for Kafka system health monitoring. Collaborate with DevOps, data engineers, and software teams. Ensure security and compliance in Kafka and Grafana implementations. Requirements: 8+ years of experience in configuring Kafka, ElasticSearch and Grafana Strong understanding of Apache Kafka architecture and Grafana visualization. Proficiency in .Net, or Python for Kafka development. Experience with distributed systems and message-oriented middleware. Knowledge of time-series databases and monitoring tools. Familiarity with data serialization formats like JSON. Expertise in Azure platforms and Kafka monitoring tools. Good problem-solving and communication skills. Mandate : Create the Kafka dashboards , Python/.NET Note: Candidate must be immediate joiner.

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 13 Lacs

Thane, Goregaon, Mumbai (All Areas)

Work from Office

Opening for Leading Insurance company. **Looking for Immediate Joiner and 30 Days** Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives. Interested candidates shared resume on snehal@topgearconsultants.com

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Looking exp in 8+ Yrs exp in Kafka Administrator Mandatory Skill: kSQL DB Developers who must have hands on experience in writing the Ksql queries. Kafka Connect development experience. Kafka Client Stream Applications Developer Confluent Terraform Provider Skill: 8+ years of experience in Development project and Support project experience 3+ years of hands on experience in Kafka Understanding Event Streaming patterns and when to apply these patterns Designing building and operating in-production Big Data, stream processing, and/or enterprise data integration solutions using Apache Kafka Working with different database solutions for data extraction, updates and insertions. Identity and Access Management space including relevant protocols and standards such as OAuth, OIDC, SAML, LDAP etc. Knowledge of networking protocols such as TCP, HTTP/2, WebSockets etc. Candidate must work in Australia timings [AWST]., Interview mode will be Face to Face Interested candidate share me your updated resume in recruiter.wtr26@walkingtree.in

Posted 1 month ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Gurugram

Remote

Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch Perks and benefits PF, ANNUAL BONUS, HEALTH INSURANCE

Posted 1 month ago

Apply

3.0 - 7.0 years

10 - 15 Lacs

Pune

Work from Office

Responsibilities: * Manage Kafka clusters, brokers & messaging architecture * Collaborate with development teams on data pipelines * Monitor Kafka performance & troubleshoot issues Health insurance Provident fund Annual bonus

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Pune, Gurugram

Work from Office

GSPANN is looking for an experienced Kafka Developer with strong Java skills to join our growing team. If you have hands-on experience with Kafka components and are ready to work in a dynamic, client-facing environment, wed love to hear from you! Key Responsibilities: Develop and maintain Kafka Producers, Consumers, Connectors, kStream, and KTable. Collaborate with stakeholders to gather requirements and deliver customized solutions. Troubleshoot production issues and participate in Agile ceremonies. Optimize system performance and support deployments. Mentor junior team members and ensure coding best practices. Required Skills: 4+ years of experience as a Kafka Developer Proficiency in Java Strong debugging skills (Splunk experience is a plus) Experience in client-facing projects Familiarity with Agile and DevOps practices Good to Have: Knowledge of Google Cloud Platform (Dataflow, BigQuery, Kubernetes) Experience with production support and monitoring tools Ready to join a collaborative and innovative team? Send your CV to heena.ruchwani@gspann.com

Posted 1 month ago

Apply

5.0 - 9.0 years

11 - 12 Lacs

Hyderabad, Mysuru, Chennai

Work from Office

Experienced Kafka Developer skilled in schema registry, cluster optimization, pipeline design, troubleshooting, and ensuring high availability, security, and compliance in data processing. Mail:kowsalya.k@srsinfoway.com

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 14 Lacs

Mumbai, Goregaon, Mumbai (All Areas)

Work from Office

Opening for the Insurance Company. **Looking someone with 30 days notice period** Location : Mumbai (Lower Parel) Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives Interested candidates shared resume on snehal@topgearconsultants.com

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Chennai

Hybrid

1. Objective We are seeking a highly experienced and visionary Expert Platform Lead with 10+ years of expertise in Confluent Kafka administration, cloud-native infrastructure, and enterprise-scale streaming architecture. This role involves overseeing Kafka platform strategy, optimizing infrastructure through automation, ensuring cost-effective scalability, and working closely with cross-functional teams to enable high-performance data streaming solutions. The ideal candidate will drive innovation, establish best practices, and mentor teams to enhance platform reliability and efficiency. 2. Main tasks Kafka Platform Management Define and execute platform strategy for Confluent Kafka, ensuring security, high availability, and scalability. Lead architecture design reviews, influencing decisions related to Kafka infrastructure and cloud deployment models. Oversee and maintain the Kafka platform in a 24/7 operational setting, ensuring high availability and fault tolerance. Establish monitoring frameworks, proactively identifying and addressing platform inefficiencies. Leadership, Collaboration and Support Act as the primary technical authority on Kafka for enterprise-wide streaming architecture. Collaborate closely with application teams, architects, and vendors to align platform capabilities with business needs. Provide technical mentorship to engineers and architects, guiding best practices in Kafka integration and platform usage. Infrastructure Automation and Optimization Spearhead Infrastructure as Code (IaC) initiatives using Terraform for Kafka, AWS, and cloud resources. Drive automation across provisioning, deployment workflows, and maintenance operations, ensuring efficiency and resilience. Implement advanced observability measures to optimize costs and resource allocation while maintaining peak performance. Governance, Documentation and Compliance Maintain detailed platform documentation, including configuration, security policies, and compliance standards. Track and analyze usage trends, ensuring cost-efficient resource utilization across streaming ecosystems. Establish governance frameworks, ensuring compliance with enterprise security policies and industry standards. 3. Technical expertise Education level: Minimum 4 years Bachelor’s or Master’s degree in Computer Science engineering or related field Required expertise for the function: 10+ years of experience in platform engineering, cloud infrastructure, and data streaming architectures. Extensive expertise in Kafka administration (preferably Confluent Kafka), leading enterprise-wide streaming initiatives. Proven track record in leading critical incident response and ensuring system uptime in a 24/7 environment. Knowledge of languages (depending on the office): English (Mandatory) Technical knowledge required to perform the function: Technical Skills: Expert knowledge of Kafka (Confluent), event-driven architectures, and high-scale distributed systems. Mastery of Terraform for infrastructure automation across AWS, Kubernetes, and cloud-native ecosystems. Strong proficiency in AWS services, networking principles, and security best practices. Advanced experience with CI/CD pipelines, version control (Git), and scripting (Bash, Python). Soft Skills: Strategic problem-solving mindset, capable of leading large-scale technical decisions. Strong leadership and mentorship skills, able to guide teams toward technical excellence. Excellent communication, stakeholder management, and cross-functional collaboration abilities. Preferred Skills: Kafka or Confluent certification, demonstrating deep platform expertise. AWS Solutions Architect certification or equivalent cloud specialization. Experience with monitoring tools (Prometheus, Grafana) and proactive alert management for 24/7 operations.

Posted 1 month ago

Apply

3.0 - 6.0 years

10 - 17 Lacs

Pune

Remote

Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Noida, Gurugram

Work from Office

Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies