Jobs
Interviews

6 Kafka Security Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing, implementing, and maintaining scalable event-streaming architectures that support real-time data. Your duties will include designing, building, and managing Kafka clusters using Confluent Platform and Kafka Cloud services (AWS MSK, Confluent Cloud). You will also be involved in developing and maintaining Kafka topics, schemas (Avro/Protobuf), and connectors for data ingestion and processing pipelines. Monitoring and ensuring the reliability, scalability, and security of Kafka infrastructure will be crucial aspects of your role. Collaboration with application and data engineering teams to integrate Kafka with other AWS-based services (e.g., Lambda, S3, EC2, Redshift) is essential. Additionally, you will implement and manage Kafka Connect, Kafka Streams, and ksqlDB where applicable. Optimizing Kafka performance, troubleshooting issues, and managing incidents will also be part of your responsibilities. To be successful in this role, you should have at least 3-5 years of experience working with Apache Kafka and Confluent Kafka. Strong knowledge of Kafka internals such as brokers, zookeepers, partitions, replication, and offsets is required. Experience with Kafka Connect, Schema Registry, REST Proxy, and Kafka security is also important. Hands-on experience with AWS services like EC2, IAM, CloudWatch, S3, Lambda, VPC, and Load balancers is necessary. Proficiency in scripting and automation using tools like Terraform, Ansible, or similar is preferred. Familiarity with DevOps practices and tools such as CI/CD pipelines, monitoring tools like Prometheus/Grafana, Splunk, Datadog, etc., is beneficial. Experience with containerization using Docker and Kubernetes is an advantage. Having a Confluent Certified Developer or Administrator certification, AWS Certified, experience with CICD tools like AWS Code Pipeline, Harness, and knowledge of containers (Docker, Kubernetes) will be considered as additional assets for this role.,

Posted 3 days ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Dubai, Pune, Chennai

Hybrid

Job Title: Confluent CDC System Analyst Role Overview: A leading bank in the UAE is seeking an experienced Confluent Change Data Capture (CDC) System Analyst/ Tech lead to implement real-time data streaming solutions. The role involves implementing robust CDC frameworks using Confluent Kafka , ensuring seamless data integration between core banking systems and analytics platforms. The ideal candidate will have deep expertise in event-driven architectures, CDC technologies, and cloud-based data solutions . Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solutions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Develop and maintain Kafka-based data pipelines for real-time processing. Implement Kafka producer and consumer applications for efficient data flow. Optimize Kafka clusters for performance, scalability, and reliability. Design and manage Grafana dashboards for monitoring Kafka metrics. Integrate Grafana with Elasticsearch, or other data sources. Set up alerting mechanisms in Grafana for Kafka system health monitoring. Collaborate with DevOps, data engineers, and software teams. Ensure security and compliance in Kafka and Grafana implementations. Requirements: 8+ years of experience in configuring Kafka, ElasticSearch and Grafana Strong understanding of Apache Kafka architecture and Grafana visualization. Proficiency in .Net, or Python for Kafka development. Experience with distributed systems and message-oriented middleware. Knowledge of time-series databases and monitoring tools. Familiarity with data serialization formats like JSON. Expertise in Azure platforms and Kafka monitoring tools. Good problem-solving and communication skills. Mandate : Create the Kafka dashboards , Python/.NET Note: Candidate must be immediate joiner.

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Noida, Gurugram

Work from Office

Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.

Posted 2 months ago

Apply

6.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Roles & Responsibilities: Design, build, and manage Kafka clusters using Confluent Platform and Kafka Cloud services (AWS MSK, Confluent Cloud) . Develop and maintain Kafka topics, schemas (Avro/Protobuf), and connectors for data ingestion and processing pipelines. Monitor and ensure the reliability, scalability, and security of Kafka infrastructure. Collaborate with application and data engineering teams to integrate Kafka with other AWS-based services (e.g., Lambda, S3, EC2, Redshift). Implement and manage Kafka Connect , Kafka Streams , and ksqlDB where applicable. Optimize Kafka performance, troubleshoot issues, and manage incident response. Preferred candidate profile 4-6 years of experience working with Apache Kafka and Confluent Kafka. Strong knowledge of Kafka internals (brokers, zookeepers, partitions, replication, offsets). Experience with Kafka Connect, Schema Registry, REST Proxy, and Kafka security. Hands-on experience with AWS (EC2, IAM, CloudWatch, S3, Lambda, VPC, Load balancers). Proficiency in scripting and automation using Terraform, Ansible, or similar tools. Familiarity with DevOps practices and tools (CI/CD pipelines, monitoring tools like Prometheus/Grafana, Splunk, Datadog etc). Experience with containerization (Docker, Kubernetes) is a plus.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies