Senior Developer Kafka

8 - 15 years

4 - 7 Lacs

Bengaluru

Posted:1 week ago| Platform:

Apply Now

Skills Required

developer kafka architecture linux unix security scripting networking network programming java tuning automation python management monitoring logging log4j troubleshooting encryption authentication integration data backup strategies apache

Work Mode

On-site

Job Type

Part Time

Job Description

Job Description We are seeking an experienced professional with 8-15 years of IT experience to join our team. The ideal candidate should possess expertise in Kafka architecture and operations , along with a strong understanding of Confluent-specific tools , Linux/Unix systems , networking , and security practices . Key Responsibilities & Requirements: Kafka Architecture & Operations: Deep understanding of core Kafka components including brokers, topics, partitions, producers, and consumers. Ability to create, configure, and manage Kafka topics and partitions. Confluent Ecosystem: Proficiency in Confluent-specific tools such as Control Centre, Schema Registry, ksqlDB, and Kafka Connect. Linux/Unix Expertise: Strong command over Linux/Unix systems, including shell scripting and system monitoring. Networking Knowledge: Understanding of network configurations, protocols, and security best practices to ensure efficient and secure Kafka operations. Programming Skills: Knowledge of Java programming and JVM tuning, as Kafka is built on Java. Automation & Scripting: Proficiency in scripting languages like Python or Bash for automation and management tasks. Monitoring & Logging: Experience with monitoring tools such as Prometheus, Grafana, and Confluent Control Centre to track Kafka performance. Familiarity with logging frameworks like Log4j for troubleshooting and maintaining Kafka logs. Security Practices: Implementation of security measures, including SSL/TLS encryption, Kerberos authentication, and access control lists (ACLs) to safeguard Kafka data. Integration Expertise: Experience in integrating Kafka with various systems and data sources, including databases, data lakes, and cloud services. Capacity Planning: Ability to plan and scale Kafka clusters to handle dynamic workloads while ensuring high availability. Backup & Recovery: Knowledge of backup and recovery strategies to protect data and ensure business continuity in case of failures (a frequent task in T&S ). Preferred Qualification: Confluent Certification: Preference for candidates holding the Confluent Certified Administrator for Apache Kafka (CCAAK) certification. Note: This role is a 6-month contractual position with the possibility of extension based on performance. The work location is Bangalore-Eco World Bellandur , and it requires on-site presence at the office.

Mock Interview

Practice Video Interview with JobPe AI

Start Developer Interview Now

RecommendedJobs for You