9 - 13 years

9 - 15 Lacs

Posted:10 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title:

Kafka Integration Engineer / Senior Data Streaming Engineer

Experience Required:

7 to 10 Years

Location:

(You can specify: Bangalore / Hyderabad / Pune / Chennai / Gurgaon / Remote)

Kafka Integration Engineer

Key Responsibilities:

  • Design and implement

    Kafka-based integration and real-time data streaming solutions

    .
  • Manage and maintain

    Kafka clusters, brokers, topics, partitions, offsets, and replication settings

    .
  • Configure and operate

    Kafka Connect

    including

    SQL Connector, HTTP Connector, and other custom connectors

    .
  • Ensure

    performance tuning, high availability, and stability

    of Kafka environments.
  • Work with

    Kafka APIs

    (Producer, Consumer, Streams API) for integration and data processing.
  • Perform

    production deployments

    , environment maintenance, and troubleshooting.
  • Design and document

    Kafka deployment architectures

    , covering scalability and security best practices.
  • Collaborate with cross-functional teams for integration and data engineering requirements.

Required Skills:

  • 7+ years of experience in

    Integration technologies & real-time streaming platforms

    .
  • Strong hands-on knowledge of

    Apache Kafka

    ,

    Zookeeper

    ,

    Kafka Brokers

    ,

    Connectors

    ,

    Producers

    ,

    Consumers

    .
  • Experience in setting up

    Kafka clusters

    and managing distributed environments.
  • Proficiency in

    Kafka Connect

    , especially

    SQL Connector and HTTP Connector

    .
  • Excellent understanding of

    Kafka ecosystem, data streaming best practices, and messaging patterns

    .
  • Strong troubleshooting, configuration, and performance optimization skills.
  • Experience in

    production deployments

    and background process management.

Good to Have:

  • Experience with

    Schema Registry

    ,

    Kafka Streams

    ,

    KSQL

    , or

    Confluent Platform

    .
  • Knowledge of

    Azure Event Hubs / AWS MSK / GCP Pub/Sub

    .
  • Understanding of

    DevOps tools

    (Docker, Kubernetes, CI/CD).
  • Exposure to monitoring tools like

    Prometheus / Grafana / Splunk

    .

    Role & responsibilities


How to Apply:

Please share your updated CV with details of your Kafka projects and current CTC, expected CTC, and notice period.

ON:

@ Janhavi@silverlinktechnologies.com

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Silverlink Technologies logo
Silverlink Technologies

Information Technology

Tech City

RecommendedJobs for You

bengaluru, karnataka, india