Jobs
Interviews

6 Kstreams Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 10+ years of experience in the field, with a strong background in Kafka Streams / KSQL architecture and associated clustering model. Your expertise should include solid programming skills with Java, along with best practices in development, automation testing, and streaming APIs. Practical experience in scaling Kafka, KStreams, and Connector infrastructures is required, as well as the ability to optimize the Kafka ecosystem based on specific use-cases and workloads. As a developer, you should have hands-on experience in building producer and consumer applications using the Kafka API, and proficiency in implementing KStreams components. Additionally, you should have developed KStreams pipelines and deployed KStreams clusters. Experience in developing KSQL queries and understanding the best practices of using KSQL vs KStreams is essential. Strong knowledge of the Kafka Connect framework is necessary, including experience with various connector types such as HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and the ability to support wire-format translations. Familiarity with connectors available from Confluent and the community, as well as hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework is a plus. Knowledge of Schema Registry is also beneficial. Nice-to-have qualities include providing thought leadership for the team, excellent verbal and written communication skills, being a good team player, and willingness to go the extra mile to support the team. In terms of educational qualifications, a four-year college degree in Science, Engineering, Technology, Business, or Humanities is required. Candidates with a Master's degree and/or certifications in the relevant technologies are preferred. The working mode for this position is hybrid, full-time (3 days working from the office), and the notice period is a maximum of 30 days.,

Posted 3 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Key Skills: Confluent Kafka, Kafka Connect, Schema Registry, Kafka Brokers, KSQL, KStreams, Java/J2EE, Troubleshooting, RCA, Production Support. Roles & Responsibilities: Design and develop Kafka Pipelines. Perform unit testing of the code and prepare test plans as required. Analyze, design, and develop programs in a development environment. Support applications and jobs in the production environment for issues or failures. Develop operational documents for applications, including DFD, ICD, HLD, etc. Troubleshoot production issues and provide solutions within defined SLA. Prepare RCA (Root Cause Analysis) document for production issues. Provide permanent fixes to production issues. Experience Requirement: 5-10 yeras of experience working with Confluent Kafka. Hands-on experience with Kafka Connect using Schema Registry. Strong knowledge of Kafka brokers and KSQL. Familiarity with Kafka Control Center, Zookeepers, and KStreams is good to have. Experience with Java/J2EE is a plus. Education: B.E., B.Tech.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Dubai, Pune, Chennai

Hybrid

Job Title: Confluent CDC System Analyst Role Overview: A leading bank in the UAE is seeking an experienced Confluent Change Data Capture (CDC) System Analyst/ Tech lead to implement real-time data streaming solutions. The role involves implementing robust CDC frameworks using Confluent Kafka , ensuring seamless data integration between core banking systems and analytics platforms. The ideal candidate will have deep expertise in event-driven architectures, CDC technologies, and cloud-based data solutions . Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solutions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 13 Lacs

Thane, Goregaon, Mumbai (All Areas)

Work from Office

Opening for Leading Insurance company. **Looking for Immediate Joiner and 30 Days** Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives. Interested candidates shared resume on snehal@topgearconsultants.com

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 14 Lacs

Mumbai, Goregaon, Mumbai (All Areas)

Work from Office

Opening for the Insurance Company. **Looking someone with 30 days notice period** Location : Mumbai (Lower Parel) Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives Interested candidates shared resume on snehal@topgearconsultants.com

Posted 1 month ago

Apply

5.0 - 10.0 years

16 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Detailed job description - Skill Set: Proven experience as a Kafka Developer Knowledge of Kafka Schemas and use of the Schema Registry Strong knowledge of Kafka and other big data technologies Best practices to optimize the Kafka ecosystem based on use-case and workload Knowledge of Kafka clustering, and its fault-tolerance model supporting High Availability Strong fundamentals in Kafka client configuration and troubleshooting Designing and implementing data pipelines using Apache Kafka Develop and maintain Kafka-based data pipelines Monitor and optimize Kafka clusters Troubleshoot and resolve issues related to Kafka and data processing Ensure data security and compliance with industry standards Create and maintain documentation for Kafka configurations and processes Implement best practices for Kafka architecture and operations Mandatory Skills(ONLY 2 or 3) Kafka Developer

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies