Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are an experienced Confluent Kafka Administrator responsible for managing and maintaining the Kafka infrastructure. With a minimum of 5 years of experience in Confluent Kafka and expertise in data streaming and real-time data processing, you will handle various responsibilities to ensure the smooth operation of Kafka clusters. Your key responsibilities include installing, configuring, and managing Kafka clusters, monitoring performance for optimization, implementing security measures, integrating Kafka with data sources, planning and executing upgrades and patches, developing disaster recovery plans, documenting configurations and procedures, providing support for Kafka-related issues, and managing Kafka topics effectively. To excel in this role, you must possess a strong understanding of Kafka architecture, Kraft mode (Kafka Raft), Kafka Connect, Kafka Streams, and KSQL. Proficiency in scripting languages and automation tools, experience with monitoring tools like Prometheus and Grafana, knowledge of security best practices for Kafka, excellent problem-solving skills, and effective communication and collaboration abilities are essential. Preferred qualifications include certifications such as Confluent Certified Administrator or Developer, experience with cloud platforms like AWS, familiarity with DevOps practices and tools, and knowledge of databases. Join us in Gurgaon for this WFO position and become an immediate contributor to our team.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have experience working with Confluent Kafka. You must possess a strong knowledge of Kafka architecture, Kraft mode (Kafka Raft), Kafka Connect, Kafka Streams, and KSQL. It is essential to have proficiency in scripting languages such as Python, Bash, and automation tools like Ansible and Terraform. Experience with monitoring tools like Prometheus, Grafana, Dynatrace, and Splunk ITSI is required. Understanding of security best practices for Kafka including SSL/TLS, Kerberos, and RBAC is crucial. Strong analytical and problem-solving skills are necessary for this role. Excellent communication and collaboration skills are also expected.,
Posted 2 weeks ago
6.0 - 8.0 years
7 - 12 Lacs
pune, mumbai (all areas)
Work from Office
*Kafka DBA Engineer SME* Exp : 6+years Location : Mumbai *Kafka and Zookeper* *Dev ops tool* : ansible, Nagios, shell scripting python, Jenkins, Ansible, Git, Maven etc. to implement automation *Multiple distribution* : Cloudera, Confluent, open source Kafka etc. *Network infrastructure* : TCP/IP, DNS, Firewall, router, load balancer etc. implementation of security in ecosystem Hadoop Security ranger, Kerberos, SSL etc
Posted 2 weeks ago
6.0 - 11.0 years
5 - 13 Lacs
navi mumbai, pune, mumbai (all areas)
Work from Office
Role & responsibilities Kafka DBA Engineer SME Exp : 6+years Location : Mumbai Kafka and Zookeper Dev ops tool : ansible, Nagios, shell scripting python, Jenkins, Ansible, Git, Maven etc. to implement automation Multiple distribution : Cloudera, Confluent, open source Kafka etc. Network infrastructure : TCP/IP, DNS, Firewall, router, load balancer etc. implementation of security in ecosystem Hadoop Security ranger, Kerberos, SSL etc IF INTERESTED KINDLY SHARE ME YOUR CV ON aditya.s@genxhire.in
Posted 2 weeks ago
6.0 - 11.0 years
5 - 13 Lacs
navi mumbai, pune, mumbai (all areas)
Work from Office
Role & responsibilities Kafka DBA Engineer SME Exp : 6+years Location : Mumbai Kafka and Zookeper Dev ops tool : ansible, Nagios, shell scripting python, Jenkins, Ansible, Git, Maven etc. to implement automation Multiple distribution : Cloudera, Confluent, open source Kafka etc. Network infrastructure : TCP/IP, DNS, Firewall, router, load balancer etc. implementation of security in ecosystem Hadoop Security ranger, Kerberos, SSL etc IF INTERESTED KINDLY SHARE ME YOUR CV ON 7021112973 OR rashi.s@gmail.in
Posted 2 weeks ago
7.0 - 12.0 years
14 - 24 Lacs
pune, bengaluru
Hybrid
Role & responsibilities Kafka Engineer Work mode - Hybrid Location - Bangalore/ Pune 7+ years experience with Apache Kafka - designing and owning solutions, troubleshooting. Please share your updated profile to puneet@mounttalent.com
Posted 2 weeks ago
10.0 - 18.0 years
25 - 40 Lacs
gurugram, bengaluru
Hybrid
Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Bengaluru/Gurgaon Payroll: BCforward Work Mode: Hybrid JD Java; Apache Kafka; AWS, microservices Experienced Java engineer with over 10 years of experience with expertise in microservices, event driven architecture, Kafka expertise is a must Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 30-Days joiners at most. All the best
Posted 3 weeks ago
8.0 - 13.0 years
20 - 35 Lacs
gurugram
Remote
Kafka Developer (7+ Years Experience) Position Overview We are seeking a highly skilled Kafka Developer with 7+ years of experience in designing, developing, and deploying real-time data streaming solutions. The ideal candidate will have strong expertise in Apache Kafka, distributed systems, and event-driven architecture, along with proficiency in Java/Scala/Python. Key Responsibilities Design, develop, and optimize Kafka-based real-time data pipelines and event-driven solutions. Implement and maintain Kafka producers, consumers, and stream processing applications. Build and configure Kafka Connectors, Schema Registry, and KSQL/Kafka Streams for data integration. Manage Kafka clusters (on-premises and cloud AWS MSK, Confluent Cloud, Azure Event Hubs). Ensure high availability, scalability, and reliability of Kafka infrastructure. Troubleshoot and resolve issues related to Kafka performance, lag, replication, offsets, and throughput. Implement security best practices (SSL/TLS, SASL, Kerberos, RBAC). Collaborate with cross-functional teams (data engineers, architects, DevOps, business stakeholders). Write unit/integration tests and ensure code quality and performance tuning. Document solutions, best practices, and knowledge sharing within the team. Required Skills & Experience 7+ years of software development experience, with at least 4+ years in Kafka development. Strong hands-on experience with Apache Kafka APIs (Producer, Consumer, Streams, Connect). Proficiency in Java or Scala (Python/Go is a plus). Solid understanding of event-driven and microservices architecture. Experience with serialization formats (Avro, Protobuf, JSON). Strong knowledge of distributed systems concepts (partitioning, replication, consensus). Experience with Confluent Platform and ecosystem tools (Schema Registry, REST Proxy, Control Center). Exposure to cloud-based Kafka (AWS MSK, Confluent Cloud, Azure Event Hubs, GCP Pub/Sub). Hands-on with CI/CD, Docker, Kubernetes, and monitoring tools (Prometheus, Grafana, Splunk). Strong problem-solving skills with the ability to troubleshoot latency, lag, and cluster performance issues.
Posted 4 weeks ago
9.0 - 14.0 years
15 - 18 Lacs
navi mumbai
Work from Office
We are hiring an experienced Messaging Systems Specialist (Kafka & RabbitMQ) with 8+ years of expertise in administration, architecture, and performance tuning. Role involves deploying, managing, and securing messaging systems on Kubernetes and VMs. Required Candidate profile Seeking 8+ years experienced professional in Kafka & RabbitMQ administration with strong Kubernetes/Docker skills, certified in Kafka (Confluent), and proven ability in performance tuning and scaling.
Posted 4 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.
Posted 1 month ago
4.0 - 8.0 years
7 - 17 Lacs
Hyderabad, Bengaluru
Hybrid
Role & responsibilities Kafka Admin (CTH 3- 6months conversion based the performance) Job Description: Standing up and administer on premise Kafka cluster. •Ability to architect and create reference architecture for kafka Implementation standards • Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. • Ensure optimum performance, high availability and stability of solutions. • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. • Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. • Use automation tools like provisioning using Docker, Jenkins and GitLab. • Ability to perform data related benchmarking, performance analysis and tuning. • Strong skills in In-memory applications, Database Design, Data Integration. • Involve in design and capacity review meetings to provide suggestion in Kafka usage. • Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. • Setting up security on Kafka. • Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. • Monitor, prevent and troubleshoot security related issues. • Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS • 4-6 years demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. • Hands on experience on recovery in Kafka. • 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. • Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). • Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) • Considerable experience in implementing Kerberos Security. • Support 24*7 Model and be available to support rotational on-call work ( including Saturday/Sunday ) • Competent working in one or more environments highly integrated with an operating system. • Experience implementing and administering/managing technical solutions in major, large-scale system implementations. • High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. • Ability to manage tasks independently and take ownership of responsibilities • Ability to learn from mistakes and apply constructive feedback to improve performance • Ability to adapt to a rapidly changing environment. • Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. • Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. • Willingness to learn new technologies/tool and train your peers. • Proven track record to automate. Preferred candidate profile If interested please share CV to harshini.d@ifinglobalgroup.com
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Roles and Responsibility Design, develop, and implement scalable Kafka infrastructure solutions. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for Kafka infrastructure projects. Troubleshoot and resolve complex issues related to Kafka infrastructure. Ensure compliance with industry standards and best practices for Kafka infrastructure. Participate in code reviews and contribute to the improvement of the overall code quality. Job Requirements Strong understanding of Kafka architecture and design principles. Experience with Kafka tools such as Streams, KSQL, and SCADA. Proficient in programming languages such as Java, Python, or Scala. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.
Posted 1 month ago
3.0 - 8.0 years
18 - 22 Lacs
Navi Mumbai, Mumbai (All Areas)
Work from Office
1 Education : B.E./B.Tech/MCA in Computer Science 2 Experience : Must have 7+ years relevant experience in the field of database Administration. 3 Mandatory Skills/Knowledge Candidate should be technically sound in multiple distribution like Cloudera, Confluent, open source Kafka. Candidate should be technically sound in Kafka, Zookeeper Candidate should well versed in capacity planning and performance tuning. Candidate should be expertise in implementation of security in ecosystem Hadoop Security ranger , Kerberos ,SSL Candidate should be expertise in dev ops tool like ansible, Nagios, shell scripting python , Jenkins, Ansible, Git, Maven to implement automation . Candidate should able to Monitor, Debug & RCA for any service failure. Knowledge of network infrastructure for eg. TCP/IP, DNS, Firewall, router, load balancer. Creative analytical and problem-solving skills Provide RCAs for critical & recurring incidents. Provide on-call service coverage within a larger group Good aptitude in multi-threading and concurrency concepts. 4 Preferred Skills/Knowledge Expert Knowledge of database administration and architecture Hands on Operating System Commands Kindly Share CVs on snehal.sankade@outworx.com
Posted 2 months ago
6.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Overview PepsiCo Data BI & Integration Platforms is seeking a Confluent Kafka Platform technology leader, responsible for overseeing the deployment, and maintenance of on-premises and cloud infrastructure (AWS/Azure) for its North America PepsiCo Foods/Beverages business. The ideal candidate will have hands-on experience in managing and maintaining Confluent Kafka platforms, ensuring system stability, security, and optimal performance with Azure/AWS services Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Kafka platform administration In-depth knowledge of Apache Kafka and the Confluent Platform. Experience with Kafka Streams, Kafka Connect, and Schema Registry. Familiarity with Confluent Control Center and other Confluent tools. Experience with CI/CD pipelines and automation tools (e.g., Jenkins, Azure DevOps). Monitoring system health and performance, identifying and resolving issues, and ensuring smooth operation of business processes. Tuning system parameters, optimizing resource utilization, and ensuring the efficient operation of applications. Implement and maintain security protocols for Kafka, including SSL/TLS encryption, Kerberos, and role-based access control (RBAC). Manage Kafka Connect for integrating data from various sources and optimize Kafka Streams applications for real-time data processing. Collaborating with development, QA, and other teams to resolve technical issues and ensure seamless integration of applications. Developing scripts and automating tasks for administration and maintenance purposes. Cloud Infrastructure & Automation Implement cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS, Confluent). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelors degree in computer science. At least 6 to 8 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 4 years in a technical leadership role. Deep understanding of Kafka architecture, including brokers, topics, partitions, replicas, and security. Experience with installing, configuring, and maintaining Kafka clusters. Proficiency in data streaming concepts and tools. Knowledge of monitoring tools like Prometheus, Grafana, or similar. Familiarity with Kafka security best practices (SSL, SASL, ACLs). Proficiency in languages like Java, Scala, or Python. Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Strong expertise in Azure/AWS messaging technologies, real time data ingestion, data warehouses, serverless ETL, DevOps, Kubernetes, virtual machines, monitoring and security tools. Strong expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.
Posted 2 months ago
10.0 - 20.0 years
30 - 40 Lacs
Hyderabad
Work from Office
Job Description Kafka/Integration Architect Position brief: Kafka/Integration Architect is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions. This role involves configuring, deploying, and monitoring Kafka clusters to ensure the high availability and scalability of data streaming services. The Kafka Architect collaborates with cross-functional teams to integrate Kafka into various applications and ensures optimal performance and reliability of the data infrastructure. Kafka/Integration Architect play a critical role in driving data-driven decision-making and enabling real-time analytics, contributing directly to the company’s agility, operational efficiency and ability to respond quickly to market changes. Their work supports key business initiatives by ensuring that data flows seamlessly across the organization, empowering teams with timely insights and enhancing the customer experience. Location: Hyderabad Primary Role & Responsibilities: Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency. Collaborate with development teams to integrate Kafka into applications and services. Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights. Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance. Optimize Kafka configurations for performance, reliability, and scalability. Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead. Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications. Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration. Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends. Required Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to translate business requirements into technical solutions. Working Experience and Qualification: Education: Bachelor’s or master’s degree in computer science, Information Technology or related field. Experience: Proven experience of 8-10 years as a Kafka Architect or in a similar role. Skills: Strong knowledge of Kafka architecture, including brokers, topics, partitions and replicas. Experience with Kafka security, including SSL, SASL, and ACLs. Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments. Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink. Solid understanding of distributed systems, data streaming and messaging patterns. Proficiency in Java, Scala, or Python for Kafka-related development tasks. Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging. Experience with tools like Zookeeper, Schema Registry, and Kafka Connect. Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment. Experience with cloud platforms like AWS, Azure, or GCP. Preferred Skills: (Optional) Kafka certification or related credentials, such as: Confluent Certified Administrator for Apache Kafka (CCAAK) Cloudera Certified Administrator for Apache Kafka (CCA-131) AWS Certified Data Analytics – Specialty (with a focus on streaming data solutions) Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with other messaging systems like RabbitMQ or Apache Pulsar. Experience with data serialization formats like Avro, Protobuf, or JSON. Company Profile: WAISL is an ISO 9001:2015, ISO 20000-1:2018, ISO 22301:2019 certified, and CMMI Level 3 Appraised digital transformation partner for businesses across industries with a core focus on aviation and related adjacencies. We transform airports and relative ecosystems through digital interventions with a strong service excellence culture. As a leader in our chosen space, we deliver world-class services focused on airports and their related domains, enabled through outcome-focused next-gen digital/technology solutions. At present, WAISL is the primary technology solutions partner for Indira Gandhi International Airport, Delhi, Rajiv Gandhi International Airport, Hyderabad, Manohar International Airport, Goa, Kannur International Airport, Kerala, and Kuwait International Airport, and we expect to soon provide similar services for other airports in India and globally. WAISL, as a digital transformation partner, brings proven credibility in managing and servicing 135+Mn passengers, 80+ airlines, core integration, deployment, and real-time management experience of 2000+ applications vendor agnostically in highly complex technology converging ecosystems. This excellence in managed services delivered by WAISL has enabled its customer airports to be rated amongst the best-in-class service providers by Skytrax and ACI awards, and to win many innovation and excellence awards
Posted 2 months ago
5.0 - 8.0 years
10 - 17 Lacs
Noida, Gurugram, Greater Noida
Work from Office
5+ yrs in Python, Django Microservices Architecture and API development Deploy via Kubernetes Redis for performance tuning Celery for distributed task DBs: PostgreSQL, Time-series Redis, Celery, RabbitMQ/Kafka Microservices architecture exp
Posted 2 months ago
5.0 - 10.0 years
18 - 33 Lacs
Bengaluru
Work from Office
Design develop implement custom solutions Java & strong backend REST APIs,integrations technical solutions workflows configurations Object-Oriented Programming, design patterns multithreading event driven application dev MQs call MIlan 7021504388
Posted 2 months ago
7.0 - 12.0 years
20 - 35 Lacs
Dubai, Pune, Chennai
Hybrid
Job Title: Confluent CDC System Analyst Role Overview: A leading bank in the UAE is seeking an experienced Confluent Change Data Capture (CDC) System Analyst/ Tech lead to implement real-time data streaming solutions. The role involves implementing robust CDC frameworks using Confluent Kafka , ensuring seamless data integration between core banking systems and analytics platforms. The ideal candidate will have deep expertise in event-driven architectures, CDC technologies, and cloud-based data solutions . Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solutions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .
Posted 2 months ago
10.0 - 15.0 years
25 - 32 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Skill Set Confluent + Kafka Architect Hiring Location Bangalore/Chennai/Pune/Mumbai/Hyderabad Notice Period Immediate Joiners- Max 30 Days Experience 10-12 Yrs Interview Levels 2 Internal/1 CI Shift Timings 2 PM- 11 PM (Sweden Timing) Job description Confluent Kafka Architect P3 its for CHina IEB support The team member will be part of a Kafka platform development team responsible for architecture administration and operations for Kafka clusters across all nonproduction and production environments The role involves best design solution ensuring system reliability optimizing performance automating regular tasks providing guidance to team for onboarding applications helping team for upskillcrosstraining and troubleshooting issues to maintain seamless Kafka operations Key Responsibilities Architect Manage and maintain Kafka clusters in nonprod and prod environments Responsible for high level discussions with customers Responsible for proposing best solutions as per industry standards Responsible for doing POCs and documenting the exercise Getting involved with team members to solve their technical problems Handling high priority discussions and drive the meetings with vendor if required Skills and Abilities Strong knowledge of Kafka architecture Handson experience with Kafka cluster setup using ZookeeperKraft Proficiency in working with Kafka connectors and ksqlDB Experience with Kubernetes for Kafka deployment and management Ability to create Docker images and work with containerized environments Proficiency in writing YAML configurations Strong troubleshooting and debugging skills Experience with monitoring tools like Datadog
Posted 2 months ago
8.0 - 13.0 years
20 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Kafka Admin - Architecture
Posted 2 months ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Develop and maintain Kafka-based data pipelines for real-time processing. Implement Kafka producer and consumer applications for efficient data flow. Optimize Kafka clusters for performance, scalability, and reliability. Design and manage Grafana dashboards for monitoring Kafka metrics. Integrate Grafana with Elasticsearch, or other data sources. Set up alerting mechanisms in Grafana for Kafka system health monitoring. Collaborate with DevOps, data engineers, and software teams. Ensure security and compliance in Kafka and Grafana implementations. Requirements: 8+ years of experience in configuring Kafka, ElasticSearch and Grafana Strong understanding of Apache Kafka architecture and Grafana visualization. Proficiency in .Net, or Python for Kafka development. Experience with distributed systems and message-oriented middleware. Knowledge of time-series databases and monitoring tools. Familiarity with data serialization formats like JSON. Expertise in Azure platforms and Kafka monitoring tools. Good problem-solving and communication skills. Mandate : Create the Kafka dashboards , Python/.NET Note: Candidate must be immediate joiner.
Posted 2 months ago
3.0 - 7.0 years
9 - 14 Lacs
Gurugram
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch Perks and benefits PF, ANNUAL BONUS, HEALTH INSURANCE
Posted 3 months ago
3.0 - 7.0 years
10 - 15 Lacs
Pune
Work from Office
Responsibilities: * Manage Kafka clusters, brokers & messaging architecture * Collaborate with development teams on data pipelines * Monitor Kafka performance & troubleshoot issues Health insurance Provident fund Annual bonus
Posted 3 months ago
5.0 - 9.0 years
11 - 12 Lacs
Hyderabad, Mysuru, Chennai
Work from Office
Experienced Kafka Developer skilled in schema registry, cluster optimization, pipeline design, troubleshooting, and ensuring high availability, security, and compliance in data processing. Mail:kowsalya.k@srsinfoway.com
Posted 3 months ago
5.0 - 10.0 years
15 - 20 Lacs
Noida, Gurugram
Work from Office
Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |