Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.
Posted 1 week ago
3.0 - 10.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a skilled Integration Specialist, you will be responsible for interacting and collaborating with customers and partners to define the integration landscape. Your role will involve defining the logical sequence of integration activities for SaaS onboarding projects and coordinating with the product development team to implement recommended integration strategies. By improving the overall project delivery experience and go-live time through process and documentation enhancements, you will contribute to the success of integration projects. You will be supporting cloud infrastructure and system components required for integration while also taking the lead in the identification, isolation, resolution, and communication of issues within a client environment. To succeed in this role, you must have worked on at least one end-to-end SaaS implementation project and possess 3-10 years of application and data integration experience. Your experience with clustering and high availability configurations, along with Agile methodologies, will be beneficial. Designing an end-to-end scalable microservices-based integration solution is a must, in addition to having broad exposure to different technology stacks involved in a SaaS delivery model. Your knowledge and experience should encompass various aspects, including microservices design patterns, service orchestration, inter-service communication, data integration concepts and tools, network protocol stacks, security postures in integration technology stacks, API design, and API-based integration. Familiarity with Azure, AWS, and GCP public cloud platforms and their integration approaches is essential. Additionally, hands-on experience with the Kafka Connect Framework and skilled technical documentation abilities are required. As a solution designer at heart, you should be able to use modeling tools to create effective architecture views and possess strong organizational, analytical, critical thinking, and debugging skills. Excellent communication skills are vital for effectively articulating complex technical and functional requirements to project stakeholders. Being a self-starter who is willing to engage in all aspects of solution delivery, including implementation and process improvement, is key. A broad picture mindset is necessary to visualize the end-to-end solution of a project. Nice-to-have qualifications include domain knowledge of banking and financial institutions, experience with geographically distributed delivery and client teams, and hands-on experience with setting up and configuring Kafka brokers.,
Posted 1 week ago
3.0 - 8.0 years
18 - 22 Lacs
Navi Mumbai, Mumbai (All Areas)
Work from Office
1 Education : B.E./B.Tech/MCA in Computer Science 2 Experience : Must have 7+ years relevant experience in the field of database Administration. 3 Mandatory Skills/Knowledge Candidate should be technically sound in multiple distribution like Cloudera, Confluent, open source Kafka. Candidate should be technically sound in Kafka, Zookeeper Candidate should well versed in capacity planning and performance tuning. Candidate should be expertise in implementation of security in ecosystem Hadoop Security ranger , Kerberos ,SSL Candidate should be expertise in dev ops tool like ansible, Nagios, shell scripting python , Jenkins, Ansible, Git, Maven to implement automation . Candidate should able to Monitor, Debug & RCA for any service failure. Knowledge of network infrastructure for eg. TCP/IP, DNS, Firewall, router, load balancer. Creative analytical and problem-solving skills Provide RCAs for critical & recurring incidents. Provide on-call service coverage within a larger group Good aptitude in multi-threading and concurrency concepts. 4 Preferred Skills/Knowledge Expert Knowledge of database administration and architecture Hands on Operating System Commands Kindly Share CVs on snehal.sankade@outworx.com
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Key Skills: Confluent Kafka, Kafka Connect, Schema Registry, Kafka Brokers, KSQL, KStreams, Java/J2EE, Troubleshooting, RCA, Production Support. Roles & Responsibilities: Design and develop Kafka Pipelines. Perform unit testing of the code and prepare test plans as required. Analyze, design, and develop programs in a development environment. Support applications and jobs in the production environment for issues or failures. Develop operational documents for applications, including DFD, ICD, HLD, etc. Troubleshoot production issues and provide solutions within defined SLA. Prepare RCA (Root Cause Analysis) document for production issues. Provide permanent fixes to production issues. Experience Requirement: 5-10 yeras of experience working with Confluent Kafka. Hands-on experience with Kafka Connect using Schema Registry. Strong knowledge of Kafka brokers and KSQL. Familiarity with Kafka Control Center, Zookeepers, and KStreams is good to have. Experience with Java/J2EE is a plus. Education: B.E., B.Tech.
Posted 2 weeks ago
5.0 - 10.0 years
12 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Design and develop Kafka Pipelines. Perform Unit testing of the code and prepare test plans as required. Analyze, design and develop programs in development environment. Support application & jobs in production environment for abends or issues.
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Dubai, Pune, Chennai
Hybrid
Job Title: Confluent CDC System Analyst Role Overview: A leading bank in the UAE is seeking an experienced Confluent Change Data Capture (CDC) System Analyst/ Tech lead to implement real-time data streaming solutions. The role involves implementing robust CDC frameworks using Confluent Kafka , ensuring seamless data integration between core banking systems and analytics platforms. The ideal candidate will have deep expertise in event-driven architectures, CDC technologies, and cloud-based data solutions . Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solutions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .
Posted 1 month ago
4.0 - 9.0 years
5 - 13 Lacs
Thane, Goregaon, Mumbai (All Areas)
Work from Office
Opening for Leading Insurance company. **Looking for Immediate Joiner and 30 Days** Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives. Interested candidates shared resume on snehal@topgearconsultants.com
Posted 1 month ago
3.0 - 7.0 years
9 - 14 Lacs
Gurugram
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch Perks and benefits PF, ANNUAL BONUS, HEALTH INSURANCE
Posted 1 month ago
3.0 - 7.0 years
10 - 15 Lacs
Pune
Work from Office
Responsibilities: * Manage Kafka clusters, brokers & messaging architecture * Collaborate with development teams on data pipelines * Monitor Kafka performance & troubleshoot issues Health insurance Provident fund Annual bonus
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad, Pune, Gurugram
Work from Office
GSPANN is looking for an experienced Kafka Developer with strong Java skills to join our growing team. If you have hands-on experience with Kafka components and are ready to work in a dynamic, client-facing environment, wed love to hear from you! Key Responsibilities: Develop and maintain Kafka Producers, Consumers, Connectors, kStream, and KTable. Collaborate with stakeholders to gather requirements and deliver customized solutions. Troubleshoot production issues and participate in Agile ceremonies. Optimize system performance and support deployments. Mentor junior team members and ensure coding best practices. Required Skills: 4+ years of experience as a Kafka Developer Proficiency in Java Strong debugging skills (Splunk experience is a plus) Experience in client-facing projects Familiarity with Agile and DevOps practices Good to Have: Knowledge of Google Cloud Platform (Dataflow, BigQuery, Kubernetes) Experience with production support and monitoring tools Ready to join a collaborative and innovative team? Send your CV to heena.ruchwani@gspann.com
Posted 1 month ago
5.0 - 10.0 years
7 - 14 Lacs
Mumbai, Goregaon, Mumbai (All Areas)
Work from Office
Opening for the Insurance Company. **Looking someone with 30 days notice period** Location : Mumbai (Lower Parel) Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives Interested candidates shared resume on snehal@topgearconsultants.com
Posted 1 month ago
7.0 - 12.0 years
5 - 15 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Role & responsibilities Responsibilities: • Work in an on-prem environment to manage and maintain data architecture. • Configure and manage Kraft / Zookeeper. • Perform Kafka Topic Configuration, create and test topics, and validate data flow from source to Kafka using sample tables. • Ingest data from different databases. • Deploy sink connectors and map Kafka topics to ODS tables with upsert logic. • Setting up Kafka High Availability. • Manage Kafka through Provectus. • Demonstrate knowledge about Kubernetes and virtual machines. Preferred Skills: • Strong analytical and problem-solving skills. • Excellent communication and teamwork abilities. • Ability to work independently and manage multiple tasks.
Posted 1 month ago
5.0 - 10.0 years
15 - 20 Lacs
Noida, Gurugram
Work from Office
Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.
Posted 1 month ago
6.0 - 11.0 years
12 - 30 Lacs
Hyderabad
Work from Office
Proficient in Java 8 , Kafka Must have Experience with Junit Test Case Good on Spring boot, Microservices, SQL , ActiveMQ & Restful API
Posted 2 months ago
4.0 - 7.0 years
8 - 13 Lacs
Pune
Work from Office
Responsibilities: * Monitor Kafka clusters for performance & availability * Manage Kafka broker instances & replication strategies * Collaborate with dev teams on data pipeline design & implementation Food allowance Health insurance Provident fund Annual bonus
Posted 2 months ago
8.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Pune, Chennai
Work from Office
Design & develop data pipelines for real-time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect Apache Flink, Build and configure Kafka Connectors to ingest data from various sources(databases, APIs, message queues,
Posted 2 months ago
5.0 - 10.0 years
10 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Administer and maintain Apache Kafka clusters , including installation, upgrades, configuration, and performance tuning. Design and implement Kafka topics, partitions, replication , and consumer groups. Ensure high availability and scalability of Kafka infrastructure in production environments. Monitor Kafka health and performance using tools like Prometheus, Grafana, Confluent Control Center , etc. Implement and manage security configurations such as SSL/TLS, authentication (Kerberos/SASL), and access control. Collaborate with development teams to design and configure Kafka-based integrations and data pipelines . Perform root cause analysis of production issues and ensure timely resolution. Create and maintain documentation for Kafka infrastructure and configurations. Required Skills: Strong expertise in Kafka administration , including hands-on experience with open-source and/or Confluent Kafka . Experience with Kafka ecosystem tools (Kafka Connect, Kafka Streams, Schema Registry). Proficiency in Linux-based environments and scripting (Bash, Python). Experience with monitoring/logging tools and Kafka performance optimization. Ability to work independently and proactively manage Kafka environments. Familiarity with DevOps tools and CI/CD pipelines (e.g., Jenkins, Git, Ansible). Preferred Skills: Experience with cloud platforms (AWS, GCP, or Azure) Kafka services. Knowledge of messaging alternatives like RabbitMQ, Pulsar, or ActiveMQ . Working knowledge of Docker and Kubernetes for Kafka deployment.
Posted 2 months ago
3 - 8 years
8 - 18 Lacs
Gurugram
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch
Posted 2 months ago
4 - 8 years
10 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:4-7yrs Work Location :Chennai /Hyd/Bangalore Job Description: Interested candidates, Kindly share your updated resume to gokul.priya@spstaffing.in or contact number (Whatsapp:9360311230) to proceed further. Job Description: Confluent Kafka platform setup , maintenance , upgrade. Hands on experience with Kafka Brokers. Hands on experience with Schema Registry. Hands on experience with KSQL DB and understanding of underlying implementation and functions. Hands on experience with Kafka connectors and understanding of underlying implementation. Proficient understanding of Kafka client Producer and consumer functioning Experience with Kafka deployment in Azure Kubernetes Service Experience working in Azure cloud
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough