Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
7 - 14 Lacs
Mumbai, Goregaon, Mumbai (All Areas)
Work from Office
Opening for the Insurance Company. **Looking someone with 30 days notice period** Location : Mumbai (Lower Parel) Key Responsibilities: Kafka Infrastructure Management: Design, implement, and manage Kafka clusters to ensure high availability, scalability, and security. Monitor and maintain Kafka infrastructure, including topics, partitions, brokers, Zookeeper, and related components. Perform capacity planning and scaling of Kafka clusters based on application needs and growth. Data Pipeline Development: Develop and optimize Kafka data pipelines to support real-time data streaming and processing. Collaborate with internal application development and data engineers to integrate Kafka with various HDFC Life data sources. Implement and maintain schema registry and serialization/deserialization protocols (e.g., Avro, Protobuf). Security and Compliance: Implement security best practices for Kafka clusters, including encryption, access control, and authentication mechanisms (e.g., Kerberos, SSL). Documentation and Support: Create and maintain documentation for Kafka setup, configurations, and operational procedures. Collaboration: Provide technical support and guidance to application development teams regarding Kafka usage and best practices. Collaborate with stakeholders to ensure alignment with business objectives Interested candidates shared resume on snehal@topgearconsultants.com
Posted 6 days ago
8.0 - 13.0 years
25 - 35 Lacs
Chennai
Hybrid
1. Objective We are seeking a highly experienced and visionary Expert Platform Lead with 10+ years of expertise in Confluent Kafka administration, cloud-native infrastructure, and enterprise-scale streaming architecture. This role involves overseeing Kafka platform strategy, optimizing infrastructure through automation, ensuring cost-effective scalability, and working closely with cross-functional teams to enable high-performance data streaming solutions. The ideal candidate will drive innovation, establish best practices, and mentor teams to enhance platform reliability and efficiency. 2. Main tasks Kafka Platform Management Define and execute platform strategy for Confluent Kafka, ensuring security, high availability, and scalability. Lead architecture design reviews, influencing decisions related to Kafka infrastructure and cloud deployment models. Oversee and maintain the Kafka platform in a 24/7 operational setting, ensuring high availability and fault tolerance. Establish monitoring frameworks, proactively identifying and addressing platform inefficiencies. Leadership, Collaboration and Support Act as the primary technical authority on Kafka for enterprise-wide streaming architecture. Collaborate closely with application teams, architects, and vendors to align platform capabilities with business needs. Provide technical mentorship to engineers and architects, guiding best practices in Kafka integration and platform usage. Infrastructure Automation and Optimization Spearhead Infrastructure as Code (IaC) initiatives using Terraform for Kafka, AWS, and cloud resources. Drive automation across provisioning, deployment workflows, and maintenance operations, ensuring efficiency and resilience. Implement advanced observability measures to optimize costs and resource allocation while maintaining peak performance. Governance, Documentation and Compliance Maintain detailed platform documentation, including configuration, security policies, and compliance standards. Track and analyze usage trends, ensuring cost-efficient resource utilization across streaming ecosystems. Establish governance frameworks, ensuring compliance with enterprise security policies and industry standards. 3. Technical expertise Education level: Minimum 4 years Bachelor’s or Master’s degree in Computer Science engineering or related field Required expertise for the function: 10+ years of experience in platform engineering, cloud infrastructure, and data streaming architectures. Extensive expertise in Kafka administration (preferably Confluent Kafka), leading enterprise-wide streaming initiatives. Proven track record in leading critical incident response and ensuring system uptime in a 24/7 environment. Knowledge of languages (depending on the office): English (Mandatory) Technical knowledge required to perform the function: Technical Skills: Expert knowledge of Kafka (Confluent), event-driven architectures, and high-scale distributed systems. Mastery of Terraform for infrastructure automation across AWS, Kubernetes, and cloud-native ecosystems. Strong proficiency in AWS services, networking principles, and security best practices. Advanced experience with CI/CD pipelines, version control (Git), and scripting (Bash, Python). Soft Skills: Strategic problem-solving mindset, capable of leading large-scale technical decisions. Strong leadership and mentorship skills, able to guide teams toward technical excellence. Excellent communication, stakeholder management, and cross-functional collaboration abilities. Preferred Skills: Kafka or Confluent certification, demonstrating deep platform expertise. AWS Solutions Architect certification or equivalent cloud specialization. Experience with monitoring tools (Prometheus, Grafana) and proactive alert management for 24/7 operations.
Posted 1 week ago
3.0 - 6.0 years
10 - 17 Lacs
Pune
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch
Posted 1 week ago
5.0 - 10.0 years
15 - 20 Lacs
Noida, Gurugram
Work from Office
Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.
Posted 1 week ago
5.0 - 10.0 years
5 - 10 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Looking exp in 5+ Yrs exp in Confluent Kafka Administrator-Technology Lead Kafka Administrator Required Skills & Experience: Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Interested candidate share me your updated resume in recruiter.wtr26@walkingtree.in
Posted 1 week ago
6.0 - 11.0 years
12 - 30 Lacs
Hyderabad
Work from Office
Proficient in Java 8 , Kafka Must have Experience with Junit Test Case Good on Spring boot, Microservices, SQL , ActiveMQ & Restful API
Posted 1 week ago
4.0 - 8.0 years
27 - 42 Lacs
Hyderabad
Work from Office
Job Summary We are looking for an experienced Infra Dev Specialist with 4 to 8 years of experience to join our team. The ideal candidate will have expertise in KSQL Kafka Schema Registry Kafka Connect and Kafka. This role involves working in a hybrid model with day shifts and does not require travel. The candidate will play a crucial role in developing and maintaining our infrastructure to ensure seamless data flow and integration. Responsibilities Develop and maintain infrastructure solutions using KSQL Kafka Schema Registry Kafka Connect and Kafka. Oversee the implementation of data streaming and integration solutions to ensure high availability and performance. Provide technical support and troubleshooting for Kafka-related issues to minimize downtime and ensure data integrity. Collaborate with cross-functional teams to design and implement scalable and reliable data pipelines. Monitor and optimize the performance of Kafka clusters to meet the demands of the business. Ensure compliance with security and data governance policies while managing Kafka infrastructure. Implement best practices for data streaming and integration to enhance system efficiency. Conduct regular reviews and updates of the infrastructure to align with evolving business needs. Provide training and support to team members on Kafka-related technologies and best practices. Develop and maintain documentation for infrastructure processes and configurations. Participate in code reviews and contribute to the continuous improvement of the development process. Stay updated with the latest trends and advancements in Kafka and related technologies. Contribute to the overall success of the team by delivering high-quality infrastructure solutions. Qualifications Possess strong experience in KSQL Kafka Schema Registry Kafka Connect and Kafka. Demonstrate a solid understanding of data streaming and integration concepts. Have a proven track record of troubleshooting and resolving Kafka-related issues. Show expertise in designing and implementing scalable data pipelines. Exhibit knowledge of security and data governance practices in managing Kafka infrastructure. Display proficiency in monitoring and optimizing Kafka cluster performance. Have experience in providing technical support and training to team members. Be skilled in developing and maintaining infrastructure documentation. Stay informed about the latest trends in Kafka and related technologies. Possess excellent communication and collaboration skills. Have a proactive approach to problem-solving and continuous improvement. Demonstrate the ability to work effectively in a hybrid work model. Show commitment to delivering high-quality infrastructure solutions. Certifications Required Certified Apache Kafka Developer
Posted 2 weeks ago
4.0 - 7.0 years
8 - 13 Lacs
Pune
Work from Office
Responsibilities: * Monitor Kafka clusters for performance & availability * Manage Kafka broker instances & replication strategies * Collaborate with dev teams on data pipeline design & implementation Food allowance Health insurance Provident fund Annual bonus
Posted 2 weeks ago
5.0 - 10.0 years
6 - 11 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Kafka Administrator Required Skills & Experience: Looking exp around 5+ Years. Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Share me your updated resume recruiter.wtr26@walkingtree.in
Posted 2 weeks ago
5.0 - 10.0 years
10 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Administer and maintain Apache Kafka clusters , including installation, upgrades, configuration, and performance tuning. Design and implement Kafka topics, partitions, replication , and consumer groups. Ensure high availability and scalability of Kafka infrastructure in production environments. Monitor Kafka health and performance using tools like Prometheus, Grafana, Confluent Control Center , etc. Implement and manage security configurations such as SSL/TLS, authentication (Kerberos/SASL), and access control. Collaborate with development teams to design and configure Kafka-based integrations and data pipelines . Perform root cause analysis of production issues and ensure timely resolution. Create and maintain documentation for Kafka infrastructure and configurations. Required Skills: Strong expertise in Kafka administration , including hands-on experience with open-source and/or Confluent Kafka . Experience with Kafka ecosystem tools (Kafka Connect, Kafka Streams, Schema Registry). Proficiency in Linux-based environments and scripting (Bash, Python). Experience with monitoring/logging tools and Kafka performance optimization. Ability to work independently and proactively manage Kafka environments. Familiarity with DevOps tools and CI/CD pipelines (e.g., Jenkins, Git, Ansible). Preferred Skills: Experience with cloud platforms (AWS, GCP, or Azure) Kafka services. Knowledge of messaging alternatives like RabbitMQ, Pulsar, or ActiveMQ . Working knowledge of Docker and Kubernetes for Kafka deployment.
Posted 2 weeks ago
5.0 - 10.0 years
16 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Detailed job description - Skill Set: Proven experience as a Kafka Developer Knowledge of Kafka Schemas and use of the Schema Registry Strong knowledge of Kafka and other big data technologies Best practices to optimize the Kafka ecosystem based on use-case and workload Knowledge of Kafka clustering, and its fault-tolerance model supporting High Availability Strong fundamentals in Kafka client configuration and troubleshooting Designing and implementing data pipelines using Apache Kafka Develop and maintain Kafka-based data pipelines Monitor and optimize Kafka clusters Troubleshoot and resolve issues related to Kafka and data processing Ensure data security and compliance with industry standards Create and maintain documentation for Kafka configurations and processes Implement best practices for Kafka architecture and operations Mandatory Skills(ONLY 2 or 3) Kafka Developer
Posted 3 weeks ago
3 - 8 years
10 - 20 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.
Posted 2 months ago
5 - 7 years
15 - 20 Lacs
Bengaluru
Work from Office
IBM Event Stream (Kafka) Platform consulting, Platform architecture and implementation IBM Event Stream (Kafka) Cluster, Security, Disaster recovery, Data pipeline, Data replication, Performance optimization
Posted 2 months ago
7 - 10 years
20 - 22 Lacs
Chennai, Pune, Noida
Work from Office
Experience in Java, Apache Kafka, Streams, Clusters Application Development, Topic Management, Data Pipeline Development, Producer & Consumer Implementation, Integration & Connectivity, Cluster Administration, Security & Compliance, Apache Zookeeper Required Candidate profile 7 -10 year exp in Kafka Expertis, Programming Skill, Big Data & Streaming Technologie, Database Knowledge, Cloud & DevOp, Event-Driven Architecture, Security & Scalability, Problem Solving & Teamwork
Posted 2 months ago
3 - 6 years
4 - 9 Lacs
Trivandrum
Work from Office
Database Administrator J D Job Title: Database Administrator Role Description: We are looking for an experienced Database Administrator (DBA) to manage, optimize, and secure AI-driven, cloud-hosted relational and NoSQL databases. The ideal candidate should be proficient in database architecture, performance tuning, security, and high availability solutions, with expertise in cloud-based databases and AI-powered data processing. The DBA will work closely with development, DevOps, and infrastructure teams to ensure efficient data management. Key Responsibilities: Design, implement, and maintain highly available and scalable databases. Optimize SQL queries, indexing strategies, and database schema for performance. Perform database tuning, monitoring, and troubleshooting. Implement replication, partitioning, and sharding to improve efficiency. Work with cloud-hosted database solutions (AWS RDS, Azure SQL, Google Cloud Spanner). Implement database security policies, user roles, permissions, and access control. Set up automated backup and disaster recovery plans to ensure data integrity. Ensure compliance with GDPR, ISO, HIPAA, and other data regulations. Handle installation, upgrades, and patching of database systems. Work with CI/CD pipelines to automate database deployments. Required Skills & Knowledge: 4+ years of experience as a Database Administrator. Proficiency in MySQL, PostgreSQL, MongoDB, SQL Server, and Oracle. Experience with query optimization, execution plans, and indexing. Knowledge of high availability, replication, and clustering. Familiarity with AI-powered predictive analytics for data processing. Hands-on experience with data pipelines and streaming technologies (Kafka, Snowflake, BigQuery). Strong security and compliance awareness. Scripting and automation skills in Shell, Python, or PowerShell. Preferred Qualifications: Bachelors Degree in Computer Science, IT, or a related field. Certifications such as Oracle Certified DBA, AWS Database Specialty, or Microsoft SQL Server Certification. Experience with NoSQL databases like Redis, Cassandra, or DynamoDB. Familiarity with big data tools (Hadoop, Spark, Elasticsearch).
Posted 2 months ago
7 - 11 years
13 - 19 Lacs
Chennai, Pune, Delhi NCR
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC for Kafka Developer Exp - 7 - 11 Years Only immediate Joiners Location - Pune , Chennai , Noida Educational Background A degree in Computer Science, IT, or a related field. Kafka Expertise Strong knowledge of Kafka architecture, brokers, producers, consumers, and stream processing. Programming Skills Proficiency in Java, Scala, or Python for developing Kafka-based applications. Big Data & Streaming Technologies Experience with Spark, Flink, or Apache Storm is a plus. Database Knowledge Familiarity with SQL and NoSQL databases like Cassandra, MongoDB, or PostgreSQL. Cloud & DevOps Experience with cloud platforms (AWS, Azure, GCP) and Kubernetes/Docker. Event-Driven Architecture Understanding of event-driven and microservices architectures. Monitoring & Debugging Experience with Kafka monitoring tools like Confluent Control Center, Kafka Manager, or ELK stack. Security & Scalability Knowledge of Kafka security, access control, and scaling strategies. Problem-Solving & Communication Strong analytical skills and ability to work in cross-functional teams Preferred candidate profile Kafka Application Development Data Pipeline Development Producer & Consumer Implementation Integration & Connectivity Performance Optimization Security & Compliance Cluster Administration Monitoring & Logging Documentation Perks and benefits
Posted 2 months ago
5 - 8 years
6 - 13 Lacs
Noida
Work from Office
POSITION: SENIOR ENGINEER-DEVOPS Company: FCI-CCM Experience: 5+ relevant exp. Work Location: Noida, Sector 126 Work Timings: General Day Shift Compensation: As per market standards Company Website: www.fci-ccm.com About us: FCI-CCM, founded in 1959, is a global provider of Customer Communication Management solutions, enabling businesses to create personalized and engaging communications for their customers. With over six decades of experience in handling physical and digital communications, FCI-CCM is well-equipped to provide communication solutions to consumer-facing businesses globally. Our outstanding social media ratings on platforms like Glassdoor, Ambition Box, and Google, averaging at 4.5, are a testament to our employee-centric philosophy. We welcome young and talented candidates to join us and become a part of our success story. Broad Function: We are seeking a Senior DevOps Engineer with strong expertise in Kubernetes environments, as well as knowledge of AWS EKS and Docker deployment and configuration. The ideal candidate must be coupled with extensive hands-on experience in DevOps practices and technologies. Roles and Responsibilities: DevOps Strategy: Collaborate with cross-functional teams to define DevOps strategies and roadmaps aligned with organizational goals. Infrastructure Management: Oversee the design, deployment, and maintenance of cloud-based infrastructure and on-premises systems. CI/CD Pipeline: Design, implement, and maintain robust continuous integration and continuous delivery pipelines. Monitoring and Troubleshooting: Implement monitoring and logging solutions to proactively identify and address system issues. Respond to incidents promptly, perform root cause analysis, and take corrective actions. Security: Collaborate with security teams to ensure compliance with industry standards and best practices. Implement security measures to safeguard applications and infrastructure from potential threats. Performance Optimization: Analyze system performance and identify opportunities for optimization. Work with development teams to improve application performance and efficiency. Collaboration: Foster effective communication and collaboration between development, operations, and quality assurance teams. Key Skills & Desired Experience: Bachelors degree in computer science, Information Technology, or related field. Knowledge and experience in configuring Kafka clusters. Scripting and automation skills (e.g., Python, Bash, PowerShell). Experience with cloud platforms (e.g., AWS, Azure) and infrastructure-as-code tools (e.g., Terraform, Cloud Formation) covering networking, storage, VMs, containers, RDS, and Managed Kubernetes services/ OpenShift. Strong knowledge of containerization technologies such as Docker and Kubernetes in /OpenShift/AWS/Azure. Proficiency in CI/CD tools (e.g., Jenkins, GitLab) and version control systems Experience with monitoring and logging tools (ELK stack & Prometheus/Grafana,). Understanding in setting up highly available, scalable infrastructure and services, covering networking, applications deployments & monitoring, and maintaining Databases, including high availability. Employee Benefits: The company offers a range of employee benefits including: Cashless medical insurance for employees, spouses, and children Accidental insurance coverage Life insurance coverage Five-day work week Complementary lunch coupon Company-paid transportation Access to online learning platforms such as Udemy Retirement benefits including Provident Fund (PF) and Gratuity Sodexo benefits for income tax savings Paternity & Maternity Leave Benefit National Pension Saving Why Join FCI: Rapid Growth : Join a bootstrapped, rapidly growing SaaS company where you can make a significant impact and shape your career trajectory. Learning Opportunities : Experience unparalleled learning opportunities that go beyond the traditional corporate mold, acquiring skills and knowledge that will benefit you in the long term. Straightforward Culture : Be part of a straightforward, no-nonsense culture that values transparency, authenticity, and open communication. Innovation and Collaboration: We believe great ideas can come from anyone, regardless of their role or level in the company. We actively encourage innovation and collaboration among our team members. Fast-Paced Environment: Thrive in a fast-paced and dynamic environment where we move quickly, adapt swiftly to market changes, and embrace the challenges and rewards of a startup lifestyle. Competitive Compensation : Enjoy a competitive salary package along with a range of benefits designed to support your lifestyle and help you perform at your best, both in and out of work. High-Growth, Innovative Environment: If you're a driven individual who thrives in a high-growth, innovative environment, we're excited to hear from you. Join us for a unique opportunity to accelerate your career and be part of our exciting journey. Apply now to become a valued member of our ambitious and innovative team!
Posted 2 months ago
5 - 10 years
5 - 8 Lacs
Bengaluru
Work from Office
Description Primary Skill Kafka Cluster Management,Kafka admin,Kubernetes,Helm ,DevOps, Jenkins Secondary Skill:Grafana, Prometheus, Dynatrace Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Kafka Cluster Management;Kubernetes Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
7 - 9 years
3 - 7 Lacs
Pune
Work from Office
J ob Description: Senior Kafka Developer, Having over around 7+ years of experience in different integration technologies, data streaming technologies and Kafka. Solid understanding of Kafka architecture and experience in using Kafka. Ensure optimum performance, high availability and stability of solutions Strong hands-on designing integration solutions using Kafka and aware of integration best practices for data streaming solutions. Strong awareness of Kafka ecosystem, Zookeeper, Kafka cluster, Kafka broker, producer, consumer, Connectors, different APIs, Kafka Topic etc. Strong hands-on SQL connector, HTTP connector etc. Strong knowledge on different APIs exposed by Kafka to handle the integration, data steaming and for data management. Strong awareness on designing deployment architecture for Kafka solutions. Solid hands-on Kafka deployment and configuration. Experience on production deployment, invoke Kafka components as background processes, configuration, trouble shooting and environment maintenance. Location:Pune
Posted 2 months ago
5 - 10 years
15 - 30 Lacs
Bengaluru
Work from Office
5+ YEARS OF EXPERIENCE CONFLUENT KAFKA KAFKA CONNECT KAFKA CLUSTER ZOOKEEPER
Posted 3 months ago
5 - 7 years
11 - 20 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Job Title : Kafka Developer Location : Chennai /Hyderabad/Bangalore Job Type : Full-Time Experience:5-9 Yrs Introduction : We are seeking an experienced Kafka Developer with JAVA to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining real-time data streaming systems using Apache Kafka. You will work closely with cross-functional teams to build scalable, high-performance data pipelines and enable the efficient flow of data across various applications. Responsibilities : Design, implement, and maintain scalable, high-performance data streaming systems using Apache Kafka. Build and deploy Kafka topics, producers, and consumers for real-time data processing. Collaborate with backend engineers, data engineers, and other team members to integrate Kafka into various systems and platforms. Optimize Kafka clusters for performance, scalability, and high availability. Develop Kafka Streams applications for real-time data processing and transformation. Troubleshoot and resolve Kafka-related issues, including cluster performance, message processing, and data consistency problems. Implement security best practices within the Kafka ecosystem, including access control, encryption, and authentication. Monitor Kafka clusters and pipelines to ensure uptime and performance metrics are met. Ensure proper data governance and compliance measures are implemented across the Kafka pipeline. Develop and maintain documentation, including setup guides, technical specifications, and architecture diagrams. Stay up to date with the latest Kafka features, improvements, and industry best practices. Requirements : Proven experience as a Kafka Developer, Data Engineer, or similar role with hands-on expertise in Apache Kafka. Strong knowledge of Kafkas core concepts: topics, partitions, producers, consumers, brokers, and Kafka Streams. Experience with Kafka ecosystem tools like Kafka Connect, Kafka Streams, and KSQL. Expertise in Java, developing Kafka-based solutions. Experience in deploying and managing Kafka clusters in cloud environments (AWS, Azure, GCP). Strong understanding of distributed systems, message brokers, and data streaming architectures. Familiarity with stream processing and real-time data analytics. Experience in building, optimizing, and monitoring Kafka-based systems. Knowledge of containerization technologies (e.g., Docker, Kubernetes) for managing Kafka deployments. Excellent problem-solving skills and the ability to troubleshoot complex Kafka-related issues. Strong communication and collaboration skills for working in a team environment. Preferred Qualifications : Experience with other messaging systems like Apache Pulsar or RabbitMQ. Familiarity with data storage technologies like HDFS, NoSQL, or relational databases. Experience in DevOps practices and CI/CD pipelines. Knowledge of cloud-native architectures and microservices. Education : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Why Join Us : [Photon Interactive systems offers a dynamic and inclusive work environment with opportunities for personal and professional growth. Competitive salary and benefits package. Work with the latest technologies in the field of data streaming and big data analytics.
Posted 3 months ago
5 - 10 years
7 - 13 Lacs
Pune
Work from Office
Immediate Opening!! Skills: Kafka, Kubernetes, Clusters, Connectors, Messaging setup, Terraform, Jenkins, GitHub, AWS(EKS), Ansible Exp: 5+ years Location: Mysure, Pune Notice period: immediate-30 days Mode: contract
Posted 3 months ago
3 - 8 years
8 - 18 Lacs
Gurugram
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2