Home
Jobs

4 Kafka Administration Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a APIGEE Administrator to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Role: APIGEE Administrator Responsibilities - 1. Designing and developing API proxies, implementing security policies (e.g., OAuth, JWT), and creating API product bundles. 2. Support users and administer Apigee OPDK. Integrating APIs with various systems and backend services. 3. Participate and contribute to the migration to Apigee X. Planning and executing API migrations between different Apigee environments 4. Automation of platform processes 5. Implementing security measures like authentication, authorization, mitigation, as well as managing traffic and performance optimization. 6. On-call support - Identifying and resolving API-related issues, providing support to developers and consumers, and ensuring high availability. 7. Implement architecture, including tests/CICD/monitoring/alerting/resilience/SLAs/Documentation 8. Collaborating with development teams, product owners, and other stakeholders to ensure seamless API integration and adoption Requirement - 1. Bachelor's degree (Computer Science/Information Technology/Electronics & Communication/ Information Science/Telecommunications) 2. 7+ years of work experience in IT Industry and strong knowledge in implementing/designing solutions using s/w application technologies 3. Good knowledge and experience of the Apigee OPDK platform and troubleshooting 4. Experience in AWS administration (EC2, Route53, Cloudtrail AWS WAF, Cloudwatch, EKS, AWS System Manager) 5. Good hands on experience in Redhat Linux administration and Shell scripting programming 6. Strong understanding of API design principles and best practices. 7. Kubernetes Admin, Github Cassandra Admin, Google Cloud 8. Familiar in managing Dynatrace Desirable . Jenkins . Proxy API Development . Kafka administration based on SASS (Confluent) . Knowledge of Azure . ELK About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 day ago

Apply

8.0 - 10.0 years

25 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Office (Bengaluru) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - SoHo Dragon) What do you need for this opportunity? Must have skills required: Kafka, Kafka Administration, Kafka Confluent, Linux SoHo Dragon is Looking for: Job Description We are seeking an experienced professional with 8-15 years of IT experience to join our team. The ideal candidate should possess expertise in Kafka architecture and operations, along with a strong understanding of Confluent-specific tools, Linux/Unix systems, networking, and security practices. Key Responsibilities & Requirements: Kafka Architecture & Operations: Deep understanding of core Kafka components including brokers, topics, partitions, producers, and consumers. Ability to create, configure, and manage Kafka topics and partitions. Confluent Ecosystem: Proficiency in Confluent-specific tools such as Control Centre, Schema Registry, ksqlDB, and Kafka Connect. Linux/Unix Expertise: Strong command over Linux/Unix systems, including shell scripting and system monitoring. Networking Knowledge: Understanding of network configurations, protocols, and security best practices to ensure efficient and secure Kafka operations. Programming Skills: Knowledge of Java programming and JVM tuning, as Kafka is built on Java. Automation & Scripting: Proficiency in scripting languages like Python or Bash for automation and management tasks. Monitoring & Logging: Experience with monitoring tools such as Prometheus, Grafana, and Confluent Control Centre to track Kafka performance. Familiarity with logging frameworks like Log4j for troubleshooting and maintaining Kafka logs. Security Practices: Implementation of security measures, including SSL/TLS encryption, Kerberos authentication, and access control lists (ACLs) to safeguard Kafka data. Integration Expertise: Experience in integrating Kafka with various systems and data sources, including databases, data lakes, and cloud services. Capacity Planning: Ability to plan and scale Kafka clusters to handle dynamic workloads while ensuring high availability. Backup & Recovery: Knowledge of backup and recovery strategies to protect data and ensure business continuity in case of failures (a frequent task in T&S). Preferred Qualification: Confluent Certification: Preference for candidates holding the Confluent Certified Administrator for Apache Kafka (CCAAK) certification. Note: This role is a 6-month contractual position with the possibility of extension based on performance. The work location is Bangalore-Eco World Bellandur, and it requires on-site presence at the office.

Posted 2 days ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Noida, Gurugram

Work from Office

Naukri logo

Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Developer Kafka Experience: 8 - 20 Years Exp Salary : INR 25-28 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Onsite (Bengaluru) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Kafka, Kafka Administration, Kafka Confluent, Linux SoHo Dragon (One of Uplers' Clients) is Looking for: Senior Developer Kafka who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Job Description We are seeking an experienced professional with 8-15 years of IT experience to join our team. The ideal candidate should possess expertise in Kafka architecture and operations, along with a strong understanding of Confluent-specific tools, Linux/Unix systems, networking, and security practices. Key Responsibilities & Requirements: Kafka Architecture & Operations: Deep understanding of core Kafka components including brokers, topics, partitions, producers, and consumers. Ability to create, configure, and manage Kafka topics and partitions. Confluent Ecosystem: Proficiency in Confluent-specific tools such as Control Centre, Schema Registry, ksqlDB, and Kafka Connect. Linux/Unix Expertise: Strong command over Linux/Unix systems, including shell scripting and system monitoring. Networking Knowledge: Understanding of network configurations, protocols, and security best practices to ensure efficient and secure Kafka operations. Programming Skills: Knowledge of Java programming and JVM tuning, as Kafka is built on Java. Automation & Scripting: Proficiency in scripting languages like Python or Bash for automation and management tasks. Monitoring & Logging: Experience with monitoring tools such as Prometheus, Grafana, and Confluent Control Centre to track Kafka performance. Familiarity with logging frameworks like Log4j for troubleshooting and maintaining Kafka logs. Security Practices: Implementation of security measures, including SSL/TLS encryption, Kerberos authentication, and access control lists (ACLs) to safeguard Kafka data. Integration Expertise: Experience in integrating Kafka with various systems and data sources, including databases, data lakes, and cloud services. Capacity Planning: Ability to plan and scale Kafka clusters to handle dynamic workloads while ensuring high availability. Backup & Recovery: Knowledge of backup and recovery strategies to protect data and ensure business continuity in case of failures (a frequent task in T&S). Preferred Qualification: Confluent Certification: Preference for candidates holding the Confluent Certified Administrator for Apache Kafka (CCAAK) certification. Note: This role is a 6-month contractual position with the possibility of extension based on performance. The work location is Bangalore-Eco World Bellandur, and it requires on-site presence at the office. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: We are a full-service Software Application Development company that focuses on portals, document management, collaboration, business intelligence, CRM tools, cloud technology, and data. Much of the work done for our clients are based in the Microsoft Application stack of business tools. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies