Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6 - 9 years
10 - 16 Lacs
Hyderabad
Hybrid
Senior Software Engineer - Java Developer with Kafka Streaming, Spark & OpenShift Position Description At CGI, were a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Java Developer with Kafka Streaming, Spark & OpenShift Position Title: Senior Software Engineer - Java Developer with Kafka Streaming, Spark & OpenShift Experience: 6 to 9 Years Category: Software Development/ Engineering Main location: Hyderabad Shift Timings: General Shift Employment Type: Full Time-Permanent Your future duties and responsibilities Job Summary: • CGI is looking for a skilled and proactive Java Developer with hands-on experience in Kafka streaming, Apache Spark, and Red Hat OpenShift. • The ideal candidate will play a key role in designing, developing, and deploying scalable backend systems and real-time data pipelines. • This position is ideal for someone passionate about building high-performance systems and working with cutting-edge technologies in cloud-native environments. Key Responsibilities: • Design, develop, and maintain robust Java-based microservices and backend applications. • Develop real-time data streaming applications using Apache Kafka and Kafka Streams. • Build and optimize large-scale batch and stream processing pipelines with Apache Spark. • Containerize applications and manage deployments using OpenShift and Kubernetes. • Collaborate with DevOps teams to ensure CI/CD pipelines are robust and scalable. • Write unit tests and conduct code reviews to maintain code quality and reliability. • Work closely with Product and Data Engineering teams to understand requirements and translate them into technical solutions. • Troubleshoot and debug production issues across multiple environments. Required qualifications to be successful in this role • Strong programming skills in Java (Java 8 or higher). • Hands-on experience with Apache Kafka, Kafka Streams, and event-driven architecture. • Solid knowledge of Apache Spark (batch and streaming). • Experience with OpenShift, Kubernetes, and container orchestration. • Familiarity with microservices architecture, RESTful APIs, and distributed systems. • Experience with build tools such as Maven or Gradle. • Familiar with Git, Jenkins, CI/CD pipelines, and Agile development practices. • Excellent problem-solving skills and ability to work in a fast-paced environment. Education & Experience: • Bachelor's or Master's degree in Computer Science, Engineering, or related field. • Minimum 6 years of experience in backend development with Java and related technologies. Preferred Skills (Nice to Have): • Knowledge of cloud platforms like AWS, Azure, or GCP. • Understanding of security best practices in cloud-native environments. • Familiarity with SQL/NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). • Experience with Scala or Python for Spark jobs is a plus.
Posted 1 month ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Kafka Integration. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
5 - 8 years
9 - 14 Lacs
Hyderabad
Work from Office
About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Kafka Integration. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
8 - 13 years
20 - 35 Lacs
Kolkata, Pune, Bengaluru
Hybrid
Role & responsibilities 8+ years experiences on relevant field below (Internship, prototype, and personal projects won't be counted) Coding is required . (Ideally Python or Java) Own end to end lifecycle (From development to deployment to production environment) Experience in building or deploying solution in the cloud. Either Cloud Native (Serverless) : S3, Lambda, AWS Batch, ECS Or Cloud Agnostic: Kubernetes, Helm Chart, ArgoCD, Prometeus, Grafana. CICD experience: Github action or Jenkin. Infrastructure as code : e.g., Terraform And experience in at least one of this focus area: Big Data: Building Big data pipeline or Platform to process petabytes of data: (PySpark, Hudi, Data Lineage, AWS Glue, AWS EMR, Kafka, Schema Registry) Or GraphDB : Ingesting and consuming data in Graph Database such as Neo4J, AWS Neptune, JanusGraph or DGraph Preferred candidate profile Specifically highlight Kafka expertise - include details like: Experience with Kafka cluster management and configuration Stream processing with Kafka Streams or KSQL Schema Registry implementation and management Kafka Connect for data integration Put significant focus on PySpark skills: Experience building and optimizing PySpark jobs for batch processing Stream processing with Spark Structured Streaming Familiarity with Delta Lake, Hudi, or Iceberg for lakehouse implementation Highlight data engineering skills that complement these technologies: Data pipeline design and implementation Experience with data quality, validation, and lineage tracking Performance optimization for large-scale data processing
Posted 1 month ago
3 - 8 years
8 - 18 Lacs
Gurugram
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch
Posted 1 month ago
4 - 6 years
15 - 22 Lacs
Gurugram
Hybrid
The Job We are looking out for a Sr Data Engineer responsible to Design, Develop and Support Real Time Core Data Products to support TechOps Applications. Work with various teams to understand business requirements, reverse engineer existing data products and build state of the art performant data pipelines. AWS is the cloud of choice for these pipelines and a solid understand and experience of architecting , developing and maintaining real time data pipelines in AWS Is highly desired. Design, Architect and Develop Data Products that provide real time core data for applications. Production Support and Operational Optimisation of Data Projects including but not limited to Incident and On Call Support , Performance Optimization , High Availability and Disaster Recovery. Understand Business Requiremensts interacting with business users and or reverse engineering existing legacy data products. Mentor and train junior team members and share architecture , design and development knowdge of data products and standards. Mentor and train junior team members and share architecture , design and development knowdge of data products and standards. Good understand and working knowledge of distributed databases and pipelines. Your Profile An ideal candidate will have 4+ yrs of experience in Real Time Streaming along with hands on Spark, Kafka, Apache Flink, Java, Big data technologies, AWS and MSK (managed service kafka) AWS Distrubuited Database technologies including Managed Services Kafka, Managed Apache Flink, DynamoDB, S3, Lambda. Experience designing and developing with Apache Flink real time data products.(Scala experience can be considered) Experience with python and pyspark SQL Code Development AWS Solutions Architecture experience for data products is required Manage, troubleshoot, real time data pipelines in the AWS Cloud Experience with High Availability and Disaster Recovery Solutions for Real time data streaming Excellent Analytical, Problem solving and Communication Skills Must be self-motivated, and ability to work independently Ability to understand existing SQL and code and user requirements and translate them into modernized data products.
Posted 1 month ago
3 - 7 years
14 - 15 Lacs
Hyderabad
Work from Office
Hi Greeting for the Day! We found your profile suitable for the below opening, kindly go through the JD and reach out to us if you are interested. About Us Incorporated in 2006, We are an 18 year old recruitment and staffing company, we are a provider of manpower for some of the fortune 500 companies for junior/ Middle/ Executive talent. About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Kafka Testing Qualification : Any Graduate or Above Relevant Experience : 3+Yrs Location : Hyderabad - Initial 8 weeks Work From Office, followed by Work From Home CTC Range : 15LPA (Lakhs Per Annum) Notice period : Immediate Shift Timing : 1 PM to 10 PM Mode of Interview : Virtual Joel. IT Staff. Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8067432416 I joel.manivasan@blackwhite.in I www.blackwhite.in
Posted 1 month ago
4 - 8 years
10 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:4-7yrs Work Location :Chennai /Hyd/Bangalore Job Description: Interested candidates, Kindly share your updated resume to gokul.priya@spstaffing.in or contact number (Whatsapp:9360311230) to proceed further. Job Description: Confluent Kafka platform setup , maintenance , upgrade. Hands on experience with Kafka Brokers. Hands on experience with Schema Registry. Hands on experience with KSQL DB and understanding of underlying implementation and functions. Hands on experience with Kafka connectors and understanding of underlying implementation. Proficient understanding of Kafka client Producer and consumer functioning Experience with Kafka deployment in Azure Kubernetes Service Experience working in Azure cloud
Posted 1 month ago
6 - 10 years
3 - 8 Lacs
Chennai
Work from Office
Must-Have handson: ------------------ Primary: Java 13, SpringBoot Microservices, Reactive REST API development, TDD - Junit & Mockito, Webflux DB: Postgresql, Couchbase Containerization: Docker, Kubernetes Build: Maven / Gradle Good-to-have (knowledge level is ok): ------------------------------------ Cloud: vmware private cloud OS : Linux experience, Shell script CICD: Azure pipeline Other skills: Splunk/Kafka Integration, Ansible, NewRelic Total Experience Expected: 08-10 years
Posted 2 months ago
5 - 10 years
8 - 16 Lacs
Hyderabad
Remote
Job Description: We are looking for a highly skilled Senior Java Developer with expertise in Apache Kafka and Flink to join our team. The ideal candidate should have a strong background in Java-based application development, microservices, and event-driven architecture. Key Responsibilities: Design, develop, and maintain Java, Spring Boot applications with a focus on microservices architecture . Develop scalable and high-performance streaming applications using Apache Kafka and Flink . Implement containerized solutions using Docker and Kubernetes for efficient deployment and scalability. Collaborate with cross-functional teams to gather and analyze requirements, design solutions, and implement robust backend services. Optimize application performance and ensure high availability and reliability of systems. Develop and maintain unit and integration tests for code quality assurance. Troubleshoot, debug, and resolve technical issues in a timely manner. Stay updated with the latest trends and best practices in Java development, event streaming, and real-time data processing . Required Skills & Experience: 6 - 10 years of hands-on experience in Java and Spring Boot application development. 1 - 2 years of experience working with Apache Kafka (producers, consumers, streams, topics, partitions, etc.). Strong experience in Apache Flink for real-time data processing. Experience with containerization technologies like Docker and Kubernetes . Expertise in microservices architecture and RESTful API development . Knowledge of cloud platforms (AWS, Azure, GCP) is a plus. Strong problem-solving skills and ability to work in an Agile environment.
Posted 2 months ago
3 - 8 years
10 - 20 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.
Posted 2 months ago
7 - 10 years
20 - 22 Lacs
Chennai, Pune, Noida
Work from Office
Experience in Java, Apache Kafka, Streams, Clusters Application Development, Topic Management, Data Pipeline Development, Producer & Consumer Implementation, Integration & Connectivity, Cluster Administration, Security & Compliance, Apache Zookeeper Required Candidate profile 7 -10 year exp in Kafka Expertis, Programming Skill, Big Data & Streaming Technologie, Database Knowledge, Cloud & DevOp, Event-Driven Architecture, Security & Scalability, Problem Solving & Teamwork
Posted 2 months ago
7 - 11 years
13 - 19 Lacs
Chennai, Pune, Delhi NCR
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC for Kafka Developer Exp - 7 - 11 Years Only immediate Joiners Location - Pune , Chennai , Noida Educational Background A degree in Computer Science, IT, or a related field. Kafka Expertise Strong knowledge of Kafka architecture, brokers, producers, consumers, and stream processing. Programming Skills Proficiency in Java, Scala, or Python for developing Kafka-based applications. Big Data & Streaming Technologies Experience with Spark, Flink, or Apache Storm is a plus. Database Knowledge Familiarity with SQL and NoSQL databases like Cassandra, MongoDB, or PostgreSQL. Cloud & DevOps Experience with cloud platforms (AWS, Azure, GCP) and Kubernetes/Docker. Event-Driven Architecture Understanding of event-driven and microservices architectures. Monitoring & Debugging Experience with Kafka monitoring tools like Confluent Control Center, Kafka Manager, or ELK stack. Security & Scalability Knowledge of Kafka security, access control, and scaling strategies. Problem-Solving & Communication Strong analytical skills and ability to work in cross-functional teams Preferred candidate profile Kafka Application Development Data Pipeline Development Producer & Consumer Implementation Integration & Connectivity Performance Optimization Security & Compliance Cluster Administration Monitoring & Logging Documentation Perks and benefits
Posted 2 months ago
6 - 10 years
1000 Lacs
Bengaluru
Work from Office
Overview At Zebra, we are a community of innovators who come together to create new ways of working to make everyday life better. United by curiosity and care, we develop dynamic solutions that anticipate our customer’s and partner’s needs and solve their challenges. Being a part of Zebra Nation means being seen, heard, valued, and respected. Drawing from our diverse perspectives, we collaborate to deliver on our purpose. Here you are a part of a team pushing boundaries to redefine the work of tomorrow for organizations, their employees, and those they serve. You have opportunities to learn and lead at a forward-thinking company, defining your path to a fulfilling career while channeling your skills toward causes that you care about – locally and globally. We’ve only begun reimaging the future – for our people, our customers, and the world. Let’s create tomorrow together. Analyzes, develops, designs, and maintains software for the organization's products and systems. Performs system integration of software and hardware to maintain throughput and program consistency. Develops, validates, and tests: structures and user documentation. Work is evaluated upon completion to ensure objectives have been met. Determines and develops approach to solutions. Responsibilities Experience with Scala, Kafka Streams and Akka Actors, GCP Data Storage: (GCP Buckets, MongoDB, MySQL, etc.) Experience with Google Cloud Platform (GCP) design and implementation Experience with Security in GCP Experience with Networking in GCP Lifecycle Management GKE and overall compute in GCP, including GCE Hands-on experience with microservices and distributed application architecture utilizing containers, Kubernetes, and/or serverless technology Experience with seamless/automated build scripts used for release management across all environments Experience with the full software development lifecycle and delivery using Agile practices In depth understanding of IP networking, VPN's, DNS, load balancing and firewalls Experience with multi-cloud architecture and deployment Experience developing cloud native CI/CD workflows and tools, such as Jenkins, Bamboo, Cloud Build (Google), etc. Establishes requirements for moderately complex software design projects. Prioritizes features to insure the most important get implemented Participates in code reviews and identifies bad sections early in the process and then recodes them Completes all phases of moderately complex software design projects. Carries out all in-process and final inspection activities Develops and tests documentation for the software projects Considers latest technologies and new approaches to designs and implementation of new designs Reviews changes or upgrades to existing software and/or firmware designs. Develops new technology to solve unique problems Provide recommendations and solutions to problems using experience in multiple technical areas Applies existing technology in new ways to improve performance and productivity May develop new tools to aid in the analysis and solving of problem Exercises judgment in selecting methods and techniques for obtaining solutions Receives little instruction on day-to-day work and general instructions on new assignments May influence the activities of junior level personnel (exempt professional and non-exempt) Networks with senior, internal and external, personnel in own area of expertise. Frequent inter-organizational and outside customer contacts Experience with Scala, Kafka Streams and Akka Actors, GCP Data Storage: (GCP Buckets, MongoDB, MySQL, etc.) Experience with Google Cloud Platform (GCP) design and implementation Experience with Security in GCP Experience with Networking in GCP Lifecycle Management GKE and overall compute in GCP, including GCE Hands-on experience with microservices and distributed application architecture utilizing containers, Kubernetes, and/or serverless technology Experience with seamless/automated build scripts used for release management across all environments Experience with the full software development lifecycle and delivery using Agile practices In depth understanding of IP networking, VPN's, DNS, load balancing and firewalls Experience with multi-cloud architecture and deployment Experience developing cloud native CI/CD workflows and tools, such as Jenkins, Bamboo, Cloud Build (Google), etc. Establishes requirements for moderately complex software design projects. Prioritizes features to insure the most important get implemented Participates in code reviews and identifies bad sections early in the process and then recodes them Completes all phases of moderately complex software design projects. Carries out all in-process and final inspection activities Develops and tests documentation for the software projects Considers latest technologies and new approaches to designs and implementation of new designs Reviews changes or upgrades to existing software and/or firmware designs. Develops new technology to solve unique problems Provide recommendations and solutions to problems using experience in multiple technical areas Applies existing technology in new ways to improve performance and productivity May develop new tools to aid in the analysis and solving of problem Exercises judgment in selecting methods and techniques for obtaining solutions Receives little instruction on day-to-day work and general instructions on new assignments May influence the activities of junior level personnel (exempt professional and non-exempt) Networks with senior, internal and external, personnel in own area of expertise. Frequent inter-organizational and outside customer contacts Experience with Scala, Kafka Streams and Akka Actors, GCP Data Storage: (GCP Buckets, MongoDB, MySQL, etc.) Experience with Google Cloud Platform (GCP) design and implementation Experience with Security in GCP Experience with Networking in GCP Lifecycle Management GKE and overall compute in GCP, including GCE Hands-on experience with microservices and distributed application architecture utilizing containers, Kubernetes, and/or serverless technology Experience with seamless/automated build scripts used for release management across all environments Experience with the full software development lifecycle and delivery using Agile practices In depth understanding of IP networking, VPN's, DNS, load balancing and firewalls Experience with multi-cloud architecture and deployment Experience developing cloud native CI/CD workflows and tools, such as Jenkins, Bamboo, Cloud Build (Google), etc. Establishes requirements for moderately complex software design projects. Prioritizes features to insure the most important get implemented Participates in code reviews and identifies bad sections early in the process and then recodes them Completes all phases of moderately complex software design projects. Carries out all in-process and final inspection activities Develops and tests documentation for the software projects Considers latest technologies and new approaches to designs and implementation of new designs Reviews changes or upgrades to existing software and/or firmware designs. Develops new technology to solve unique problems Provide recommendations and solutions to problems using experience in multiple technical areas Applies existing technology in new ways to improve performance and productivity May develop new tools to aid in the analysis and solving of problem Exercises judgment in selecting methods and techniques for obtaining solutions Receives little instruction on day-to-day work and general instructions on new assignments May influence the activities of junior level personnel (exempt professional and non-exempt) Networks with senior, internal and external, personnel in own area of expertise. Frequent inter-organizational and outside customer contacts Qualifications Minimum Education: Bachelor's degree or technical diploma in Computer Science, Electronic Engineering, Computer Engineering, or related field 6+ years’ experience working on an operations style team (NOC, SOC, MOC, etc.) and troubleshooting networking, service desk, operations center and/or supporting cloud based Infrastructure. Proficient with Scala Experience with Kafka Preferred Experience: Experience with Windows, Linux, web services, networking, Data bases, and cloud platforms (AWS, Azure, and GCP). Understanding of the following monitoring concepts: Infrastructure, systems, and Application health, system availability, latency, performance, and end-to-end monitoring. Entry level cloud, network or security certificate from Cisco, Microsoft, AWS, CompTIA, or other well-known vendors. Knowledge and experience of SIEM, ELK, PLG, and container orchestration platforms like Kubernetes are preferred Cyber Security related Certifications like Security+, SSCP, CCSP, and CEH. Knowledge of Markup, query, and scripting languages, including Python, HTML, PromQL, SQL and familiarity with REST API calls, and PowerShell. Experience (1+ years) with ITIL processes including Incident, Problem, Change, Knowledge and Event Management.
Posted 2 months ago
6 - 7 years
11 - 14 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Location: Remote / Pan India,Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Notice Period: Immediate iSource Services is hiring for one of their client for the position of Java kafka developer. We are seeking a highly skilled and motivated Confluent Certified Developer for Apache Kafka to join our growing team. The ideal candidate will possess a deep understanding of Kafka architecture, development best practices, and the Confluent platform. You will be responsible for designing, developing, and maintaining scalable and reliable Kafka-based data pipelines and applications. Your expertise will be crucial in ensuring the efficient and robust flow of data across our organization. Develop Kafka producers, consumers, and stream processing applications. Implement Kafka Connect connectors and configure Kafka clusters. Optimize Kafka performance and troubleshoot related issues. Utilize Confluent tools like Schema Registry, Control Center, and ksqlDB. Collaborate with cross-functional teams and ensure compliance with data policies. Qualifications: Bachelors degree in Computer Science or related field. Confluent Certified Developer for Apache Kafka certification. Strong programming skills in Java/Python. In-depth Kafka architecture and Confluent platform experience. Experience with cloud platforms and containerization (Docker, Kubernetes) is a plus. Experience with data warehousing and data lake technologies. Experience with CI/CD pipelines and DevOps practices. Experience with Infrastructure as Code tools such as Terraform, or CloudFormation.
Posted 2 months ago
10 - 15 years
25 - 40 Lacs
Bengaluru
Hybrid
Welcome to HCL Software! We are looking for a Senior Software Engineer to help us build, optimize, and maintain inter-system information flows. As a Senior Software Engineer at HCL Software, you are an integral part of an agile team that works to enhance, build, and deliver trusted technology solutions. A strong academic background in combination with solid coding skills, understanding of software development methodologies, and good communication skills are key to success. You will design and develop solutions to ensure company information is stored and transferred across system boundaries in an effective, reliable, and secure fashion. YOUR AREA OF RESPONSIBILITY As a core technical contributor, you are responsible for designing and implementing critical technology solutions within various business functions. Design and implement creative software solutions, and conduct technical troubleshooting Develop secure high-quality production code, review and debug code written by your peers Drive outcomes-oriented workshops and work closely with the various functions to discover and take us through projects that improve our efficiency Analyze data storage, data transfers, and associated processes. Identify waste, and suggest solutions based on conceptual and logical data models and flowcharts. YOUR PROFILE MSc in Computer Science, or relevant adjacent field + formal training and certification on software engineering concepts 5+ years applied experience delivering system design, application development, testing, and operational stability Demonstrable experience with one or more of the following programming languages: TypeScript, Java, Scala, Rust. Expertise in Avro design, Kafka topics, exception handling, Kstreams, KSQL will help you to deliver results quickly Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Knowledge of systems such as Dynamics 365, NetSuite, SQL and of one or more ETL tools is advantageous but not required Practical cloud native experience is of advantage
Posted 2 months ago
7 - 10 years
9 - 12 Lacs
Bengaluru
Work from Office
Skills: . Java 17, Spring Boot, Microservices Architecture, Docker, Kubernetes, AWS, RESTful APIs, Apache Kafka,. Hiring:Java Architect Onsite in Bangalore. Chervic Advisory Services brings you an exclusive contract opportunity to join a prestigious project and make a significant impact in the Java Architect!. Position:Java Architect. Location:Bangalore (Onsite). Work Type:Contract. Duration:Minimum 3 Months (Extendable). Experience Required:15+ Years. Relevant Experience:10+ Years. JOB Description. Technical Skills: . Proficiency in Java and J2EE technologies. Strong understanding of core java design patterns, EAI patterns, J2EE patterns and best practices. Strong design documentation experience using UML. Experience with frameworks such as Spring (and its ecosystem), Hibernate, and others. Strong knowledge on OpenAPI, SOAP, REST, GraphQL. Experience on Kafka, Kafka Streams, Kafka Connect. Hands-on experience with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of CI/CD pipelines and tools such as Github Action, Azure Devops etc. Familiarity with database design (RDBMS as well as NOSQL DB). Familiarity with workflow & Rule engine (e.g. Drools, Activiti, Camunda etc.). Familiarity with data serialization solutions like Google Protobuf would be a plus. Soft Skills. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to communicate complex technical concepts to non-technical stakeholders. Experience with Agile development methodologies. Ready to Apply?. Send your CV to koyel@chervic.in. Contact us at 9635247380. hashtag. #Opentowork hashtag. #Lookingforchange hashtag. #Fulltime hashtag. #Contractroles hashtag. #ChervicAdvisoryServices hashtag. #Panindia hashtag. #JavaArchitect hashtag. #Kafka hashtag. #MongoDB hashtag. #Hiring hashtag. #BangaloreJobs hashtag. #nosql. Show more Show less
Posted 2 months ago
8 - 13 years
20 - 35 Lacs
Chennai
Hybrid
We are looking for someone with: Strong and demonstrable problem-solving ability Comfortable with self-management and on-the-job learning Ability to share knowledge across team(s) Demonstrable initiative and logical thinking Passion about emerging technologies and self development Strong computer science fundamentals Collaborative work-ethic Strong problem-solving and analytical skills Excellent communication skills Knowledge of applying object oriented and functional programming styles to real world problems. Ideally (but not restrictive) you should have: Hands on experience (5+) years using Java and/or Scala Knowledge of continuous integration and continuous delivery Knowledge of microservice architecture Working experience with TDD & BDD Experience building REST API's Experience working with Docker General knowledge of agile software development concepts and processes Proficient understanding of code versioning tools, such as Git Working experience with Jira, Confluence Nice to haves: Special interest in functional programming Knowledge of reactive manifesto Knowledge of streaming data Experience with Akka, Play Framework or Lagom Experience working with Kafka Knowledge of NoSQL Cloud based development with AWS, Microsoft Azure, Google Cloud etc. Commercial exposure with ELK stack
Posted 3 months ago
5 - 10 years
35 - 50 Lacs
Chennai
Hybrid
We are looking for someone with: Strong and demonstrable problem-solving ability Comfortable with self-management and on-the-job learning Ability to share knowledge across team(s) Demonstrable initiative and logical thinking Passion about emerging technologies and self development Strong computer science fundamentals Collaborative work-ethic Strong problem-solving and analytical skills Excellent communication skills Knowledge of applying object oriented and functional programming styles to real world problems. Ideally (but not restrictive) you should have: Hands on experience (5+) years using Scala and/or Java Knowledge of continuous integration and continuous delivery Knowledge of microservice architecture Working experience with TDD & BDD Experience building REST API's Experience working with Docker General knowledge of agile software development concepts and processes Proficient understanding of code versioning tools, such as Git Working experience with Jira, Confluence Nice to haves: Special interest in functional programming Knowledge of reactive manifesto Knowledge of streaming data Experience with Akka, Play Framework or Spring Experience working with Kafka Knowledge of NoSQL Cloud based development with AWS, Microsoft Azure, Google Cloud etc. Commercial exposure with ELK stack
Posted 3 months ago
5 - 7 years
11 - 20 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Job Title : Kafka Developer Location : Chennai /Hyderabad/Bangalore Job Type : Full-Time Experience:5-9 Yrs Introduction : We are seeking an experienced Kafka Developer with JAVA to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining real-time data streaming systems using Apache Kafka. You will work closely with cross-functional teams to build scalable, high-performance data pipelines and enable the efficient flow of data across various applications. Responsibilities : Design, implement, and maintain scalable, high-performance data streaming systems using Apache Kafka. Build and deploy Kafka topics, producers, and consumers for real-time data processing. Collaborate with backend engineers, data engineers, and other team members to integrate Kafka into various systems and platforms. Optimize Kafka clusters for performance, scalability, and high availability. Develop Kafka Streams applications for real-time data processing and transformation. Troubleshoot and resolve Kafka-related issues, including cluster performance, message processing, and data consistency problems. Implement security best practices within the Kafka ecosystem, including access control, encryption, and authentication. Monitor Kafka clusters and pipelines to ensure uptime and performance metrics are met. Ensure proper data governance and compliance measures are implemented across the Kafka pipeline. Develop and maintain documentation, including setup guides, technical specifications, and architecture diagrams. Stay up to date with the latest Kafka features, improvements, and industry best practices. Requirements : Proven experience as a Kafka Developer, Data Engineer, or similar role with hands-on expertise in Apache Kafka. Strong knowledge of Kafkas core concepts: topics, partitions, producers, consumers, brokers, and Kafka Streams. Experience with Kafka ecosystem tools like Kafka Connect, Kafka Streams, and KSQL. Expertise in Java, developing Kafka-based solutions. Experience in deploying and managing Kafka clusters in cloud environments (AWS, Azure, GCP). Strong understanding of distributed systems, message brokers, and data streaming architectures. Familiarity with stream processing and real-time data analytics. Experience in building, optimizing, and monitoring Kafka-based systems. Knowledge of containerization technologies (e.g., Docker, Kubernetes) for managing Kafka deployments. Excellent problem-solving skills and the ability to troubleshoot complex Kafka-related issues. Strong communication and collaboration skills for working in a team environment. Preferred Qualifications : Experience with other messaging systems like Apache Pulsar or RabbitMQ. Familiarity with data storage technologies like HDFS, NoSQL, or relational databases. Experience in DevOps practices and CI/CD pipelines. Knowledge of cloud-native architectures and microservices. Education : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Why Join Us : [Photon Interactive systems offers a dynamic and inclusive work environment with opportunities for personal and professional growth. Competitive salary and benefits package. Work with the latest technologies in the field of data streaming and big data analytics.
Posted 3 months ago
2 - 7 years
8 - 18 Lacs
Bengaluru
Work from Office
DESCRIPTION AWS Sales, Marketing, and Global Services (SMGS) is responsible for driving revenue, adoption, and growth from the largest and fastest growing small- and mid-market accounts to enterprise-level customers including public sector. Amazon Web Services is the global market leader and technology forerunner in the Cloud business. As a member of the AWS Support team in Amazon Web Services, you will be at the forefront of this transformational technology, assisting a global list of companies and developers that are taking advantage of a growing set of services and features to run their mission-critical applications. As a Cloud Support Engineer, you will act as the Cloud Ambassador’ across all the cloud products, arming our customers with required tools & tactics to get the most out of their Product and Support investment. Would you like to use the latest cloud computing technologies? Do you have an interest in helping customers understand application architectures and integration approaches? Are you familiar with best practices for applications, servers and networks? Do you want to be part of a customer facing technology team in India helping to ensure the success of Amazon Web Services (AWS) as a leading technology organization? If you fit the description, you might be the person we are looking for! We are a team passionate about cloud computing, and believe that world class support is critical to customer success. Key job responsibilities - Diagnose and resolve issues related to Kafka performance, connectivity, and configuration. - Monitor Kafka clusters and perform regular health checks to ensure optimal performance. - Collaborate with development teams to identify root causes of problems and implement effective solutions. - Provide timely and effective support to customers via email, chat, and phone. - Create and maintain documentation for troubleshooting procedures and best practices. - Assist in the deployment and configuration of Kafka environments, including brokers, producers, and consumers. - Conduct training sessions and provide knowledge transfer to team members. - You will be continuously learning groundbreaking technologies, and developing new technical skills and other professional competencies. - You will act as interviewer in hiring processes, and coach/mentor new team members. A day in the life • First and foremost this is a customer support role – in The Cloud. • On a typical day, a Support Engineer will be primarily responsible for solving customer’s cases through a variety of customer contact channels which include telephone, email, and web/live chat. You will apply advanced troubleshooting techniques to provide tailored solutions for our customers and drive customer interactions by thoughtfully working with customers to dive deep into the root cause of an issue. • Apart from working on a broad spectrum of technical issues, an AWS Support Engineer may also coach/mentor new hires, develop & present training, partner with development teams on complex issues or contact deflection initiatives, participate in new hiring, write tools/script to help the team, or work with leadership on process improvement and strategic initiatives to ensure better CX and compliance with global AWS standards, practices and policies. • Career development: We promote advancement opportunities across the organization to help you meet your career goals. • Training: We have training programs to help you develop the skills required to be successful in your role. • We hire smart people who are keen to build a career with AWS, so we are more interested in the areas that you do know instead of those you haven’t been exposed to yet. • Support engineers interested in travel have presented training or participated in focused summits across our sites or at specific AWS events. AWS Support is 24/7/365 operations and shift work will be required in afternoon i.e. 1 PM to 10 PM IST About the team Diverse Experiences AWS values diverse experiences. Even if you do not meet all of the preferred qualifications and skills listed in the job description, we encourage candidates to apply. If your career is just starting, hasn’t followed a traditional path, or includes alternative experiences, don’t let it stop you from applying. Why AWS? Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform. We pioneered cloud computing and never stopped innovating — that’s why customers from the most successful startups to Global 500 companies trust our robust suite of products and services to power their businesses. Inclusive Team Culture Here at AWS, it’s in our nature to learn and be curious. Our employee-led affinity groups foster a culture of inclusion that empower us to be proud of our differences. Ongoing events and learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences, inspire us to never stop embracing our uniqueness. Mentorship & Career Growth We’re continuously raising our performance bar as we strive to become Earth’s Best Employer. That’s why you’ll find endless knowledge-sharing, mentorship and other career-advancing resources here to help you develop into a better-rounded professional. Work/Life Balance We value work-life harmony. Achieving success at work should never come at the expense of sacrifices at home, which is why we strive for flexibility as part of our working culture. When we feel supported in the workplace and at home, there’s nothing we can’t achieve in the cloud. BASIC QUALIFICATIONS - Bachelor’s degree OR equivalent experience in a technical position; Requires minimum of 2+ yrs experience in relevant technical position - Exposure to Database Fundamentals and General Troubleshooting (tuning and optimization, deadlocks, keys, normalization) in any Relational Database Engines (MySQL, PostgreSQL, Oracle, SQLServer) OR exposure to search services fundamentals and troubleshooting (indices and JVMMemory analysis and CPU utilization) for key open source products like Elasticsearch and Solr OR exposure to streaming services like Kafka / Kinesis. - Experience in Business Analytics application, support, and troubleshooting concepts; Experience with System Administration and troubleshooting with Linux (Ubuntu, CentOS, RedHat) and/or Microsoft Windows Server and associated technologies (Active Directory); Experience with Networking and troubleshooting (TCP/IP, DNS, OSI model, routing, switching, firewalls, LAN/WAN, traceroute, iperf, dig, URL or related) PREFERRED QUALIFICATIONS - Experience in a customer support environment and Experience in analyzing, troubleshooting, and providing solutions to technical issues - Knowledge in data warehousing and ETL process • Understanding of Cloud Computing concepts; Experience in scripting or developing in at least one of the following languages :Python, R, Ruby, GO, Java, .NET (C#), JavaScript - Expertise in any one of the Data warehouse technology (example Redshift, Teradata, Exadata or Snowflake) OR expertise in search services products like Elasticsearch / Solr; Expertise in streaming services like Kafka / Kinesis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 3 months ago
6 - 8 years
8 - 12 Lacs
Chennai
Work from Office
Around 5+ years of hands-on experience in Java based application development with integration into Kafka messaging systems Mandatory Development experience implementing Spring, Spring Boot, Microservices at least for one year. Preferred candidates with hands on experience with Apache Kafka (producers, consumers and stream processors) Familiarity with Kafka internals such as brokers, zookeepers, topics and partitions Familiarity with tools like Kafka connect, Kafka streams and schema Registry Very strong hands-on experience in Java-8 features like Generics, exception handling, collection API, Functional Interfaces, Multithreading, Lambda Expression, Stream API etc Mandatory knowledge in deploying microservices in ECS environment Kubernetes, docker, Light speed etc. Knowledge and experience in Junit are must. Experience in writing Oracle PL / SQL queries. Good to have: Angular, CSS, Banking domain, capital markets
Posted 3 months ago
7 - 12 years
25 - 32 Lacs
Gurgaon
Work from Office
Role & responsibilities Uses algorithms, data structures, programming languages, programming paradigms to create, test and operate sustainable client-side or server-side software applications and services. Builds and extends software applications in varying (cloud, hybrid cloud, and on-premise) environments. Delivers software that meet architectural and operational requirements and perform to expectations. Ensures applications are designed to be highly available, observable, and durable via software engineering best practices. Works with business and systems analysts to understand end-user requirements and translate those into pragmatic and effective technical solutions. Works closely with onsite and remote frontend, backend and operations engineers to ensure deliverables are well-documented, secure and resilient. Contribute to engineering automation, management or development of production level systems Partner with project management to deliver status and performance updates Collaborates with the engineering teams globally to deliver on integration projects. Demonstrates ability to develop and manage a strategy in a cross-functional environment Mentor engineers and ensure that all team projects are delivered following standards and best practices Engages in ongoing quality and performance improvement to ensure reliability and security. Preferred candidate profile 8+ years of engineering experience. Strong working experience in design and development using Java, Spring Boot and Microservices Experience with Confluent Kafka is a must. 2+ years of experience in Front End like ReactJS, Angular or equivalent is Required. Fluency with NoSQL databases (Mongo, Cassandra, or equivalent), RDBMS (Oracle, MySQL or equivalent). 1+ year(s) experience in Java based Content Management System (Contentful, Content stack, Liferay, Drupal, or equivalent) is preferred. Retail Industry background or e-commerce experience for candidates from other industries. Preferred : GraphQL experience or knowledge and Azure Cloud Experience is a plus. Hands-on code mindset with deep understanding in technologies / skillset and an ability to understand larger picture. Sound knowledge in Architectural Patterns, best practices and Non-Functional Requirements Understanding of DevOps as well as experience with CI/CD pipelines Exposure to Agile Methodology and project tools: Jira, Confluence, SharePoint. Excellent team player, ability to work independently and as part of a team Experience in mentoring junior developers and providing technical leadership Ability to learn, understand, and work quickly with new emerging technologies, methodologies, and solutions in the Cloud/IT technology space
Posted 3 months ago
5 - 9 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apigee Good to have skills : Spring Boot Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of full-time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :ApigeeGood to Have Skills : Spring BootJob Requirements :Key Responsibilities :1-Designing, operationalizing, maintaining and scaling Kafka clusters Strong experience with Kafka Confluent / Apache framework, Kafka SQL KSQLand the Kafka Streaming APIs2-Experience with messaging and stream processing architectures built on Kafka3-Installation and Setup of Kafka clusters4-Designing, operationalizing, maintaining and scaling Kafka clusters5-Strong experience with Kafka Confluent / Apache framework, Kafka SQL KSQLand the Kafka Streaming APIs6-Experience in developing Kafka Technical Experience :1-Work Experience :4-6 years 2-DevOps experience CI/CD, monitoring and troubleshooting of systems Knowledge on Cloud environments 3-Experience with API concepts and technologies such as REST, JSON, XML, SOAP, YAML, GraphQL, and Swagger 4-Experience of Kafka Producer/Consumer Microservices concepts and Kafka distributed Architecture Professional Attributes :Good communication skills, 2- Good analytical skills, 3- should extend and work whenever required Educational Qualification:Minimum 15 years of full-time educationAdditional Info :Must Have Skills:Strong experience with Kafka Good to Have:Telecom Domain, Experience with agile Qualification Minimum 15 years of full-time education
Posted 3 months ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : Test Management, Selenium Minimum 3 year(s) of experience is required Educational Qualification : any Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to understand the project requirements, designing and developing applications using Apache Kafka, and ensuring the applications meet the desired functionality and performance standards. You will also be responsible for troubleshooting and resolving any application-related issues that may arise. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with the team to understand project requirements. Design and develop applications using Apache Kafka. Ensure applications meet desired functionality and performance standards. Troubleshoot and resolve application-related issues. Continuously enhance and optimize application performance. Professional & Technical Skills: Must To Have Skills:Proficiency in Apache Kafka. Good To Have Skills:Experience with Test Management and Selenium. Strong understanding of distributed messaging systems. Experience with real-time data streaming and processing. Hands-on experience with Kafka Connect and Kafka Streams. Knowledge of Kafka security and authentication mechanisms. Additional Information: The candidate should have a minimum of 3 years of experience in Apache Kafka. This position is based at our Bengaluru office. Any education is required. Qualification any
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2