Jobs
Interviews

151 Apache Kafka Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

0 - 0 Lacs

Hyderabad

Remote

Data Engineering / Big Data part time Work from Home (Any where in world) Warm Greetings from Excel Online Classes, We are a team of industry professionals running an institute that provides comprehensive online IT training, technical support, and development services. We are currently seeking Data Engineering / Big Data Experts who are passionate about technology and can collaborate with us in their free time. If you're enthusiastic, committed, and ready to share your expertise, we would love to work with you! Were hiring for the following services: Online Training Online Development Online Technical Support Conducting Online Interviews Corporate Training Proof of Concept (POC) Projects Research & Development (R&D) We are looking for immediate joiners who can contribute in any of the above areas. If you're interested, please fill out the form using the link below: https://docs.google.com/forms/d/e/1FAIpQLSdvut0tujgMbBIQSc6M7qldtcjv8oL1ob5lBc2AlJNRAgD3Cw/viewform We also welcome referrals! If you know someone—friends, colleagues, or connections—who might be interested in: Teaching, developing, or providing tech support online Sharing domain knowledge (e.g., Banking, Insurance, etc.) Teaching foreign languages (e.g., Spanish, German, etc.) Learning or brushing up on technologies to clear interviews quickly Upskilling in new tools or frameworks for career growth Please feel free to forward this opportunity to them. For any queries, feel free to contact us at: excel.onlineclasses@gmail.com Thank you & Best Regards, Team Excel Online Classes excel.onlineclasses@gmail.com

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Navi Mumbai

Work from Office

Apache Kafka (Kafka Connect, Schema Registry, Kafka Streams). Kafka cluster Prometheus, Grafana, ELK stack. Linux/Unix Bash, Python. (Ansible, Jenkins, Git). Confluent Cloud, AWS MSK) is a plus.

Posted 1 month ago

Apply

5.0 - 9.0 years

25 - 30 Lacs

Chennai

Work from Office

Key Responsibilities: Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink. Build scalable and automated MLOps pipelines for model training, validation, and deployment using AWS SageMaker and related services. Implement and manage Infrastructure as Code (IaC) using Terraform for AWS provisioning and maintenance. Collaborate with ML, Data Science, and DevOps teams to ensure reliable and efficient model deployment workflows. Optimize data storage and retrieval strategies for both structured and unstructured large-scale datasets. Integrate and transform data from multiple sources into data lakes and data warehouses. Monitor, troubleshoot, and improve performance of cloud-native data systems in a fast-paced production setup. Ensure compliance with data governance, privacy, and security standards across all data operations. Document data engineering workflows and architectural decisions for transparency and maintainability. Required Skills & Qualifications: 5+ Years of experience as Data Engineer or in similar role Proven experience in building data pipelines and streaming applications using Apache Kafka and Apache Flink. Strong ETL development skills, with deep understanding of data modeling and data architecture in large-scale environments. Hands-on experience with AWS services, including SageMaker, S3, Glue, Lambda, and CloudFormation or Terraform. Proficiency in Python and SQL; knowledge of Java is a plus, especially for streaming use cases. Strong grasp of MLOps best practices, including model versioning, monitoring, and CI/CD for ML pipelines. Deep knowledge of IaC tools, particularly Terraform, for automating cloud infrastructure. Excellent analytical and problem-solving abilities, especially with regard to data processing and deployment issues. Agile mindset with experience working in fast-paced, iterative development environments. Strong communication and team collaboration skills.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Navi Mumbai

Work from Office

Apache Kafka (Kafka Connect, Schema Registry, Kafka Streams). Kafka cluster Prometheus, Grafana, ELK stack. Linux/Unix Bash, Python. (Ansible, Jenkins, Git). Confluent Cloud, AWS MSK) is a plus. Required Candidate profile CBD Belapur – Navi Mumbai

Posted 1 month ago

Apply

2.0 - 7.0 years

10 - 20 Lacs

Bengaluru

Hybrid

Software Engineer – Billing Automation Integration – 2–5 yrs, Golang/Java (must), Kafka, MySQL, K8s, API Dev, Billing systems. DevOps mindset. Loc: BLR(Hybrid), 9 mo C2H @ TE Infotech (Exotel) Convertible to permanent. Apply: ssankala@toppersedge.com

Posted 1 month ago

Apply

1.0 years

3 - 11 Lacs

IN

Remote

About the job: Hi, we are Sentics! We are a technology start-up that develops AI-based camera systems for industrial applications. Areas of application for our customers include safety/accident prevention, infrastructure automation, and human-machine collaboration. Our customers range from SMEs to OEMs from various industries. We are now looking for support for the next step - that means you! Key responsibilities: A. Network configuration and VPN management: 1. Configuration and monitoring of network settings on the router. 2. Management and optimization of VPN services for secure communication. 3. Allocation and monitoring of ports for hosted services. B. Remote lifecycle management: 1. Development of remote lifecycle management tools for our products. C. Git management and administration: 1. Active management of Git repositories, including user administration and permissions. 2. Implementation and maintenance of Continuous Integration/Continuous Deployment (CI/CD) pipelines. 3. Administration of Git runners and assistance in integrating new development tools. D. Backup strategies and data security: 1. Implementation and monitoring of backup solutions for our server infrastructure. 2. Ensuring data security and integrity. E. Management of server services: 1. Monitoring and administration of all Docker containers and services for development processes and cloud-based products. 2. Development and scaling of applications in containerized environments. Our benefits: 1. A young, dynamic team with flat hierarchies. 2. The opportunity to work on exciting projects and contribute your ideas. 3. Flexible working hours and the option to work from home. 4. Fair remuneration and the chance for long-term cooperation. Who can apply: Only those candidates can apply who: have minimum 1 years of experience are Computer Science Engineering students Salary: ₹ 3,50,000 - 11,00,000 /year Experience: 1 year(s) Deadline: 2025-07-30 23:59:59 Other perks: 5 days a week Skills required: OpenCV, Computer Vision, C++ Programming and Apache Kafka Other Requirements: 1. C++ coding [at least 4 years]. 2. Data engineer/handling big data into one set/ADAS. 3. Proficiency in Sensor fusion, Kalman filters. 4. Leading engineer in autonomous driving / autonomous vacuum/flying. 5. Leading a team of 5 – 6 people on a longer term. 6. Experience in remote lifecycle management of products is advantageous. 7. Should know how to set up software/test cases, upset. 8. A very efficient data structure to handle thousands of data points per second. 9. High motivation to contribute to learn new topics, ability to work independently, and willing to adapt to flexible tasks. 10. Knowledge of network configurations, VPN management, and port allocations. 11. Solid experience in Git repository management and CI/CD pipelines. 12. Experience in managing Docker containers and containerized environments. 13. Competence in server backup and data security. 14. Team player with quick perception, reliability, and accuracy. 15. Contributions to open-source projects (You’re welcome to share links to your contributions/Github account). About Company: Sentics GmbH helps you increase both your production and storage potentials and make your environment safe with only one system.

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 15 Lacs

Hyderabad, Bengaluru

Work from Office

We are hiring a Microservices Lead with strong expertise in designing and developing scalable applications using Java, Spring Boot, and Python. The ideal candidate should have deep experience with microservices architecture, container orchestration using Kubernetes, cloud deployment on Azure, and event streaming with Apache Kafka. Familiarity with monitoring tools such as Open Search, AppInsights, and Grafana is highly desirable. This role requires technical leadership, hands-on coding skills, and experience in guiding teams in an Agile/DevOps environment. Location - Hyderabad / Bangalore / Remote.

Posted 1 month ago

Apply

10.0 - 15.0 years

96 - 108 Lacs

Bengaluru

Work from Office

Responsibilities: * Design data solutions using Java, Python & Apache Spark. * Collaborate with cross-functional teams on Azure cloud projects. * Ensure data security through Redis caching and HDFS storage.

Posted 1 month ago

Apply

6.0 - 8.0 years

30 - 40 Lacs

Bengaluru

Work from Office

Lynx is a BO application facilitating Clearing & Settlement within PB. It also handles other streams w.r.t Margin & Cash Mgmt, Client Billing, Firm Clearing Cost Allocation, Asset Servicing, etc. Result oriented Senior Developer with 5 to 7 years of experience, who can work independently, liaise with onshore stakeholders and complete tasks within the timelines. The candidate should be technically inclined and with proven skills in the latest development stack and should be able to help the junior team members. Responsibilities Direct Responsibilities 1. Application development / support / enhancements / bug-fixing. 2. Demonstrate Good understanding of the Functional aspects of the application 3. Reporting progress to the Team Lead. 4. Escalation of problems to local Team Lead / management 5. Ensuring that the project and organization standards are followed during various phases of software development life-cycle and day to day development work. 6. Maintain administration tasks; i.e. Jira to record progress against tasks, Wiki for documentation. 7. Meeting deliverables deadlines. 8. Keeping Team Lead appraised of issues that affect the project and the deliverables of the project. Contributing Responsibilities Sprint & Release Planning Technical & Behavioral Competencies 1) Java SDK versions 1.8 (Data Structures, Data Encapsulation and Inheritance, Exception Handling , exposure to newer features i.e. Lambdas, Streams will be preferable) 2) Advanced Java SDK versions 1.8 (Multi-Threading, Thread Executor Framework, Transaction Management, etc.) 3) Hands-on experience Hibernate 4) Web services (REST/WSDL) 5) Spring 4 (IoC / DI and CORE SPRING features) and exposure with respect to Spring Boot, Spring MVC, Spring JPA and Spring JDBC 6) UI Framework Exposure: Hands-on experience on any UI Frameworks (Angular JS, Ext JS, JQuery, GWT/GXT or similar), willingness to work on GWT/GXT 7) SQL / PL-SQL: The database we use is Sybase 16, hands-on experience on SQL queries, PL/SQL code and mandatory experience on Oracle or SQL Server or MySQL or any equivalent database 8) Hands-on experience on writing JUnit test cases using JUnit, Mockito, SpringJUnit 9) Hands-on experience on UNIX. 10) Hands-on experience on Maven, Jenkins & Build/Deployment tools 11) Hands-on experience on SVN / GIT / other version tools 12) Hands-on experience and exposure to FULL SDLC Lifecycle 13) Hands-on experience on Spring MicroServices 14) Hands-on experience on Apache Kafka 15) Hands-on experience on any MW technologies (IBM MQ, TIBCO, Solace or equivalent) 16) Nice to have Knowledge of IBM MQ or exposure to any other Middleware technologies 17) Knowledge of either Perl, Python or Unix SHELL scripting is mandatory 18) Nice to have Knowledge of Autosys / other scheduling tools like CRON jobs 19) Be able to work locally with peer teams and remotely with BA and project lead 20) Business / Technical spoken and written English Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Ability to deliver / Results driven Client focused Communication skills - oral & written Education Level: Bachelor Degree or equivalent Experience Level At least 5 years Mandatory Skills: Java, Spring, Hibernate, PL/SQL, Any 1 FrontEnd UI Framework, CIB experience / organization, Good to have : Capital Markets experience

Posted 1 month ago

Apply

6.0 - 8.0 years

27 - 30 Lacs

Pune, Ahmedabad, Chennai

Work from Office

Technical Skills Must Have: 8+ years overall IT industry experience, with 5+ years in a solution or technical architect role using service and hosting solutions such as private/public cloud IaaS, PaaS and SaaS platforms. 5+ years of hands-on development experience with event driven architecture-based implementation. Achieved one or more of the typical solution and technical architecture certifications e.g. Microsoft, MS Azure Certification, TOGAF, AWS Cloud Certified, SAFe, PMI, and SAP etc. Hand-on experience with: o Claims-based authentication (SAML/OAuth/OIDC), MFA, JIT, and/or RBAC / Ping etc. o Architecting Mission critical technology components with DR capabilities. o Multi-geography, multi-tier service design and management. o Project financial management, solution plan development and product cost estimation. o Supporting peer teams and their responsibilities; such as infrastructure, operations, engineering, info-security. o Configuration management and automation tools such as Azure DevOps, Ansible, Puppet, Octopus, Chef, Salt, etc. o Software development full lifecycle methodologies, patterns, frameworks, libraries and tools. o Relational, graph and/or unstructured data technologies such as SQL Server, Azure SQL, Cosmos, Azure Data Lake, HD Insights, Hadoop, Neo4j etc. o Data management and data governance technologies. o Experience in data movement and transformation technologies. o AI and Machine Learning tools such as Azure ML etc. o Architecting mobile applications that are either independent applications or supplementary addons (to intranet or extranet). o Cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. o Apache Kafka, Confluent Kafka, Kafka Streams, and Kafka Connect. o Proficient in NodeJS, Java, Scala, or Python languages.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Nagpur

Work from Office

Your Role : We're looking for a Senior Java Engineer who is passionate about technology and eager to solve complex problems. You'll join a diverse team of product managers, engineers, and designers, working together to build scalable, robust backend solutions in the accounting and finance space. What You'll Do : - Innovate and Build : Design, build, and maintain a high-performance platform for accounting and finance with an uncompromising focus on data integrity. - Develop and Deploy : Create solutions in Java/Spring Boot that provide seamless experiences for our users and deploy them into production. - Collaborate and Scale : Work closely with Product, Frontend, and DevOps teams to ensure our solutions are scalable and extensible. - Quality Focus : Ensure high product quality and user experience by addressing performance bottlenecks and debugging issues quickly. - Contribute and Grow : Participate in solution design and code reviews while evangelizing best practices and engineering hygiene. What We're Looking For : - Educational Background : Bachelor's degree in Computer Science, Information Technology, or a related field. - Technical Expertise : Proven experience in building highly scalable, high-performance applications. - Java Proficiency : Extensive hands-on experience with Java and Spring Boot, particularly developing API-first solutions using GraphQL and REST. - Solution Design and Architecture : Strong understanding of software design patterns, microservices architecture, and designing scalable solutions. Experience with API design and best practices. - Database Knowledge : Proficient in database design and management for both SQL (PostgreSQL/ MySQL) and NoSQL (Redis/MongoDB/Cassandra) databases. - Cloud Experience : Hands-on experience with cloud platforms such as AWS, including services like AWS Lambda, EC2, ECS, S3, and RDS. Experience with serverless architectures is a plus. - Messaging Systems : Familiarity with message streaming/queuing systems such as Apache Kafka, RabbitMQ, AWS SQS/SNS/Kinesis. - DevOps Skills : Experience with CI/CD pipelines, containerization (Docker), and orchestration tools (Kubernetes). - Security Best Practices : Knowledge of security principles and best practices for building secure applications, including authentication, authorization, and encryption. - Problem-Solving Skills : Strong analytical skills with the ability to troubleshoot and resolve complex issues efficiently. - Performance Optimization : Experience in identifying and addressing performance bottlenecks within applications and infrastructure. - Collaboration and Communication : Excellent interpersonal skills, with the ability to work effectively in a team environment and communicate technical concepts clearly to non-technical stakeholders. - Agile Methodologies : Familiarity with Agile methodologies and experience working in an Agile/Scrum environment. - Continuous Learning : A proactive mindset for continuous learning and staying updated with the latest industry trends and technologies. Why You'll Love Working Here : - Impactful Work : Be a part of building the next-gen AI-first accounting and finance platform. - Collaborative Culture : Work in a team-oriented environment where your contributions are valued. - Professional Growth : Opportunities for career development and learning new technologies. - Competitive Benefits : Enjoy a competitive salary and benefits package.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Kolkata

Work from Office

Your Role : We're looking for a Senior Java Engineer who is passionate about technology and eager to solve complex problems. You'll join a diverse team of product managers, engineers, and designers, working together to build scalable, robust backend solutions in the accounting and finance space. What You'll Do : - Innovate and Build : Design, build, and maintain a high-performance platform for accounting and finance with an uncompromising focus on data integrity. - Develop and Deploy : Create solutions in Java/Spring Boot that provide seamless experiences for our users and deploy them into production. - Collaborate and Scale : Work closely with Product, Frontend, and DevOps teams to ensure our solutions are scalable and extensible. - Quality Focus : Ensure high product quality and user experience by addressing performance bottlenecks and debugging issues quickly. - Contribute and Grow : Participate in solution design and code reviews while evangelizing best practices and engineering hygiene. What We're Looking For : - Educational Background : Bachelor's degree in Computer Science, Information Technology, or a related field. - Technical Expertise : Proven experience in building highly scalable, high-performance applications. - Java Proficiency : Extensive hands-on experience with Java and Spring Boot, particularly developing API-first solutions using GraphQL and REST. - Solution Design and Architecture : Strong understanding of software design patterns, microservices architecture, and designing scalable solutions. Experience with API design and best practices. - Database Knowledge : Proficient in database design and management for both SQL (PostgreSQL/ MySQL) and NoSQL (Redis/MongoDB/Cassandra) databases. - Cloud Experience : Hands-on experience with cloud platforms such as AWS, including services like AWS Lambda, EC2, ECS, S3, and RDS. Experience with serverless architectures is a plus. - Messaging Systems : Familiarity with message streaming/queuing systems such as Apache Kafka, RabbitMQ, AWS SQS/SNS/Kinesis. - DevOps Skills : Experience with CI/CD pipelines, containerization (Docker), and orchestration tools (Kubernetes). - Security Best Practices : Knowledge of security principles and best practices for building secure applications, including authentication, authorization, and encryption. - Problem-Solving Skills : Strong analytical skills with the ability to troubleshoot and resolve complex issues efficiently. - Performance Optimization : Experience in identifying and addressing performance bottlenecks within applications and infrastructure. - Collaboration and Communication : Excellent interpersonal skills, with the ability to work effectively in a team environment and communicate technical concepts clearly to non-technical stakeholders. - Agile Methodologies : Familiarity with Agile methodologies and experience working in an Agile/Scrum environment. - Continuous Learning : A proactive mindset for continuous learning and staying updated with the latest industry trends and technologies. Why You'll Love Working Here : - Impactful Work : Be a part of building the next-gen AI-first accounting and finance platform. - Collaborative Culture : Work in a team-oriented environment where your contributions are valued. - Professional Growth : Opportunities for career development and learning new technologies. - Competitive Benefits : Enjoy a competitive salary and benefits package.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Mumbai

Work from Office

Your Role : We're looking for a Senior Java Engineer who is passionate about technology and eager to solve complex problems. You'll join a diverse team of product managers, engineers, and designers, working together to build scalable, robust backend solutions in the accounting and finance space. What You'll Do : - Innovate and Build : Design, build, and maintain a high-performance platform for accounting and finance with an uncompromising focus on data integrity. - Develop and Deploy : Create solutions in Java/Spring Boot that provide seamless experiences for our users and deploy them into production. - Collaborate and Scale : Work closely with Product, Frontend, and DevOps teams to ensure our solutions are scalable and extensible. - Quality Focus : Ensure high product quality and user experience by addressing performance bottlenecks and debugging issues quickly. - Contribute and Grow : Participate in solution design and code reviews while evangelizing best practices and engineering hygiene. What We're Looking For : - Educational Background : Bachelor's degree in Computer Science, Information Technology, or a related field. - Technical Expertise : Proven experience in building highly scalable, high-performance applications. - Java Proficiency : Extensive hands-on experience with Java and Spring Boot, particularly developing API-first solutions using GraphQL and REST. - Solution Design and Architecture : Strong understanding of software design patterns, microservices architecture, and designing scalable solutions. Experience with API design and best practices. - Database Knowledge : Proficient in database design and management for both SQL (PostgreSQL/ MySQL) and NoSQL (Redis/MongoDB/Cassandra) databases. - Cloud Experience : Hands-on experience with cloud platforms such as AWS, including services like AWS Lambda, EC2, ECS, S3, and RDS. Experience with serverless architectures is a plus. - Messaging Systems : Familiarity with message streaming/queuing systems such as Apache Kafka, RabbitMQ, AWS SQS/SNS/Kinesis. - DevOps Skills : Experience with CI/CD pipelines, containerization (Docker), and orchestration tools (Kubernetes). - Security Best Practices : Knowledge of security principles and best practices for building secure applications, including authentication, authorization, and encryption. - Problem-Solving Skills : Strong analytical skills with the ability to troubleshoot and resolve complex issues efficiently. - Performance Optimization : Experience in identifying and addressing performance bottlenecks within applications and infrastructure. - Collaboration and Communication : Excellent interpersonal skills, with the ability to work effectively in a team environment and communicate technical concepts clearly to non-technical stakeholders. - Agile Methodologies : Familiarity with Agile methodologies and experience working in an Agile/Scrum environment. - Continuous Learning : A proactive mindset for continuous learning and staying updated with the latest industry trends and technologies. Why You'll Love Working Here : - Impactful Work : Be a part of building the next-gen AI-first accounting and finance platform. - Collaborative Culture : Work in a team-oriented environment where your contributions are valued. - Professional Growth : Opportunities for career development and learning new technologies. - Competitive Benefits : Enjoy a competitive salary and benefits package.

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 55 Lacs

Gurugram

Work from Office

Job Summary: We are seeking an experienced Engineering Manager to lead our software development team. The ideal candidate will have a strong background in Java, Spring Boot, databases, Kafka, and system design. As an Engineering Manager, you will be responsible for managing and mentoring a team of engineers, driving technical projects, and ensuring the successful delivery of high-quality software solutions. Key Responsibilities: Team Leadership: Lead and manage a team of software engineers, fostering a collaborative and high-performance culture. Mentor and provide guidance to team members, supporting their professional growth and development. Conduct regular performance reviews and provide constructive feedback. Technical Leadership: Drive the architectural design and implementation of complex software systems using Java, Spring Boot, and related technologies. Oversee the integration of Kafka for real-time data processing and messaging. Ensure best practices in database design, development, and optimization. Collaborate with product managers, designers, and other stakeholders to translate business requirements into technical solutions. Conduct code reviews, ensure code quality, and enforce coding standards. Project Management: Plan, prioritize, and manage multiple projects simultaneously, ensuring on-time delivery. Coordinate with cross-functional teams to align on project goals, timelines, and deliverables. Identify potential risks and implement mitigation strategies. System Design: Lead the design and development of scalable, reliable, and maintainable software systems. Evaluate and recommend tools, technologies, and processes to improve the development workflow. Ensure the security, performance, and scalability of systems. Communication: Communicate technical concepts and project status effectively to both technical and non-technical stakeholders. Foster a culture of transparency and open communication within the team. Required Qualifications: Bachelor's or Masters degree in Computer Science, Engineering, or a related field. 7+ years of software development experience, focusing on Java and Spring Boot. 3+ years of experience in a leadership or management role. Strong expertise in system design, with a track record of building scalable and robust software systems. Proficiency in working with relational and non-relational databases (e.g., MySQL, PostgreSQL, MongoDB). Experience with Apache Kafka or similar messaging systems. Strong problem-solving skills and a hands-on approach to troubleshooting complex issues. Excellent communication and interpersonal skills. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, Google Cloud). Familiarity with microservices architecture and containerization (Docker, Kubernetes). Understanding of DevOps practices and CI/CD pipelines. Experience with Agile development methodologies.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

The Platform Data Engineer will be responsible for designing and implementing robust data platform architectures, integrating diverse data technologies, and ensuring scalability, reliability, performance, and security across the platform. The role involves setting up and managing infrastructure for data pipelines, storage, and processing, developing internal tools to enhance platform usability, implementing monitoring and observability, collaborating with software engineering teams for seamless integration, and driving capacity planning and cost optimization initiatives.

Posted 1 month ago

Apply

6.0 - 7.0 years

11 - 14 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Location: Remote / Pan India,Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Notice Period: Immediate iSource Services is hiring for one of their client for the position of Java kafka developer. We are seeking a highly skilled and motivated Confluent Certified Developer for Apache Kafka to join our growing team. The ideal candidate will possess a deep understanding of Kafka architecture, development best practices, and the Confluent platform. You will be responsible for designing, developing, and maintaining scalable and reliable Kafka-based data pipelines and applications. Your expertise will be crucial in ensuring the efficient and robust flow of data across our organization. Develop Kafka producers, consumers, and stream processing applications. Implement Kafka Connect connectors and configure Kafka clusters. Optimize Kafka performance and troubleshoot related issues. Utilize Confluent tools like Schema Registry, Control Center, and ksqlDB. Collaborate with cross-functional teams and ensure compliance with data policies. Qualifications: Bachelors degree in Computer Science or related field. Confluent Certified Developer for Apache Kafka certification. Strong programming skills in Java/Python. In-depth Kafka architecture and Confluent platform experience. Experience with cloud platforms and containerization (Docker, Kubernetes) is a plus. Experience with data warehousing and data lake technologies. Experience with CI/CD pipelines and DevOps practices. Experience with Infrastructure as Code tools such as Terraform, or CloudFormation.

Posted 1 month ago

Apply

7.0 - 9.0 years

11 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Location: Remote / Pan India, hyderabad,ahmedabad,pune,chennai,kolkata. Notice Period: Immediate iSource Services is hiring for one of their client for the position of RoR Engineer About the Role - An RoR Engineer is responsible for maintaining all the applications i.e. the primary back-end application API, the order admin tool, the eCommerce application based on Solidus, and various supporting services which are used by our fulfilment partners, web and mobile customer facing applications. Roles & Responsibilities: Primary technology: Ruby on Rails Monitoring #escalated-support and #consumer-eng slack channels and addressing any issues that require technical assistance. Monitoring logs (via rollbar / datadog) and resolving any errors. Monitoring Sidekiqs job morgue and addressing any dead jobs. Maintaining libraries in all applications with security updates. Security requirements and scope understanding. Must have knowledge of database like MySQL, PostgreSQL, SQLite. Good knowledge of deployment of application on server. 7 years in ROR and 3 Years in Angular JS .

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai, Pune, Chennai

Work from Office

We are seeking proficient and result-oriented Java/Spring Boot Developer with hands-on experience in building scalable and high-performance systems Job Description Expert on Core Java (Java 11 or 17), J2EE, Spring Boot, JUnit Should have knowledge on NodeJs, Maven, GitHub/BitBucket Experience with RESTful services, Rabbit MQ, Active MQ, JSON, Graphql, Apache Kafka & postGres is a plus Excellent problem solving/troubleshooting skills on Java/J2EE technologies Roles & Responsibilities Participate in system design discussions, planning and performance tuning Write clean, testable and scalable code following industry best practices Resolve technical issues through debugging, research, and investigation Work in Agile/Scrum development lifecycle and participate in daily stand-ups and sprint planning Complete the task assigned within the given timelines Continuously learn new technologies

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Thane

Work from Office

Position Purpose In the context of development of applications for the Compliance domain of BNPP, the developer will be part of a team of developers, align with the local team lead, take ownership, and deliver quality for all the user stories worked upon. We are looking for a highly skilled backend developer with strong experience in Java 8+, Spring Boot and Microservices. Candidate should be comfortable designing and developing scalable backend solutions with NoSQL databases like MongoDB. Responsibilities Direct Responsibilities Design and develop backend services using Java 8+, Spring boot & JUnit. Build and maintain robust RESTful APIs. Integrate with MongoDB and ensure performance and security. Ensure coding standards are followed Ensure collaboration, good rapport & teamwork with ISPL and Paris team members Contributing Responsibilities Take ownership and commit towards quality deliverables within estimated timelines, avoiding global schedule shift Participate in code reviews and documentation process. Contribute to continuous improvement in development practices processes and code quality. Participation in projects meetings: fine-tuning, daily, retrospective. Collaboration with the team members: the ability to collect, analyze, synthesize and present information in a clear, concise and precise way Technical & Behavioral Competencies - Expert in Java 8+ and Spring Boot - RESTful API and Microservices architecture. - Hands-on experience with MongoDB - Apache Kafka for messaging - Junit and Spring boot testing frameworks and code quality tools like Sonar - API Gateways like APIGEE and authentication strategies - Clean coding practices. - Maven and swagger tools. - Good to have Familiar with payment systems or related compliance driven systems Knowledge of Docker and Kubernetes and CI/CD pipelines using GitLab Angular2+, Typescript Including knowledge on PrimeNG and/or Material UI Experience in Integrated AI tool and knowledge on efficient prompting Knowledge of Web security principles (OWASP, Auth double factor, encryption, etc.) Knowledge of hexagonal architecture, event-oriented architecture and DDD Specific Qualifications (if required) Experience in Linux, DevOps, IntelliJ, Gitlab (Pipeline CI/CD), Cloud Object Storage, Kafka Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Attention to detail / rigor Communication skills - oral & written Ability to deliver / Results driven Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Choose an item. Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 3 years

Posted 1 month ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Gurugram, Bengaluru

Hybrid

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Gurgaon/ Bengaluru Payroll: BCforward Work Mode: Hybrid JD Skills: Java; Apache Kafka; AWS; Spring, microservices, Event Driven Architecture, deeper knowledge of Java, Spring, Kafka and with good hands-on coding and analytical skills. Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 30-Days joiners at most. All the best

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

Expertise in the following areas: Java, Spring MVC, Spring Boot, Docker, MariaDB, MongoDB, NoSql, Maven, JUnit, Mockito, SAML,XML, Object Oriented Design and Development, Apache ANT, Relational databases (MySQL), Hibernate,HTML,J2EE 1.6+,PostgreSQL

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Monday to Friday (WFO) Timings : 9 am to 6 pm Desired Skills Expertise: Strong experience and mathematical understanding in one or more of Natural Language Understanding, Computer Vision, Machine Learning, and Optimization Proven track record in effectively building and deploying ML systems using frameworks such as PyTorch, TensorFlow, Keras, scikit-learn, etc. Expertise in modular, typed, and object-oriented Python programming Proficiency with core data science languages (such as Python, R, Scala), and familiarity & flexibility with data systems (e.g., SQL, NoSQL, knowledge graphs) Experience with financial data analysis, time series forecasting, and risk modeling Knowledge of financial regulations and compliance requirements in the fintech industry Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization technologies (Docker, Kubernetes) Understanding of blockchain technology and its applications in fintech Experience with real-time data processing and streaming analytics (e.g., Apache Kafka, Apache Flink) Excellent communication skills with a desire to work in multidisciplinary teams Ability to explain complex technical concepts to non-technical stakeholders

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Design,develop,maintain scalable microservices using Java,Kotlin,Spring Boot. Build/ optimize data models and queries in MongoDB. Integrate and manage Apache Kafka for real-time data streaming and messaging. Implement CI/CD pipelines using Jenkins Required Candidate profile 6+ yrs of exp in backend development with Java/Spring Boot. Exp with Jenkins for CI/CD automation. Familiarity with AWS services (EC2, S3, Lambda, RDS, etc.) or OpenShift for container orchestration

Posted 1 month ago

Apply

6.0 - 9.0 years

16 - 22 Lacs

Hyderabad, Pune, Chennai

Work from Office

Full stack developer - Java/Angular/Springboot / Kotlin / Kafka We are seeking a talented Full Stack Developer experienced in Java, Kotlin, Spring Boot, Angular, and Apache Kafka to join our dynamic engineering team. The ideal candidate will design, develop, and maintain end-to-end web applications and real-time data processing solutions, leveraging modern frameworks and event-driven architectures. Location : offshore Timings : Until US EST Noon hours Experience : 4-6 Years Key Responsibilities Design, develop, and maintain scalable web applications using Java, Kotlin, Spring Boot, and Angular. Build and integrate RESTful APIs and microservices to connect frontend and backend components. Develop and maintain real-time data pipelines and event-driven features using Apache Kafka. Collaborate with cross-functional teams (UI/UX, QA, DevOps, Product) to define, design, and deliver new features. Write clean, efficient, and well-documented code following industry best practices and coding standards. Participate in code reviews, provide constructive feedback, and ensure code quality and consistency. Troubleshoot and resolve application issues, bugs, and performance bottlenecks in a timely manner. Optimize applications for maximum speed, scalability, and security. Stay updated with the latest industry trends, tools, and technologies, and proactively suggest improvements. Participate in Agile/Scrum ceremonies and contribute to continuous integration and delivery pipelines. Required Qualifications Experience with cloud-based technologies and deployment (Azure, GCP). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Proven experience as a Full Stack Developer with hands-on expertise in Java, Kotlin, Spring Boot, and Angular (Angular 2+). Strong understanding of object-oriented and functional programming principles. Experience designing and implementing RESTful APIs and integrating them with frontend applications. Proficiency in building event-driven and streaming applications using Apache Kafka. Experience with database systems (SQL/NoSQL), ORM frameworks (e.g., Hibernate, JPA), and SQL. Familiarity with version control systems (Git) and CI/CD pipelines. Good understanding of HTML5, CSS3, JavaScript, and TypeScript. Experience with Agile development methodologies and working collaboratively in a team environment. Excellent problem-solving, analytical, and communication skills.

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Pune, Hinjewadi

Work from Office

Job Summary Synechron is seeking an experienced and technically proficient Senior PySpark Data Engineer to join our data engineering team. In this role, you will be responsible for developing, optimizing, and maintaining large-scale data processing solutions using PySpark. Your expertise will support our organizations efforts to leverage big data for actionable insights, enabling data-driven decision-making and strategic initiatives. Software Requirements Required Skills: Proficiency in PySpark Familiarity with Hadoop ecosystem components (e.g., HDFS, Hive, Spark SQL) Experience with Linux/Unix operating systems Data processing tools like Apache Kafka or similar streaming platforms Preferred Skills: Experience with cloud-based big data platforms (e.g., AWS EMR, Azure HDInsight) Knowledge of Python (beyond PySpark), Java or Scala relevant to big data applications Familiarity with data orchestration tools (e.g., Apache Airflow, Luigi) Overall Responsibilities Design, develop, and optimize scalable data processing pipelines using PySpark. Collaborate with data engineers, data scientists, and business analysts to understand data requirements and deliver solutions. Implement data transformations, aggregations, and extraction processes to support analytics and reporting. Manage large datasets in distributed storage systems, ensuring data integrity, security, and performance. Troubleshoot and resolve performance issues within big data workflows. Document data processes, architectures, and best practices to promote consistency and knowledge sharing. Support data migration and integration efforts across varied platforms. Strategic Objectives: Enable efficient and reliable data processing to meet organizational analytics and reporting needs. Maintain high standards of data security, compliance, and operational durability. Drive continuous improvement in data workflows and infrastructure. Performance Outcomes & Expectations: Efficient processing of large-scale data workloads with minimum downtime. Clear, maintainable, and well-documented code. Active participation in team reviews, knowledge transfer, and innovation initiatives. Technical Skills (By Category) Programming Languages: Required: PySpark (essential); Python (needed for scripting and automation) Preferred: Java, Scala Databases/Data Management: Required: Experience with distributed data storage (HDFS, S3, or similar) and data warehousing solutions (Hive, Snowflake) Preferred: Experience with NoSQL databases (Cassandra, HBase) Cloud Technologies: Required: Familiarity with deploying and managing big data solutions on cloud platforms such as AWS (EMR), Azure, or GCP Preferred: Cloud certifications Frameworks and Libraries: Required: Spark SQL, Spark MLlib (basic familiarity) Preferred: Integration with streaming platforms (e.g., Kafka), data validation tools Development Tools and Methodologies: Required: Version control systems (e.g., Git), Agile/Scrum methodologies Preferred: CI/CD pipelines, containerization (Docker, Kubernetes) Security Protocols: Optional: Basic understanding of data security practices and compliance standards relevant to big data management Experience Requirements Minimum of 7+ years of experience in big data environments with hands-on PySpark development. Proven ability to design and implement large-scale data pipelines. Experience working with cloud and on-premises big data architectures. Preference for candidates with domain-specific experience in finance, banking, or related sectors. Candidates with substantial related experience and strong technical skills in big data, even from different domains, are encouraged to apply. Day-to-Day Activities Develop, test, and deploy PySpark data processing jobs to meet project specifications. Collaborate in multi-disciplinary teams during sprint planning, stand-ups, and code reviews. Optimize existing data pipelines for performance and scalability. Monitor data workflows, troubleshoot issues, and implement fixes. Engage with stakeholders to gather new data requirements, ensuring solutions are aligned with business needs. Contribute to documentation, standards, and best practices for data engineering processes. Support the onboarding of new data sources, including integration and validation. Decision-Making Authority & Responsibilities: Identify performance bottlenecks and propose effective solutions. Decide on appropriate data processing approaches based on project requirements. Escalate issues that impact project timelines or data integrity. Qualifications Bachelors degree in Computer Science, Information Technology, or related field. Equivalent experience considered. Relevant certifications are preferred: Cloudera, Databricks, AWS Certified Data Analytics, or similar. Commitment to ongoing professional development in data engineering and big data technologies. Demonstrated ability to adapt to evolving data tools and frameworks. Professional Competencies Strong analytical and problem-solving skills, with the ability to model complex data workflows. Excellent communication skills to articulate technical solutions to non-technical stakeholders. Effective teamwork and collaboration in a multidisciplinary environment. Adaptability to new technologies and emerging trends in big data. Ability to prioritize tasks effectively and manage time in fast-paced projects. Innovation mindset, actively seeking ways to improve data infrastructure and processes.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies