Home
Jobs

7132 Kafka Jobs - Page 27

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 20.0 years

35 - 40 Lacs

Navi Mumbai

Work from Office

Naukri logo

Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary JOB DESCRIPTION We are seeking an experienced and innovative Data Scientist to join our team. The ideal candidate will leverage data-driven insights to solve complex problems, optimize business processes, and contribute to strategic decision-making. This role requires expertise in statistical analysis, machine learning, and data visualization to extract valuable insights from large datasets. Responsibilities Key Responsibilities: Collect, clean, and preprocess structuredandunstructureddata from various sources. Apply statisticalmethods and machinelearningalgorithms to analyze data and identify patterns. Develop predictive and prescriptive models to support business goals. Collaborate with stakeholders to define data-driven solutions for business challenges. Visualize data insights using tools like PowerBI , Tableau , or Matplotlib . Perform A / Btesting and evaluate model accuracy using appropriate metrics. Optimize machine learning models for scalability and performance. Document processes and communicate findings to non-technical stakeholders. Stay updated with advancements in data science techniques and tools. Qualifications Required Skills and Qualifications: Proficiency in programming languages like Python , R , or Scala . Strong knowledge of machinelearningframeworks such as TensorFlow , PyTorch , or Scikit − learn . Experience with SQL and NoSQLdatabases for data querying and manipulation. Understanding of bigdatatechnologies like Hadoop , Spark , or Kafka . Ability to perform statisticalanalysis and interpret results. Experience with datavisualizationlibraries like Seaborn , Plotly , or D 3. js . Excellent problem-solving and analytical skills. Strong communication skills to present findings to technical and non-technical audiences. Preferred Qualifications Master's or PhD in DataScience , Statistics , ComputerScience , or a related field. Experience with cloudplatforms (e.g., AWS, Azure, GCP) for data processing and model deployment. Knowledge of NLP ( NaturalLanguageProcessing ) and computervision . Familiarity with DevOpspractices and containerizationtools like Docker and Kubernetes . Exposure to time − seriesanalysis and forecastingtechniques . Certification in data science or machine learning tools is a plus. About Us ABOUT US Bristlecone is the leading provider of AI-powered application transformation services for the connected supply chain. We empower our customers with speed, visibility, automation, and resiliency – to thrive on change. Our transformative solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement and Digitalization are positioned around key industry pillars and delivered through a comprehensive portfolio of services spanning digital strategy, design and build, and implementation across a range of technology platforms. Bristlecone is ranked among the top ten leaders in supply chain services by Gartner. We are headquartered in San Jose, California, with locations across North America, Europe and Asia, and over 2,500 consultants. Bristlecone is part of the $19.4 billion Mahindra Group. Equal Opportunity Employer Bristlecone is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status . Information Security Responsibilities Understand and adhere to Information Security policies, guidelines and procedure, practice them for protection of organizational data and Information System. Take part in information security training and act while handling information. Report all suspected security and policy breach to InfoSec team or appropriate authority (CISO). Understand and adhere to the additional information security responsibilities as part of the assigned job role. Show more Show less

Posted 3 days ago

Apply

10.0 - 20.0 years

35 - 40 Lacs

Navi Mumbai

Work from Office

Naukri logo

Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.

Posted 3 days ago

Apply

5.0 - 10.0 years

13 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

In-Person Interview Announcement - Hyderabad Date: 21st June 2025 Cognizant is conducting in-person interviews in Hyderabad for experienced professionals across the following skill combinations. We are looking for talented individuals with experience levels ranging from 5-8 years and 9 -12 years. 1. Java MSB (Microservices Backend) Typical Requirements: Strong experience in Java , Spring Boot , and Microservices architecture . Proficiency in RESTful APIs , SOAP , and web services . Familiarity with Java application servers like Tomcat, JBoss, or Jetty. Experience with relational databases (Oracle, MySQL). Exposure to CI/CD pipelines , unit testing frameworks (JUnit, TestNG). Cloud experience (AWS or Azure) is often preferred 2. Java MSB + React All Java MSB skills as above. Strong hands-on experience with React.js , JavaScript , HTML5 , and CSS3 . Ability to build responsive UIs and integrate them with backend services. Familiarity with state management libraries like Redux is a plus. Understanding of RESTful integration between frontend and backend. 3. Java MSB + Kafka (Exposure) All Java MSB skills as above. Basic to intermediate exposure to Apache Kafka . Understanding of event-driven architecture and message brokers . Ability to integrate Kafka producers/consumers in microservices. Familiarity with Kafka Streams or Kafka Connect is a bonus. 4. Java MSB + Angular All Java MSB skills as above. Proficiency in Angular (v8+) , TypeScript , and RxJS . Experience in building modular and scalable frontend applications . Integration of Angular apps with RESTful APIs. Knowledge of component-based architecture and Angular CLI .

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Engineer - Full Stack developer As a Lead Software Engineer at the Loyalty Rewards and Segments Organization, you will be responsible for designing, developing, testing, and delivering software frameworks in the areas of event-driven architecture and zero trust for use in large-scale distributed systems. Loyalty Rewards and Segments is an organization within Mastercard that provide end to end loyalty management solution for banks, merchants and Fintechs. The ideal candidate for this role will have a strong background in software design, development, and testing, with a passion for technology and software development. They will be highly motivated, intellectually curious, and analytical, with a desire to continuously learn and improve. As a member of the Loyalty Rewards and Segments team, you will have the opportunity to work on cutting-edge technologies and collaborate with cross-functional teams to deliver software frameworks that meet the needs of Mastercard's customers. Role Key Responsibilities Lead the technical direction, architecture, design, and engineering practices. Prototype and proving concepts for new technologies, application frameworks, and design patterns to improve software development practices. Design and develop software frameworks using industry-standard best practices and methodologies Write efficient and maintainable code that meets feature specifications Debug and troubleshoot code to resolve issues and improve performance Validate software functionality, including performance, reliability, and security Collaborate with cross-functional teams to architect and deliver new services Participate in code reviews to ensure code quality and consistency Document software design, development, and testing processes Balance trade-offs between competing interests with judgment and experience. Identify synergies and reuse opportunities across teams and programs. Key Expectations Focus on individual and team objectives as an active participant in the Agile/Scrum development process, completing assignments on time, with the necessary quality, and in accordance with the project timeline Continuously learn and keep up-to-date with the latest software development technologies and methodologies Communicate effectively and professionally with team members and stakeholders Proactively identify opportunities for process improvements and efficiency gains Demonstrate a commitment to quality, best practices, and continuous improvement All About You You have an exceptional foundation in Computer Science fundamentals, web applications & services, and microservices-based software architecture. You have demonstrated experience architecting solutions based on platform-as-a-service (PaaS) and containers, including PCF, Kubernetes, and cloud-native technologies. You have architected & designed high transaction volume, financial (banking, payment) systems that operate at global scale and extreme up-time requirements. You have experience with web technologies including HTML5, CSS, Javascript, and front-end frameworks such as Angular. You have extensive experience in designing and building global-scale, back-end micro services using Java, Spring, Spring Boot, Pivotal Cloud Foundry, Kafka, RabbitMQ You have a deep understanding of storage technologies such as PostgreSQL or SQL Server, and how to effectively leverage them at massive scale. You have deep experience with cloud-native technologies and best practices, including Azure & AWS. You have experience with automated testing and successfully releasing software in a continuous delivery model using Git. You enjoy working in an Agile environment focused on continuous improvement. You have a strong desire to collaborate and provide mentorship to technology teams. You enjoy working with product leaders to inform and support options for delivering highly capable solutions that meet market demands. You desire to be hands-on building prototypes to solve complex business problems. You have excellent communication skills with both technical and non-technical people. You are a relentless self-starter who works quickly and efficiently to support product and technical objectives. You advocate for what’s technically important and doing the right thing. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250513 Show more Show less

Posted 3 days ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

5+ years in Data Engineer, Hands on experience in: 1. Spark 2. Kafka 3. Java 4. AWS

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Software Engineer I Job Description Summary Job Description Summary Who is Mastercard? Mastercard is a global technology company in the payments sector. We power payments and provide products and services for individuals and industries all around the world. Our people, technology, data and brand provide the capabilities that drive our success. We believe in connecting people to priceless possibilities. As a company, we know that our success is driven by the skills, experience, integrity and mindset of the talent we hire. By building an inclusive, world-class culture, our employees have once-in-a-career opportunities to be a part of teams that have a greater impact on our community and our world. We invite you to join our team to find out how you too can start something priceless. Overview Mastercard’s Corporate Solutions team is a rapidly growing organization with responsibilities to deliver innovative solutions to help our customers grow and expand their business. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. We are seeking a Software Development Engineer to lead a cross-functional development team within the Corporate Solutions organization. Role Hands on Building performant, scalable and secure web applications. Provide leadership, guidance and direction on systems and web application architecture and system design. Guide and coach less experienced engineers. Focus on user-centered design. Spearhead design, technical and code reviews. Implement coding standards and approaches. Promote an Agile approach to software development. Insist on a culture of continuous integration and delivery, automation and minimizing blast radius. Collaborate with teams across the organization to mitigate risk and resolve dependencies. All About You Successfully designed and developed externally facing web applications utilizing modern single page application frameworks. Angular and React JS preferred. Good understanding of JavaScript and it’s benefits and quirks. Proven experience building ADA and WCAG compliant web applications. Successfully designed and developed cloud ready REST APIs utilizing industry best practices. Good understanding of Cloud Architecture and it’s benefits and quirks. Good understanding of Messaging frameworks and event-based architecture and it’s benefits and quirks. Possess an understanding of multi-tier web development, including knowledge of server-side technologies and databases. Understanding of web application optimizations such as progressive web apps, web workers, browser repaints and reflows, performance and memory optimizations, debugging memory leaks, caching, flame graphs. Experience and knowledge of authentication and authorization workflows using concepts like JWT. Have proficiency in Quality engineering and experience working with Quality leads to define processes and technologies. Passion for improving code quality using approaches such as unit and end to end testing. Deep knowledge of Continuous Integration and Delivery and toolsets that provide this ability. Understanding of agile principles and methodologies and experience implementing and adapting them to fit the team’s needs. Use and understand Git based source control systems. Tech Stack: Java 11+, Spring Boot, Apache Kafka, SQL and no SQL databases, REST API, Angular 11+ Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-248580 Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need? Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart? Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms. Show more Show less

Posted 3 days ago

Apply

5.0 - 10.0 years

13 - 23 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain high-performance applications Java, Kafka, SQL, MongoDB, and Spring Boot DevOps practices and tools Docker, Kubernetes, CI/CD pipelines

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Gurugram

Hybrid

Naukri logo

IntraEdge is looking for BigData Engineers/developers who will work on the collecting, storing, processing, and analyzing of huge sets of data. One will also be responsible for integrating them with the architecture used across the company. Responsibilities- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities. Partners with architects and other senior leads to address the data needs Partners with Data Scientists and product teams to build and deploy machine learning models that unlock growth Build custom integration and data pipelines between cloud-based systems using APIs Write complex and efficient code to transform raw data sources into easily accessible models by coding several languages such as Python, Scala or SQL . Design, develop and test a large-scale, custom-distributed software system using the latest Java, Scala and Big data technologies. Actively contribute to the technological strategy definition (design, architecture and interfaces) in order to effectively respond to our client's business needs Participate in technological watch and the definition of standards to ensure that our systems and data warehouses are efficient, resilient and durable Experienced in using Informatica or similar products, with an understanding of heterogeneous data replication techniques Build data expertise and own data quality for the pipelines you create. Skills and Qualifications- Bachelor/Masters degree in Computer Science, Management of Information Systems or equivalent. 4 or more years of relevant software engineering experience ( Big Data: Hive, Spark, Kafka, Cassandra, Scala, Python, SQL ) in a data-focused role. Experience in GCP Building batch/streaming ETL pipelines with frameworks like Spark, Spark Streaming and Apache Beam and working with messaging systems like Pub/Sub and Kafka . Working experience with Java tools or Apache Camel. Experience in designing and building highly scalable and reliable data pipelines using Big Data ( Airflow, Python, Redshift/Snowflake ) Software development experience with proficiency in Python, Java, Scala or another language. Good knowledge of Big Data querying tools, such as Hive, and experience with Spark/PySpark Good knowledge of SQL, Good Knowledge of Python Ability to analyse and obtain insights from complex/large data sets Design and develop highly performing SQL Server database objects Experience- 5-10 Years Notice period- Serving NP/Immediate joiners/Max 30 days Location- Gurugram/Bangalore/Pune/Remote Salary- Decent hike on Current CTC

Posted 3 days ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

Bachelor's Degree in computer science, computer science engineering, or related experience required; advanced degree preferred Minimum 3 years of relevant experience in Golang Development. Writing scalable, robust, testable, efficient, and easily maintainable code Translating software requirements into stable, working, high-performance software Strong knowledge of Go programming language, paradigms, constructs, and idioms Hands-on experience in React and PostgreSQL. Ability to write clean and effective Godoc comments Familiarity with code versioning tools Excellent communication and analytical skills. Experience- 3-10 Years Notice period- Serving NP/Immediate joiners/Max 30 days Location- Bangalore Salary- Decent hike on Current CTC

Posted 3 days ago

Apply

5.0 - 10.0 years

0 - 2 Lacs

Chennai, Coimbatore, Bengaluru

Work from Office

Naukri logo

About the Role: We are looking for Senior .NET Engineers who can go beyond coding professionals who can architect solutions, evaluate trade-offs, and collaborate cross-functionally to deliver scalable systems. You will take ownership of backend development while playing an active role in system design and continuous improvement initiatives. This role demands strong communication, decision-making ability, and a solution-oriented mindset. Key Responsibilities: Design and build scalable, high-performance backend services using .NET / .NET Core Drive technical design discussions, evaluate options, and recommend best approaches aligned with business goals Collaborate with architects, product managers, and other engineers to define clean API contracts and system boundaries Ensure code quality, performance, and maintainability through reviews, mentoring, and hands-on contributions Implement asynchronous messaging patterns using Kafka (preferred) Work with NoSQL databases like DynamoDB (preferred) and relational databases as needed Lead by example in Agile/Scrum ceremonies and champion engineering excellence across the team Continuously assess and improve processes, performance, and scalability Required Skills and Experience: 5+ years of backend software engineering experience with .NET / .NET Core Strong expertise in C#, object-oriented design, and API development Solid understanding of software architecture, design principles, and distributed system concepts Experience integrating with messaging systems (Kafka is a plus) Familiarity with NoSQL (DynamoDB preferred) and SQL databases Comfortable working in CI/CD environments and with version control systems like Git Proven ability to communicate technical ideas clearly and effectively to both technical and non-technical stakeholders.

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Associate Engineer Corporate Title: Associate Location: Pune, India Role Description Associate Engineer is responsible for performing development work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities This role is for Engineer responsible for design, development and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable and high performing software applications are delivered to users in an Agile development environment. You should be coming from a strong technological background. The candidate should have experience working in Google Cloud Platform. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Your Skills And Experience Has Java solution design and development experience Has Java Spring Boot development experience Has practical and applied knowledge of design patterns (and anti-patterns) in Java in general and Java Spring Boot specifically Hands on experience working with APIs and microservices, integrating external and internal web services including SOAP, XML, REST, JSON Hands on experience in Google Cloud Platform. Has experience with cloud development platform: Spring Cloud; Open Shift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g.; GIT, TeamCity, Maven, SONAR. Experience with software design patterns and UML design Experience in integration design patterns with Kafka Experience with Agile/SCRUM environment. Familiar with Agile Team management tools (JIRA, Confluence) Understand and promote Agile values: FROCC (Focus, Respect, Openness, Commitment, Courage) Good communication skills Pro-active team player Comfortable working in multi-disciplinary, self-organized teams Professional knowledge of English Differentiators: knowledge/experience about How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 3 days ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Hyderabad

Work from Office

Naukri logo

About us We are a fast-growing and highly exciting start-up revolutionalising group communications and mass broadcasting. Our mobile app allows people to build their own mobile apps for free in just a few hours by white-labeling our app, and create groups of unlimited people, and unlimited sub-groups. We are about to launch our product, and several colleges are already using the beta version. The company is small currently, run by IIT-IIM graduates, and has an excellent working atmosphere. We are constantly looking for people who want to challenge our existing thinking and experiment with different thinking, contribute new ideas to the product, and in general work in a start-up environment. Apply to us to be a part of a disruptive company, feel like an entrepreneur, and have great fun while at it! Job Description We are looking for people with good skills in Java programming (you will be tested in Java), and preferably with a B.E./B.Tech. in Computer Science Engineering. You will be trained to be a full-stack mobile and web application developer (we may deploy you in testing as well). You will be working on programming and testing on the following stack: 1. Angular, HTML, CSS, Javascript/TypeScript 2. Java, PHP (Laravel), MySQL, Redis, ElasticSearch, Azure Table Storage 3. Swift, Selenium 4. Cloud infrastructure (IAAS, PAAS) - AWS, DigitalOcean, Azure You will learn to write code that not just works, but works very fast and uses very little hardware resources to execute. You need extremely good teachers and challenging assignments to write such code, and you will find both in our company. These are also some of the hottest technologies around, and you will be an extremely sought-after software engineer if you work for 2-3 years in these areas in the kind of environment and with the kind of challenging assignments that we provide. You will be working Mon-Sat (6 days a week), and 8-9 hours a day. This is not a work-from-home job, and you will need to attend office on all working days. Desired Candidate Profile We are looking for people with good skills in Java programming (you will be tested in Java), and preferably with a B.E./B.Tech. in Computer Science Engineering. We are looking for people with an excellent academic record, and good ranks in JEE Mains and EAMCET. Please make sure to mention all your percentages and ranks when you apply. You will be taking some quantitative aptitude tests as part of the interviewing process. Perks and Benefits You will be in probation for 4 months during which you will get a stipend of Rs. 10,000 per month. After that (if selected), you will get paid Rs. 20-25,000 per month depending on your performance in the probation period. In addition, you will be asked to complete certain courses / certifications, and if you complete all of them, you can get upto 100% more as salary in as soon as 6-18 months. We will also offer stock / ESOPs in the company if you work for a certain minimum period of time. If you think you deserve a higher salary, please let us know why in the questionnaire - we are open. We are also a small company with a family-like atmosphere and very friendly people, and you will feel quite respected here. So go ahead and apply to us - we look forward to speaking with you!

Posted 3 days ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

Roles & Responsibilities: Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements: Hands-on experience with Microsoft Fabric , including Lakehouse, Data Factory, and Synapse . Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: Bachelor's degree in computer science, Information Systems or related field 2+ years of experience in Java, RESTful APIs, Spring, Spring MVC, Spring Kafka, Microservices, database technologies. 1 year experience building Java based APIs. 1 year of experience in API documentation tool, swagger preferred. 1 year of experience in API monitoring and dashboards using ELK and Dynatrace. 1 year of experience in Unit and Function testing using Junit, Mockito/JMock, Selenium, Cucumber. 1 year of experience in event driven microservice architecture using Kafka. 1 Year of experience with testing tools/methodologies. 2+ years of experience in advanced Git skills and respective branching strategies. Relational database knowledge including SQL, Oracle, MS SQL, PostGreSQLData. Understanding on JSON, XML, SoapUI, or Postman (API testing tool). Analyzing requirements in User stories and developing software from acceptance criteria. Experience working with Agile/Scrum/Kanban development team and software such as Itrack (Jira) & ADO is preferred. Work with Leads, Engineers, Architects, Product Managers, and Business stakeholders to identify technical and functional needs of systems based on priority. Writing great quality code with a relentless passion for automated testing and validation. Excellent Communication Skills And Experience In Collaborative Environments. Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Show more Show less

Posted 3 days ago

Apply

10.0 - 15.0 years

12 - 18 Lacs

Pune

Work from Office

Naukri logo

Responsibilities: * Design and deliver corporate training programs using Python * Ensure proficiency in Python, Pyspark, data structures, NumPy, Pandas, Aws, Azure, GCP Cloud, Data visualization, Big Data tools * Experience in core python skills Food allowance Travel allowance House rent allowance

Posted 3 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Kochi

Work from Office

Naukri logo

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Non-Degree Program Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 3 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune

Work from Office

Naukri logo

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python

Posted 3 days ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

About the Role: Launched in 2017, Oracle Banking Payments continues to evolve with an ambitious roadmap covering both functional enhancements and modern technology stacks. This is a unique opportunity to join a high-impact development team working on a globally recognized, mission-critical banking product. Role: Java Development Lead Oracle Banking Payments Responsibilities: As a Senior Software Architect, you will: Translate business requirements into scalable, maintainable technical designs and code. Develop and maintain components using Java, Spring, and microservices frameworks. Diagnose and resolve technical issues across environments. Lead initiatives to identify and fix application security vulnerabilities. Deliver high-quality code with minimal production issues. Guide and mentor junior developers, fostering a culture of technical excellence. Navigate ambiguity and drive clarity in fast-paced Agile environments. Communicate clearly and proactively with cross-functional teams. Mandatory Skills: Expertise in Java, Java Microservices, Spring Framework, EclipseLink, JMS, JSON/XML, RESTful APIs. Experience developing cloud-native applications. Familiarity with Docker, Kubernetes, or similar containerization tools. Practical knowledge of at least one major cloud platform (AWS, Azure, Google Cloud). Understanding of monitoring tools (e.g., Prometheus, Grafana). Experience with Kafka or other message brokers in event-driven architectures. Proficient in CI/CD pipelines using Jenkins, GitLab CI, etc. Strong SQL skills with Oracle databases. Hands-on debugging and performance tuning experience. Nice to Have: Experience with Oracle Cloud Infrastructure (OCI). Domain knowledge of the payments industry and processing flows. What Were Looking For: The ideal candidate is: A passionate coder with a deep understanding of Java and modern application design. Curious, resourceful, and persistent in solving problems using various approachesfrom research and experimentation to creative thinking. A proactive mentor and team contributor with a strong sense of accountability. Adaptable to evolving technology landscapes and fast-paced environments. Self-Test Questions – Ask Yourself Before Applying: 1. Have I built or maintained enterprise-grade applications using Java and Spring Microservices?2. Can I explain how I've implemented cloud-native solutions using AWS, Azure, or Google Cloud?3. Have I worked in Agile teams for at least three years, contributing actively in sprints?4. Am I comfortable troubleshooting production issues, using tools like Prometheus, Grafana, and log aggregators?5. Have I designed or debugged RESTful APIs and worked with JSON/XML extensively?6. Do I have experience integrating applications with message brokers such as Kafka?7. Have I mentored junior developers or acted as a tech lead on projects?Do I genuinely enjoy solving complex problems and exploring multiple approaches to arrive at the

Posted 3 days ago

Apply

5.0 - 7.0 years

8 - 10 Lacs

Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 3 days ago

Apply

5.0 - 7.0 years

8 - 10 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 3 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

System Analysis, Design, Development and implementation of Enterprise e-business solutions and n-tier architectures. Customer Interactions, Requirement Gathering, Project Execution/Delivery Process fitment to Maximo, Business process Re-Engineering Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise MAS Experience. Maximo application configuration, JMS/KAFKA Integration setup, BIRT reports, Maximo Mobile Generating Custom Reports using Actuate & BIRT Design and Development of External System Integrations using MEA Preferred technical and professional experience Installation of Maximo applications, Websphere Configurations. Upgrading Maximo from the legacy versions to the latest. Customizations and configurations in Maximo.

Posted 3 days ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test, and maintain applications using Java and Spring Boot. Design and implement microservices architecture. Work with databases to ensure data integrity and performance. Collaborate with cross-functional teams to define, design. Required Candidate profile Proficiency in Java programming. Experience with Spring Boot framework. Knowledge of microservices architecture. Familiarity with databases (SQL/NoSQL). Basic understanding of Kafka and S3.

Posted 3 days ago

Apply

4.0 - 7.0 years

5 - 15 Lacs

Noida

Work from Office

Naukri logo

Mandatory Skills Python, Django/Flask, Databases - PostgresQL/MySQL/MongoDB,, Database Management, REST API development, Multi-Threading, CLI systems (e.g., AMOS, CORBA), Kafka. Project Overview 1. Strong understanding of Python programming language, its syntax, and libraries 2. Experience with web frameworks such as Flask and Fast API. 3. Experience with relational databases PostgreSQL. 4. Python (Django or Flask), REST API development, database management (PostgreSQL/MySQL), multi-threading, and familiarity with CLI systems (e.g., AMOS, CORBA). 5. Should have experience in Mobile app development. 6. Developing back-end components. 7. Integrating user-facing elements using server-side logic. Interested candidates can share the resume at neha.sharma@innovationm.com

Posted 3 days ago

Apply

Exploring Kafka Jobs in India

Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Gurgaon

These cities are known for their thriving tech industries and have a high demand for Kafka professionals.

Average Salary Range

The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.

Career Path

Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.

Related Skills

In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture

Interview Questions

  • What is Apache Kafka and how does it differ from other messaging systems? (basic)
  • Explain the role of Zookeeper in Apache Kafka. (medium)
  • How does Kafka guarantee fault tolerance? (medium)
  • What are the key components of a Kafka cluster? (basic)
  • Describe the process of message publishing and consuming in Kafka. (medium)
  • How can you achieve exactly-once message processing in Kafka? (advanced)
  • What is the role of Kafka Connect in Kafka ecosystem? (medium)
  • Explain the concept of partitions in Kafka. (basic)
  • How does Kafka handle consumer offsets? (medium)
  • What is the role of a Kafka Producer API? (basic)
  • How does Kafka ensure high availability and durability of data? (medium)
  • Explain the concept of consumer groups in Kafka. (basic)
  • How can you monitor Kafka performance and throughput? (medium)
  • What is the purpose of Kafka Streams API? (medium)
  • Describe the use cases where Kafka is not a suitable solution. (advanced)
  • How does Kafka handle data retention and cleanup policies? (medium)
  • Explain the Kafka message delivery semantics. (medium)
  • What are the different security features available in Kafka? (medium)
  • How can you optimize Kafka for high throughput and low latency? (advanced)
  • Describe the role of a Kafka Broker in a Kafka cluster. (basic)
  • How does Kafka handle data replication across brokers? (medium)
  • Explain the significance of serialization and deserialization in Kafka. (basic)
  • What are the common challenges faced while working with Kafka? (medium)
  • How can you scale Kafka to handle increased data loads? (advanced)

Closing Remark

As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies