Home
Jobs

6920 Kafka Jobs - Page 19

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

0 - 2 Lacs

Chennai, Coimbatore, Bengaluru

Work from Office

Naukri logo

About the Role: We are looking for Senior .NET Engineers who can go beyond coding professionals who can architect solutions, evaluate trade-offs, and collaborate cross-functionally to deliver scalable systems. You will take ownership of backend development while playing an active role in system design and continuous improvement initiatives. This role demands strong communication, decision-making ability, and a solution-oriented mindset. Key Responsibilities: Design and build scalable, high-performance backend services using .NET / .NET Core Drive technical design discussions, evaluate options, and recommend best approaches aligned with business goals Collaborate with architects, product managers, and other engineers to define clean API contracts and system boundaries Ensure code quality, performance, and maintainability through reviews, mentoring, and hands-on contributions Implement asynchronous messaging patterns using Kafka (preferred) Work with NoSQL databases like DynamoDB (preferred) and relational databases as needed Lead by example in Agile/Scrum ceremonies and champion engineering excellence across the team Continuously assess and improve processes, performance, and scalability Required Skills and Experience: 5+ years of backend software engineering experience with .NET / .NET Core Strong expertise in C#, object-oriented design, and API development Solid understanding of software architecture, design principles, and distributed system concepts Experience integrating with messaging systems (Kafka is a plus) Familiarity with NoSQL (DynamoDB preferred) and SQL databases Comfortable working in CI/CD environments and with version control systems like Git Proven ability to communicate technical ideas clearly and effectively to both technical and non-technical stakeholders.

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Associate Engineer Corporate Title: Associate Location: Pune, India Role Description Associate Engineer is responsible for performing development work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities This role is for Engineer responsible for design, development and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable and high performing software applications are delivered to users in an Agile development environment. You should be coming from a strong technological background. The candidate should have experience working in Google Cloud Platform. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Your Skills And Experience Has Java solution design and development experience Has Java Spring Boot development experience Has practical and applied knowledge of design patterns (and anti-patterns) in Java in general and Java Spring Boot specifically Hands on experience working with APIs and microservices, integrating external and internal web services including SOAP, XML, REST, JSON Hands on experience in Google Cloud Platform. Has experience with cloud development platform: Spring Cloud; Open Shift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g.; GIT, TeamCity, Maven, SONAR. Experience with software design patterns and UML design Experience in integration design patterns with Kafka Experience with Agile/SCRUM environment. Familiar with Agile Team management tools (JIRA, Confluence) Understand and promote Agile values: FROCC (Focus, Respect, Openness, Commitment, Courage) Good communication skills Pro-active team player Comfortable working in multi-disciplinary, self-organized teams Professional knowledge of English Differentiators: knowledge/experience about How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 2 days ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Hyderabad

Work from Office

Naukri logo

About us We are a fast-growing and highly exciting start-up revolutionalising group communications and mass broadcasting. Our mobile app allows people to build their own mobile apps for free in just a few hours by white-labeling our app, and create groups of unlimited people, and unlimited sub-groups. We are about to launch our product, and several colleges are already using the beta version. The company is small currently, run by IIT-IIM graduates, and has an excellent working atmosphere. We are constantly looking for people who want to challenge our existing thinking and experiment with different thinking, contribute new ideas to the product, and in general work in a start-up environment. Apply to us to be a part of a disruptive company, feel like an entrepreneur, and have great fun while at it! Job Description We are looking for people with good skills in Java programming (you will be tested in Java), and preferably with a B.E./B.Tech. in Computer Science Engineering. You will be trained to be a full-stack mobile and web application developer (we may deploy you in testing as well). You will be working on programming and testing on the following stack: 1. Angular, HTML, CSS, Javascript/TypeScript 2. Java, PHP (Laravel), MySQL, Redis, ElasticSearch, Azure Table Storage 3. Swift, Selenium 4. Cloud infrastructure (IAAS, PAAS) - AWS, DigitalOcean, Azure You will learn to write code that not just works, but works very fast and uses very little hardware resources to execute. You need extremely good teachers and challenging assignments to write such code, and you will find both in our company. These are also some of the hottest technologies around, and you will be an extremely sought-after software engineer if you work for 2-3 years in these areas in the kind of environment and with the kind of challenging assignments that we provide. You will be working Mon-Sat (6 days a week), and 8-9 hours a day. This is not a work-from-home job, and you will need to attend office on all working days. Desired Candidate Profile We are looking for people with good skills in Java programming (you will be tested in Java), and preferably with a B.E./B.Tech. in Computer Science Engineering. We are looking for people with an excellent academic record, and good ranks in JEE Mains and EAMCET. Please make sure to mention all your percentages and ranks when you apply. You will be taking some quantitative aptitude tests as part of the interviewing process. Perks and Benefits You will be in probation for 4 months during which you will get a stipend of Rs. 10,000 per month. After that (if selected), you will get paid Rs. 20-25,000 per month depending on your performance in the probation period. In addition, you will be asked to complete certain courses / certifications, and if you complete all of them, you can get upto 100% more as salary in as soon as 6-18 months. We will also offer stock / ESOPs in the company if you work for a certain minimum period of time. If you think you deserve a higher salary, please let us know why in the questionnaire - we are open. We are also a small company with a family-like atmosphere and very friendly people, and you will feel quite respected here. So go ahead and apply to us - we look forward to speaking with you!

Posted 2 days ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

Roles & Responsibilities: Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements: Hands-on experience with Microsoft Fabric , including Lakehouse, Data Factory, and Synapse . Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: Bachelor's degree in computer science, Information Systems or related field 2+ years of experience in Java, RESTful APIs, Spring, Spring MVC, Spring Kafka, Microservices, database technologies. 1 year experience building Java based APIs. 1 year of experience in API documentation tool, swagger preferred. 1 year of experience in API monitoring and dashboards using ELK and Dynatrace. 1 year of experience in Unit and Function testing using Junit, Mockito/JMock, Selenium, Cucumber. 1 year of experience in event driven microservice architecture using Kafka. 1 Year of experience with testing tools/methodologies. 2+ years of experience in advanced Git skills and respective branching strategies. Relational database knowledge including SQL, Oracle, MS SQL, PostGreSQLData. Understanding on JSON, XML, SoapUI, or Postman (API testing tool). Analyzing requirements in User stories and developing software from acceptance criteria. Experience working with Agile/Scrum/Kanban development team and software such as Itrack (Jira) & ADO is preferred. Work with Leads, Engineers, Architects, Product Managers, and Business stakeholders to identify technical and functional needs of systems based on priority. Writing great quality code with a relentless passion for automated testing and validation. Excellent Communication Skills And Experience In Collaborative Environments. Weekly Hours: 40 Time Type: Regular Location: Hyderabad, Andhra Pradesh, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made. Show more Show less

Posted 2 days ago

Apply

10.0 - 15.0 years

12 - 18 Lacs

Pune

Work from Office

Naukri logo

Responsibilities: * Design and deliver corporate training programs using Python * Ensure proficiency in Python, Pyspark, data structures, NumPy, Pandas, Aws, Azure, GCP Cloud, Data visualization, Big Data tools * Experience in core python skills Food allowance Travel allowance House rent allowance

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Kochi

Work from Office

Naukri logo

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Non-Degree Program Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune

Work from Office

Naukri logo

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python

Posted 2 days ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

About the Role: Launched in 2017, Oracle Banking Payments continues to evolve with an ambitious roadmap covering both functional enhancements and modern technology stacks. This is a unique opportunity to join a high-impact development team working on a globally recognized, mission-critical banking product. Role: Java Development Lead Oracle Banking Payments Responsibilities: As a Senior Software Architect, you will: Translate business requirements into scalable, maintainable technical designs and code. Develop and maintain components using Java, Spring, and microservices frameworks. Diagnose and resolve technical issues across environments. Lead initiatives to identify and fix application security vulnerabilities. Deliver high-quality code with minimal production issues. Guide and mentor junior developers, fostering a culture of technical excellence. Navigate ambiguity and drive clarity in fast-paced Agile environments. Communicate clearly and proactively with cross-functional teams. Mandatory Skills: Expertise in Java, Java Microservices, Spring Framework, EclipseLink, JMS, JSON/XML, RESTful APIs. Experience developing cloud-native applications. Familiarity with Docker, Kubernetes, or similar containerization tools. Practical knowledge of at least one major cloud platform (AWS, Azure, Google Cloud). Understanding of monitoring tools (e.g., Prometheus, Grafana). Experience with Kafka or other message brokers in event-driven architectures. Proficient in CI/CD pipelines using Jenkins, GitLab CI, etc. Strong SQL skills with Oracle databases. Hands-on debugging and performance tuning experience. Nice to Have: Experience with Oracle Cloud Infrastructure (OCI). Domain knowledge of the payments industry and processing flows. What Were Looking For: The ideal candidate is: A passionate coder with a deep understanding of Java and modern application design. Curious, resourceful, and persistent in solving problems using various approachesfrom research and experimentation to creative thinking. A proactive mentor and team contributor with a strong sense of accountability. Adaptable to evolving technology landscapes and fast-paced environments. Self-Test Questions – Ask Yourself Before Applying: 1. Have I built or maintained enterprise-grade applications using Java and Spring Microservices?2. Can I explain how I've implemented cloud-native solutions using AWS, Azure, or Google Cloud?3. Have I worked in Agile teams for at least three years, contributing actively in sprints?4. Am I comfortable troubleshooting production issues, using tools like Prometheus, Grafana, and log aggregators?5. Have I designed or debugged RESTful APIs and worked with JSON/XML extensively?6. Do I have experience integrating applications with message brokers such as Kafka?7. Have I mentored junior developers or acted as a tech lead on projects?Do I genuinely enjoy solving complex problems and exploring multiple approaches to arrive at the

Posted 2 days ago

Apply

5.0 - 7.0 years

8 - 10 Lacs

Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 days ago

Apply

5.0 - 7.0 years

8 - 10 Lacs

Pune

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 2 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

System Analysis, Design, Development and implementation of Enterprise e-business solutions and n-tier architectures. Customer Interactions, Requirement Gathering, Project Execution/Delivery Process fitment to Maximo, Business process Re-Engineering Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise MAS Experience. Maximo application configuration, JMS/KAFKA Integration setup, BIRT reports, Maximo Mobile Generating Custom Reports using Actuate & BIRT Design and Development of External System Integrations using MEA Preferred technical and professional experience Installation of Maximo applications, Websphere Configurations. Upgrading Maximo from the legacy versions to the latest. Customizations and configurations in Maximo.

Posted 2 days ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Develop, test, and maintain applications using Java and Spring Boot. Design and implement microservices architecture. Work with databases to ensure data integrity and performance. Collaborate with cross-functional teams to define, design. Required Candidate profile Proficiency in Java programming. Experience with Spring Boot framework. Knowledge of microservices architecture. Familiarity with databases (SQL/NoSQL). Basic understanding of Kafka and S3.

Posted 2 days ago

Apply

4.0 - 7.0 years

5 - 15 Lacs

Noida

Work from Office

Naukri logo

Mandatory Skills Python, Django/Flask, Databases - PostgresQL/MySQL/MongoDB,, Database Management, REST API development, Multi-Threading, CLI systems (e.g., AMOS, CORBA), Kafka. Project Overview 1. Strong understanding of Python programming language, its syntax, and libraries 2. Experience with web frameworks such as Flask and Fast API. 3. Experience with relational databases PostgreSQL. 4. Python (Django or Flask), REST API development, database management (PostgreSQL/MySQL), multi-threading, and familiarity with CLI systems (e.g., AMOS, CORBA). 5. Should have experience in Mobile app development. 6. Developing back-end components. 7. Integrating user-facing elements using server-side logic. Interested candidates can share the resume at neha.sharma@innovationm.com

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise AWS Data Vault 2.0 development mechanism for agile data ingestion, storage and scaling Databricks for complex queries on transformation, aggregation, business logic implementation aggregation, business logic implementation AWS Redshift and Redshift spectrum, for complex queries on transformation, aggregation, business logic implementation DWH Concept on star schema, Materialize view concept. Strong SQL and data manipulation/transformation skills Preferred technical and professional experience Robust and Scalable Cloud Infrastructure End-to-End Data Engineering Pipeline Versatile Programming Capabilities

Posted 2 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Streaming data Technical skills requirements :- Mandatory Skills- Spark, Scala, AWS, Hadoop (Big Data) Experience- 5+ Years Solid hands-on and Solution Architecting experience in Big-Data Technologies (AWS preferred) - Hands on experience in: AWS Dynamo DB, EKS, Kafka, Kinesis, Glue, EMR - Hands-on experience of programming language like Scala with Spark. - Good command and working experience on Hadoop Map Reduce, HDFS, Hive, HBase, and/or No-SQL Databases - Hands on working experience on any of the data engineering analytics platform (Hortonworks Cloudera MapR AWS), AWS preferred - Hands-on experience on Data Ingestion Apache Nifi, Apache Airflow, Sqoop, and Oozie - Hands on working experience of data processing at scale with event driven systems, message queues (Kafka FlinkSpark Streaming) - Hands on working Experience with AWS Services like EMR, Kinesis, S3, CloudFormation, Glue, API Gateway, Lake Foundation. - Hands on working Experience with AWS Athena - Data Warehouse exposure on Apache Nifi, Apache Airflow, Kylo - Operationalization of ML models on AWS (e.g. deployment, scheduling, model monitoring etc.) - Feature Engineering Data Processing to be used for Model development - Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) - Experience building data pipelines for structured unstructured, real-time batch, events synchronous asynchronous using MQ, Kafka, Steam processing - Hands-on working experience in analysing source system data and data flows, working with structured and unstructured data - Must be very strong in writing SQL queries - Strengthen the Data engineering team with Big Data solutions - Strong technical, analytical, and problem-solving skills Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role: Technical Architect Experience: 8-15 years Location: Bangalore, Chennai, Gurgaon, Pune, and Kolkata Mandatory Skills: Python, Pyspark, SQL, ETL, Pipelines, Azure Databricks, Azure Data Factory, & Architect Designing. Primary Roles and Responsibilities: Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills and Qualifications: Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 8+ yrs. of IT experience and 5+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Show more Show less

Posted 2 days ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Purpose: You join the stream team, an experienced, informal and enthusiastic scrum team of 5 developers, working on stream-processing components to improve our data publication platform. This team is responsible for combining different sources of Sports data from all over the world into a single unified product, all in real-time. Part of the job is that you work together with international teams of developers located in Gracenote offices around the world. Job Requirements: Has experience with Scala, or other JVM languages with the capability to learn Scala Understands Stream-processing (preferably with Kafka Streams and/or Akka Streams) Is comfortable in a DevOps culture, and knows how to get their work into production. Has relevant work experience with both NoSQL (MongoDB) and SQL databases (Postgres, SQL Server) Has affinity with data and data streams. Has experience working in an Agile environment. Has good communication skills and is able to share their knowledge with the team. Has good knowledge of the English language, both spoken and written. Good to have skills: Have an affinity with sports, active or passive Understand schemas and like data modelling Are used to working with the scrum framework Have experience with other programming languages (some other languages we use are Python, Typescript and jsJava) Qualifications: B.E / B.Tech / BCA/ MCA in Computer Science, Engineering or a related subject. Strong Computer Science fundamentals. Comfortable with version control systems such as git. A thirst for learning new Tech and keeping up with industry advances. Excellent communication and knowledge-sharing skills. Comfortable working with technical and non-technical teams. Strong debugging skills. Comfortable providing and receiving code review feedback. A positive attitude, adaptability, enthusiasm, and a growth mindset.Role & responsibilities: Outline the day-to-day responsibilities for this role.

Posted 2 days ago

Apply

175.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you’ll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Responsible for contacting clients with overdue accounts to secure the settlement of the account. Also, they do preventive work to avoid future overdue with accounts that have a high exposure. Key Responsibilities Contributes to design, development, troubleshooting, debugging, evaluating, modifying, deploying, and documenting software and systems that meet the needs of customer-facing applications, business applications, and/or internal end user applications. Perform technical aspects of software development for assigned applications including design, developing prototypes, and coding assignments Familiar with Agile or other rapid application development methods Experience with design and coding across one or more platforms and languages as appropriate Hands-on expertise with application design, software development and automated testing Lead code reviews and automated testing Debug software components and identify code defects for remediation Leads the deployment, support, and monitoring of software across test, integration, and production environments. Explore and innovate new solution to modernize platforms Collaborates with leadership across multiple teams to define solution requirements and technical implementation Engineering & Architecture’ Demonstrate technical expertise to help team members overcome technical problems Solves technical problems outside of day-to-day responsibilities Leadership Takes accountability for the success of the team achieving their goals Drives the team’s strategy and prioritizes initiatives Influence team members by challenging status quo, demonstrating risk taking, and implementing innovative ideas Be a productivity multiplier for your team by analysing your workflow and contributing to enable the team to be more effective, productive, and demonstrating faster and stronger results. Minimum Qualifications/ Must Have 3+ years of software development experience in a professional environment and/or comparable experience Hands-on experience with Java 8 & above JavaScript, React JS, typescript, HTML, CSS. Strong experience in developing UI mockups, experience in J2EE, RESTful, SOAP API development. Experience in Event driven programming paradigm using Kafka. Knowledge of Source control (Git, Bitbucket etc). CI/CD (Jenkins, Maven/Gradle, Mockito, JMeter) Knowledge of VSS, IaaS, PaaS. Container Concepts (LXD, Docker). Knowledge of Serverless architecture (Lambda) will be an additional advantage. Demonstrated experience in Agile development, application design, software development, and testing Bachelor’s degree in computer science, computer science engineering, or related experience required, advanced degree. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 days ago

Apply

18.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Enterprise Architect Grade: VP Location: Pune / Mumbai/Chennai Experience: 18+ Years Organization: Intellect Design Arena Ltd. www.intellectdesign.com About the Role: We are looking for a senior Enterprise Architect with strong leadership and deep technical expertise to define and evolve the architecture strategy for iGTB , our award-winning transaction banking platform. The ideal candidate will have extensive experience architecting large-scale, cloud-native enterprise applications within the BFSI domain , and will be responsible for driving innovation, ensuring engineering excellence, and aligning architecture with evolving business needs. Mandatory Skills: Cloud-native architecture Microservices-based systems PostgreSQL, Apache Kafka, ActiveMQ Spring Boot / Spring Cloud, Angular Strong exposure to BFSI domain Key Responsibilities: Architectural Strategy & Governance: Define and maintain enterprise architecture standards and principles across iGTB product suites. Set up governance structures to ensure compliance across product lines. Technology Leadership: Stay updated on emerging technologies; assess and recommend adoption to improve scalability, security, and performance. Tooling & Automation: Evaluate and implement tools to improve developer productivity, code quality, and application reliability—including automation across testing, deployment, and monitoring. Architecture Evangelism: Drive adoption of architecture guidelines and tools across engineering teams through mentorship, training, and collaboration. Solution Oversight: Participate in the design of individual modules to ensure technical robustness and adherence to enterprise standards. Performance & Security: Oversee performance benchmarking and security assessments. Engage with third-party labs for certification as needed. Customer Engagement: Represent architecture in pre-sales, CXO-level interactions, and post-production engagements to demonstrate the product's technical superiority. Troubleshooting & Continuous Improvement: Support teams in resolving complex technical issues. Capture learnings and feed them back into architectural best practices. Automation Vision: Lead the end-to-end automation charter for iGTB—across code quality, CI/CD, testing, monitoring, and release management. Profile Requirements: 18+ years of experience in enterprise and solution architecture roles, preferably within BFSI or fintech Proven experience with mission-critical, scalable, and secure systems Strong communication and stakeholder management skills, including CXO interactions Demonstrated leadership in architecting complex enterprise products and managing teams of architects Ability to blend technical depth with business context to drive decisions Passion for innovation, engineering excellence, and architectural rigor Show more Show less

Posted 2 days ago

Apply

4.0 - 9.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Lead Java Software Engineer Experience: 4 - 12 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Opportunity Type: Onsite (Bengaluru) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Java, Springboot, Microservices, Kafka, AWS Practo (One of Uplers' Clients) is Looking for: Lead Java Software Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Job Description Key Responsibilities: Design, develop, and maintain high-quality software solutions using Java 8 and Spring Boot. Build and optimize backend services and APIs, ensuring scalability, performance, and security. Design and manage data models using PostgreSQL and Redis, ensuring data integrity and efficiency. Optimize database queries and implement best practices for database management and scaling. Contribute to frontend development using ReactJS, ensuring seamless integration between backend and frontend components. Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. Equivalent experience will also be considered. Strong proficiency in Java 8/11 and Spring Boot. Experience with relational databases like MySQL/PostgreSQL and in-memory data stores like Redis/Memcache. Professional Experience: 4 to 12 years of experience in software development, with a focus on backend technologies. Proven experience in building and scaling distributed systems. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Ability to work independently as well as in a team setting. Leadership qualities with a passion for mentoring and developing junior engineers. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 days ago

Apply

6.0 - 11.0 years

15 - 22 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Role & Responsibilities : 1• Having extensive hands on experience with C#, .NET Core, Web API, MVC, SQL, and Entity Framework. 2• Analyzing and designing software products to meet client specifications. 3• Leading teams and managing projects throughout the development lifecycle. 4• Proven track record of leading client/team discussions, resolving team issues, and handling escalations. 5• Implementing, testing, and fixing functionality. 6• Participating in agile Scrum deliveries as a team member. 7• Defining high level requirements through interaction with stakeholders. 8• Improving code performance by providing critical suggestions for fixes. 9• Troubleshooting technical issues such as hangs, slow performance, memory leaks, and crashes. 10• Analyzing and documenting requirements for development and software maintenance. 11• Keeping up-to-date with current and emerging technologies. 12• Managing multiple tasks and prioritizing effectively to meet deadlines. Preferred candidate profile : 1• 6+ years of IT industry experience 2• Extensive knowledge in C#, Net Core Technologies 3• Proficient experience in Azure Cloud Technologies. 4• Familiarity with Docker, Kubernetes, Kafka, NoSQL like Casandra, Mongo DB etc. and Cloud platforms 5• Good understanding of Agile development practices is a plus 6• Good to have knowledge in Hazelcast, Redis 7• Excellent communication and client-facing skills 8• Having knowledge of NodeJS, React JS/Angular JS will be an added benefit 9• Experience in software engineering and design architecture. Experience with following Tools is added advantage; 1-Visual Studio 2022 2• Re Sharper 3• SQL Server Management Studio 4• TFS/VSTS or a similar Task Tracking Tool 5• Git version control 6•Debug diagnostic tools Perks and benefits Based on Experience& Skills and Interview performance the salary is Negotiable for the deserving candidates.

Posted 2 days ago

Apply

5.0 - 8.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Java, spring, kafka and Azure (optional) We are looking for a skilled Senior Consultant with expertise in Java Backend and Kafka to join our team at Best Hawk Infosystems Pvt. Ltd. The ideal candidate will have a strong background in IT Services & Consulting, particularly in Java Backend and Kafka development. Roles and Responsibility Design, develop, and implement scalable Java-based backend systems using Spring Boot or similar frameworks. Collaborate with cross-functional teams to identify business requirements and design solutions that meet customer needs. Develop high-quality code that is efficient, modular, and easy to maintain. Troubleshoot issues and optimize system performance for improved efficiency. Participate in code reviews to ensure adherence to coding standards and best practices. Stay updated with industry trends and emerging technologies to continuously improve skills and knowledge. Job Requirements Strong proficiency in Java programming language with experience in developing large-scale applications. Experience with Kafka messaging system integration is required. Excellent problem-solving skills and the ability to analyze complex issues and provide effective solutions. Strong communication and collaboration skills are essential for working effectively with teams. Ability to work in an agile environment and adapt to changing priorities and deadlines. Strong attention to detail and a focus on delivering high-quality results.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn’t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target’s global team and has more than 4,000 team members supporting the company’s global strategy. Pyramid overview Roundel is Target’s entry into the media business with an impact of $1B+; an advertising sell-side business built on the principles of first party (people based) data, brand safe content environments and proof that our marketing programs drive business results for our clients. We are here to drive business growth for our clients and redefine “value” in the industry by solving core industry challenges vs. copy current industry methods of operation. Roundel is a key growth initiative for Target and lead the industry to a better way of operating within the media marketplace. Target Tech is on a mission to offer the systems, tools and support that our clients, guests and team members need and deserve. We drive industry-leading technologies in support of every angle of the business, and help ensure that Target operates smoothly, securely, and reliably from the inside out. Role Overview As a Senior Engineer, you serve as a specialist in the engineering team that supports the product. You help develop and gain insight in the application architecture. You can distill an abstract architecture into concrete design and influence the implementation. You show expertize in applying the appropriate software engineering patterns to build robust and scalable systems. You are an expert in programming and apply your skills in developing the product. You have the skills to design and implement the architecture on your own, but choose to influence your fellow engineers by proposing software designs, providing feedback on software designs and/or implementation. You show good problem solving skills and can help the team in triaging operational issues. You leverage your expertise in eliminating repeat occurrences. We are looking for a highly skilled and motivated Senior Backend Developer with deep expertise in Java or Kotlin and modern backend technologies. You will be responsible for designing, building, and maintaining scalable backend systems that power our platform. If you're passionate about building high-performance APIs, optimizing data flow, and working with large-scale systems, we'd love to meet you. Key Responsibilities Design, develop, and maintain robust backend services using Java or Kotlin and Spring Framework (Spring Boot, Spring Data, etc.). Develop RESTful APIs and backend components that are secure, scalable, and performant. Work with both SQL (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra) databases. Work with Kafka and Kafka Streams. Integrate and optimize Elasticsearch for advanced search functionality. Write clean, maintainable, and testable code with proper documentation. Participate in system design, architecture discussions, and code reviews. Collaborate with product managers, frontend developers, and QA engineers to deliver seamless features. Ensure system reliability, performance tuning, and monitor services in production. * Follow DevOps and CI/CD best practices. Required Qualifications 5+ years of backend development experience. Strong programming skills in Java or Kotlin. Deep understanding of the Spring ecosystem (Spring Boot, Spring Security, etc.). Solid experience in working with both relational and non-relational databases. Experience implementing Elasticsearch in production systems. Proficiency in designing and consuming RESTful APIs. Experience with microservices architecture and distributed systems. Strong problem-solving and debugging skills. Familiarity with version control tools like Git and CI/CD tools (Jenkins, GitHub Actions, etc.). Good to Have Experience with containerization and orchestration (Docker, Kubernetes). Exposure to cloud platforms (GCP, AWS, Azure). Useful Links Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture - https://india.target.com/life-at-target/belonging Show more Show less

Posted 2 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Gurugram

Work from Office

Naukri logo

In one sentence We are seeking an experienced Kafka Administrator to manage and maintain our Apache Kafka infrastructure, with a strong focus on deployments within OpenShift and Cloudera environments. The ideal candidate will have hands-on experience with Kafka clusters, container orchestration, and big data platforms, ensuring high availability, performance, and security. What will your job look like? Install, configure, and manage Kafka clusters in production and non-production environments. Deploy and manage Kafka on OpenShift using Confluent for Kubernetes (CFK) or similar tools. Integrate Kafka with Cloudera Data Platform (CDP), including services like NiFi, HBase, and Solr. Monitor Kafka performance and implement tuning strategies for optimal throughput and latency. Implement and manage Kafka security using SASL_SSL, Kerberos, and RBAC. Perform upgrades, patching, and backup/recovery of Kafka environments. Collaborate with DevOps and development teams to support CI/CD pipelines and application integration. Troubleshoot and resolve Kafka-related issues in a timely manner. Maintain documentation and provide knowledge transfer to team members. All you need is... 5+ years of experience as a Kafka Administrator. 2+ years of experience deploying Kafka on OpenShift or Kubernetes. Strong experience with Cloudera ecosystem and integration with Kafka. Proficiency in Kafka security protocols (SASL_SSL, Kerberos). Experience with monitoring tools like Prometheus, Grafana, or Confluent Control Center. Solid understanding of Linux systems and shell scripting. Familiarity with CI/CD tools (Jenkins, GitLab CI, etc.). Excellent problem-solving and communication skills.

Posted 2 days ago

Apply

Exploring Kafka Jobs in India

Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Gurgaon

These cities are known for their thriving tech industries and have a high demand for Kafka professionals.

Average Salary Range

The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.

Career Path

Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.

Related Skills

In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture

Interview Questions

  • What is Apache Kafka and how does it differ from other messaging systems? (basic)
  • Explain the role of Zookeeper in Apache Kafka. (medium)
  • How does Kafka guarantee fault tolerance? (medium)
  • What are the key components of a Kafka cluster? (basic)
  • Describe the process of message publishing and consuming in Kafka. (medium)
  • How can you achieve exactly-once message processing in Kafka? (advanced)
  • What is the role of Kafka Connect in Kafka ecosystem? (medium)
  • Explain the concept of partitions in Kafka. (basic)
  • How does Kafka handle consumer offsets? (medium)
  • What is the role of a Kafka Producer API? (basic)
  • How does Kafka ensure high availability and durability of data? (medium)
  • Explain the concept of consumer groups in Kafka. (basic)
  • How can you monitor Kafka performance and throughput? (medium)
  • What is the purpose of Kafka Streams API? (medium)
  • Describe the use cases where Kafka is not a suitable solution. (advanced)
  • How does Kafka handle data retention and cleanup policies? (medium)
  • Explain the Kafka message delivery semantics. (medium)
  • What are the different security features available in Kafka? (medium)
  • How can you optimize Kafka for high throughput and low latency? (advanced)
  • Describe the role of a Kafka Broker in a Kafka cluster. (basic)
  • How does Kafka handle data replication across brokers? (medium)
  • Explain the significance of serialization and deserialization in Kafka. (basic)
  • What are the common challenges faced while working with Kafka? (medium)
  • How can you scale Kafka to handle increased data loads? (advanced)

Closing Remark

As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies