Home
Jobs

8532 Kafka Jobs - Page 45

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Experience: 5+ Years Role Overview: Responsible for designing, building, and maintaining scalable data pipelines and architectures. This role requires expertise in SQL, ETL frameworks, big data technologies, cloud services, and programming languages to ensure efficient data processing, storage, and integration across systems. Requirements: • Minimum 5+ years of experience as a Data Engineer or similar data-related role. • Strong proficiency in SQL for querying databases and performing data transformations. • Experience with data pipeline frameworks (e.g., Apache Airflow, Luigi, or custom-built solutions). • Proficiency in at least one programming language such as Python, Java, or Scala for data processing tasks. • Experience with cloud-based data services and Datalakes (e.g., Snowflake, Databricks, AWS S3, GCP BigQuery, or Azure Data Lake). • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). • Experience with ETL tools (e.g., Talend, Apache NiFi, SSIS, etc.) and data integration techniques. • Knowledge of data warehousing concepts and database design principles. • Good understanding of NoSQL and Big Data Technologies like MongoDB, Cassandra, Spark, Hadoop, Hive, • Experience with data modeling and schema design for OLAP and OLTP systems. • Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Educational Qualification: Bachelor’s/Master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Job Title: Senior Java Backend Developer Location: Trivandrum Experience: 6+ Years Job Summary We are looking for a Senior Java Backend Developer to design and develop robust, high-performance backend solutions for our client's e-commerce platforms. The role involves working in an Agile environment with cross-functional teams to build scalable and secure applications using modern backend technologies. Key Responsibilities Design, develop, and maintain backend services using Java, Kotlin, and Spring Boot Build scalable solutions using Kafka, Message Queues, and AWS Lambda Manage and optimize data using PostgreSQL, Oracle, and DynamoDB Integrate various systems using ATG 11.3, IBM App Connect, and IBM Integration Bus (IIB) Implement and maintain CI/CD pipelines using Jenkins and GitLab CI/CD Ensure high standards for performance, reliability, and scalability Participate in code reviews, testing, and technical documentation Mandatory Skills Java (8+) – Advanced level Kotlin – At least 4 years of hands-on experience Spring Boot – Strong proficiency with microservices architecture Kafka and MQ – Solid experience with messaging systems PostgreSQL & DynamoDB – Expertise in database technologies AWS Lambda – Experience with serverless architecture CI/CD tools – Jenkins and GitLab CI/CD ATG 11.3 – Prior implementation experience is essential E-commerce platforms – Hands-on exposure Excellent communication skills in English (both written and verbal) Good To Have Skills Experience with Oracle database Familiarity with IBM App Connect and IBM Integration Bus (IIB) Background in Agile/Scrum methodologies Experience working in distributed or multinational teams Strong analytical and problem-solving skills Skills Java,Atg,Microservices Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Job Title: Senior Java Backend Developer Location: Trivandrum Experience: 6+ Years Job Summary We are seeking a Senior Java Backend Developer to join our team in Trivandrum. The ideal candidate will bring strong backend development expertise, particularly in Java and Kotlin, to build scalable, high-performance backend systems for our client’s e-commerce platform. You will work in a dynamic Agile environment , collaborating with cross-functional teams to deliver secure, efficient, and reliable solutions. Key Responsibilities Design, develop, and maintain backend services using Java, Kotlin, and Spring Boot Build and optimize event-driven systems using Kafka, MQ, and AWS Lambda Manage and maintain databases including PostgreSQL, Oracle, and DynamoDB Integrate systems and services using ATG 11.3, IBM App Connect, and IBM Integration Bus (IIB) Develop and maintain CI/CD pipelines using Jenkins and GitLab CI/CD Ensure backend services meet standards for performance, scalability, and reliability Participate in code reviews, automated testing, and create relevant technical documentation Mandatory Skills Java (8+) – Advanced development experience Kotlin – Minimum 4 years of hands-on experience Spring Boot – Proficiency with microservices architecture Kafka and MQ – Strong knowledge of messaging systems PostgreSQL and DynamoDB – In-depth experience with relational and NoSQL databases AWS Lambda – Understanding of serverless architecture and implementation CI/CD Tools – Experience with Jenkins and GitLab CI/CD ATG 11.3 – Hands-on experience is essential E-commerce platforms – Proven background working with e-commerce systems Excellent communication skills in English (written and verbal) Good To Have Skills Experience with Oracle database Familiarity with IBM App Connect and IBM Integration Bus (IIB) Understanding of Agile/Scrum methodology Experience working in distributed or multinational teams Strong analytical and problem-solving abilities Skills Java,Atg,Microservices Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Job Title: Senior Java Backend Developer – ATG / Fluent Commerce Location: Trivandrum Experience: 6+ Years Job Summary We are looking for a Senior Java Backend Developer with hands-on experience in ATG or Fluent Commerce platforms to join our team. The successful candidate will be responsible for developing high-performance backend systems that support our client’s e-commerce operations. You will work in an Agile environment and collaborate with cross-functional teams to deliver scalable, secure, and efficient solutions. Key Responsibilities Design, develop, and maintain backend services using Java, Spring Boot, and ATG or Fluent Commerce Develop and optimize messaging-based solutions using Kafka, MQ, and AWS Lambda Manage relational and NoSQL databases such as PostgreSQL, Oracle, and DynamoDB Integrate systems and services using App Connect, IBM Integration Bus (IIB), and e-commerce middleware Implement CI/CD pipelines using Jenkins and GitLab CI/CD Ensure backend systems meet high standards for performance, scalability, and reliability Participate in code reviews, testing, and maintain technical documentation Mandatory Skills Java (Java 8+) – Strong backend development expertise ATG or Fluent Commerce – Proven experience working with either platform Spring Boot – Proficiency with microservices-based architecture Kafka and MQ – Experience in developing event-driven architectures PostgreSQL and DynamoDB – Hands-on experience with relational and NoSQL databases AWS Lambda – Knowledge of cloud-native/serverless services CI/CD – Experience using tools such as Jenkins and GitLab Strong problem-solving and collaboration skills Excellent verbal and written communication in English Good To Have Skills Experience with Oracle database Familiarity with IBM App Connect and IBM Integration Bus (IIB) Understanding of Agile/Scrum development methodologies Prior experience working in global or distributed teams Skills Java,Atg,Microservices Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments We are seeking a Senior level Kotlin Backend Developer to join our team, focused on developing high-performance backend solutions for our Client’s e-commerce platforms. You will work in an Agile environment, collaborating closely with cross-functional teams to design, implement, and maintain scalable and secure applications. Activities as below Develop, and maintain backend services using Kotlin, and Spring Boot Experience with technologies like Kafka, MQ Work on database management for PostgreSQL, Oracle, or DynamoDB Implement CI/CD pipelines with Jenkins and GitLab CI/CD Good understanding of SOLID Principles. Participate in code reviews, testing, and documentation Technical Skills required: Strong experience in Kotlin (min. 5 years) Proficiency in Spring Boot framework Experience in Restful APIs and microservices architecture Hands-on experience with messaging systems like Kafka and MQ integration Knowledge in databases like PostgreSQL, Oracle, or DynamoDB. Knowledge of cloud services like AWS/GCP/Azure Experience with testing frameworks like Jest, Playwright, or similar. Good knowledge of SOLID Principles. Strong problem-solving skills and the ability to work collaboratively in a team environment Excellent command of English (written and spoken) Skills Kotlin,Android,Graphql Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Role: Senior Java Backend Developer Location: Trivandrum Experience: 6+ years Job Description We are looking for a Senior Java Backend Developer to join our team and contribute to building high-performance backend solutions for our client's e-commerce platforms. You will work in a fast-paced Agile environment and collaborate with cross-functional teams to design, implement, and maintain scalable and secure applications. Responsibilities Design, develop, and maintain backend services using Java, Kotlin, and Spring Boot. Build and optimize event-driven and messaging solutions with Kafka, MQ, and AWS Lambda. Manage databases such as PostgreSQL, Oracle, and DynamoDB. Integrate various systems using ATG 11.3, IBM App Connect, and IBM Integration Bus (IIB). Implement and maintain CI/CD pipelines using Jenkins and GitLab CI/CD. Ensure backend services meet performance, reliability, and scalability standards. Participate in code reviews, testing, documentation, and team collaboration. Mandatory Skills Java (8+) – Strong development experience. Kotlin – Minimum 4 years of hands-on experience. Spring Boot – Proficiency in microservices architecture. Kafka, MQ – Expertise in messaging systems. PostgreSQL and DynamoDB – Deep database knowledge. AWS Lambda – Experience with serverless cloud functions. CI/CD tools – Jenkins, GitLab CI/CD. ATG 11.3 – Prior experience required. E-commerce platforms – Working knowledge essential. Excellent communication skills in English (written and spoken). Good To Have Skills Experience with Oracle databases. Familiarity with IBM App Connect and IBM Integration Bus (IIB). Exposure to Agile/Scrum methodologies. Strong problem-solving and analytical skills. Prior experience in a multinational or distributed team environment. Skills Java,Atg,Microservices Show more Show less

Posted 5 days ago

Apply

2.0 - 4.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

Location: Trivandrum About us At Arbor, we're on a mission to transform the way schools work for the better. We believe in a future of work in schools where being challenged doesn't mean being burnt out and overworked. Where data guides progress without overwhelming staff. And where everyone working in a school is reminded why they got into education every day. Our MIS and school management tools are already making a difference in over 7,000 schools and trusts. Giving time and power back to staff, turning data into clear, actionable insights, and supporting happier working days. At the heart of our brand is a recognition that the challenges schools face today aren't just about efficiency, outputs and productivity - but about creating happier working lives for the people who drive education everyday: the staff. We want to make schools more joyful places to work, as well as learn. About the role We're seeking a PHP Backend Developer (Platform) with 2-4 years of hands-on experience in developing and maintaining scalable backend systems. The ideal candidate is well-versed in PHP and modern frameworks such as Symfony /Laravel , with a solid understanding of OOPs, writing unit test cases, RESTful APIs, MySQL database management, and performance optimization techniques. You'll work closely with product managers, and engineering teams to deliver reliable, high-quality features. Familiarity with cloud platforms like AWS is a strong advantage. A strong emphasis on clean, maintainable code and the ability to troubleshoot production issues is essential. We value a passion for continuous learning and a collaborative approach to cross-functional teamwork. Core responsibilities Develop core platform components to aid reusability and stability of the system Work with Head of Platform Engineering/SRE to identify and progress platform improvements related to stability, scalability, and performance Work with the QA automation framework to ensure functionality is delivered to a high quality Work with DevOps Engineers to understand application impacts and system performance and stability, and work with engineering teams to rectify Assist in incident response and resolution, and subsequent post-mortems and retrospectives Contribute to the platform code base and framework which is used by Product Engineers across Engineering Requirements About you Experience of PHP at scale through frameworks such as Symfony /Laravel Experience of distributed cloud systems, and specifically Amazon Web Services Enterprise Software design patterns and their implementation in real-world enterprise systems Experience of message queuing and/or streaming systems such as SQS, ActiveMQ, Apache Kafka, AWS Kinesis, AWS Firehose Understanding of relational database technologies and their cloud versions (e.g. AWS MySQL Aurora) Experience with DataDog, Prometheus or similar observability tools A positive and proactive attitude to problem solving A team player, willing to muck in and help others when needed, driven personality who asks questions and actively participates in discussions Bonus skills Past experience with enterprise solutions running at scale Familiarity with Scrum methodology or other agile development processes Experience with Docker and containerization Experience with AWS or other Cloud Infrastructure Familiarity with software best practices such as Refactoring, Clean Code, Domain-Driven Design, Test-Driven Development, etc. Benefits Interview process Phone screen 1st stage 2nd stage We are committed to a fair and comfortable recruitment process, so if you require any reasonable adjustments during your application or interview process, please reach out to a member of the team at careers@arbor-education.com. What we offer The chance to work alongside a team of hard-working, passionate people in a role where you'll see the impact of your work everyday. We also offer: Flexible work environment (3 days work from office) Group Term Life Insurance paid out at 3x Annual CTC (Arbor India) 32 days holiday (plus Arbor Holidays). This is made up of 25 days annual leave plus 7 extra companywide days given over Easter, Summer & Christmas Work time: 9.30 am to 6 pm (8.5 hours only) Compensation - 100% fixed salary disbursement and no variable component Arbor Education is an equal opportunities organisation Our goal is for Arbor to be a workplace which represents, celebrates and supports people from all backgrounds, and which gives them the tools they need to thrive - whatever their ambitions may be so we support and promote diversity and equality, and actively encourage applications from people of all backgrounds. Refer a friend Know someone else who would be good for this role? You can refer a friend, family member or colleague, if they are offered a role with Arbor, we will say thank you with a voucher valued up to £200! Simply email: careers@arbor-education.com Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Java Developer - Software Engineer Experience: 4-9 Years Location: Chennai (HYBRID) Interview: F2F Mandatory: Java Spring Boot Microservice -React Js -AWS Cloud- DevOps- Node(Added Advantage) Job Description: Overall 4+ years of experience in Java Development Projects 3+Years of development experience in development with React 2+Years Of experience in AWS Cloud, Devops. Microservices development using Spring Boot Technical StackCore Java, Java, J2EE, Spring, MongoDB, GKE, Terraform, GitHub, GCP Developer, Kubernetes, Scala, Kafka Technical ToolsConfluence/Jira/Bitbucket or Git, CI / CD (Maven, Git, Jenkins), Eclipse or IntelliJ IDEA Experience in event-driven architectures (CQRS and SAGA patterns). Experience in Design patterns Build Tools (Gulp, Webpack), Jenkins, Docker, Automation, Bash, Redis, Elasticsearch, Kibana Technical Stack (UI)JavaScript, React JS, CSS/SCSS, HTML5, Git+ Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

At NICE, we don’t limit our challenges. We challenge our limits. Always. We’re ambitious. We’re game changers. And we play to win. We set the highest standards and execute beyond them. And if you’re like us, we can offer you the ultimate career opportunity that will light a fire within you. So, what’s the role all about? We are seeking a skilled and experienced in Java programming to join our dynamic team. You will be responsible for developing and maintaining contact center applications, with a specific focus on routing functionality. Your role will involve designing and implementing robust and scalable routing solutions, ensuring efficient call flows and optimal customer experiences. You will collaborate closely with cross-functional teams, including software developers, system architects, and contact center managers, to deliver cutting-edge solutions that enhance our contact center operations. How will you make an impact? Proven experience in Java programming, with a deep understanding of data structures, threading, object-oriented programming (OOP), design patterns, functional programming, and memory optimization. Strong expertise in developing web applications and web services using Java, Spring, and Spring Boot frameworks. Good hands-on experience with microservice architecture and RESTful API development. Experience with message brokers like Kafka and API Gateway/reverse proxy systems (good to have). Proficient in working with relational and NoSQL databases such as Postgres, Redis, and Amazon Aurora. Good understanding of cloud infrastructure, particularly with Amazon Web Services (AWS). Hands-on experience developing and maintaining infrastructure as code using Terraform and best practices. Experience working with Continuous Integration and Delivery (CI/CD) pipelines using tools like Jenkins, Docker, Kubernetes, Artifactory, and CloudFormation (Terraform experience is a plus). Comfortable working in an Agile environment, utilizing tools like JIRA for work item management. Proficiency in version control systems like Git and TFS. Strong analytical skills and a problem-solving mindset. Excellent communication and collaboration abilities, able to work effectively in a team setting. Familiarity with Microsoft .NET and C# (good to have). Have you got what it takes? Design, implement, and optimize routing algorithms to ensure efficient and effective call flows. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Perform system analysis, troubleshooting, and debugging to identify and resolve routing-related issues. Conduct regular performance monitoring and optimization of routing strategies to ensure optimal customer experiences. Maintain documentation, including technical specifications, system designs, and user manuals. Stay up-to-date with industry trends and emerging technologies in contact center routing and Java development, and apply them to enhance our systems. Participate in code reviews and provide constructive feedback to ensure high-quality code standards. Deliver high quality, sustainable, maintainable code Participate in reviewing design and code (pull requests) for other team members – again with a secure code focus Work as a member of an agile team responsible for product development and delivery Adhere to agile development principles while following and improving all aspects of the scrum process Follow established department procedures, policies, and processes. Adheres to the company Code of Ethics and CxOne policies and procedures. Excellent English and experience in working in international teams are required. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager, Engineering, CX . Role Type: Individual Contributor About NICE NICE Ltd. (NASDAQ: NICE) software products are used by 25,000+ global businesses, including 85 of the Fortune 100 corporations, to deliver extraordinary customer experiences, fight financial crime and ensure public safety. Every day, NICE software manages more than 120 million customer interactions and monitors 3+ billion financial transactions. Known as an innovation powerhouse that excels in AI, cloud and digital, NICE is consistently recognized as the market leader in its domains, with over 8,500 employees across 30+ countries. NICE is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, age, sex, marital status, ancestry, neurotype, physical or mental disability, veteran status, gender identity, sexual orientation or any other category protected by law. Show more Show less

Posted 5 days ago

Apply

140.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About NCR VOYIX NCR VOYIX Corporation (NYSE: VYX) is a leading global provider of digital commerce solutions for the retail, restaurant and banking industries. NCR VOYIX is headquartered in Atlanta, Georgia, with approximately 16,000 employees in 35 countries across the globe. For nearly 140 years, we have been the global leader in consumer transaction technologies, turning everyday consumer interactions into meaningful moments. Today, NCR VOYIX transforms the stores, restaurants and digital banking experiences with cloud-based, platform-led SaaS and services capabilities. Not only are we the leader in the market segments we serve and the technology we deliver, but we create exceptional consumer experiences in partnership with the world’s leading retailers, restaurants and financial institutions. We leverage our expertise, R&D capabilities and unique platform to help navigate, simplify and run our customers’ technology systems. Our customers are at the center of everything we do. Our mission is to enable stores, restaurants and financial institutions to exceed their goals – from customer satisfaction to revenue growth, to operational excellence, to reduced costs and profit growth. Our solutions empower our customers to succeed in today’s competitive landscape. Our unique perspective brings innovative, industry-leading tech to all the moving parts of business across industries. NCR VOYIX has earned the trust of businesses large and small — from the best-known brands around the world to your local favorite around the corner. Title :- Senior Software Engineering Manager – Data Engineering & Full stack Experience :- 12 Years – 15 Years Location :- Hyderabad/Gurgaon/Virtual YOU ARE… Passionate about technology and see the world a little differently than your peers. Everywhere you look, there’s possibility. Opportunity. Boundaries to push and challenges to solve. You believe software engineering changes how people live. At NCR Voyix, we believe that, too. We’re one of the world’s first tech companies, and still going strong. Like us, you know the online and mobile worlds better than any other—and see patterns that no one else sees. Our leadership team drives the delivery of products that provide optimal performance and stability with unsurpassed longevity with over 25 years in the Restaraunts, Retail, Payments & Services industry. We are looking for talented people to join our expanding our NCR Voyix Data and Analytics platform team. Our product as a cloud based SaaS solution is responsible for providing the foundation for NCR Voyix cloud-based Data and Analytics platform. Our primary customers are merchants you see and visit every day in the Retail, Grocery, and Hospitality industry. We experience the impact our work is having, and we take pride in providing services with great availability and ease of use. IN THIS ROLE, YOU CAN EXPECT TO… . The NCR Voyix Software Engineer will be responsible for front-end and back-end solution design, software development, code quality, data security, production readiness and performance tuning. The ideal candidate is an experienced software engineer who enjoys optimizing data systems and building them from the ground up. The Software Engineer will support database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. The NCR Voyix Software Engineer contributes in the following: KEY AREAS OF RESPONSIBILITY: Lead team of talented developers and leads working on full stack frameworks and data engineering. Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. Mine and analyze data from different NCR data sources to drive optimization of operations, and improve customer experience. Assess the effectiveness and accuracy of new data sources and data gathering techniques. Develop custom data models and algorithms to apply to data sets. Use predictive modeling to increase and optimize customer experiences, cost savings, actionable insights and other business outcomes. Develop company A/B testing framework and test model quality. Collaborate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Be part of an Agile team, participate in all Agile ceremonies & activities and be accountable for the sprint deliverable Create and maintain optimal data delivery architecture Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure and GCP ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data delivery needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. YOU HAVE… 15+ years of experience in software testing or software engineering 10+ years in non-functional automation & performance testing 10+ years in Public Cloud based engineering React.js understanding: Experience with React components, hooks, and state management. JavaScript/TypeScript knowledge Node.js: Expertise in server-side development using Node.js. RESTful APIs & GraphQL: Ability to design and consume APIs. Agile Methodologies: Experience in Agile, Scrum, or Kanban environments. UI/UX Principles: Basic understanding for effective collaboration with designers. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with structured and unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with ETL,and big data integration services: Confluent Kafka, BigQuery, Data Bricks, Data Factory, etc. Experience with relational SQL and NoSQL databases, including DataBricks, BigQuery, Azure Data Warehouse, etc. Experience with stream-processing systems: kSQL, Flink SQL, dbtLabs, DataBricks, Spark-Streaming, etc. Experience with object-oriented, functional and scripting languages: Python, Java, C#, Scala, etc. Experience with Dev Ops tools: CI & Dev Ops: GitHub, GitHub Actions, Jenkins, JIRA, Chef, Sonar Experience with unit testing, integration testing, performance testing and user acceptance testing BASIC QUALIFICATIONS: Strong inferential skills with an ability to succinctly communicate complex topics to business stakeholders. Experience with UI and full stack frameworks like ReactJS, NodeJS, Typescript, Material UI, SASS etc Experience using cloud platforms like Azure or GCP. Experience working with complex on-premise and cloud data architectures. GENERAL KNOWLEDGE, SKILLS AND ABILITIES: Exhibit leadership skills Azure or GCP Public Cloud Technologies In-depth knowledge of end-to-end systems development life cycles (including agile, iterative, and other modern approaches to software development) Outstanding verbal and written communication skills to technical and non-technical audiences of various levels in the organization (e.g., executive, management, individual contributors) Ability to estimate work effort for project sub-plans or small projects and ensure projects are successfully completed Quality assurance mindset Positive outlook, strong work ethic, and responsive to internal and external customers and contacts Willingly and successfully fulfils the role of teacher, mentor and coach Requires in-depth knowledge of networking, computing platform, storage, database, security, middleware, network and systems management, and related infrastructure technologies and practice Offers of employment are conditional upon passage of screening criteria applicable to the job EEO Statement Integrated into our shared values is NCR Voyix’s commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. NCR Voyix is committed to being a globally inclusive company where all people are treated fairly, recognized for their individuality, promoted based on performance and encouraged to strive to reach their full potential. We believe in understanding and respecting differences among all people. Every individual at NCR Voyix has an ongoing responsibility to respect and support a globally diverse environment. Statement to Third Party Agencies To ALL recruitment agencies: NCR Voyix only accepts resumes from agencies on the preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Voyix employees, or any NCR Voyix facility. NCR Voyix is not responsible for any fees or charges associated with unsolicited resumes “When applying for a job, please make sure to only open emails that you will receive during your application process that come from a @ncrvoyix.com email domain.” Show more Show less

Posted 5 days ago

Apply

3.0 - 13.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Hi Connections, We are Hiring: - Java Developer Location: - Anywhere In India Experience: - 3 to 13 Years Requirements: - Java Full Stack Developer Java Backend Developer Strong knowledge of Java, Spring Boot, Hibernate. Hands on experience with Rest APIs and Microservices. Database: - MongoDB Cloud: - AWS/ Azure/ GCP Version Control experience using Git. CI/CD pipeline and containerization like Doker/ Kubernetes. Problem solving skills. Bonus Points for: Experience with Apache Kafka, Oracle, Frontend Skills for full stack. Apply Now: If you are ready to take the next step in your JAVA career, send your resume to divya.rghav@nagarro.com. Let's build something great together. Show more Show less

Posted 5 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. About The Team And The Role TCGplayer connects hobbyists and hobby businesses to communities. We relentlessly improve the exchange of things and thoughts that fuel passions, providing the most compelling destination and tools for collectible card game enthusiasts and professional sellers! TCGplayer, now a part of eBay, promotes and drives growth of our products and services by connecting a global community of millions of buyers with tens of thousands of retailers in a $25B global collectible hobby market. As a Senior Quality Engineer, you will be responsible for ensuring the quality and reliability of TCGplayer’s Point-of-Sale product through rigorous testing and quality assurance practices. You will be uplifting legacy applications into a microservice based architecture using the latest and greatest technologies. You will work closely with other development teams to bring a cohesive and intuitive selling experience to our customers across TCGplayer’s suite of tools. Due to the criticality of point-of-sale software to a seller’s business, your role on this team is especially impactful. If you’re an experienced and disciplined quality-focused engineer, with a desire to make a difference in the way we build new tools, this might be the perfect opportunity! What You Will Accomplish Build a best-in-class Point Of Sale system for Local Game Stores, used by thousands of customers domestically and internationally, allowing them to profitably grow their business in an omni-channel environment. Write automated tests to ensure the functionality, performance, and security of the platform. Collaborate with developers, product managers, and other stakeholders to understand requirements and provide feedback on design and implementation. Identify, document, and track defects, working with the development team to ensure timely resolution. Mentor and guide junior quality engineers, promoting best practices in testing and quality assurance. Continuously improve testing processes and tools to enhance efficiency and effectiveness. Stay current with industry trends and emerging technologies to drive innovation in quality assurance practices. What You Will Bring Relevant Bachelor’s Degree and 7 years of relevant industry, or Relevant Master's Degree plus 4 years of relevant industry, or 11 years of practical experience. Hands-on development experience in one or more back-end object-oriented programming languages (preferably C# and TypeScript) Experience automating testing of REST API and Kafka contracts Experience with testing frameworks like Playwright Experience with Contract Testing (using Pact) Experience with event processing mechanisms such as Kafka Experience with automating tests (load, API, integration, functional) within CI/CD pipelines Experience with load testing, preferably with JMeter and/or BlazeMeter Excellent verbal and written communication, leadership and collaboration skills Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the TCGplayer Careers website or apply for a job with TCGplayer. TCGplayer, a subsidiary company of eBay, is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you are unable to submit an application because of incompatible assistive technology or a disability, please contact us at careers@tcgplayer.com. We will make every effort to respond to your request for disability assistance as soon as possible. View our accessibility info to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information. Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description Experian is looking for an experienced Senior Staff Engineer that, developing and delivering innovative direct to customer products on a cloud native platform developed using Java and the Spring Framework. You will be involved in projects using cutting-edge technologies as part of a senior software engineering team. You will be a key player in designing and implementing product features. This is a highly technical role requiring excellent coding skills. You will be responsible for developing core functionality and processing for a new powerful, enterprise level data platform built with Java and leveraging leading mainstream open-source technologies. Hands-On active collaboration as a core member of a software engineering team focused on building event driven services delivering highly secure, efficient and robust solutions in timeframe. You will report to director You will deliver highly available and scalable data streaming application functionality on an AWS cloud-based platform Diligently observe and maintain Standards for Regulatory Compliance and Information Security Deliver and maintain accurate, complete and current documentation Participate in full Agile cycle engagements, including meetings, iterative development, estimations, code reviews and design sessions Actively contribute to team architecture, engineering, and product discussions ensuring the team delivers best of breed software Work closely with the service quality engineering team to ensure that only thoroughly tested code makes it to production Own deliverables from design through production operationalization Qualifications 10+ years of software development experience building and testing applications following secure coding practices. Currently collaborating as a hands-on team member developing and supporting a significant commercial software project in Java with Spring Framework. Proven proficiency in developing server-side Java applications using mainstream frameworks, libraries, and tools including the Spring framework and AWS SDK. Experience developing web application using Spring Reactive libraries like WebFlux and Project Reactor as well as normal Spring Web. Experience with event driven architectures using pub/sub message brokers such as Kafka, Kinesis, and NATS.io. Knowledgeable and experienced with software and system patterns and their application in prior works. Current cloud technology experience AWS (Fargate, EC2, S3, RDS PostgreSQL, Lambda, API Gateway, Airflow). A strong proven proficiency in SQL and NoSQL based data access and management on PostgeSQL and MongoDB or AWS DocumentDB. Recent hands-on experience building and supporting commercial systems managing data and transactions including server-side development of Data Flow processes. Extensive experience gathering and assessing specifications and requirements. Extensive experience building systems for financial services or tightly regulated businesses. Security and privacy compliance (GPDR, CCPA, ISO 27001, PCI, HIPAA, etc.) experience. Experience with Continuous Integration/Continuous Delivery (CI/CD) process and practices (CodeCommit, CodeDeploy, CodePipeline/Harness/Jenkins/Github Actions, CLI, BitBucket/Git). Experience monitoring technologies including Splunk, Datadog, and Cloudwatch. Familiarity creating and using Docker/Kubernetes applications. Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer the best family well-being benefits, Enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Show more Show less

Posted 5 days ago

Apply

15.0 years

0 Lacs

India

On-site

Linkedin logo

Flexera saves customers billions of dollars in wasted technology spend. A pioneer in Hybrid ITAM and FinOps, Flexera provides award-winning, data-oriented SaaS solutions for technology value optimization (TVO), enabling IT, finance, procurement and cloud teams to gain deep insights into cost optimization, compliance and risks for each business service. Flexera One solutions are built on a set of definitive customer, supplier and industry data, powered by our Technology Intelligence Platform, that enables organizations to visualize their Enterprise Technology Blueprint™ in hybrid environments—from on-premises to SaaS to containers to cloud. We’re transforming the software industry. We’re Flexera. With more than 50,000 customers across the world, we’re achieving that goal. But we know we can’t do any of that without our team. Ready to help us re-imagine the industry during a time of substantial growth and ambitious plans? Come and see why we’re consistently recognized by Gartner, Forrester and IDC as a category leader in the marketplace. Learn more at flexera.com At Flexera, we're on a mission to empower global enterprises by transforming IT insights into decisive actions. We are looking for an architect/principal engineer with deep expertise in analytics, including data modeling, semantic modeling, to take our world-class reference data in Technopedia to the next level, through deeper insights, richer features and more data sets. The ideal candidate will have a strong track record of providing technical leadership, deep technical expertise and delivering big data solutions through data lakehouse and large scale-processing technologies. As an architect/principal engineer, you will be a key player in architecting, designing, developing, and maintaining our common ontology and data models, transformations, and analytics capabilities for our industry-leading reference data. You will collaborate closely with other architects and guide, influence and help engineering teams to ensure seamless integration with other components of our system. What you'll do: Architect, design and develop our big data platform, processing pipelines and content reference data. Drive innovation, enabling capabilities to enhance speed to innovation for our analytics, AI/ML, and product development teams. Define data models and semantic models for our wide range of data sets to provide insights and optimized user experiences. Provide guidance, influence and help engineering teams to deliver deep insights from our large and broad data sets. Continuously enhance your technical skills and mentor other engineers. Promote a high-standard engineering culture and operational excellence within the team. Collaborate with product and design teams to ensure the platform meets user needs. Define project requirements, technical artifacts, and designs, driving consensus across Product Management, architects, and engineering teams. Balance priorities between new feature development, architectural enhancements, and technical debt reduction for a sustainable and scalable platform. Champion continuous improvement in product quality, security, and performance standards within the development team. You'll be expected to have: Bachelor's or higher degree in Computer Science, Software Engineering, or related field Minimum 15 years relevant experience in software development, including 8+ years of expertise in data modeling, semantic modeling, and data visualization. Strong expertise in other big data technologies, such as metadata catalogs, data lineage, and orchestration. Strong technical expertise in distributed systems, big data, and cloud computing. Strong problem-solving skills and ability to troubleshoot complex issues. Good understanding of streaming technologies, including Kafka, with experience in defining message schemas and data models. Outstanding communication skills and emotional intelligence to collaborate effectively with teams across the organization. Excellent written and verbal communication skills. Ability to work effectively both independently and in a collaborative team environment. Prior experience mentoring and providing technical guidance to junior engineers. At Flexera, we foster a fun and engaged hybrid working environment where collaboration and innovation thrive. We value diversity and encourage applicants from underrepresented groups in technology to apply. Join our team to not only contribute to a world-class global product but also to grow in your career. At Flexera, we encourage continuous learning and provide opportunities for professional development. Flexera is proud to be an equal opportunity employer. Qualified applicants will be considered for open roles regardless of age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by local/national laws, policies and/or regulations. Flexera understands the value that results from employing a diverse, equitable, and inclusive workforce. We recognize that equity necessitates acknowledging past exclusion and that inclusion requires intentional effort. Our DEI (Diversity, Equity, and Inclusion) council is the driving force behind our commitment to championing policies and practices that foster a welcoming environment for all. We encourage candidates requiring accommodations to please let us know by emailing careers@flexera.com. Show more Show less

Posted 5 days ago

Apply

7.0 - 10.0 years

35 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Experience Minimum 5 years of coding experience in NodeJS, JavaScript & TypeScript and NoSQL Databases. Minimum 5 years of coding experience in ReactJS (TypeScript), HTML, CSS-Pre-processors, or CSS-in-JS in creating Enterprise Applications with high performance for Responsive Web Applications. Developing and implementing highly responsive user interface components using React concepts. (self-contained, reusable, and testable modules and components) Architecting and automating the build process for production, using task runners or scripts Knowledge of Data Structures for TypeScript. Monitoring and improving front-end performance. Hands on experience in performance tuning, debugging, monitoring. Technical Skills Excellent knowledge developing scalable and highly available Restful APIs using NodeJS technologies Well versed with CI/CD principles, and actively involved in solving, troubleshooting issues in distributed services ecosystem Understanding of containerization, experienced in Dockers, Kubernetes. Exposed to API gateway integrations like 3Scale. Understanding of Single-Sign-on or token-based authentication (Rest, JWT, OAuth) Possess expert knowledge of task/message queues include but not limited to: AWS, Microsoft Azure. Writing tested, idiomatic, and documented JavaScript, HTML and CSS Experiencing in Developing responsive web-based UI Have experience on Styled Components, Tailwind CSS, Material UI and other CSS-in-JS techniques Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, and other web services used in the system Writing non-blocking code, and resorting to advanced techniques such as multi-threading, when needed Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model .

Posted 5 days ago

Apply

5.0 - 10.0 years

27 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Experience Minimum 5 years of coding experience in NodeJS, JavaScript & TypeScript and NoSQL Databases. Minimum 5 years of coding experience in ReactJS (TypeScript), HTML, CSS-Pre-processors, or CSS-in-JS in creating Enterprise Applications with high performance for Responsive Web Applications. Developing and implementing highly responsive user interface components using React concepts. (self-contained, reusable, and testable modules and components) Architecting and automating the build process for production, using task runners or scripts Knowledge of Data Structures for TypeScript. Monitoring and improving front-end performance. Hands on experience in performance tuning, debugging, monitoring. Technical Skills Excellent knowledge developing scalable and highly available Restful APIs using NodeJS technologies Well versed with CI/CD principles, and actively involved in solving, troubleshooting issues in distributed services ecosystem Understanding of containerization, experienced in Dockers, Kubernetes. Exposed to API gateway integrations like 3Scale. Understanding of Single-Sign-on or token-based authentication (Rest, JWT, OAuth) Possess expert knowledge of task/message queues include but not limited to: AWS, Microsoft Azure. Writing tested, idiomatic, and documented JavaScript, HTML and CSS Experiencing in Developing responsive web-based UI Have experience on Styled Components, Tailwind CSS, Material UI and other CSS-in-JS techniques Thorough understanding of the responsibilities of the platform, database, API, caching layer, proxies, and other web services used in the system Writing non-blocking code, and resorting to advanced techniques such as multi-threading, when needed Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model .

Posted 5 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

About the Team Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on experienced engineers to help us further our 24x7, global infrastructure system that powers DoorDash's three-sided marketplace of consumers, merchants, and dashers. About the Role The Data Tools mission is to build robust data platforms and establish policies that guarantee the analytics data is of high quality, easily accessible/cataloged, and compliant with financial and privacy regulations, fostering trust and confidence in our data-driven decision-making process. We are building the Data Tools team in India and you will have an opportunity to be part of a founding team with a greater opportunity for impact where you can help grow the team and shape the roadmap for the data platform at DoorDash. You will report directly to the Data Tools Engineering Manager. You're excited about this opportunity because you will… Work on building a data discovery platform, privacy frameworks, unified access control frameworks, and data quality platform to enable data builders at DoorDash to deliver high-quality and trustable data sets and metrics Help accelerate the adoption of the data discovery platform by building integrations across online, analytics platforms and promoting self-serve Come up with solutions for scaling data systems for various business needs Collaborate in a dynamic startup environment We're excited about you because… B.E./B.Tech., M.E./M.Tech, or Ph.D. in Computer Science or equivalent 6+ years of experience with CS fundamental concepts and experience with at least one of the programming languages of Scala, Java, and Python Prior technical experience in Big Data infrastructure & governance - you've built meaningful pieces of data infrastructure. Bonus if those were open-sourced technologies like DataHub, Spark, Airflow, Kafka, Flink Experience improving efficiency, scalability, and stability of data platforms Notice to Applicants for Jobs Located in NYC or Remote Jobs Associated With Office in NYC Only We use Covey as part of our hiring and/or promotional process for jobs in NYC and certain features may qualify it as an AEDT in NYC. As part of the hiring and/or promotion process, we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound from August 21, 2023, through December 21, 2023, and resumed using Covey Scout for Inbound again on June 29, 2024. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: Covey About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. We use Covey as part of our hiring and/or promotional process for jobs in certain locations. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https://getcovey.com/nyc-local-law-144 To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

About the Team Data is at the foundation of DoorDash success. The Data Engineering team builds database solutions for various use cases including reporting, product analytics, marketing optimization and financial reporting. By implementing pipelines, data structures, and data warehouse architectures; this team serves as the foundation for decision-making at DoorDash. About the Role DoorDash is looking for a Senior Data Engineer to be a technical powerhouse to help us scale our data infrastructure, automation and tools to meet growing business needs. You're excited about this opportunity because you will… Work with business partners and stakeholders to understand data requirements Work with engineering, product teams and 3rd parties to collect required data Design, develop and implement large scale, high volume, high performance data models and pipelines for Data Lake and Data Warehouse Develop and implement data quality checks, conduct QA and implement monitoring routines Improve the reliability and scalability of our ETL processes Manage a portfolio of data products that deliver high-quality, trustworthy data Help onboard and support other engineers as they join the team We're excited about you because… 5+ years of professional experience 3+ years experience working in data engineering, business intelligence, or a similar role Proficiency in programming languages such as Python/Java 3+ years of experience in ETL orchestration and workflow management tools like Airflow, Flink, Oozie and Azkaban using AWS/GCP Expert in Database fundamentals, SQL and distributed computing 3+ years of experience with the Distributed data/similar ecosystem (Spark, Hive, Druid, Presto) and streaming technologies such as Kafka/Flink. Experience working with Snowflake, Redshift, PostgreSQL and/or other DBMS platforms Excellent communication skills and experience working with technical and non-technical teams Knowledge of reporting tools such as Tableau, Superset and Looker Comfortable working in fast paced environment, self starter and self organizing Ability to think strategically, analyze and interpret market and consumer information You must be located near one of our engineering hubs indicated above Notice to Applicants for Jobs Located in NYC or Remote Jobs Associated With Office in NYC Only We use Covey as part of our hiring and/or promotional process for jobs in NYC and certain features may qualify it as an AEDT in NYC. As part of the hiring and/or promotion process, we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound from August 21, 2023, through December 21, 2023, and resumed using Covey Scout for Inbound again on June 29, 2024. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: Covey About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. We use Covey as part of our hiring and/or promotional process for jobs in certain locations. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https://getcovey.com/nyc-local-law-144 To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection. Show more Show less

Posted 5 days ago

Apply

7.0 - 10.0 years

35 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled Senior NodeJS Developer with experience in software development to join our team in Bangalore. The ideal candidate will have a strong background in developing and maintaining scalable and secure microservices using TypeScript (Node.js) and supporting cloud-native services on AWS. This role is vital in delivering high-quality solutions that meet compliance, performance, and security requirements in the financial services industry. Software Requirements: Required Proficiency: TypeScript (Node.js) for developing RESTful APIs and microservices. AWS services including Lambda, Aurora PostgreSQL, and Serverless Framework. CI/CD processes using GitHub Actions. Docker for containerization. Preferred Proficiency: Experience with Kubernetes for container orchestration. Familiarity with Kafka (MSK) for event-driven architectures. Overall Responsibilities: Develop and maintain scalable and secure microservices using TypeScript (Node.js). Support the implementation of cloud-native services on AWS. Translate technical and business requirements into well-structured, maintainable code adhering to best practices. Contribute to CI/CD workflows, ensuring clean code and comprehensive testing. Collaborate with cross-functional teams to deliver high-quality software solutions. Ensure that all work aligns with compliance, performance, and security requirements specific to the financial services sector.

Posted 5 days ago

Apply

5.0 - 7.0 years

30 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled Senior NodeJS Developer with experience in software development to join our team in Bangalore. The ideal candidate will have a strong background in developing and maintaining scalable and secure microservices using TypeScript (Node.js) and supporting cloud-native services on AWS. This role is vital in delivering high-quality solutions that meet compliance, performance, and security requirements in the financial services industry. Software Requirements: Required Proficiency: TypeScript (Node.js) for developing RESTful APIs and microservices. AWS services including Lambda, Aurora PostgreSQL, and Serverless Framework. CI/CD processes using GitHub Actions. Docker for containerization. Preferred Proficiency: Experience with Kubernetes for container orchestration. Familiarity with Kafka (MSK) for event-driven architectures. Overall Responsibilities: Develop and maintain scalable and secure microservices using TypeScript (Node.js). Support the implementation of cloud-native services on AWS. Translate technical and business requirements into well-structured, maintainable code adhering to best practices. Contribute to CI/CD workflows, ensuring clean code and comprehensive testing. Collaborate with cross-functional teams to deliver high-quality software solutions. Ensure that all work aligns with compliance, performance, and security requirements specific to the financial services sector.

Posted 5 days ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Roles and Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines to support ETL (Extract, Transform, Load) processes using tools like Apache Airflow, AWS Glue, or similar. Database Management: Design, optimize, and manage relational and NoSQL databases (such as MySQL, PostgreSQL, MongoDB, or Cassandra) to ensure high performance and scalability. SQL Development: Write advanced SQL queries, stored procedures, and functions to extract, transform, and analyze large datasets efficiently. Cloud Integration: Implement and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud, utilizing services like Redshift, BigQuery, or Snowflake. Data Warehousing: Contribute to the design and maintenance of data warehouses and data lakes to support analytics and BI requirements. Programming and Automation: Develop scripts and applications in Python or other programming languages to automate data processing tasks. Data Governance: Implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security. Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions. Performance Optimization: Identify and resolve performance bottlenecks in data systems and optimize data storage and retrieval. Documentation: Maintain comprehensive documentation for data processes, pipelines, and infrastructure. Stay Current: Keep up-to-date with the latest trends and advancements in data engineering, big data technologies, and cloud services. Required Skills and Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Technical Skills: Proficiency in SQL and relational databases (PostgreSQL, MySQL, etc.). Experience with NoSQL databases (MongoDB, Cassandra, etc.). Strong programming skills in Python; familiarity with Java or Scala is a plus. Experience with data pipeline tools (Apache Airflow, Luigi, or similar). Expertise in cloud platforms (AWS, Azure, or Google Cloud) and data services (Redshift, Big Query, Snowflake). Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus. Data Modeling: Experience in designing and maintaining data models for relational and non-relational databases. Analytical Skills: Strong analytical and problem-solving abilities with a focus on performance optimization and scalability. Soft Skills: Excellent verbal and written communication skills to convey technical concepts to non-technical stakeholders. Ability to work collaboratively in cross-functional teams. Certifications (Preferred): AWS Certified Data Analytics, Google Professional Data Engineer, or similar. Mindset: Eagerness to learn new technologies and adapt quickly in a fast-paced environment. Show more Show less

Posted 5 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Experience 6+ Years Immediate/Service Notice profiles preferred Face-to-Face Interview Process Responsibilities Leads multidimensional projects that involve multiple teams. Leads and works with other software engineers on design best practices and conducts code reviews. Resolves complex engineering problems, collaborating with others. Facilitates cross-functional troubleshooting, root cause analysis, and engages others when needed. Responsible for creating, evaluating, and contributing to feature detailed designs. Design, develop, and implement software utilizing an agile project cycle. Mentor team members and raise the bar for technical knowledge across a wide spectrum. Demonstrates thorough knowledge of information technology concepts, issues, trends, and best practices related to Cloud technologies and system integrations. Apply and share knowledge of security coding practices and secure system fundamentals Skills Strong proficiency in Java (v11+) with deep expertise in object-oriented programming, concurrency, and performance optimization. Hands-on experience with Spring Boot and the Spring Ecosystem , including Spring MVC, Spring Data, and Spring Security. Proficiency in containerization technologies such as Docker and orchestration tools like Kubernetes . Experience with RESTful architecture and microservices development, including API design, security, and scalability-related best practices. Strong knowledge of relational databases (PostgreSQL, MySQL, Oracle, etc.) and proficiency in writing efficient SQL queries and stored procedures. Experience working with cloud-based services such as AWS, GCP, or Azure , including serverless computing and managed database services. Familiarity with CI/CD methodologies and tools such as Jenkins, GitHub Actions, or GitLab CI/CD to automate build, test, and deployment pipelines. Experience with messaging and event-driven architectures ; knowledge of Kafka or RabbitMQ is a plus. Experience integrating with financial systems (e.g., Anaplan, Oracle Financials) is a plus. Strong problem-solving skills with a focus on writing clean, maintainable, and well-tested code. Excellent communication skills (verbal and written) and the ability to collaborate effectively with cross-functional teams. 5+ years of professional experience in backend development, preferably in enterprise or high-scale environments. Bachelor’s or Master’s in Computer Science, Engineering, or equivalent practical experience. Show more Show less

Posted 5 days ago

Apply

1.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Engineer – AWS Full Stack Location: India (Remote or Hybrid) Contract Type: Full-time, 1-Year Contract Experience Required: Minimum 5 years Start Date: Immediate Compensation: Competitive (Based on experience) About the Role We are seeking a highly skilled Data Engineer with deep expertise in the AWS ecosystem and full-stack data engineering . The ideal candidate will be responsible for designing, developing, and maintaining robust data pipelines and analytics platforms that support critical business insights and decision-making. This is a 1-year contract role ideal for professionals who have experience across data ingestion, transformation, cloud infrastructure, and data operations. Key Responsibilities Design and build end-to-end data pipelines using AWS services (Glue, Lambda, S3, Athena, Redshift, EMR, etc.). Develop and manage ETL/ELT processes , ensuring data quality, scalability, and maintainability. Collaborate with product, analytics, and engineering teams to deliver data models, APIs, and real-time data solutions . Implement best practices for data governance, lineage, monitoring, and access control . Automate data workflows using tools like Airflow, Step Functions , or custom scripts. Create and maintain infrastructure as code (IaC) using CloudFormation or Terraform for AWS data components. Optimize data warehouse and lakehouse architectures for performance and cost. Required Skills & Qualifications 5+ years of experience in data engineering, including cloud-native data development. Strong expertise in AWS data services : Glue, S3, Lambda, Redshift, Athena, Kinesis, EMR, etc. Proficiency in SQL, Python, and Spark for data manipulation and transformation. Experience with DevOps tools (CI/CD, Git, Docker) and infrastructure automation. Knowledge of data modeling , schema design, and performance tuning for large-scale datasets. Ability to work independently in a contract environment , managing priorities and deadlines. Preferred Qualifications Familiarity with streaming data architectures using Kafka/Kinesis. Experience working in regulated or large-scale enterprise environments . Exposure to BI tools (e.g., QuickSight, Tableau, Power BI) and API integration for downstream consumption. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Java + NodeJS Developer Location: PAN India Experience Preferred: 4+ Years Job Description: Design, develop, and maintain scalable backend services and RESTful APIs using Java and Node.js . Write clean, modular, and reusable code following best practices. Work with microservices architecture and cloud platforms (AWS/Azure/GCP). Integrate with frontend frameworks and third-party APIs/services. Optimize applications for speed, scalability, and reliability. Collaborate with DevOps for CI/CD pipeline, deployment, and monitoring. Participate in code reviews and technical discussions. Troubleshoot, debug, and resolve production issues. Ensure security and data protection in application design. Document technical specifications and processes. Required Skills and Qualifications: Strong programming skills in Java (Spring Boot) and Node.js (Express/NestJS) . Proficiency in relational and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB). Experience with RESTful APIs and API documentation tools (Swagger/OpenAPI). Familiarity with version control systems (Git). Understanding of asynchronous programming and event-driven architecture. Good knowledge of unit testing and integration testing (e.g., JUnit, Mocha, Jest). Experience with message brokers (e.g., Kafka, RabbitMQ) is a plus. Exposure to containerization (Docker, Kubernetes) is an advantage. Strong analytical, problem-solving, and communication skills. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less

Posted 5 days ago

Apply

Exploring Kafka Jobs in India

Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Gurgaon

These cities are known for their thriving tech industries and have a high demand for Kafka professionals.

Average Salary Range

The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.

Career Path

Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.

Related Skills

In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture

Interview Questions

  • What is Apache Kafka and how does it differ from other messaging systems? (basic)
  • Explain the role of Zookeeper in Apache Kafka. (medium)
  • How does Kafka guarantee fault tolerance? (medium)
  • What are the key components of a Kafka cluster? (basic)
  • Describe the process of message publishing and consuming in Kafka. (medium)
  • How can you achieve exactly-once message processing in Kafka? (advanced)
  • What is the role of Kafka Connect in Kafka ecosystem? (medium)
  • Explain the concept of partitions in Kafka. (basic)
  • How does Kafka handle consumer offsets? (medium)
  • What is the role of a Kafka Producer API? (basic)
  • How does Kafka ensure high availability and durability of data? (medium)
  • Explain the concept of consumer groups in Kafka. (basic)
  • How can you monitor Kafka performance and throughput? (medium)
  • What is the purpose of Kafka Streams API? (medium)
  • Describe the use cases where Kafka is not a suitable solution. (advanced)
  • How does Kafka handle data retention and cleanup policies? (medium)
  • Explain the Kafka message delivery semantics. (medium)
  • What are the different security features available in Kafka? (medium)
  • How can you optimize Kafka for high throughput and low latency? (advanced)
  • Describe the role of a Kafka Broker in a Kafka cluster. (basic)
  • How does Kafka handle data replication across brokers? (medium)
  • Explain the significance of serialization and deserialization in Kafka. (basic)
  • What are the common challenges faced while working with Kafka? (medium)
  • How can you scale Kafka to handle increased data loads? (advanced)

Closing Remark

As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies