Jobs
Interviews

47 Confluent Kafka Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

As a member of the deals team at PwC, you will be providing strategic advice and support to clients in areas such as mergers, acquisitions, divestitures, and restructuring. Your role will involve helping clients navigate complex transactions and maximize the value in their business deals. In deal integration and valuation realization, your focus will be on assisting clients in successfully integrating acquisitions and maximizing the value of their investments. You will be responsible for conducting valuations, financial analysis, and developing strategies for post-merger integration. Key Responsibilities: - Conduct valuations, financial analysis, and develop strategies for post-merger integration. - Assist clients in successfully integrating acquisitions and maximizing the value of their investments. - Provide strategic advice and support to clients in areas such as mergers, acquisitions, divestitures, and restructuring. Qualifications Required: - 3+ years of relevant experience. - Must have skills: JAVA, Spring/Spring Boot, REST API, Microservices (JAVA or NODE.JS), JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, ASP) or AWS equivalent. - Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/CD, and related tools. - Nice to Have: OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. - Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design, and other relevant technology areas from a design/support/solutions perspective. - Responsible for overall development activities/progress in alignment with the development standards and guidelines set by the practice. At PwC, you will be expected to develop a deeper understanding of the business context and how it is changing. Additionally, you will interpret data to inform insights and recommendations, uphold professional and technical standards, and ensure adherence to the Firm's code of conduct and independence requirements. Your role will involve providing technical guidelines, support, and aligning the development practice to the bank's strategic vision and objectives. You will also be responsible for coaching, educating, and monitoring the work of others, and serving as a point of escalation for the development team.,

Posted 5 days ago

Apply

6.0 - 9.0 years

13 - 18 Lacs

hyderabad

Hybrid

Preferred candidate profile Primary Key Skills : Java FSD - Angular - Level 3 - Senior Software Engineer Advanced Java, Spring Boot, Confluent Kafka, MongoDB, PostGres, Open Shift Development

Posted 1 week ago

Apply

15.0 - 17.0 years

0 Lacs

india

Remote

About The Company BP Energy is a global leader in the energy sector, committed to delivering reliable and sustainable energy solutions to meet the world&aposs evolving needs. With over a century of experience, BP focuses on discovering, developing, and producing oil and gas in various regions worldwide. The company is dedicated to integrating innovative technologies and sustainable practices to reduce its carbon footprint and achieve its goal of becoming a net zero company by 2050 or sooner. BP Energy values diversity, inclusion, and equal opportunity, fostering a workplace where all employees can thrive and contribute to the company&aposs mission of delivering energy to the world, today and tomorrow. About The Role As a Staff Software Engineer at BP Energy, you will serve as a technical leader within the Enterprise Integration solutions team. Your primary responsibility will be to design, develop, and oversee scalable and reliable integration solutions that support BP&aposs digital transformation initiatives. Operating in a dynamic environment, you will collaborate with cross-functional teams, including business stakeholders, project managers, and fellow engineers, to deliver innovative solutions that align with BP&aposs strategic objectives. Your expertise will help ensure the seamless integration of diverse systems, enabling real-time data exchange, automation, and enhanced operational efficiency. This role offers an opportunity to lead multiple engineering squads, promote best practices in DevOps, and contribute to the continuous improvement of BPs digital infrastructure. Qualifications The ideal candidate will possess over 15 years of professional experience, with a minimum of 10 years in enterprise integration and software engineering. You should have advanced expertise in Java, integration frameworks, and designing highly scalable integrations involving APIs, messaging systems, files, databases, and cloud services. Experience with integration tools such as TIBCO, MuleSoft, Apache Camel, Spring Integration, and Confluent Kafka is essential. A deep understanding of Enterprise Integration Patterns (EIPs) and iBlocks for secure and reliable integrations is required. Candidates should demonstrate a willingness to learn and adapt to cloud-native integration solutions on AWS and Azure platforms. Strong knowledge of the entire interface development lifecycle, including design, security, testing, CI/CD practices, and telemetry, is crucial. Experience with open-source technologies, AI-assisted development, enterprise architectures like EDA and Microservices, and stakeholder management skills are highly valued. Leadership qualities, including inclusive management and fostering a culture of continuous improvement, are essential for success in this role. Responsibilities Design, develop, and implement stable, efficient, and scalable enterprise integration solutions, including managing technical debt and remediation of existing platforms. Ensure integration services evolve in response to changing business needs, technological advancements, and adherence to BPs standards and emerging trends. Collaborate with functional stakeholders, project managers, and business analysts to gather and understand requirements, translating them into effective technical solutions. Lead and mentor a team of integration engineers, fostering a culture of agility, innovation, and automation. Maximize value from current applications and emerging technologies by demonstrating technical thought leadership across various platforms. Work closely with users and business analysts to define and refine integration requirements, ensuring alignment with business objectives. Coordinate with peers across IT&S teams to promote best practices, share knowledge, and drive continuous improvement initiatives. Benefits BP Energy offers a comprehensive benefits package designed to support employees' well-being and professional growth. These include competitive salary packages, health insurance, retirement plans, and opportunities for career advancement. The company promotes work-life balance through flexible working arrangements, including hybrid work models combining office and remote work. BP also invests in employee development through training programs, mentorship, and access to cutting-edge technologies. Additionally, employees benefit from a collaborative and inclusive work environment that values diversity and encourages innovation. Equal Opportunity BP Energy is an equal opportunity employer committed to creating an inclusive workplace. We value diversity and do not discriminate based on race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability. We ensure that individuals with disabilities receive reasonable accommodations throughout the application and employment process. Our commitment is to foster an environment where all employees can contribute their best and thrive professionally. Show more Show less

Posted 1 week ago

Apply

10.0 - 12.0 years

0 Lacs

mumbai, maharashtra, india

Remote

Position Title Lead Infrastructure Engineer- Integration Function/Group Digital and Technology Location Mumbai Shift Timing Regular Role Reports to D&T Manager - Integration Remote/Hybrid/in-Office Hybrid ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center (GIC) is our global capability center in Mumbai that works as an exte nsion of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. JOB OVERVIEW Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the Work with Heart philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the provided Purpose of the role We have exciting opportunity for Lead Infrastructure Engineers to work with General Mill's various Advanced Digital Transformation teams and partner to achieve required business outcomes by transforming, simplifying, integrating and managing our services in cloud. We are seeking a highly skilled MuleSoft Platform & Automation Engineer with expertise in managing and automating the MuleSoft ecosystem. The ideal candidate should have strong experience in MuleSoft Platform Administration, Automation, and CI/CD along with exposure to Mule Development. Additional knowledge of Cloud Composer (Airflow), GCP, Terraform, Confluent Kafka and scripting is a plus. Primary responsibilities include Administration and Automation excellence, consultation, optimization and implementation. KEY ACCOUNTABILITIES Manage and automate MuleSoft AnyPoint Platform configurations, including API management, security, deployments, monitoring and governance etc. Troubleshoot the production issues and provide root cause. Setup and support highly available Integration Platform Infrastructure (IaaS/PaaS). Automate LastMile Security Certificate Management and renewal processes. Implement CI/CD pipelines to streamline MuleSoft application deployments. Develop self-healing and proactive monitoring solutions for MuleSoft applications and APIs. Work with Developers to triage production bugs Manage a queue of cases and work with users and other support teams to troubleshoot production issues. Provide 24.7 on-call production support once every month on rotational basis. Integrate GitHub with MuleSoft Design Centre and automate code deployment & rollback mechanisms. Implement infrastructure automation using Terraform for MuleSoft environments. Ensure high availability and fault tolerance of MuleSoft applications through proper configurations and failover strategies. Would need to support Enterprise platform with rotational on-call support. MINIMUM QUALIFICATIONS Education - Full time graduation from an accredited university (Mandatory- Note: This is the minimum education criteria which cannot be altered) Over 10+ years of experience in IT industry, minimum of 8+ years of administration or operations experience in the Mule area. Strong Experience with Runtime Fabric and Hybrid Deployment Model Expertise in Deployment strategies, Mule Clustering, Mule Gateway, MUnit MuleSoft MMC. Experience troubleshooting/Managing Runtime Servers, experience on Mule Connectors (Standard/Custom). Provide technical consultation on MuleSoft platform best practices, security, and governance. Should have considerable knowledge of API development using Mule Platform. Strong experience with Anypoint Platform, Cloud Hub, RTF and API Management. Strong understanding and experience with security implementations (e.g., SSL/mutual SSL, SAML, oAuth). Hands-on experience with MuleSoft API Gateway, RTF, Anypoint Monitoring, and Security. Experience in monitoring and alerting for MuleSoft APIs and applications. Strong knowledge of CI/CD tools such as GitHub Actions and GitHub. Experience in Infrastructure as Code (IaC) using Terraform. Excellent troubleshooting skills in dynamic environments. Familiarity with Agile methodologies & modern Software Engineering principles Strong problem-solving skills and ability to work in a fast-paced environment PREFERRED QUALIFICATIONS Strong aptitude to learn and passion for problem solving. Excellent communication skills in coordinating with different stakeholders. Good to have knowledge on Kafka, Google Cloud Platform- GCP and its services like IAM, Apigee, Composer Compute, Storage etc. Nice to have working knowledge of Service Oriented Architecture (SOA) and associated concepts such as XML Schemas, WS specifications, RESTful APIs, SOAP, Service Mediation/ESB, Digital certificates, Messaging, etc Good to have knowledge of Python, Shell scripting, or other automation languages. Familiarity with Agile methodologies & modern Software Engineering principles Nice to have exposure to Kubernetes and container orchestration. Strong knowledge of infrastructure components, including networking, storage, and compute in cloud environments. Familiarity with Agile methodologies Familiarity with modern Software Engineering principles Workflow Management: Google Cloud Composer & Airflow jobs Cloud: Google cloud platform (GCP) Experience managing Google Cloud Composer, Tidal Scheduler from a platform/infrastructure perspective. Terraform Knowledge Hands-on experience with monitoring and logging tools for performance and issue resolution

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

mumbai, maharashtra, india

Remote

Position Title Sr Infrastructure Engineer- Integration Function/Group Digital and Technology Location Mumbai Shift Timing Regular Role Reports to D&T Manager - Integration Remote/Hybrid/in-Office Hybrid ABOUT GENERAL MILLS We make foodthe world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Hagen-Dazs, we've been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out General Mills India Center (GIC) is our global capability center in Mumbai that wo rks as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. JOB OVERVIEW Function Overview The Digital and Technology team at General Mills stands as the largest and foremost unit, dedicated to exploring the latest trends and innovations in technology while leading the adoption of cutting-edge technologies across the organization. Collaborating closely with global business teams, the focus is on understanding business models and identifying opportunities to leverage technology for increased efficiency and disruption. The team's expertise spans a wide range of areas, including AI/ML, Data Science, IoT, NLP, Cloud, Infrastructure, RPA and Automation, Digital Transformation, Cyber Security, Blockchain, SAP S4 HANA and Enterprise Architecture. The MillsWorks initiative embodies an agile@scale delivery model, where business and technology teams operate cohesively in pods with a unified mission to deliver value for the company. Employees working on significant technology projects are recognized as Digital Transformation change agents. The team places a strong emphasis on service partnerships and employee engagement with a commitment to advancing equity and supporting communities. In fostering an inclusive culture, the team values individuals passionate about learning and growing with technology, exemplified by the Work with Heart philosophy, emphasizing results over facetime. Those intrigued by the prospect of contributing to the digital transformation journey of a Fortune 500 company are encouraged to explore more details about the function through the provided Purpose of the role Purpose of the role We have exciting opportunity for Sr. Infrastructure Engineers to work with General Mill's various Advanced Digital Transformation teams and partner to achieve required business outcomes by transforming, simplifying, integrating and managing our services in cloud. We are seeking a highly skilled MuleSoft Platform & Automation Engineer with expertise in managing and automating the MuleSoft ecosystem. The ideal candidate should have strong experience in MuleSoft Platform Administration, Automation, and CI/CD along with exposure to Mule Development. Additional knowledge of Cloud Composer (Airflow), GCP, Terraform, Confluent Kafka and scripting is a plus. Primary responsibilities include Administration and Automation excellence, consultation, optimization and implementation. KEY ACCOUNTABILITIES Manage and automate MuleSoft AnyPoint Platform configurations, including API management, security, deployments, monitoring and governance etc. Troubleshoot the production issues and provide root cause. Setup and support highly available Integration Platform Infrastructure (IaaS/PaaS). Automate LastMile Security Certificate Management and renewal processes. Implement CI/CD pipelines to streamline MuleSoft application deployments. Develop self-healing and proactive monitoring solutions for MuleSoft applications and APIs. Work with Developers to triage production bugs Manage a queue of cases and work with users and other support teams to troubleshoot production issues. Provide 24.7 on-call production support once every month on rotational basis. Integrate GitHub with MuleSoft Design Centre and automate code deployment & rollback mechanisms. Implement infrastructure automation using Terraform for MuleSoft environments. Ensure high availability and fault tolerance of MuleSoft applications through proper configurations and failover strategies. Would need to support Enterprise platform with rotational on-call support. MINIMUM Q UALIFICATIONS Education - Full time graduation from an accredited university (Mandatory- Note: This is the minimum education criteria which cannot be altered) Over 10+ years of experience in IT industry, minimum of 8+ years of administration or operations experience in the Mule area. Strong Experience with Runtime Fabric and Hybrid Deployment Model Expertise in Deployment strategies, Mule Clustering, Mule Gateway, MUnit MuleSoft MMC. Experience troubleshooting/Managing Runtime Servers, experience on Mule Connectors (Standard/Custom). Provide technical consultation on MuleSoft platform best practices, security, and governance. Should have considerable knowledge of API development using Mule Platform. Strong experience with Anypoint Platform, Cloud Hub, RTF and API Management. Strong understanding and experience with security implementations (e.g., SSL/mutual SSL, SAML, oAuth). Hands-on experience with MuleSoft API Gateway, RTF, Anypoint Monitoring, and Security. Experience in monitoring and alerting for MuleSoft APIs and applications. Strong knowledge of CI/CD tools such as GitHub Actions and GitHub. Experience in Infrastructure as Code (IaC) using Terraform. Excellent troubleshooting skills in dynamic environments. Familiarity with Agile methodologies & modern Software Engineering principles Strong problem-solving skills and ability to work in a fast-paced environment PREFERRED QUALIFICATIONS Strong aptitude to learn and passion for problem solving. Excellent communication skills in coordinating with different stakeholders. Good to have knowledge on Kafka, Google Cloud Platform- GCP and its services like IAM, Apigee, Composer Compute, Storage etc. Nice to have working knowledge of Service Oriented Architecture (SOA) and associated concepts such as XML Schemas, WS specifications, RESTful APIs, SOAP, Service Mediation/ESB, Digital certificates, Messaging, etc Good to have knowledge of Python, Shell scripting, or other automation languages. Familiarity with Agile methodologies & modern Software Engineering principles Nice to have exposure to Kubernetes and container orchestration. Strong knowledge of infrastructure components, including networking, storage, and compute in cloud environments. Familiarity with Agile methodologies Familiarity with modern Software Engineering principles Workflow Management: Google Cloud Composer & Airflow jobs Cloud: Google cloud platform (GCP) Experience managing Google Cloud Composer, Tidal Scheduler from a platform/infrastructure perspective. Terraform Knowledge Hands-on experience with monitoring and logging tools for performance and issue resolution

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have experience working with Confluent Kafka. You must possess a strong knowledge of Kafka architecture, Kraft mode (Kafka Raft), Kafka Connect, Kafka Streams, and KSQL. It is essential to have proficiency in scripting languages such as Python, Bash, and automation tools like Ansible and Terraform. Experience with monitoring tools like Prometheus, Grafana, Dynatrace, and Splunk ITSI is required. Understanding of security best practices for Kafka including SSL/TLS, Kerberos, and RBAC is crucial. Strong analytical and problem-solving skills are necessary for this role. Excellent communication and collaboration skills are also expected.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Genpact is a global professional services and solutions firm with a workforce of 125,000+ employees in over 30 countries. We are characterized by our innate curiosity, entrepreneurial agility, and commitment to creating lasting value for our clients, including Fortune Global 500 companies. Our purpose, the relentless pursuit of a world that works better for people, drives us to serve and transform leading enterprises worldwide. We leverage our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI to deliver outcomes that shape the future. We are currently seeking applications for the position of Lead Consultant, Senior Java Developer with experience in Microservices. As a Senior Java Developer, you will be responsible for the following: **Responsibilities:** - Must-have skills: JAVA, Spring/SpringBoot, REST API, Microservices (JAVA or NODE.JS), JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, Azure App Service Environment, ASP) or AWS equivalent. - Working Knowledge: Bitbucket, GIT, Confluence, JIRA, strong experience in DevOps pipeline, CI/CD, and related tools. - Nice to Have: Kubernetes, MQ, OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. - Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities, coaching, educating, and monitoring the work of others. - Primary subject matter expertise in multiple areas, counselling clients and project teams on research, analysis, design, support of technical business solutions, development of technical solutions, and testing. - Involvement in coaching and advising clients, partners, and project teams, serving as an internal expert resource in technical information exchange. - Commitment to and belief in the quality of deliverables. **Qualifications:** Minimum Qualifications: - BE/ B Tech/ MCA or equivalent - Excellent Communication skills Preferred Qualifications/ Skills: - Experience in software development, including architecting, designing, and coding. - Strong expertise in API, Microservices development, and integration using Java/Spring Boot, Node.js. - Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design, and other relevant technology areas from a design/support/solutions perspective. **Location:** India-Hyderabad **Schedule:** Full-time **Education Level:** Bachelor's / Graduation / Equivalent **Job Posting:** Jun 10, 2025, 6:07:05 AM **Unposting Date:** Ongoing **Master Skills List:** Consulting **Job Category:** Full Time,

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Inviting applications for the role of Lead Consultant, Senior Java Developer In this role, we are inviting an application for the role of Senior Java Developer with work experience in Microservices. Responsibilities Must have skills: JAVA, Spring/Spring Boot, REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, Azure App Service Environment, ASP) or AWS equivalent. Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/CD and related tools Nice to Have: Kubernetes, MQ, OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities (e.g. design, support of technical business solutions) and can be relied on to coach, educate and monitor the work of others. Primary subject matter expertise in multiple areas you%27re seasoned in counselling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions and testing. Involvement coaching and advising clients, partners and project teams capable of being an internal expert resource in technical information exchange. Commitment to and belief in the quality of your deliverables. Qualifications we seek in you Minimum Qualifications BE/ B Tech/ MCA or equivalent Excellent Communication skills Preferred Qualifications/ Skills Experience of software development including architecting, designing and coding Strong expertise in API, Microservices development and integration using Java/Spring Boot, Node.js Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design and other relevant technology areas from a design/support/solutions perspective.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data-driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling, and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. ACV's network of brands includes ACV Auctions, ACV Transportation, ClearCar, MAX Digital, and ACV Capital within its Marketplace Products, as well as True360 and Data Services. ACV Auctions is opening its new India Development Center in Chennai, India, and we're looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles. At ACV, we put people first and believe in the principles of trust and transparency. If you are looking for an opportunity to work with the best minds in the industry and solve unique business and technology problems, look no further! Join us in shaping the future of the automotive marketplace! At ACV, we focus on the Health, Physical, Financial, Social, and Emotional Wellness of our Teammates and to support this, we offer industry-leading benefits and wellness programs. We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize performance scalability and reliability. This role requires a strong focus and experience in multi-cloud-based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV's most complex data and software problems. You will be an engineer who is able to operate in a high-performing team, balance high-quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast-paced environment. It is expected that you are a technical liaison that can balance high-quality delivery with customer focus, have excellent communication skills, and have a record of delivering results in a fast-paced environment. In this role, you will collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. You will influence company-wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. You will also design, implement, and maintain tools and best practices for access control, data versioning, database management, and migration strategies. Additionally, you will contribute, influence, and set standards for all technical aspects of a product or service including coding, testing, debugging, performance, languages, database selection, management, and deployment. Identifying and troubleshooting database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions will be part of your responsibilities. Writing clean, maintainable, well-commented code and automation to support our data infrastructure layer, performing code reviews, developing high-quality documentation, and building robust test suites for your products are also key tasks. You will provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborating with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products will be an essential aspect of your role. You will also collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participating in the SOX audits, including creation of standards and reproducible audit evidence through automation, creating and maintaining documentation for database and system configurations, procedures, and troubleshooting guides, maintaining and extending existing database operations solutions for backups, index defragmentation, data retention, etc., responding to and troubleshooting highly complex problems quickly, efficiently, and effectively, being accountable for the overall performance of products and/or services within a defined area of focus, being part of the on-call rotation, handling multiple competing priorities in an agile, fast-paced environment, and performing additional duties as assigned are also part of your responsibilities. To be eligible for this role, you should have a Bachelor's degree in computer science, Information Technology, or a related field (or equivalent work experience), ability to read, write, speak, and understand English, strong communication and collaboration skills with the ability to work effectively in a fast-paced global team environment, 1+ years of experience architecting, developing, and delivering software products with an emphasis on the data infrastructure layer, 1+ years of work with continuous integration and build tools, 1+ years of experience programming in Python, 1+ years of experience with Cloud platforms preferably in GCP/AWS, knowledge in day-to-day tools and how they work including deployments, k8s, monitoring systems, and testing tools, knowledge in version control systems including trunk-based development, multiple release planning, cherry-picking, and rebase, hands-on skills and the ability to drill deep into the complex system design and implementation, experience with DevOps practices and tools for database automation and infrastructure provisioning, programming in Python, SQL, Github, Jenkins, infrastructure as code tooling such as terraform (preferred), big data technologies, and distributed databases. Nice to have qualifications include experience with NoSQL data stores, Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran, database monitoring and diagnostic tools, preferably Data Dog, database management/administration with PostgreSQL, MySQL, Dynamo, Mongo, GCP/BigQuery, Confluent Kafka, using and integrating with cloud services, specifically AWS RDS, Aurora, S3, GCP, Service Oriented Architecture/Microservices, and Event Sourcing in a platform like Kafka (preferred), familiarity with DevOps practices and tools for automation and infrastructure provisioning, hands-on experience with SOX compliance requirements, knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks, knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning, and optimization techniques. Our Values: - Trust & Transparency - People First - Positive Experiences - Calm Persistence - Never Settling,

Posted 2 weeks ago

Apply

6.0 - 8.0 years

27 - 30 Lacs

pune, ahmedabad, chennai

Work from Office

Technical Skills Must Have: 8+ years overall IT industry experience, with 5+ years in a solution or technical architect role using service and hosting solutions such as private/public cloud IaaS, PaaS and SaaS platforms. 5+ years of hands-on development experience with event driven architecture-based implementation. Achieved one or more of the typical solution and technical architecture certifications e.g. Microsoft, MS Azure Certification, TOGAF, AWS Cloud Certified, SAFe, PMI, and SAP etc. Hand-on experience with: o Claims-based authentication (SAML/OAuth/OIDC), MFA, JIT, and/or RBAC / Ping etc. o Architecting Mission critical technology components with DR capabilities. o Multi-geography, multi-tier service design and management. o Project financial management, solution plan development and product cost estimation. o Supporting peer teams and their responsibilities; such as infrastructure, operations, engineering, info-security. o Configuration management and automation tools such as Azure DevOps, Ansible, Puppet, Octopus, Chef, Salt, etc. o Software development full lifecycle methodologies, patterns, frameworks, libraries and tools. o Relational, graph and/or unstructured data technologies such as SQL Server, Azure SQL, Cosmos, Azure Data Lake, HD Insights, Hadoop, Neo4j etc. o Data management and data governance technologies. o Experience in data movement and transformation technologies. o AI and Machine Learning tools such as Azure ML etc. o Architecting mobile applications that are either independent applications or supplementary addons (to intranet or extranet). o Cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. o Apache Kafka, Confluent Kafka, Kafka Streams, and Kafka Connect. o Proficient in NodeJS, Java, Scala, or Python languages.

Posted 2 weeks ago

Apply

15.0 - 17.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Sr Associate Director Software Engineering In this role, you will: This role is part of the Innovation and Gen AI tech team and is responsible for supporting the use cases delivery on Innovation and GenAI, including: New use case onboarding design, development/coding, testing & deployment into Production. . Understand Group and Compliance/Enterprise Technology Innovation strategy, help define technical strategy and execute which to fulfill business needs and aspirations on growth, management and control. . Work with global Risk and Compliance IT architecture team to ensure that technical solution, development, and integrations adhere to group standards. . Establish and execute a vision to plan, deliver, and support solutions in a complex, distributed technology environment. The ideal candidate must be able to communicate clearly and effectively with both technical and non-technical individuals. . Plan for people and project management, provide coaching and guiding directly or indirectly to teams having developers, testers, analyst and architects inside by giving clear direction, feedbacks and timely suggestions to ensure a high quality standard of deliverables according to HSBC standards and best practices. . Address existing technical debt and drive for technical evolutions Innovation and Digital Transformation for the teams by working closely with various parties including business, Transformation, Solution Architects globally. . Establish and maintain trustworthy relationships with business and relevant stakeholders. Manage expectation of key stakeholders and work jointly to maximize interest of business and customers . Manage supply and demand pipeline and give guidance, direction for making decision to achieve goals, deliver products that align with business. Requirements To be successful in this role, you should meet the following requirements: You will be tenacious about doing things the right way and building efficient and brilliantly simple business solutions. You will also be adept at working in customer facing roles within enterprise environments, helping clients to capitalise on technology for their commercial benefit. High Level & Holistic Capabilities sought: . Bachelor degree in Computer Science, Engineering or equivalent advanced degree is preferred . GCP or AWS certified (GCP preferred) . Prompt engineering experience preferred . Expert level core Java / Python skills . Minimal 15+ years experience of software developments with both waterfall and Agile methodologies. . Have experience leading and managing agile, cross function delivery teams encompassing 30+ staff (direct and indirect reports). . Possess strong technical capabilities (BigData, AI/ML, API, Microservices), knowledge and experience on DevOps, Disciplined Agile Delivery (DAD) and Agile control Framework. . Are passionate about technology and look for opportunities to learn & bring new ideas to the team. . Sound understanding of Azure / Google Cloud platforms . Experience working with Kubernetes, Docker, Storage, Cloud Functions . Experience in performing data analysis, on Databases (SQL, no-SQL, DBT) . CI/CD pipelines . Data Flow, Data Proc, Big Query skills an advantage . Security (IAM, AD, ADLDS, roles, service accounts, entitlements) . Events & Data Streaming: Data Proc, Pub-Sub, Confluent/Kafka . Ability to communicate and explain complex ideas in both oral and written English You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSDI

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Lead Consultant- Senior Java Developer Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design and other relevant technology areas from a design/support/solutions perspective. Responsibilities . Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design and other relevant technology areas from a design/support/solutions perspective. . Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities (e.g. design, support of technical business solutions) and can be relied on to coach, educate and monitor the work of others. . Primary subject matter expertise in multiple areas you%27re seasoned in counselling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions and testing. . Involvement coaching and advising clients, partners and project teams capable of being an internal expert resource in technical information exchange. . Commitment to and belief in the quality of your deliverables. Qualifications we seek in you! Minimum Qualifications . B.E/B. TECH/ Any Graduation Equivalent Preferred Qualifications/ Skills . Must have skills: JAVA, Spring/SpringBoot, REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, Azure App Service Environment, ASP) or AWS equivalent. . Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/CD and related tools . Nice to Have: Kubernetes, MQ, OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Lead Consultant - Java, Kafka, AWS In this role, you will be responsible for coding, testing and delivering high quality deliverables, and should be willing to learn new technologies . Responsibilities Excellent development knowledge in java , microservices , AWS . Design and develop Kafka-based data pipeline. S trong hands-on experience building applications using Kafka event streaming (preferably Confluent Kafka) Experience in deploying solutions (CI/CD, DevOps) Understand the use cases and develop applications from detailed design specifications. Write user stories based on understanding of use cases. Working with other developers to ensure deadlines are met. Develop full range of tests to support fully automated deployment pipeline (unit, integration, system, smoke, performance, etc.) Bug and defect remediation Interacting with customers (on specific items) Adherence to client coding standards Write test cases as needed for ensuring fully automated deployment pipeline. Participate in Quality Improvement initiatives as required . Mentoring and coaching of graduate staff Can write effective APIs ( RestAPI a plus) Experience in Technical documentation and creation of Administration/Support doc Ability to work both independently as well as collaborate effectively within the team. Qualifications we seek in you! Minimum Qualifications / Skills BE/BTech/MCA Preferred Qualifications/ Skills E xperience in java and kafka - preferably Confluent Kafka Experience in server-side technologies ( RestAPI a plus) Experience in creating, maintaining , and optimizing Kafka clusters and data pipelines. Fluent Written and verbal communication in English Knowledge and experience in Agile / Scrum project Strong customer focus Willingness and ability to multi-task Willingness to learn the Product/Platform and associated technologies. Ability to be learn and apply new technology and tools. Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

hyderabad, telangana, india

On-site

Inviting applications for the role Lead Consultant, Senior Java Developer In this role, we are inviting applications for the role of Senior Java Developer with working experience in Microservices. Responsibilities Must have skills: JAVA, Spring/ SpringBoot , REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, Azure App Service Environment, ASP) or AWS equivalent. Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/CD and related tools Nice to Have: Kubernetes, MQ, OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities (e.g. design, support of technical business solutions) and can be relied on to coach, educate and monitor the work of others. Primary subject matter expertise in multiple areas you%27re seasoned in counselling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions and testing. Involvement coaching and advising clients, partners and project teams capable of being an internal expert resource in technical information exchange. Commitment to and belief in the quality of your deliverables. Qualifications we seek in you Minimum Qualifications BE/ B Tech/ MCA or equivalent Excellent Communication skills Preferred Qualifications/ Skills Experience of software development including architecting, designing and coding Strong expertise in API, Microservices development and integration using Java/Spring Boot, Node.js Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design and other relevant technology areas from a design/support/solutions perspective.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

At PwC, our team in deals focuses on providing strategic advice and support to clients in areas such as mergers and acquisitions, divestitures, and restructuring. We help clients navigate complex transactions and maximize value in their business deals. Those in deal integration and valuation realization at PwC will focus on assisting clients in successfully integrating acquisitions and maximizing the value of their investments. You will be responsible for conducting valuations, financial analysis, and developing strategies for post-merger integration. Focused on relationships, you are building meaningful client connections and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise, and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn't clear, you ask questions, and you use these moments as opportunities to grow. Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: - Respond effectively to the diverse perspectives, needs, and feelings of others. - Use a broad range of tools, methodologies, and techniques to generate new ideas and solve problems. - Use critical thinking to break down complex concepts. - Understand the broader objectives of your project or role and how your work fits into the overall strategy. - Develop a deeper understanding of the business context and how it is changing. - Use reflection to develop self-awareness, enhance strengths, and address development areas. - Interpret data to inform insights and recommendations. - Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Essential Skills and Experience: - 3+ years of relevant experience - Must have skills: JAVA, Spring/Spring Boot, REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, ASP) or AWS equivalent. - Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/CD, and related tools. - Nice to Have: OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. - Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design, and other relevant technology areas from a design/support/solutions perspective. - Responsible for overall development activities/progress in alignment with the development standards and guidelines set by the practice. - Provide technical guidelines, support, and align the development practice to align with the bank's strategic vision and objective. - Provide technical input and support to Architecture & Design, delivery, and other team leads/partners as required. - Point of escalation for the development team. - Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities (e.g. design, support of technical business solutions) and can be relied on to coach, educate, and monitor the work of others. - Primary subject matter expertise in multiple areas; you're seasoned in counseling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions, and testing. - Involvement in coaching and advising clients, partners, and project teams; capable of being an internal expert resource in "technical information exchange". - Commitment to and belief in the quality of your deliverables.,

Posted 1 month ago

Apply

10.0 - 12.0 years

10 - 12 Lacs

Hyderabad, Telangana, India

On-site

About Chubb JOB DESCRIPTION Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Job Title : Confluent Kafka Platform Engineer / Technical Lead Function/Department : Technology Location : Hyderabad Employment Type : Full Time Reports To : Vijai Sai Kosireddy Role Overview Key Responsibilities Design, implement, and manage a robust Confluent Kafka platform on Azure AKS or Azure VMs. Work on Kafka clusters configuration, optimization, and scaling to ensure high availability and performance. Provide support and guidance to development teams on integrating applications with Confluent Kafka. Troubleshoot and resolve issues related to Kafka integrations, ensuring minimal service disruption. Lead the migration of existing Kafka implementations to the latest Confluent Kafka version, ensuring a smooth transition with minimal downtime. Collaborate with teams to define migration strategies, testing protocols, and rollback plans. Oversee the operational health of Kafka clusters, ensuring uptime, performance, and reliability. Implement monitoring, logging, and alerting solutions to track system performance and potential issues. Actively monitor the environment for security vulnerabilities and implement necessary patches and updates. Ensure compliance with industry security standards and best practices. Create and maintain documentation for Kafka architecture, configurations, and best practices. Conduct training sessions and workshops to upskill team members on Confluent Kafka usage and management. Communicate regularly with stakeholders to provide updates on platform health, upcoming changes, and migration progress. Collaborate with cross-functional teams to align on project goals and timelines. Skills And Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred. Minimum of 10 years of experience in building and supporting data streaming solutions, specifically with Confluent Kafka. Demonstrated experience with Azure AKS and Azure VM environments. Solid understanding of Core Java fundamentals and experience in developing Java-based applications. Strong knowledge of Kafka architecture, including brokers, topics, consumers, and producers. Familiarity with migration processes for Kafka, including version upgrades and data migrations. Experience with monitoring tools (e.g., Prometheus, Grafana) for Kafka performance and health checks. Excellent troubleshooting skills with a passion for problem-solving. Good To Have Confluent Kafka certification or other relevant cloud certifications (e.g., Azure). Experience with microservices architecture, RESTful APIs, and containerization. Knowledge of security best practices related to data streaming and cloud environments. Why Chubb Join Chubb to be part of a leading global insurance company! Industry leader : Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence. A Great Place to work : Chubb India has been recognized as a Great Place to Work for the years 2023-2024, 2024-2025, and 2025-2026. Laser focus on excellence : At Chubb, we pride ourselves on our culture of greatness, where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results. Start-Up Culture : Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter. Growth and success : As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment. Employee Benefits Our company offers a comprehensive benefits package designed to support our employees health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision-related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans : We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits, and Car Lease that help employees optimally plan their finances. Upskilling and career growth opportunities : With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs, and access to global learning programs. Health and Welfare Benefits : We care about our employees well-being in and out of work and have benefits like Hybrid Work Environment, Employee Assistance Program (EAP), Yearly Free Health campaigns, and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent and inclusive. Step 1 : Submit your application via the Chubb Careers Portal. Step 2 : Engage with our recruitment team for an initial discussion. Step 3 : Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4 : Final interaction with Chubb leadership. Join Us With you, Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India's journey. Apply Now : Chubb External Careers

Posted 1 month ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Hyderabad, Telangana, India

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength, and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow. With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Job Title : Senior Software Engineer Function/Department : Technology Location : Hyderabad - Work From Office Employment Type : Full-time Reports To : Kuldeep Kumar Role Overview As a Full Stack engineer, you will engage in both front-end and back-end programming languages, contributing to the development of frameworks and integrating third-party libraries. Key Responsibilities Full Stack Development : Work on both front-end and back-end technologies to build robust and scalable applications. Data Streaming Solutions : Build and support data streaming solutions , particularly using Confluent Kafka . Framework and API Development : Develop frameworks and integrate third-party libraries using technologies like .Net Framework , C# , Asp.Net , and Web APIs . Cloud Application Design : Design, build, test, and maintain cloud applications and services on Microsoft Azure , with proficiency in various Azure SDKs, data storage, and app authentication. UI/UX & Databases : Familiarity with databases like MSSQL and MySQL . Perform performance tuning of relational databases and contribute to UI/UX design . DevOps & CICD : Implement and manage CICD pipelines using tools like BitBucket , Bamboo , PowerShell , and SonarQube . Skills and Qualifications Education : Bachelor's degree in Computer Science , Information Technology , or a related field. Master's degree preferred . Experience : 4-8 years of experience in building and supporting data streaming solutions , specifically with Confluent Kafka . Technical Skills : In-depth knowledge of .Net Framework , C# , Asp.Net , and Web APIs . Hands-on experience in Angular 13+ and JavaScript frameworks . Familiarity with Agile methodologies and Scrum . Expertise in designing, building, testing, and maintaining cloud applications on Microsoft Azure . Familiarity with MSSQL , MySQL , IIS , and UI/UX design and performance tuning of relational databases. Proficient in DevOps practices and implementing CICD pipelines using BitBucket , Bamboo , PowerShell , and SonarQube . Why Chubb Industry Leader : Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence. A Great Place to Work : Chubb India has been recognized as a Great Place to Work for the years 2023-2024, 2024-2025, and 2025-2026. Focus on Excellence : Chubb takes pride in a culture of excellence, constantly seeking innovative ways to excel at work and deliver outstanding results. Start-Up Culture : Embracing the spirit of a start-up, Chubb focuses on speed and agility , enabling quick responses to market needs. Growth and Success : Chubb remains committed to providing employees with the best work experience, supporting career advancement and continuous learning . Employee Benefits Chubb offers a comprehensive benefits package that supports employees health , well-being , and professional growth . The benefits include: Savings and Investment Plans : Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits, and Car Lease. Upskilling and Career Growth : Customized programs supporting education reimbursement , certification programs , and access to global learning programs . Health and Welfare Benefits : Hybrid work environment , Employee Assistance Program (EAP) , Yearly Free Health campaigns , and comprehensive insurance benefits. Application Process Step 1 : Submit your application via the Chubb Careers Portal . Step 2 : Engage with the recruitment team for an initial discussion . Step 3 : Participate in HackerRank assessments or technical/functional interviews . Step 4 : Final interaction with Chubb leadership . Join Us With you, Chubb is better . Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity , innovation , and inclusion , and are ready to make a difference, we invite you to be part of Chubb India's journey . Apply Now : Chubb External Careers

Posted 1 month ago

Apply

6.0 - 8.0 years

6 - 8 Lacs

Hyderabad, Telangana, India

On-site

About the job About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance, and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength, and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow. With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Job Title : Senior Software Engineer Function/Department : Technology Location : Hyderabad - Work From Office Employment Type : Full-time Reports To : Kuldeep Kumar Role Overview Key Responsibilities As a Full Stack engineer, you will engage in both front-end and back-end programming languages, contributing to the development of frameworks and integrating third-party libraries. Skills And Qualifications Bachelor's degree in computer science, Information Technology, or a related field. Master's degree preferred. Minimum of 6-8 years of experience in building and supporting data streaming solutions, specifically with Confluent Kafka. In-depth knowledge of .Net Framework, C#, Asp.Net, Web APIs. Hands-on experience in Angular 13+, JavaScript framework. Familiar with Agile methodologies and Scrum experience. Expertise in designing, building, testing, and maintaining cloud applications and services on Microsoft Azure. In addition, he/she should have the ability to program in a language supported by Azure and proficiency in Azure SDKs, data storage options, data connections, APIs, app authentication and authorization, compute and container deployment, debugging, performance tuning, and monitoring. Familiarity with databases (e.g. MSSQL & MySQL), web servers (IIS), and UI/UX design, performance tuning of relational databases. Skilled in DevOps with knowledge of implementing CICD pipeline using BitBucket, Bamboo, PowerShell, SonarQube. Why Chubb Join Chubb to be part of a leading global insurance company! Industry leader : Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence. A Great Place to work : Chubb India has been recognized as a Great Place to Work for the years 2023-2024, 2024-2025, and 2025-2026. Laser focus on excellence : At Chubb, we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results. Start-Up Culture : Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter. Growth and success : As we continue to grow, we are steadfast in our commitment to providing our employees with the best work experience, enabling them to advance their careers in a conducive environment. Employee Benefits Our company offers a comprehensive benefits package designed to support our employees health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision-related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans : We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits, and Car Lease that help employees optimally plan their finances. Upskilling and career growth opportunities : With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs, and access to global learning programs. Health and Welfare Benefits : We care about our employees well-being in and out of work and have benefits like Hybrid Work Environment, Employee Assistance Program (EAP), Yearly Free Health campaigns, and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1 : Submit your application via the Chubb Careers Portal. Step 2 : Engage with our recruitment team for an initial discussion. Step 3 : Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4 : Final interaction with Chubb leadership. Join Us With you, Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India's journey. Apply Now : Chubb External Careers

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing, implementing, and maintaining scalable event-streaming architectures that support real-time data. Your duties will include designing, building, and managing Kafka clusters using Confluent Platform and Kafka Cloud services (AWS MSK, Confluent Cloud). You will also be involved in developing and maintaining Kafka topics, schemas (Avro/Protobuf), and connectors for data ingestion and processing pipelines. Monitoring and ensuring the reliability, scalability, and security of Kafka infrastructure will be crucial aspects of your role. Collaboration with application and data engineering teams to integrate Kafka with other AWS-based services (e.g., Lambda, S3, EC2, Redshift) is essential. Additionally, you will implement and manage Kafka Connect, Kafka Streams, and ksqlDB where applicable. Optimizing Kafka performance, troubleshooting issues, and managing incidents will also be part of your responsibilities. To be successful in this role, you should have at least 3-5 years of experience working with Apache Kafka and Confluent Kafka. Strong knowledge of Kafka internals such as brokers, zookeepers, partitions, replication, and offsets is required. Experience with Kafka Connect, Schema Registry, REST Proxy, and Kafka security is also important. Hands-on experience with AWS services like EC2, IAM, CloudWatch, S3, Lambda, VPC, and Load balancers is necessary. Proficiency in scripting and automation using tools like Terraform, Ansible, or similar is preferred. Familiarity with DevOps practices and tools such as CI/CD pipelines, monitoring tools like Prometheus/Grafana, Splunk, Datadog, etc., is beneficial. Experience with containerization using Docker and Kubernetes is an advantage. Having a Confluent Certified Developer or Administrator certification, AWS Certified, experience with CICD tools like AWS Code Pipeline, Harness, and knowledge of containers (Docker, Kubernetes) will be considered as additional assets for this role.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with bp is transforming and at our Digital Hub in Pune we are growing the digital expertise and solutions needed to advance the global energy transition. Digital Engineering are a team of technology and software specialists providing innovative, custom-built or customized software and technical platforms to bp colleagues and external users. Let Me Tell You About The Role As an Integration Senior Enterprise Tech Engineer, you are a senior member of a team creating Application Integration solutions for BP colleagues and external users. Your teams mission is to be the digital provider of choice to your area of BP delivering innovation at speed where it&aposs wanted, and day-in-day-out reliability where it&aposs needed. You will operate in a dynamic and commercially focussed environment, with the resources of one of the world&aposs largest Digital organisations and leading Digital and IT vendors working with you. You will be part of growing and strengthening our technical talent base experts coming together to solve BP and the worlds problems. What You Will Deliver Lead enterprise technology architecture, security frameworks, and platform engineering across enterprise landscapes. Oversee the end-to-end security of enterprise platforms, ensuring compliance with industry standards and regulatory requirements. Drive enterprise operations excellence, optimising system performance, availability, and scalability. Provide leadership in enterprise modernization and transformation, ensuring seamless integration with enterprise IT. Establish governance, security standards, and risk management strategies aligned with global security policies. Design and implement automated security monitoring, vulnerability assessments, and identity management solutions for enterprise environments. Drive CI/CD, DevOps, and Infrastructure-as-Code adoption for enterprise deployments. Ensure disaster recovery, high availability, and resilience planning for enterprise platforms. Engage with business leaders, technology teams, and external vendors to ensure enterprise solutions align with enterprise goals. Mentor and lead enterprise security and operations teams, fostering a culture of excellence, innovation, and continuous improvement. Provide executive-level insights and technical recommendations on enterprise investments, cybersecurity threats, and operational risks What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelors (or higher) degree, ideally in Computer Science, MIS/IT, Mathematics or a hard science. Years of experience: 8-12 years, with a minimum of 5-7 years of relevant experience. Essential Skills SME in enterprise integration domain, should be able to design highly scalable integrations which involves with API, Messaging, Files, Databases, and cloud services Experienced in Integration tools like TIBCO/MuleSoft, Apache Camel/ Spring Integration, Confluent Kafka...etc. Expert in Enterprise Integration Patterns (EIPs) and iBlocks to build secure integrations Willingness and ability to learn, to become skilled in at least one more cloud-native (AWS and Azure) integration solutions on top of your existing skillset. Deep understanding of the Interface development lifecycle, including design, security, design patterns for extensible and reliable code, automated unit and functional testing, CI/CD and telemetry Demonstrated understanding of modern technologies like Cloud native, containers, serverless Emerging Technology Monitoring Application Support Strong inclusive leadership and people management Stakeholder Management Embrace a culture of continuous improvement Skills That Set You Apart Agile methodologies ServiceNow Risk Management Systems Development Management Monitoring and telemetry tools User Experience Analysis Cybersecurity and compliance Key Behaviors: Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Automation, Integration Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

5 - 10 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Ahmedabad, Gujarat, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Gurgaon, Haryana, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Pune, Maharashtra, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies