Jobs
Interviews

82 Kafka Streams Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

???? Were Hiring: Senior Data Engineer 7+ Years Experience ???? Location: Gurugram, Haryana, India ???? Duration: 6 Months C2H (Contract to Hire) ???? Apply Now: [HIDDEN TEXT] ???? What Were Looking For: ? 7+ years of experience in data engineering ? Strong expertise in building scalable, robust batch and real-time data pipelines ? Proficiency in AWS Data Services (S3, Glue, Athena, EMR, Kinesis, etc.) ? Advanced SQL skills and deep knowledge of file formats: Parquet, Delta Lake, Iceberg, Hudi ? Hands-on experience with CDC patterns ? Experience with stream processing (Apache Flink , Kafka Streams) and distributed frameworks like PySpark ? Expertise in Apache Airflow for workflow orchestration ? Solid foundation in data warehousing concepts and experience with both relational and NoSQL databases ? Strong communication and problem-solving skills ? Passion for staying up to date with the latest in the data tech landscape Show more Show less

Posted 16 hours ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Technology Lead at Standard Chartered Bank, you will play a crucial role in driving the strategy, business growth, process enhancements, talent development, risk management, and governance within the CASH Management Operations. Your responsibilities will involve tapping into the latest industry trends and innovative products to deliver effective and faster product capabilities. You will support the operations by leveraging technology to streamline processes, enhance productivity, reduce risk, and improve controls. In the business domain, you will ensure the timely delivery of business objectives while meeting cost and quality constraints. Your role will involve supporting respective businesses in growing return on investment, commercializing capabilities, monitoring usage, improving client experience, enhancing operations, and addressing defects. Additionally, you will be responsible for fostering an ecosystem of innovation and enabling business through technology. Your focus on processes will include developing and enhancing core product capabilities, ensuring compliance with regulatory mandates, supporting operational improvements and process efficiencies, and collaborating with business and technology leaders of other SCB systems for joint delivery. As a leader, you will set the tone and expectations for the team, bridge skill and capability gaps through learning and development, and ensure clear role descriptions and expectations for the entire team. You will also manage the blend and balance of in-house and vendor resources effectively. Risk management is a critical aspect of your role, requiring quick and decisive action when risk and control weaknesses surface. You will balance business delivery with risks and controls to maintain acceptable risk levels and ensure business continuity and disaster recovery planning for the technology portfolio. In terms of governance, you will promote an environment of compliance with internal control functions and the external regulatory framework. Your conduct will be exemplary, adhering to the Group's Values and Code of Conduct while taking personal responsibility for embedding the highest standards of ethics. Your interactions will be with key stakeholders in Group Cash Operations, and your technical expertise will include Java/Spring Boot, Kafka Streams, REST, JSON, Hazelcast, ELK, Oracle, and Postgres. Your ability to work with varied stakeholders, strong communication skills, and knowledge of JIRA and Confluence tools will be essential for success in this role. Standard Chartered Bank offers a purpose-driven career where you can make a positive impact. If you are passionate about driving commerce and prosperity through diversity and inclusion, we encourage you to join us in our journey of growth and continuous improvement.,

Posted 2 days ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 10+ years of experience in the field, with a strong background in Kafka Streams / KSQL architecture and associated clustering model. Your expertise should include solid programming skills with Java, along with best practices in development, automation testing, and streaming APIs. Practical experience in scaling Kafka, KStreams, and Connector infrastructures is required, as well as the ability to optimize the Kafka ecosystem based on specific use-cases and workloads. As a developer, you should have hands-on experience in building producer and consumer applications using the Kafka API, and proficiency in implementing KStreams components. Additionally, you should have developed KStreams pipelines and deployed KStreams clusters. Experience in developing KSQL queries and understanding the best practices of using KSQL vs KStreams is essential. Strong knowledge of the Kafka Connect framework is necessary, including experience with various connector types such as HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and the ability to support wire-format translations. Familiarity with connectors available from Confluent and the community, as well as hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework is a plus. Knowledge of Schema Registry is also beneficial. Nice-to-have qualities include providing thought leadership for the team, excellent verbal and written communication skills, being a good team player, and willingness to go the extra mile to support the team. In terms of educational qualifications, a four-year college degree in Science, Engineering, Technology, Business, or Humanities is required. Candidates with a Master's degree and/or certifications in the relevant technologies are preferred. The working mode for this position is hybrid, full-time (3 days working from the office), and the notice period is a maximum of 30 days.,

Posted 2 days ago

Apply

12.0 - 14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Global Technology Partners is a premier partner for digital transformation, with a diverse team of software engineering experts in the US and India. They combine strategic thinking, innovative design, and robust engineering to deliver exceptional results for their clients. Job Summary We are seeking a highly experienced and visionary Principal/Lead Java Architect to play a pivotal role in designing and evolving our next-generation, high-performance, and scalable event-driven platforms. This role demands deep expertise in Java, extensive experience with Kafka as a core component of event streaming architectures, and a proven track record of leading architectural design and implementation across complex enterprise systems. You will be instrumental in defining technical strategy, establishing best practices, and mentoring engineering teams to deliver robust and resilient solutions. Key Responsibilities: Architectural Leadership: Lead the design, development, and evolution of highly scalable, resilient, and performant event-driven architectures using Java and Kafka. Define architectural patterns, principles, and standards for event sourcing, CQRS, stream processing, and microservices integration with Kafka. Drive technical vision and strategy for our core platforms, ensuring alignment with business objectives and long-term technology roadmap. Conduct architectural reviews, identify technical debt, and propose solutions for continuous improvement. Stay abreast of emerging technologies and industry trends, evaluating their applicability and recommending adoption where appropriate. Design & Development: Design and implement robust, high-throughput Kafka topics, consumers, producers, and streams (Kafka Streams/KSQL). Architect and design Java-based microservices that effectively integrate with Kafka for event communication and data synchronization. Lead the selection and integration of appropriate technologies and frameworks for event processing, data serialization, and API development. Develop proof-of-concepts (POCs) and prototypes to validate architectural choices and demonstrate technical feasibility. Contribute hands-on to critical path development when necessary, demonstrating coding excellence and leading by example. Kafka Ecosystem Expertise: Deep understanding of Kafka internals, distributed systems concepts, and high-availability configurations. Experience with Kafka Connect for data integration, Schema Registry for data governance, and KSQL/Kafka Streams for real-time stream processing. Proficiency in monitoring, optimizing, and troubleshooting Kafka clusters and related applications. Knowledge of Kafka security best practices (authentication, authorization, encryption). Technical Governance & Mentorship: Establish and enforce architectural governance, ensuring adherence to design principles and coding standards. Mentor and guide engineering teams on best practices for event-driven architecture, Kafka usage, and Java development. Foster a culture of technical excellence, collaboration, and continuous learning within the engineering organization. Communicate complex technical concepts effectively to both technical and non-technical stakeholders. Performance, Scalability & Reliability: Design for high availability, fault tolerance, and disaster recovery. Define and implement strategies for performance optimization, monitoring, and alerting across the event-driven ecosystem. Ensure solutions are scalable to handle significant data volumes and transaction rates. Required Skills & Experience: 12+ years of progressive experience in software development, with at least 5+ years in an Architect role designing and implementing large-scale enterprise solutions. Expert-level proficiency in Java (Java 8+, Spring Boot, Spring Framework). Deep and extensive experience with Apache Kafka: Designing and implementing Kafka topics, producers, and consumers. Hands-on experience with Kafka Streams API or KSQL for real-time stream processing. Familiarity with Kafka Connect, Schema Registry, and Avro/Protobuf. Understanding of Kafka cluster operations, tuning, and monitoring. Strong understanding and practical experience with Event-Driven Architecture (EDA) principles and patterns: Event Sourcing, CQRS, Saga, Choreography vs. Orchestration. Extensive experience with Microservices architecture principles and patterns. Proficiency in designing RESTful APIs and asynchronous communication mechanisms. Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra). Solid understanding of cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes). Experience with CI/CD pipelines (e.g., Jenkins, GitLab CI, Azure DevOps). Strong problem-solving skills, analytical thinking, and attention to detail. Excellent communication, presentation, and interpersonal skills. Show more Show less

Posted 2 days ago

Apply

10.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Strategy Develop the strategic direction and roadmap for SCPAY, aligning with Business Strategy, ITO Strategy and investment priorities. Tap into latest industry trends, innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes, enhance productivity, reduce risk and improve controls Business Work hand in hand with Payments Business, taking product programs from investment decisions into design, specifications, solutioning, development, implementation and hand-over to operations, securing support and collaboration from other SCB teams Ensure delivery to business meeting time, cost and high quality constraints Support respective businesses in growing Return on investment, commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments & Clearing. Own technology delivery of projects and programs across global SCB markets that develop/enhance core product capabilities ensure compliance to Regulatory mandates support operational improvements, process efficiencies and zero touch agenda build payments platform to align with latest technology & architecture trends, improved stability and scale Interface with business & technology leaders of other SCB systems for collaborative delivery. Key Responsibilities People & Talent Employee, engage and retain high quality talent to ensure Payments Technology team is adequately staffed and skilled to deliver on business commitments Lead through example and build appropriate culture and values. Set appropriate tone and expectations for the team and work in collaboration with risk and control partners. Bridge skill / capability gaps through learning and development Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team Ensure the optimal blend and balance of in-house and vendor resources Risk Management Be proactive in ensuring regular assurance that the Payments ITO Team is performing to acceptable risk levels and control standards Act quickly and decisively when any risk and control weakness becomes apparent and ensure those are addressed within quick / prescribed timeframes and escalated through the relevant committees Balance business delivery on time, quality and cost constraints with risks & controls to ensure that they do not materially threaten the Groups ability to remain within acceptable risk levels Ensure business continuity and disaster recovery planning for the entire technology portfolio Governance Promote an environment where compliance with internal control functions and the external regulatory framework Regulatory & Business Conduct Display exemplary conduct and live by the Groups Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead to achieve the outcomes set out in the Banks Conduct Principles Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters Key stakeholders Solution Architect SCPAY SCPAY Programme Managers Group Payments Product Development Heads Group Cash Operations. Qualifications Refer Minimum 10 yrs of experience in the Dev role and in that a couple of years of experience as Dev lead role is an added advantage, good knowledge in Java, Microservices and Spring boot Technical Knowledge: Java / Spring Boot, Kafka Streams, REST, JSON, Netflix Micro Services suite ( Zuul / Eureka / Hystrix etc., ), 12 Factor Apps, Oracle, PostgresSQL, Hazelcast & ELK Ability to work with geographically dispersed and highly varied stakeholders Very Good communication and interpersonal skills to manage senior stakeholders and top management Knowledge on JIRA and Confluence tools are desired Skills And Experience Java / Spring Boot Kafka Streams, REST, JSON Design Principle Hazelcast & ELK Oracle & Postgres About Standard Chartered We&aposre an international bank, nimble enough to act, big enough for impact. For more than 170 years, we&aposve worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you&aposre looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can&apost wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you&aposll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less

Posted 2 days ago

Apply

10.0 - 12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Responsibilities Strategy Develop the strategic direction and roadmap for SCPAY, aligning with Business Strategy, ITO Strategy and investment priorities. Tap into latest industry trends, innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes, enhance productivity, reduce risk and improve controls Business Work hand in hand with Payments Business, taking product programs from investment decisions into design, specifications, solutioning, development, implementation and hand-over to operations, securing support and collaboration from other SCB teams Ensure delivery to business meeting time, cost and high quality constraints Support respective businesses in growing Return on investment, commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments & Clearing. Own technology delivery of projects and programs across global SCB markets that develop/enhance core product capabilities ensure compliance to Regulatory mandates support operational improvements, process efficiencies and zero touch agenda build payments platform to align with latest technology & architecture trends, improved stability and scale Interface with business & technology leaders of other SCB systems for collaborative delivery. People & Talent Employee, engage and retain high quality talent to ensure Payments Technology team is adequately staffed and skilled to deliver on business commitments Lead through example and build appropriate culture and values. Set appropriate tone and expectations for the team and work in collaboration with risk and control partners. Bridge skill / capability gaps through learning and development Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team Ensure the optimal blend and balance of in-house and vendor resources Risk Management Be proactive in ensuring regular assurance that the Payments ITO Team is performing to acceptable risk levels and control standards Act quickly and decisively when any risk and control weakness becomes apparent and ensure those are addressed within quick / prescribed timeframes and escalated through the relevant committees Balance business delivery on time, quality and cost constraints with risks & controls to ensure that they do not materially threaten the Groups ability to remain within acceptable risk levels Ensure business continuity and disaster recovery planning for the entire technology portfolio Governance Promote an environment where compliance with internal control functions and the external regulatory framework Regulatory & Business Conduct Display exemplary conduct and live by the Groups Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead to achieve the outcomes set out in the Banks Conduct Principles Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key Stakeholders Solution Architect SCPAY SCPAY Programme Managers Group Payments Product Development Heads Group Cash Operations Qualifications Refer Minimum 10 yrs of experience in the Dev role ,Payment background and stakeholders management, Java, Microservices and Spring boot Technical Knowledge: Java / Spring Boot, Kafka Streams, REST, JSON, Netflix Micro Services suite ( Zuul / Eureka / Hystrix etc., ), 12 Factor Apps, Oracle, PostgresSQL, Cassandra & ELK Ability to work with geographically dispersed and highly varied stakeholders Very Good communication and interpersonal skills to manage senior stakeholders and top management Knowledge on JIRA and Confluence tools are desired Role Specific Technical Competencies Java / Spring Boot Kafka Streams, REST, JSON Design Principle Hazelcast & ELK Oracle & Postgres About Standard Chartered We&aposre an international bank, nimble enough to act, big enough for impact. For more than 170 years, we&aposve worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you&aposre looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can&apost wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you&aposll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less

Posted 2 days ago

Apply

6.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Strategy Develop the strategic direction and roadmap for SCPAY, aligning with Business Strategy, ITO Strategy and investment priorities. Tap into latest industry trends, innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes, enhance productivity, reduce risk and improve controls Key Stakeholders Solution Architect SCPAY SCPAY Programme Managers Group Payments Product Development Heads Group Cash Operations Qualifications Minimum 6 yrs of experience in the Automation Testing role Specialisation Payments/Collections/Messaging Automation: Selenium, Java, UFT Ability to work with geographically dispersed and highly varied stakeholders Very Good communication and interpersonal skills to manage senior stakeholders and top management Knowledge on JIRA and Confluence tools are desired Role Specific Technical Competencies Java / Spring Boot Kafka Streams, REST, JSON Design Principle Hazelcast & ELK Oracle & Postgres About Standard Chartered We&aposre an international bank, nimble enough to act, big enough for impact. For more than 170 years, we&aposve worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you&aposre looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can&apost wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you&aposll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less

Posted 2 days ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.

Posted 6 days ago

Apply

5.0 - 8.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer Java & Kafka Location: Bangalore, India Job Type: Full-Time Experience: 5+ Years About the Role: We are looking for a highly skilled Data Engineer with solid experience in Java and Apache Kafka to join our growing data team in Bangalore. As a key member of our engineering team, you will be responsible for building and optimizing our data pipeline architecture, as well as developing and maintaining data systems to enable scalable, real-time data processing. Key Responsibilities: Design, develop, and maintain scalable, high-performance data pipelines and streaming systems using Java and Apache Kafka . Build and manage reliable data ingestion processes from diverse data sources. Work closely with Data Scientists, Analysts, and other Engineers to integrate and optimize data workflows. Implement data quality, monitoring, and alerting solutions. Ensure robust data governance, security, and compliance standards. Optimize data systems for performance, scalability, and cost efficiency. Participate in code reviews, architecture discussions, and contribute to best practices. Required Skills & Experience: Minimum 5 years of experience in Data Engineering or related backend roles. Strong programming experience with Java (mandatory). Expertise in working with Apache Kafka for real-time streaming and event-driven architectures. Proficiency in building ETL/ELT pipelines and handling large volumes of data. Experience with data storage systems such as HDFS , Hive , HBase , Cassandra , or PostgreSQL . Familiarity with cloud platforms (AWS/GCP/Azure) is a plus. Good understanding of distributed systems, data partitioning, and scalability challenges. Strong problem-solving skills and ability to work independently and in a team. Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Experience with other streaming technologies like Apache Flink , Spark Streaming , or Kafka Streams is a plus. Exposure to containerization tools like Docker and orchestration platforms like Kubernetes .

Posted 1 week ago

Apply

8.0 - 13.0 years

16 - 22 Lacs

Hyderabad

Work from Office

Looking for a Data Engineer with 8+ yrs exp to build scalable data pipelines on AWS/Azure, work with Big Data tools (Spark, Kafka), and support analytics teams. Must have strong coding skills in Python/Java and exp with SQL/NoSQL & cloud platforms. Required Candidate profile Strong experience in Java/Scala/Python. Worked with big data tech: Spark, Kafka, Flink, etc. Built real-time & batch data pipelines. Cloud: AWS, Azure, or GCP.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Backend Developer, you will be responsible for building, testing, and deploying software using standard CI/CD pipelines. Your role will involve building microservices to process data, interact with databases, expose data to other applications, and more. You will work with architectural patterns such as event-based data streaming, request-response web services, and file transport jobs based on specific context requirements. Ensuring the creation of relevant logs, sharing them with the central logging platform, and setting up necessary alerts will be part of your responsibilities. Testing software for functionality, quality, fault-tolerance, performance, and scalability will be crucial. You will integrate security features like federated authentication, role-based access control, and similar mechanisms into the solution. Collaborating within a guild for backend developers to share knowledge, technical patterns, and best practices across product teams will be encouraged. The tech stack includes Spring Boot, Spring Boot JPA, Spring Boot actuator, PostgreSQL, Kafka, Keycloak, Observability platform, Maven, log4j2, Kafka Streams, JUnit, Kubernetes, and Azure. To excel in this role, you should be service-minded, customer-driven, and possess effective communication skills in English. Strong organizational, interpersonal, time management, and communication abilities are essential. Working well in a team environment to meet strict deadlines and comply with criteria defined by various teams is necessary. You should also thrive under pressure and be proficient in multitasking.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a backend engineer at Latinum, you will play a crucial role in designing and developing robust and scalable systems to address complex business challenges. You will be part of a dynamic team of high-performance engineers focused on creating efficient and high-throughput solutions. To excel in this role, you must have a minimum of 7+ years of hands-on experience in backend engineering. You should demonstrate proficiency in Core Java backend engineering and Microservices and Cloud architecture. Candidates with expertise in both areas will be considered for senior positions within the team. In the realm of Java & Backend Engineering, you should be well-versed in Java 8+ concepts such as Streams, Lambdas, Functional Interfaces, and Optionals. Additionally, you should have experience with Spring Core, Spring Boot, object-oriented principles, multithreading, and collections. Knowledge of Kafka, JPA, RDBMS/NoSQL, and design patterns is essential for this role. In the Microservices, Cloud & Distributed Systems domain, familiarity with REST APIs, OpenAPI/Swagger, Spring Boot, and Kafka Streams is required. Experience with event-driven patterns, GraphQL, cloud-native applications on AWS, CI/CD pipelines, and observability tools like ELK and Prometheus will be beneficial. Moreover, additional skills in Node.js, React, Angular, Golang, Python, and web platforms like AEM and Sitecore are considered advantageous. Proficiency in TDD, mocking, security testing, and architecture artifacts is a plus. Your key responsibilities will include designing and developing scalable backend systems using Java and Spring Boot, building event-driven microservices and cloud-native APIs, and implementing secure and high-performance solutions. You will collaborate with cross-functional teams to define architecture, conduct code reviews, and ensure production readiness. Additionally, troubleshooting, optimizing, and monitoring distributed systems and mentoring junior engineers will be part of your role, especially for senior positions. Join Latinum's team of dedicated engineers and be part of a challenging and rewarding environment where you can contribute to cutting-edge solutions and innovative technologies.,

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do: 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Kafka Integration Experience : 3-5 Years.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Chennai

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Kafka Integration. Experience: 3-5 Years.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

3 - 12 Lacs

Mumbai, Maharashtra, India

On-site

4+ years of experience developing medium to large Java applications Experience working with Git Experience working in a CI/CD environment Experience in streaming data applications Kafka Experience with Docker/Kubernetes and development of containerized applications Experience working in an Agile development methodology Experience with project management tools: Rally, JIRA, Confluence, Bit Bucket Excellent communication skills - verbal & written Self-motivated, passionate, well organized individual with demonstrated problem solving skills Experience in building distributed Machine Learning systems.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Role Overview Seeking a highly skilled Kafka Developer with deep expertise in Kafka KSQL Streams-based transformations and hands-on experience working with Oracle, MS SQL, and PostgreSQL databasesespecially in environments with non-standard schemas (e.g., tables without primary keys, with/without constraints, and foreign keys). The ideal candidate will be responsible for designing and implementing scalable, real-time data pipelines and ensuring robust CDC (Change Data Capture) mechanisms for incremental data replication. Key Responsibilities • Design and implement Kafka Streams and KSQL-based transformations for real-time data processing. • Develop and maintain CDC pipelines using Kafka Connect, Debezium, or custom connectors. • Handle initial data loads from large relational datasets via manual exports and ensure seamless transition to incremental CDC. • Work with complex relational schemas , including: • Tables without primary keys • Tables with/without constraints • Foreign key relationships • Optimize data ingestion and transformation pipelines for performance, reliability, and scalability . • Collaborate with data architects, DBAs, and application teams to ensure data integrity and consistency. • Document technical designs, data flow diagrams, and operational procedures. • Communicate effectively with cross-functional teams and stakeholders. Required Skills and Experience • Strong hands-on experience with Apache Kafka , Kafka Streams , and KSQL . • Proficiency in Kafka Connect and CDC tools (Debezium, Confluent). • Deep understanding of Oracle, MS SQL Server, and PostgreSQL internals and schema design. • Experience handling non-standard table structures and resolving challenges in CDC replication. • Familiarity with manual data export/import strategies and their integration into streaming pipelines. • Strong knowledge of data serialization formats (Avro, JSON, Protobuf). • Proficient in Java or Scala for custom Kafka development. • Excellent communication skills —both written and verbal. Preferred Qualifications • Experience with schema registry , data governance , and data quality frameworks . • Familiarity with CI/CD pipelines , GitOps , and containerized deployments (Docker, Kubernetes). • Prior experience in data architecture or data platform engineering roles.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

18 - 22 Lacs

Navi Mumbai, Mumbai (All Areas)

Work from Office

1 Education : B.E./B.Tech/MCA in Computer Science 2 Experience : Must have 7+ years relevant experience in the field of database Administration. 3 Mandatory Skills/Knowledge Candidate should be technically sound in multiple distribution like Cloudera, Confluent, open source Kafka. Candidate should be technically sound in Kafka, Zookeeper Candidate should well versed in capacity planning and performance tuning. Candidate should be expertise in implementation of security in ecosystem Hadoop Security ranger , Kerberos ,SSL Candidate should be expertise in dev ops tool like ansible, Nagios, shell scripting python , Jenkins, Ansible, Git, Maven to implement automation . Candidate should able to Monitor, Debug & RCA for any service failure. Knowledge of network infrastructure for eg. TCP/IP, DNS, Firewall, router, load balancer. Creative analytical and problem-solving skills Provide RCAs for critical & recurring incidents. Provide on-call service coverage within a larger group Good aptitude in multi-threading and concurrency concepts. 4 Preferred Skills/Knowledge Expert Knowledge of database administration and architecture Hands on Operating System Commands Kindly Share CVs on snehal.sankade@outworx.com

Posted 2 weeks ago

Apply

13.0 - 20.0 years

30 - 45 Lacs

Pune

Hybrid

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring DATA ENGINEERING - Solution Architect for one of our leading MNC client. PFB the details for your better understanding : 1. WORK LOCATION : PUNE 2. Job Role: DATA ENGINEERING - Solution Architect 3. EXPERIENCE : 13+ yrs 4. CTC Range: Rs. 35 LPA to Rs. 50 LPA 5. Work Type : WFO Hybrid ****** Looking for SHORT JOINERS ****** Job Description : Who are we looking for : Architectural Vision & Strategy: Define and articulate the technical vision, strategy and roadmap for Big Data, data streaming, and NoSQL solutions , aligning with overall enterprise architecture and business goals. Required Skills : 13+ years of progressive EXP in software development, data engineering and solution architecture roles, with a strong focus on large-scale distributed systems. Expertise in Big Data Technologies: Apache Spark: Deep expertise in Spark architecture, Spark SQL, Spark Streaming, performance tuning, and optimization techniques. Experience with data processing paradigms (batch and real-time). Hadoop Ecosystem: Strong understanding of HDFS, YARN, Hive and other related Hadoop components . Real-time Data Streaming: Apache Kafka: Expert-level knowledge of Kafka architecture, topics, partitions, producers, consumers, Kafka Streams, KSQL, and best practices for high-throughput, low-latency data pipelines. NoSQL Databases: Couchbase: In-depth experience with Couchbase OR MongoDB OR Cassandra), including data modeling, indexing, querying (N1QL), replication, scaling, and operational best practices. API Design & Development: Extensive experience in designing and implementing robust, scalable and secure APIs (RESTful, GraphQL) for data access and integration. Programming & Code Review: Hands-on coding proficiency in at least one relevant language ( Python, Scala, Java ) with a preference for Python and/or Scala for data engineering tasks. Proven experience in leading and performing code reviews, ensuring code quality, performance, and adherence to architectural guidelines. Cloud Platforms: Extensive EXP in designing and implementing solutions on at least one major cloud platform ( AWS, Azure, GCP ), leveraging their Big Data, streaming, and compute services . Database Fundamentals: Solid understanding of relational database concepts, SQL, and data warehousing principles. System Design & Architecture Patterns: Deep knowledge of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Lambda/Kappa Architecture, Data Mesh ) and their application in data solutions. DevOps & CI/CD: Familiarity with DevOps principles, CI/CD pipelines, infrastructure as code (IaC) and automated deployment strategies for data platforms . ****** Looking for SHORT JOINERS ****** Interested, don't hesitate to call NAK @ 9840035825 / 9244912300 for IMMEDIATE response. Best, ANANTH | GSN | Google review : https://g.co/kgs/UAsF9W

Posted 2 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Kafka Integration. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

3.0 - 10.0 years

18 - 22 Lacs

Hyderabad

Work from Office

WHAT YOU'LL DO Lead the development of scalable data infrastructure solutions Leverage your data engineering expertise to support data stakeholders and mentor less experienced Data Engineers. Design and optimize new and existing data pipelines Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 8 to 10 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

12 - 16 Lacs

Hyderabad

Work from Office

WHAT YOU'LL DO Build scalable data infrastructure solutions Design and optimize new and existing data pipelines Integrate new data sources into our existing data architecture Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 3 to 5 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

10 - 17 Lacs

Bengaluru

Work from Office

Role name Kafka Platform Engineer No of years experience 5 + Years of relavant skill Detailed JD Kafka Platform Engineer We are seeking a highly skilled and motivated Kafka Platform Engineer to join our team. As a Kafka Platform Engineer, you will be responsible for operating and managing our Kafka cluster, ensuring its scalability, reliability, and security. You will collaborate with cross-functional teams to design, implement, and optimize Kafka solutions that meet the needs of our business. This is a key role in modernizing our application infrastructure and adopting industry best practices. Primary Skills: Strong expertise in operating and administering Kafka clusters. Experience in performance tuning and troubleshooting of middleware technologies, applying them to infrastructure. Proficiency in shell scripting and/or Python/Python, with specific experience in administering Kafka. Experience with Java application servers on cloud platforms is a significant advantage. Provide operational support for the Kafka cluster, ensuring high availability and stability 24/7 (on-call support). Utilize infrastructure as code (IaC) principles to provision and manage Kafka infrastructure. Work Location Bangalore (No remote access, need to operate from base location) Client Interview / F2F Applicable Yes

Posted 3 weeks ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Bengaluru

Work from Office

We are seeking an experienced Tech Lead with a passion for crafting scalable backend services and intuitive front-end experiences. This is an exciting opportunity to contribute to the design and development of complex, high-performance enterprise applications, particularly within our loyalty platform ecosystem . You will work in a collaborative Agile environment , take ownership of technical components, and mentor junior engineers. How Youll Make an Impact: Agile Development: Actively participate in all phases of Agile development, including planning, backlog grooming, coding, testing, and retrospectives. End-to-End Ownership: Own the development and integration of loyalty platform components, including REST APIs , batch jobs , and message queues . Domain Expert: Serve as a domain expert in at least one technology area, demonstrating leadership and ownership across feature development. Cross-Functional Collaboration: Collaborate closely with Product Owners and QA engineers to understand and refine acceptance criteria and technical specifications. Design & Architecture Leadership: Drive design and architecture discussions, contributing simple yet scalable solutions to complex business problems. Documentation: Create and maintain detailed documentation for business logic, configuration settings, and integration points. TDD & Testing: Develop unit and integration tests using TDD practices and frameworks like JUnit and Mockito . Mentorship: Guide junior developers through code reviews, pair programming, and knowledge-sharing sessions. Coding Best Practices: Promote coding best practices, clean architecture , and SOLID principles across the team. Effort Estimation: Accurately estimate effort, flag risks early, and ensure timely delivery of features within scope. Continuous Improvement: Proactively identify areas for improvement in code quality, performance, and DevOps practices . Production Support: Support application deployment, monitoring, and issue resolution in production environments. What You Need to Be Successful: Technical Skills: Backend Development: Proficient in Java (preferably JDK 17+ ), Spring Boot , and Spring Batch . Frontend Development: Experience with Angular (version 7+), HTML5 , CSS3 , and TypeScript . REST APIs: Strong experience in designing and developing RESTful APIs using JSON . Cloud & Microservices: Hands-on experience building cloud-native microservices on AWS , Azure , or Oracle Cloud . Database: Strong SQL skills, including experience with multi-table queries and query optimization using execution plans (preferably with Oracle or PostgreSQL ). Messaging Queues: Familiarity with RabbitMQ , Kafka Streams , or ActiveMQ . Containerization & Orchestration: Exposure to Docker , Kubernetes , and using kubectl for cluster configuration. DevOps & CI/CD: Experience with Git/Bitbucket , Gradle , Bamboo , or similar CI/CD tools . Soft Skills: Excellent written and verbal communication skills. Strong analytical and problem-solving capabilities. Ability to work independently and collaboratively in a cross-functional team environment. Mentorship mindset and willingness to support peer development. A proactive attitude toward continuous learning and innovation. Preferred Experience: Prior experience in Loyalty , Banking , Accounting , or other transactional domains . Working knowledge of monitoring tools , debugging distributed systems , and performance tuning .

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Mumbai

Work from Office

Role Overview: Seeking an experienced Apache Airflow specialist to design and manage data orchestration pipelines for batch/streaming workflows in a Cloudera environment. Key Responsibilities: * Design, schedule, and monitor DAGs for ETL/ELT pipelines * Integrate Airflow with Cloudera services and external APIs * Implement retries, alerts, logging, and failure recovery * Collaborate with data engineers and DevOps teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required: * Experience3-8 years * Expertise in Airflow 2.x, Python, Bash * Knowledge of CI/CD for Airflow DAGs * Proven experience with Cloudera CDP, Spark/Hive-based data pipelines * Integration with Kafka, REST APIs, databases

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Chennai

Hybrid

We are looking for someone with: Strong and demonstrable problem-solving ability Comfortable with self-management and on-the-job learning Ability to share knowledge across team(s) Demonstrable initiative and logical thinking Passion about emerging technologies and self development Strong computer science fundamentals Collaborative work-ethic Strong problem-solving and analytical skills Excellent communication skills Knowledge of applying object oriented and functional programming styles to real world problems. Ideally (but not restrictive) you should have: Hands on experience (5+) years using Java and/or Scala Knowledge of continuous integration and continuous delivery Knowledge of microservice architecture Working experience with TDD & BDD Experience building REST API's Experience working with Docker General knowledge of agile software development concepts and processes Proficient understanding of code versioning tools, such as Git Working experience with Jira, Confluence Nice to haves: Special interest in functional programming Knowledge of reactive manifesto Knowledge of streaming data Experience with Akka, Play Framework or Lagom Experience working with Kafka Knowledge of NoSQL Cloud based development with AWS, Microsoft Azure, Google Cloud etc. Commercial exposure with ELK stack

Posted 3 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies