Jobs
Interviews

293 Apache Kafka Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

15 - 25 Lacs

bengaluru

Work from Office

We are looking for a Java Developer with 5+ years of hands-on experience in building robust, scalable backend systems. The ideal candidate will have strong expertise in Java 8 , experience with messaging systems like Kafka or JMS, and a good command of writing unit tests using JUnit . The role requires someone who can work independently, deliver quality code, and contribute to system design and optimization. Key Responsibilities: Design, develop, and maintain enterprise-grade applications using Java 8 . Integrate and manage messaging systems using Apache Kafka or JMS . Write and maintain unit test cases using JUnit to ensure code quality and reliability. Collaborate with cross-functional teams to define, design, and ship new features. Participate in code reviews, troubleshoot production issues, and optimize application performance. Follow best practices in software engineering including Agile methodologies and continuous integration. Required Skills: Strong proficiency in Java 8 Hands-on experience with Kafka or JMS (at least one is mandatory) Experience in JUnit for writing and executing unit tests Solid understanding of OOP , multi-threading , and design patterns Experience in RESTful API development Familiarity with build tools like Maven or Gradle Strong problem-solving and analytical skills Good to Have: Experience with Spring Framework (Spring Boot) Exposure to microservices architecture Familiarity with CI/CD tools like Jenkins, Git

Posted 20 hours ago

Apply

2.0 - 4.0 years

0 - 0 Lacs

hyderabad, chennai, bengaluru

Work from Office

2-4 years of experience in Design, develop, and implement data streaming solutions using Apache Kafka (AWS MSK). Build and maintain reliable and scalable Kafka topics, producers, and consumers. Develop data integration pipelines using Kafka Connect for connecting Kafka with other systems (databases, applications, cloud services). Implement data transformation and processing logic using Kafka Streams. Develop microservices using Springboot and Kafka. Develop CI/CD pipelines for deploying Kafka topics and configurations. Monitor the health, performance, and scalability of Kafka clusters and streaming applications. Troubleshoot and resolve issues related to Kafka cluster operations, data flow, and application connectivity. Ensure the security and reliability of Kafka-based data pipelines. Collaborate with data architects, data engineers, developers, and operations teams to integrate Kafka into the overall data architecture. Apply your domain knowledge to understand specific data challenges and requirements within the Telecom sectors. Contribute to best practices for Kafka development, deployment, and operations. Optimize Kafka configurations and infrastructure for cost-efficiency and performance. Excellent problem-solving and debugging skills Good communication and collaboration abilities.

Posted 21 hours ago

Apply

4.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Senior Azure Cloud Engineer at Pristine Retail Solutions, you will play a crucial role in developing and migrating end-to-end solutions that empower our customers to succeed in their markets. Your primary responsibilities will include collaborating with engineering teams, evaluating optimal cloud solutions, migrating applications to Azure cloud platform, modifying existing systems, and educating teams on new cloud technologies. You will also design, develop, and deploy modular cloud-based systems, ensuring efficient data storage and processing functions in accordance with best practices in cloud security. **Key Responsibilities:** - Collaborate with engineering teams to evaluate and identify optimal cloud solutions. - Migrate existing applications to Azure cloud platform. - Modify and improve existing systems. - Educate teams on new cloud technologies and initiatives. - Design, develop, and deploy modular cloud-based systems. - Develop and maintain cloud solutions in accordance with best practices. - Ensure efficient functioning of data storage and processing functions in line with company security policies. - Identify, analyze, and resolve infrastructure vulnerabilities and application deployment issues. - Review business requirements and propose solutions for improvement. - Interact with clients, provide cloud support, and make recommendations based on client needs. **Technical Expertise and Familiarity:** - Cloud Technologies: Azure, AWS, VMWare, Google Cloud, Oracle Cloud. - Real-time Streaming: Apache Kafka. - Java: Core Java, Java EE/J2EE, JSP, Servlets, JDBC, JMS, AJAX, JSON, XML, Apache AXIS, JUNIT. - Microsoft: .NET Core, ASP.NET, C#, NUNIT. - Frameworks/Technologies: Spring, Spring Boot, Spring Batch, Hibernate/MyBatis, JPA, MVC, Microservices, Web Services, REST API, Java Script, JQuery, CSS, ReactJS/AngularJS, Testing Frameworks, CI/CD Pipeline, Jenkins, Docker. - App Servers: JBoss/Weblogic/WebSphere/Tomcat. Web Servers: Apache/IIS. - IDEs: Eclipse, STS, Android Studio, Visual Studio, RStudio, PyCharm. - Databases: Oracle, MySQL, NoSQL Databases (e.g., MongoDB), Azure database, Big Data. - O/S: Linux, Unix, Windows. - Experience in Big Data technologies, and Advanced Coursework in AI and Machine Learning is a plus. - Azure, AWS, and GCP certifications are preferred. In addition to technical skills, the following soft skills and abilities are crucial for success in this role: **Soft Skills:** - Adaptability - Communication - Teamwork - Time Management - Critical Thinking **Abilities:** - Architectural view - ability to see the big picture and relationships. - Ability to provide pragmatic solutions considering current realities. - Excellent written and verbal communication skills. - Strong presentation capabilities to technical and non-technical audiences. - Adherence to standards and best practices. - Ability to gain team respect and create consensus through knowledge sharing and empathy towards stakeholders. **Key Experiences:** - Data center migration to Cloud infrastructure. - Migrating applications in .Net, Java, R, and Python to Azure Cloud platform. - Experience in Real-time Streaming technologies. - Experience with CI/CD pipeline, Jenkins. - Complete solution life cycle participation. - Developing enterprise solutions using REST APIs and Microservices. - Deep knowledge of Design and Architectural. Joining Pristine Retail Solutions will provide you with a start-up environment where you can shape the future of the industry. You will work with talented Retail Business Practitioners, Consumer Behavior Analysts, AI Scientists, and Engineers. Additionally, you will have opportunities for personal growth with annual study time, competitive salaries, and generous equity. Apply now by sending your resume to career@pristineinfotech.com and describe why you are supremely qualified for this role.,

Posted 2 days ago

Apply

3.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

Role Overview: Our engineering team at Adobe develops the Adobe Experience Platform, focusing on innovative data management and analytics. Developing a reliable and resilient system at a large scale is crucial for us. We utilize Big Data and open-source technologies for Adobe's services, supporting large enterprise products across geographies. The data needs to be easily accessible at very low latency to cater to various scenarios and use cases. We are looking for candidates with deep expertise in building low latency services at high scales to lead us in accomplishing our vision. Key Responsibilities: - Lead the technical design and implementation strategy for major systems and components of the Adobe Experience Platform - Evaluate and drive the architecture and technology choices for major systems/components - Design, build, and deploy products with outstanding quality - Innovate the current system to enhance robustness, ease, and convenience - Articulate design and code choices to cross-functional teams - Mentor and guide a high-performing team - Review and provide feedback on features, technology, architecture, design, time & budget estimates, and test strategies - Engage in creative problem-solving - Develop and evolve engineering standard methodologies to improve the team's efficiency - Partner with other teams across Adobe to achieve common goals Qualifications Required: - 8+ years of design and development experience in data-driven large distributed systems - 3+ years as an architect building large-scale data-intensive distributed systems and services - Relevant experience in building application layers on top of Apache Spark - Strong experience with Hive SQL and Presto DB - Experience in leading architecture designs while collaborating with multiple collaborators, dependencies, and internal/external customer requirements - In-depth work experience with open-source technologies like Apache Kafka, Apache Spark, Kubernetes, etc. - Experience with big data technologies on public clouds such as Azure, AWS, or Google Cloud Platform - Experience with in-memory distributed caches like Redis, Memcached, etc. - Strong coding (design patterns) and design proficiencies, setting examples for others; contributions to open source are highly desirable - Proficiency in data structures and algorithms - Cost consciousness around computation and memory requirements - Strong verbal and written communication skills - BTech/MTech/MS in Computer Science (Note: Additional details about the company were not present in the provided job description),

Posted 3 days ago

Apply

5.0 - 10.0 years

1 - 6 Lacs

hyderabad, chennai, bengaluru

Work from Office

Role & responsibilities Preferred candidate profile Java, spring boot, FHIR, swagger, apache Kafka, Apache Camel, Mongo DB, HL7, Maven, SQL/ Stored Procedure, Streaming technologies (Kafka) and Cloud (AWS)

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be working at a company that is dedicated to changing the world through digital experiences, empowering individuals to create exceptional digital content and revolutionize customer interactions across various platforms. As part of the team, you will have the opportunity to contribute to the development of innovative features and technologies. **Key Responsibilities:** - Own the development of features of medium to large complexity by translating requirements into architectural and feature specifications. - Contribute to the analysis, design, prototyping, and implementation of new features while enhancing existing ones. - Address architecture and design issues of current and future products, providing strategic direction for evaluating new technologies. - Collaborate with product management and Engineering leads to identify and prioritize new features. - Demonstrate proactive and self-starting behavior to develop methods, techniques, and evaluation criteria for achieving results. - Specialize in one or more platforms, understand cross-platform issues and customer requirements, and contribute significantly to the team's advancement. - Ensure the production of high-quality code and related documentation. **Qualifications Required:** - Hold a B.Tech/M.Tech degree from a reputable institute with 3 to 6 years of hands-on design and development experience in software development, preferably within a product development environment. - Proficiency in Java programming. - Hands-on experience with REST APIs and the message pub/sub model. - Familiarity with frameworks such as springboot, Apache Kafka, Apache Storm, etc. - Knowledge of software fundamentals, including algorithm design and analysis, data structures, implementation, documentation, and unit testing. - Strong understanding of object-oriented design, product life cycles, and associated issues. - Possess excellent computer science fundamentals, with a sound grasp of architecture, design, and performance. - Ability to work proactively and independently with minimal supervision. - Strong teamwork skills with effective written and oral communication abilities. Please note that Adobe is dedicated to creating an inclusive work environment where everyone is respected and has access to equal opportunities. If you require any accommodations for accessibility during the application process, please contact accommodations@adobe.com or call (408) 536-3015.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Junior to Mid-level AI/ML Engineer at VectorStack, you will play a crucial role in designing, deploying, and optimizing machine learning solutions for diverse business challenges. Here's what you need to know: **Key Responsibilities:** - Develop, implement, and maintain machine learning models, focusing on time series forecasting and predictive analytics. - Build and optimize scalable data pipelines using Apache Kafka and AWS services. - Design, develop, and integrate Flask APIs to serve ML models into production-ready applications. - Collaborate with cross-functional teams to translate business requirements into AI-powered solutions. - Evaluate models, perform hyperparameter tuning, and apply optimization strategies for maximum efficiency. - Deploy, monitor, and maintain ML models in production, addressing data and performance issues proactively. - Document methodologies, experiments, and development processes for consistency and knowledge sharing. - Stay updated with advancements in AI, ML, and cloud technologies, recommending and implementing new approaches where relevant. **Qualifications and Skills:** - Proficiency in time series analysis, including forecasting techniques and handling complex temporal datasets. - Strong experience in predictive modeling to analyze trends and enable data-driven decisions. - Hands-on expertise in deploying and managing ML models on AWS infrastructure and tools. - Practical knowledge of Apache Kafka for real-time data streaming and scalable data pipelines. - Strong programming skills in Python for model development, testing, and deployment. - Experience with Flask API for building and deploying ML model endpoints and RESTful services. - Knowledge of model optimization techniques to improve predictive performance and efficiency. - Strong debugging, troubleshooting, and production maintenance capabilities for AI/ML systems. - Demonstrated teamwork, openness to learning new technologies, and strong written and verbal communication skills. VectorStack is a dynamic leader in the IT Services and Consulting sector, headquartered in Bangalore. With a team of 51200 professionals, VectorStack delivers high-quality solutions, consulting, and support services, driven by innovation and excellence for its clients. Learn more at [VectorStack](https://vectorstack.co).,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Backend Engineer at our company, you will be responsible for designing, developing, and maintaining the infrastructure powering our generative AI applications. You will collaborate with AI engineers, platform teams, and product stakeholders to build scalable and reliable backend systems supporting AI model deployment, inference, and integration. This role will challenge you to combine traditional backend engineering expertise with cutting-edge AI infrastructure challenges to deliver robust solutions at enterprise scale. - Design and implement scalable backend services and APIs for generative AI applications using microservices architecture and cloud-native patterns. - Build and maintain model serving infrastructure with load balancing, auto-scaling, caching, and failover capabilities for high-availability AI services. - Deploy and orchestrate containerized AI workloads using Docker, Kubernetes, ECS, and OpenShift across development, staging, and production environments. - Develop serverless AI functions using AWS Lambda, ECS Fargate, and other cloud services for scalable, cost-effective inference. - Implement robust CI/CD pipelines for automated deployment of AI services, including model versioning and gradual rollout strategies. - Create comprehensive monitoring, logging, and alerting systems for AI service performance, reliability, and cost optimization. - Integrate with various LLM APIs (OpenAI, Anthropic, Google) and open-source models, implementing efficient batching and optimization techniques. - Build data pipelines for training data preparation, model fine-tuning workflows, and real-time streaming capabilities. - Ensure adherence to security best practices, including authentication, authorization, API rate limiting, and data encryption. - Collaborate with AI researchers and product teams to translate AI capabilities into production-ready backend services. - Strong experience with backend development using Python, with familiarity in Go, Node.js, or Java for building scalable web services and APIs. - Hands-on experience with containerization using Docker and orchestration platforms including Kubernetes, OpenShift, and AWS ECS in production environments. - Proficient with cloud infrastructure, particularly AWS services (Lambda, ECS, EKS, S3, RDS, ElastiCache) and serverless architectures. - Experience with CI/CD pipelines using Jenkins, GitLab CI, GitHub Actions, or similar tools, including Infrastructure as Code with Terraform or CloudFormation. - Strong knowledge of databases including PostgreSQL, MongoDB, Redis, and experience with vector databases for AI applications. - Familiarity with message queues (RabbitMQ, Apache Kafka, AWS SQS/SNS) and event-driven architectures. - Experience with monitoring and observability tools such as Prometheus, Grafana, DataDog, or equivalent platforms. - Knowledge of AI/ML model serving frameworks like MLflow, Kubeflow, TensorFlow Serving, or Triton Inference Server. - Understanding of API design principles, load balancing, caching strategies, and performance optimization techniques. - Experience with microservices architecture, distributed systems, and handling high-traffic, low-latency applications. - Bachelors degree in computer science, Engineering, or related technical field, or equivalent practical experience. - 4+ years of experience in backend engineering with focus on scalable, production systems. - 2+ years of hands-on experience with containerization, Kubernetes, and cloud infrastructure in production environments. - Demonstrated experience with AI/ML model deployment and serving in production systems.,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

telangana

On-site

In this role, you will be a part of the "Platform" team responsible for maintaining platform stability, providing offshore support, enhancing documentation, ensuring backend services are up-to-date and secure, performing technology upgrades, and making platform enhancements. You will collaborate directly with the Team Leader and Tech Lead to receive work assignments and plan your tasks. Your responsibilities will include meeting commitments, delivering high-quality work, taking ownership of your contributions, providing constructive feedback, and effectively documenting your work. Additionally, you will be expected to learn new technologies and methodologies. Key Responsibilities: - Maintain applications and microservices using Java Spring Boot framework and its ecosystem - Demonstrate a good understanding of Apache Kafka concepts including topics, producers, consumers, partitioning, and brokers in event-driven systems - Work with both SQL and MongoDB databases to create optimized databases and collections - Deploy, manage, and troubleshoot applications on Kubernetes platform - Utilize GitHub Actions for building, testing, and deploying applications - Frontend development using React Qualifications Required: - Minimum of 6 years of relevant experience - Proficiency in Java Spring Boot, Apache Kafka, Databases (SQL and Mongo), Kubernetes, GitHub Actions, and React (Note: Additional details about the company were not provided in the job description),

Posted 5 days ago

Apply

5.0 - 9.0 years

0 - 0 Lacs

maharashtra

On-site

You are required for one of Uplers" client - Datakrew. **Responsibilities:** - Design, build, and maintain server-side logic and databases. - Implement APIs to support front-end applications and external services. - Design and optimize database schemas. - Build responsive and user-friendly frontend applications using C# and Blazor. - Familiarity with JavaScript frameworks (e.g., React, Angular, Vue.js) and CSS. - Write clean, maintainable, and well-documented code. - Optimize code and database queries for performance, scalability, and reliability. - Troubleshoot and resolve issues. - Implement and manage secure user authentication and access control using JWT tokens. - Create and maintain real-time dashboards using Grafana and Blazor Server/WASM apps for data visualization. - Integrate APIs and webhooks to connect and automate workflows between different systems. - Use Git for version control and collaborate with the team on code management. **Required Skills:** - Proven experience of writing efficient database queries (both SQL and NoSQL). - Proficiency in Blazor Server/WASM apps for creating and maintaining real-time dashboards. - Knowledge of frontend frameworks in JavaScript (e.g., React, Angular, Vue.js) and CSS. - Experience with stream processing platforms such as Apache Kafka. - Experience with API and webhook integrations using REST and gRPC. - Proficiency in using Git for version control. - Ability to work independently and as part of a team with strong problem-solving skills. **Qualification:** - Bachelors or equivalent higher degree in Computer Science, Full stack development, or a related field. - 5+ years of experience in a similar role. - Strong problem-solving skills and ability to work independently as well as in a team. - Experience with additional programming languages or technologies (e.g., Python, Java). - Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). - Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). - Familiarity with no-code & low-code software development tools (e.g., Bubble, Webflow, Retool). - Excellent communication skills, both verbal and written. - Experience with other IoT platforms and technologies is a plus. **Benefits:** - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative and innovative work environment. - Flexible work hours and remote work options. If you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. Datakrew and Uplers are waiting for you!,

Posted 5 days ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

The role you will be taking on involves being a part of the "Platform" team, with responsibilities that include ensuring the stability of the platform, providing support during offshore hours, creating and improving documentation, keeping backend services current and secure, upgrading technologies, and enhancing the platform. Collaborating closely with the Team Leader and Tech Lead, you will receive assigned tasks and collaborate on work planning. Your role will require you to fulfill commitments, produce high-quality work, take ownership of your contributions, offer constructive feedback, and document your work efficiently. Additionally, you will be expected to adapt to new technologies and methodologies. To excel in this position, you should possess proficiency in the following key areas: Java Spring Boot - for maintaining applications and microservices using the Spring Boot framework and its ecosystem; Apache Kafka - with a solid grasp of Kafka concepts such as topics, producers, consumers, partitioning, brokers, and understanding of event-driven systems; Databases - encompassing both SQL and Mongo, including the ability to create optimized databases and collections; Kubernetes - for deploying, managing, and troubleshooting applications on Kubernetes; GitHub Actions - for building, testing, and deploying applications; and React - for frontend development. This role requires a minimum of 6 years of relevant experience.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

MongoDB's mission is to empower innovators to create, transform, and disrupt industries by unleashing the power of software and data. We enable organizations of all sizes to easily build, scale, and run modern applications by helping them modernize legacy workloads, embrace innovation, and unleash AI. Our industry-leading developer data platform, MongoDB Atlas, is the only globally distributed, multi-cloud database and is available in more than 115 regions across AWS, Google Cloud, and Microsoft Azure. Atlas allows customers to build and run applications anywhere - on premises or across cloud providers. With offices worldwide and over 175,000 new developers signing up to use MongoDB every month, it's no wonder that leading organizations, like Samsung and Toyota, trust MongoDB to build next-generation, AI-powered applications. MongoDB is expanding a development team in Sydney working on tooling that helps customers migrate data from relational databases to MongoDB. Tools developed by the Data Migration team help application developers with schema modeling, type conversions, data sync, change data capture, and so on. MongoDB is looking for a software engineer with experience in the Java ecosystem and streaming systems to join the team. Our main technology stack includes Java, Spring Boot, Apache Kafka, and React. A successful candidate will collaborate closely with product management and engineers on the team to help drive the design and implementation of a cutting-edge product. This role will be based in our India office in Gurugram and offers a hybrid working model. The ideal candidate for this role will have 2-3 years of commercial software development experience with at least one JVM language such as Java, preferably using the Spring ecosystem. Experience with relational data modeling and at least one SQL database (Postgres, MySQL, etc), basic familiarity with streaming systems such as Apache Kafka, AWS SQS, etc, basic familiarity with client-side technologies such as React, good understanding of algorithms, data structures and their time and space complexity, curiosity, a positive attitude, and a drive to continue learning, and excellent verbal and written communication skills. Position Expectations: - Collaborate with product management, product designers, and other engineers - Contribute high-quality and well-tested backend code to the data migration engine and its surrounding services - Participate in code reviews and team technical discussions - Give and solicit feedback on technical design documents and pull requests - Perform tasks related to process such as CI/CD, quality, testing, etc Success Measures: Within the first three months, you will have: - Familiarized yourself with the MongoDB database and aggregation language - Familiarized yourself with the backend tech stack including Java, Spring Boot, and Kafka - Set up software development infrastructure (tech stack, build tools, etc) to enable development using the relevant tech stacks - Started collaborating with your peers and contributed to code reviews Within six months, you will have: - Familiarized yourself with the rest of our codebase including the frontend stack, Confluent plugins, GitHub workflows, etc - Worked on and delivered a medium scale feature in the product - Contributed to and helped deliver a few releases of the product - Reviewed and contributed to scope and technical design documents Within 12 months, you will have: - Familiarized yourself with the work of other teams within the department - Delivered at least one large scale feature that spans the entire tech stack - Helped recruit and interview new members of the team To drive the personal growth and business impact of our employees, we're committed to developing a supportive and enriching culture for everyone. From employee affinity groups, to fertility assistance and a generous parental leave policy, we value our employees" wellbeing and want to support them along every step of their professional and personal journeys. Learn more about what it's like to work at MongoDB, and help us make an impact on the world! MongoDB is committed to providing any necessary accommodations for individuals with disabilities within our application and interview process. To request an accommodation due to a disability, please inform your recruiter. MongoDB is an equal opportunities employer. Req ID: 425547,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

You are invited to join HCL Software, the Product Development Division of HCL Tech. We are dedicated to delivering cutting-edge software solutions that cater to the evolving needs of clients worldwide. Our portfolio includes award-winning software products in the domains of AI, Automation, Data & Analytics, Security, and Cloud. As part of our team, you will focus on the Unica Marketing Platform, a powerful tool that empowers our clients to execute precise and high-performance marketing campaigns across various channels such as social media, AdTech Platforms, Mobile Applications, and Websites. The Unica Marketing Platform, characterized by its data-driven and AI-centric approach, enables our clients to craft hyper-personalized offers and messages to drive customer acquisition, enhance product awareness, and boost customer retention. We are currently seeking a Lead Java Developer with a strong background in Spring Boot, Apache Kafka, JDBC, REST, Postman, and Swagger, along with at least 8 years of experience in Java development within Enterprise settings. Key Qualifications & Experiences: - Over 8 years of hands-on Java Development expertise in Enterprise Application Integration and Enterprise Data Integration spanning Web Applications, Middleware, Databases, and Transactional Systems. - Proficiency in Integration tools and frameworks such as Spring Boot, Postman, Swagger, and API Gateways. - Thorough understanding of REST, JSON, XML, and SOAP protocols. - Advantageous to have experience with Customer Data Platforms (CDP) like Treasure Data, Epsilon, Tealium, Adobe, or Salesforce. - Proficiency in API development, encompassing best practices for performance optimization, error handling, security protocols, automated testing, version control, governance, observability, monitoring, and deployment automation. - Experience in designing solutions related to data ingestion (real-time and batch), transformation, enrichment, cleansing, and export utilizing ETL and other methodologies. - Familiarity with Message Brokers like Apache Kafka and RabbitMQ. - Capability to thrive in an agile team setup and apply agile methodologies effectively. - Experience in working with and integrating Cloud Platforms such as GCP, AWS, Azure, and OpenShift is a plus. - Strong communication and interpersonal skills. - Ability to mentor and guide team members. - A bachelor's degree in Computer Science or IT is a prerequisite. - Java Certification would be considered advantageous. Location: Belapur (Navi Mumbai Location). Please note that work is scheduled on the 1st, 3rd, and 5th Saturdays. Compensation: Competitive base salary with additional bonus incentives. Key Responsibilities: - Design, develop, and implement integration solutions for HCL's Marketing Platform (Unica) and Customer Data Platform (CDP) utilizing Java, Microservices, REST APIs, and API Gateways. - Translate conceptual system and business requirements into precise technical data and integration specifications. - Create efficient, scalable, and dependable APIs and Connectors to seamlessly integrate diverse applications, middleware, systems, and data sources. - Develop automated Unit Tests and monitoring scripts to uphold reliability and high-quality standards for the delivered integrations. - Engage in research activities and provide recommendations on integration products and services. - Monitor integration processes proactively to identify and resolve performance or data quality issues. - Provide continual maintenance and support for integration solutions. Join us at HCL Software and be part of a dynamic team where your expertise will make a significant impact on our innovative software solutions.,

Posted 6 days ago

Apply

1.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You should have at least 5 to 10 years of overall work experience, including a minimum of 1 year in Generative AI. As a Backend Developer focusing on Generative AI applications, your responsibilities will include: - Developing and maintaining scalable backend services using Java and Spring Boot - Designing and implementing APIs for LLM integration and AI agent orchestration - Building microservices architecture to support RAG systems and LLM-powered applications - Implementing caching strategies and optimization for vector database operations - Integrating with multiple LLM providers and managing API rate limiting and failover mechanisms - Developing real-time streaming capabilities for conversational AI applications - Ensuring system observability and implementing comprehensive logging and monitoring Required Skills: - Backend Development with 3+ years of experience in Java and Spring Boot framework - Strong understanding of RESTful API design and microservices architecture - Experience with Spring Security, Spring Data JPA, and Spring Cloud - Proficiency in database technologies like PostgreSQL, MongoDB, and Redis - Knowledge of message queuing systems such as RabbitMQ and Apache Kafka - Experience with caching mechanisms and performance optimization Generative AI Integration skills should include: - Experience integrating LLM APIs into backend systems (OpenAI, Gemini, Anthropic) - Knowledge of vector database integration and semantic search implementations - Experience with AI agent frameworks and orchestration (LangGraph, etc.) - Understanding of RAG architecture and implementation patterns - Experience with streaming responses and WebSocket connections for real-time AI interactions - Knowledge of prompt management and template systems Good To Have: - Experience with fine-tuning workflows and model deployment pipelines - Knowledge of self-hosted LLM integration and management - Experience with observability tools like LangSmith and custom monitoring solutions - Understanding of natural language to SQL query systems Additional Requirements: - Experience with containerization (Docker) and orchestration (Kubernetes) - Knowledge of CI/CD pipelines and DevOps practices - Cloud platform experience with certifications preferred (AWS, GCP, Azure) - Understanding of authentication and authorization patterns - Experience with testing frameworks like JUnit, Mockito, and TestContainers - Knowledge of system design and scalability patterns - Familiarity with Elasticsearch or similar search technologies We offer: - Opportunity to work on bleeding-edge projects - Collaboration with a highly motivated and dedicated team - Competitive salary - Flexible schedule - Benefits package including medical insurance and sports facilities - Corporate social events - Professional development opportunities About Us: Grid Dynamics is a leading provider of technology consulting, AI, and advanced analytics services. With our technical vision and business acumen, we help enterprise companies solve technical challenges and achieve positive business outcomes. Our expertise in enterprise AI, data, analytics, cloud & DevOps, and customer experience sets us apart. Founded in 2006, Grid Dynamics is headquartered in Silicon Valley with offices across the Americas, Europe, and India.,

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Java Backend with Kafka, you will be responsible for demonstrating a strong proficiency in Core Java, Spring Boot, and Microservices architecture. Your role will involve hands-on experience with Apache Kafka, including Kafka Streams, Connect, Schema Registry, and Confluent. You will also work with REST APIs, JSON, and event-driven systems. In this position, you should have knowledge of SQL databases such as MySQL and PostgreSQL, as well as NoSQL databases like MongoDB, Cassandra, and Redis. Familiarity with Docker, Kubernetes, and CI/CD pipelines is essential for success in this role. Experience in multi-threading, concurrency, and distributed systems will be beneficial. An understanding of cloud platforms such as AWS, Azure, or GCP is desired. You should possess strong problem-solving skills and excellent debugging abilities to address complex technical challenges effectively. Join our team in Bangalore (WFO) and contribute your expertise to our dynamic projects.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a software engineer at UBS, your role will involve designing and building next-generation business applications using the latest technologies to create a new Digital Banking experience. You will be responsible for iteratively refining user requirements, removing any ambiguity, and providing technology solutions to solve business problems and strengthen the organization's position as digital leaders in financial services. This role will require you to analyze business requirements, design sustainable solutions using modern programming languages, and provide technical expertise in assessing new software projects. Joining the Banker Desktop Technology crew, you will work on the Digital Banking Initiative in Wealth Management Americas, adopting the latest Agile practices in a fast-paced environment. The team utilizes a suite of Microsoft products such as D365 and Data verse, aiming to provide the best user experience. You will be involved in integrating with third-party systems and internal/external choreography service layers. Ideally, you should have 8+ years of experience in building enterprise-grade back-end services and APIs using Java, proficiency in Java, Spring Framework, various data stores, RESTful APIs, and microservices architecture. Experience with cloud technologies (Azure, AWS), containerization, orchestration tools, and security best practices is desired. Knowledge of event-driven frameworks like Apache Kafka and writing unit tests using JUnit is a plus. Strong communication skills, technical writing abilities, and the capacity to explain technical concepts to non-technical stakeholders are essential. UBS is the world's largest and the only truly global wealth manager, operating through four business divisions. We have a presence in major financial centers in over 50 countries. At UBS, we prioritize our people and their diverse skills, experiences, and backgrounds. We offer new challenges, a supportive team, growth opportunities, and flexible working options. Our inclusive culture fosters collaboration and empowers our employees throughout their career journey. UBS is an Equal Opportunity Employer that values and supports the diverse cultures, perspectives, skills, and experiences within our workforce. Should you require reasonable accommodations throughout the recruitment process, please feel free to contact us.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

bangalore, karnataka

On-site

As a Senior Software Engineer at The StoneX Group, you will play a crucial role in developing high-performing, scalable, enterprise-grade applications. Your responsibilities will involve working on low-latency, mission-critical applications as part of a talented engineering team. You will be tasked with application architecture and development across all software development lifecycle stages, from concept and design to testing. Collaboration with fellow engineers and stakeholders is key to your success in this role. Your primary duties will include developing high-quality, scalable, and maintainable enterprise applications using a Microservices architecture. You will participate in all phases of development within an Agile methodology, writing efficient, testable code. Collaborating with cross-functional teams, you will contribute to designing, developing, and deploying applications. Furthermore, you will engage in code reviews, testing, and bug fixing, while also exploring new technologies for architectural enhancements. To excel in this role, you must have extensive experience in developing complex distributed event-based microservices using Java/Spring Boot. Proficiency in developing front-end applications using ReactJS/Angular, as well as Restful APIs and gRPC services, is essential. Experience with containerization (Docker, Kubernetes), cloud platforms (Azure, AWS), and distributed messaging platforms (Apache Kafka) will be advantageous. Familiarity with CI/CD pipelines, build and release automation, testing frameworks, and databases (both SQL and No-SQL) is required. Your standout qualities will include a minimum of 7+ years of experience, particularly in the financial domain. Exposure to Python and NodeJs will be a plus. You should hold a BS/MS degree in Computer Science, Engineering, or a related field. In this hybrid working environment, you will have the opportunity to work both independently and collaboratively with global teams distributed across geographic locations. Travel requirements are not applicable to this role.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

You will be working at Paras Twin Tower, Gurgaon as a full-time employee for Falcon, a Series-A funded cloud-native, AI-first banking technology & processing platform. Falcon specializes in assisting banks, NBFCs, and PPIs to efficiently launch cutting-edge financial products like credit cards, credit lines on UPI, prepaid cards, fixed deposits, and loans. Since its inception in 2022, Falcon has processed over USD 1 billion in transactions, collaborated with 12 of India's top financial institutions, and generated revenue exceeding USD 15 million. The company is supported by prominent investors from Japan, the USA, and leading Indian ventures and banks. To gain more insights about Falcon, visit https://falconfs.com/. As an Intermediate Data Engineer with 5-7 years of experience, your responsibilities will include designing, developing, and maintaining scalable ETL processes using open source tools and data frameworks such as AWS Glue, AWS Athena, Redshift, Apache Kafka, Apache Spark, Apache Airflow, and Pentaho Data Integration (PDI). You will also be accountable for designing, creating, and managing data lakes and data warehouses on the AWS cloud, optimizing data pipeline architecture, formulating complex SQL queries for big data processing, collaborating with product and engineering teams to develop a platform for data modeling and machine learning operations, implementing data structures and algorithms to meet functional and non-functional requirements, ensuring data privacy and compliance, developing processes for monitoring and alerting on data quality issues, and staying updated with the latest data engineering trends by evaluating new open source technologies. To qualify for this role, you must have a Bachelor's or Master's degree in Computer Science or MCA from a reputable institute, at least 4 years of experience in a data engineering role, proficiency in Python, Java, or Scala for data processing (Python preferred), a deep understanding of SQL and analytical data warehouses, experience with database frameworks like PostgreSQL, MySQL, and MongoDB, knowledge of AWS technologies such as Lambda, Athena, Glue, and Redshift, experience implementing ETL or ELT best practices at scale, familiarity with data pipeline tools like Airflow, Luigi, Azkaban, dbt, proficiency with Git for version control, familiarity with Linux-based systems, cloud services (preferably AWS), strong analytical skills, and the ability to work in an agile and collaborative team environment. Preferred skills for this role include certification in any open source big data technologies, expertise in Apache Hadoop, Apache Hive, and other open source big data technologies, familiarity with data visualization tools like Apache Superset, Grafana, Tableau, experience in CI/CD processes, and knowledge of containerization technologies like Docker or Kubernetes. If you are someone with these skills and experience, we encourage you to explore this opportunity further. Please note that this job description is for an Intermediate Data Engineer role with key responsibilities and qualifications outlined.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Programmer Analyst position is an intermediate level role where you will be involved in establishing and implementing new or updated application systems and programs in collaboration with the Technology team. Your main objective will be to contribute to activities related to applications systems analysis and programming. As a Full Stack Engineering Developer, you will be responsible for designing, developing, and testing application modules using Angular and Java. You should possess strong skills in designing and developing business modules utilizing Angular, Java, and related components. This role requires an understanding of standard development practices and implementation strategies. Additionally, you will be working as part of distributed teams, necessitating proficiency in writing, communication, time-management skills, as well as excellent technical skills. Your responsibilities will include utilizing your knowledge of applications development procedures and concepts to identify necessary system enhancements, analyzing issues, making recommendations, and implementing solutions. You will also need to use your understanding of business processes, system processes, and industry standards to solve complex problems. Furthermore, you will be required to conduct testing and debugging, write basic code for design specifications, and assess risk in business decisions. Your primary skills should include a strong knowledge of Angular, HTML/CSS 5, and JavaScript, proficiency in Spring frameworks, Microservices, REST API, and Java core concepts, familiarity with Git and code repository hosting platforms, among others. Additionally, experience in PL/SQL, Oracle databases, OOP design principles, Jenkins, Lightspeed, Containerization, and other technologies is considered a secondary skill. It would be beneficial to have good reasoning and logical thinking skills, knowledge of CI/CD pipelines, DevOps concepts, Apache Kafka, and Agile development methodology. Qualifications for this role include 2 years of relevant experience, experience in programming/debugging for business applications, and a working knowledge of industry practices and standards. Your educational background should include a Bachelor's degree or equivalent experience. This job description provides an overview of the work performed, and additional job-related duties may be assigned as needed. If you require a reasonable accommodation due to a disability to use our search tools or apply for a career opportunity, please review Accessibility at Citi.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled Azure Integration API Developer responsible for designing, developing, and maintaining scalable API and integration solutions on the Microsoft Azure platform. Your role involves utilizing Azure Integration Services such as Azure API Management, Logic Apps, Service Bus, Event Grid, and other related technologies to ensure seamless data flow and system interoperability. Key Responsibilities: - Design, develop, test, and deploy APIs and integration workflows using Azure services including Azure API Management, Logic Apps, Functions, Service Bus, Event Grid, and Event Hubs. - Build and maintain scalable, secure, and reliable integration solutions connecting various enterprise applications and cloud/on-premises systems. - Collaborate with business analysts, architects, and other developers to gather requirements and translate them into technical solutions. - Implement API gateways, security policies, throttling, and versioning using Azure API Management. - Create and manage messaging and event-driven architectures using Azure Service Bus and Event Grid. - Monitor, troubleshoot, and optimize integration workflows and APIs to ensure high availability and performance. - Document API interfaces, integration designs, and configuration details for knowledge sharing and future reference. - Support CI/CD pipeline implementation for automated deployment of integration solutions. - Stay current with new Azure Integration capabilities and recommend improvements or adoption as appropriate. - Participate in code reviews, design discussions, and contribute to the development of best practices for integration development. Required Skills and Qualifications: - Bachelors degree in Computer Science, Information Technology, or related field. - 3+ years of experience in integration development and API management. - Strong expertise with Microsoft Azure Integration services: Azure API Management, Logic Apps, Functions, Service Bus, Event Grid, etc. - Proficient in RESTful API design and development. - Experience with API security standards such as OAuth 2.0, OpenID Connect, JWT. - Solid knowledge of JSON, XML, and related data formats. - Familiarity with Azure DevOps for source control, build, and release pipelines. - Strong scripting or programming skills in languages like C#, JavaScript, or Python. - Experience with monitoring and troubleshooting tools within Azure. - Excellent problem-solving and communication skills. Preferred Qualifications: - Azure certifications such as Azure Developer Associate (AZ-204) or Azure Solutions Architect. - Experience with microservices architecture and containerization (Docker, Kubernetes). - Knowledge of other integration platforms like BizTalk, MuleSoft, or Apache Kafka. - Familiarity with DevOps culture and automation tools. Join NTT DATA, a trusted global innovator of business and technology services committed to helping clients innovate, optimize and transform for long term success. With a diverse team and a focus on digital and AI infrastructure, we are at the forefront of the digital future. Visit us at us.nttdata.com.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

Your opportunity to make a real impact and shape the future of financial services is waiting for you. Let's push the boundaries of what's possible together. As a Senior Director of Software Engineering at JPMorgan Chase within the Consumer and Community Banking division, you will be responsible for leading various technical domains, overseeing the activities of multiple departments, and fostering cross-functional collaboration. Your technical expertise will be utilized across different teams to promote the adoption and implementation of advanced technical methods, helping the firm stay ahead of industry trends, best practices, and technological advancements. Leads multiple technology and process implementations across departments to achieve firmwide technology objectives. Directly manages multiple areas with strategic transactional focus. Provides leadership and high-level direction to teams while frequently overseeing employee populations across multiple platforms, divisions, and lines of business. Acts as the primary interface with senior leaders, stakeholders, and executives, driving consensus across competing objectives. Manages multiple stakeholders, complex projects, and large cross-product collaborations. Influences peer leaders and senior stakeholders across the business, product, and technology teams. Champions the firm's culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills include formal training or certification on data management concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate, and solve complex technical items within your domain of expertise. Proven experience in designing and developing large-scale data pipelines for batch & stream processing. Strong understanding of Data Warehousing, Data Lake, ETL processes, and Big Data technologies (e.g., Hadoop, Snowflake, Databricks, Apache Spark, PySpark, Airflow, Apache Kafka, Java, Open File & Table Formats, GIT, CI/CD pipelines, etc.). Expertise with public cloud platforms (e.g., AWS, Azure, GCP) and modern data processing & engineering tools. Excellent communication, presentation, and interpersonal skills. Experience developing or leading large or cross-functional teams of technologists. Demonstrated prior experience influencing across highly matrixed, complex organizations and delivering value at scale. Experience leading complex projects supporting system design, testing, and operational stability. Experience with hiring, developing, and recognizing talent. Extensive practical cloud native experience. Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field. Preferred qualifications, capabilities, and skills include experience working at code level and ability to be hands-on performing PoCs, code reviews. Experience in Data Modeling (ability to design Conceptual, Logical and Physical Models, ERDs and proficiency in data modeling software like ERwin). Experience with Data Governance, Data Privacy & Subject Rights, Data Quality & Data Security practices. Strong understanding of Data Validation / Data Quality. Experience with supporting large-scale AI/ML Data requirements. Experience in Data visualization & BI tools is a huge plus.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

thane, maharashtra

On-site

You will be joining Gigatech Global Services Pvt Ltd for the position of Offshore Java Fullstack Developer in either Mumbai or Pune location. As an Offshore Java Fullstack Developer, your primary role will involve seamlessly combining front-end and back-end development to create scalable and efficient web applications that cater to the requirements of our clients. Your collaboration with cross-functional teams will be essential in delivering top-notch software solutions. Leveraging your expertise in Java and modern front-end frameworks, you will be responsible for designing, developing, testing, and maintaining applications to ensure their functionality, performance, and responsiveness. The ideal candidate for this role will possess strong technical skills, a knack for problem-solving, and a drive for innovation. Effective communication and teamwork are crucial as you will be working alongside project managers, designers, and fellow developers. Your contributions as an Offshore Java Fullstack Developer will play a significant role in achieving project milestones and aligning software applications with strategic initiatives to meet overall business goals efficiently and effectively. Your key responsibilities will include designing and developing high-quality, scalable web applications using Java and JavaScript technologies, implementing front-end components with React, creating RESTful APIs for seamless integration, participating in code reviews, collaborating with UX/UI designers, optimizing application performance, conducting unit and integration tests, utilizing version control systems like Git, engaging in Agile/Scrum methodologies, documenting application processes, troubleshooting and debugging applications, staying updated with emerging technologies, contributing to project planning, working with database systems, and providing technical guidance to junior developers. To excel in this role, you are required to have proficiency in Java, Java EE, Spring Boot, front-end technologies (HTML, CSS, JavaScript), modern JavaScript frameworks (React), RESTful services, SQL, relational databases, microservices architectures, Agile/Scrum methodologies, problem-solving skills, analytical mindset, Git, communication skills, willingness to learn new technologies, and experience with deployment and CI/CD processes is a plus. Your adaptability to changing project requirements will be crucial for success in this position.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced Ruby on Rails developer with a minimum of 6-9 years of experience, you will have the opportunity to collaborate with the Product team to design, develop, and maintain robust solutions in Ruby on Rails. Your responsibilities will include implementing scalable, high-performance systems for real-time events processing in the supply chain, warehouse, and inventory management domains. You will also contribute to the ongoing improvement and optimization of the core platform and customer-facing APIs. In this role, you will work on diverse technical challenges, leveraging your expertise in Ruby on Rails to enhance real-time visibility and data processing from the IoT fleet. You will actively participate in grassroots innovation and contribute to decentralized decision-making within the engineering team. A data-centric mindset is crucial in this role, ensuring that exceptional ideas are welcomed and considered, regardless of the source. To be successful in this position, you must have experience working with relational databases like PostgreSQL, MySQL, or similar. Additionally, experience with Apache Kafka or similar streaming services, as well as a JavaScript library or framework like Hotwire (Turbo + StimulusJS), React, Angular, or Vue, is required. Familiarity with Rails ActiveJobs for background job processing and strong knowledge of software development fundamentals are essential. You should also be pragmatic, combining a strong understanding of technology and product needs to arrive at the best solution for a given problem. As a candidate for this role, you should be highly entrepreneurial and thrive in taking ownership of your impact. Experience in leading a team of 4-8 members, guiding them, and resolving their queries is advantageous. This position is an IC+Lead role, where you will have the opportunity to design and develop products in supply chain domains. Nice-to-have skills include experience working with AWS services like EC2, ECS, S3, CloudFront, and CloudWatch, as well as Docker. Familiarity with Github Actions, Jenkins, and AWS CDK (Cloud Development Kit) is a plus. When you apply for this position, you can expect an introductory call followed by two technical rounds: one virtual and one face-to-face. This is a challenging and rewarding opportunity for a seasoned Ruby on Rails developer who is passionate about innovation and problem-solving.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer II at Media.net, you will be responsible for designing, executing, and managing large and complex distributed data systems. Your role will involve monitoring performance, optimizing existing projects, and researching and integrating Big Data tools and frameworks as required to meet business and data requirements. You will play a key part in implementing scalable solutions, creating reusable components and data tools, and collaborating with teams across the company to integrate with the data platform efficiently. The team you will be a part of ensures that every web page view is seamlessly processed through high-scale services, handling a large volume of requests across 5 million unique topics. Leveraging cutting-edge Machine Learning and AI technologies on a large Hadoop cluster, you will work with a tech stack that includes Java, Elastic Search/Solr, Kafka, Spark, Machine Learning, NLP, Deep Learning, Redis, and Big Data technologies such as Hadoop, HBase, and YARN. To excel in this role, you should have 2 to 4 years of experience in big data technologies like Apache Hadoop and relational databases (MS SQL Server/Oracle/MySQL/Postgres). Proficiency in programming languages such as Java, Python, or Scala is required, along with expertise in SQL (T-SQL/PL-SQL/SPARK-SQL/HIVE-QL) and Apache Spark. Hands-on knowledge of working with Data Frames, Data Sets, RDDs, Spark SQL/PySpark/Scala APIs, and deep understanding of Performance Optimizations will be essential. Additionally, you should have a good grasp of Distributed Storage (HDFS/S3), strong analytical and quantitative skills, and experience with data integration across multiple sources. Experience with Message Queues like Apache Kafka, MPP systems such as Redshift/Snowflake, and NoSQL storage like MongoDB would be considered advantageous for this role. If you are passionate about working with cutting-edge technologies, collaborating with global teams, and contributing to the growth of a leading ad tech company, we encourage you to apply for this challenging and rewarding opportunity.,

Posted 1 week ago

Apply

8.0 - 10.0 years

20 - 22 Lacs

pune

Work from Office

Core Java, Collections skills, Java 8, Lambda, Streams, Spring Framework (Core / Boot / Integration), Apache Flink, Apache Kafka, ELK stack, Elasticsearch, Logstash & Kibana, BPMN/CMMN, Angular/JavaScript / React / Redux, CI / CD, Git, agile SDLC

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies