Jobs
Interviews

293 Apache Kafka Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an AI Ops Expert, you will be responsible for taking full ownership of deliverables with defined quality standards, timelines, and budget constraints. Your primary role will involve designing, implementing, and managing AIops solutions to automate and optimize AI/ML workflows. Collaborating with data scientists, engineers, and stakeholders is essential to ensure the seamless integration of AI/ML models into production environments. Your duties will also include monitoring and maintaining the health and performance of AI/ML systems, developing and maintaining CI/CD pipelines specifically tailored for AI/ML models, and implementing best practices for model versioning, testing, and deployment. In case of issues related to AI/ML infrastructure or workflows, you will troubleshoot and resolve them effectively. To excel in this role, you are expected to stay abreast of the latest AIops, MLOps, and Kubernetes tools and technologies. Your strong skills should include proficiency in Python with experience in Fast API, hands-on expertise in Docker and Kubernetes (or AKS), familiarity with MS Azure and its AI/ML services like Azure ML Flow, and the ability to use DevContainer for development purposes. Furthermore, you should possess knowledge of CI/CD tools such as Jenkins, Argo CD, Helm, GitHub Actions, or Azure DevOps, experience with containerization and orchestration tools like Docker and Kubernetes, proficiency in Infrastructure as code (Terraform or equivalent), familiarity with machine learning frameworks like TensorFlow, PyTorch, or scikit-learn, and exposure to data engineering tools such as Apache Kafka, Apache Spark, or similar technologies.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking talented and experienced Java Full Stack Developers to join our dynamic team at a reputed client location in Pune. If you are passionate about backend and frontend technologies, system design, and DevOps practices, this is your opportunity to make an impact! Qualification: - Bachelors or Masters degree in Computer Science, Engineering, or a related field. Required Skills: - Strong hands-on experience with Java, Spring Framework, and Spring Boot - Expertise in building and consuming RESTful APIs - Solid frontend development skills using React.js - Proficiency in PostgreSQL and Apache Kafka - Experience with CI/CD tools: Chef, Jenkins, Maven, SonarQube, Checkmarx - Deep understanding of High & Low-Level System Design - Experience in Domain-Driven Design and Event-Driven Architecture - Strong debugging, performance tuning, and troubleshooting capabilities - Excellent communication skills to collaborate with technical and business stakeholders - Proven ability to lead, mentor, and coach development teams Job Role & Responsibilities: - Develop, enhance, and maintain fullstack applications using Java and React - Design and implement robust, scalable, and secure backend services - Integrate systems using Apache Kafka and manage event-driven workflows - Ensure code quality with continuous integration, code reviews, and static analysis tools - Participate in system design discussions and contribute to architecture decisions - Lead and mentor junior developers, promote coding standards and best practices - Collaborate with cross-functional teams, including product managers and architects - Troubleshoot issues across the stack and optimize application performance Note: This is a Work from Office (5 Days a Week) opportunity at the client location in Yerwada, Pune.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Senior Developer with 6 to 9 years of experience, you will play a crucial role in our growth story based at our Mangalore office in India. You will be at the forefront of driving innovation and working with cutting-edge technology to craft the vertical software of tomorrow. Our team is dedicated to bringing sustainable impact to customers and society, constantly striving for improvement and taking responsibility for our contributions. Facility & Energy management is a key focus area for us, where we integrate advanced technologies like IoT sensors and AI-driven analytics to help facilities optimize energy consumption, reduce costs, and enhance operational efficiency. You will be responsible for designing and implementing new features and enhancements, collaborating with cross-functional teams, and mentoring junior developers to ensure adherence to best practices and coding standards. In this dynamic role, you will participate in designing and developing existing products, making key technical decisions, reviewing and optimizing code, conducting code reviews, and providing constructive feedback. You will also collaborate with cross-functional teams to define and refine project stories, enforce coding standards, and best practices. The ideal candidate for this role will have experience in modern software development, particularly on the .Net tech stack. You should have a Bachelor's degree or higher in computer science/information science, or equivalent, along with expertise in .Net development, C#, SQL Server/Azure Database/PostgreSQL, Object-Oriented programming, and Test-Driven Development. Experience with Apache Kafka, Docker/Kubernetes, and microservices is considered an added advantage. If you are passionate about technology and delivering high-quality software solutions, this is an exciting opportunity to drive innovation and be part of a team dedicated to creating the best software solutions for Facility & Energy Management. Join us in this once-in-a-lifetime opportunity to make a difference every day.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

NTT DATA is looking for an AWS Devops Engineer to join their team in Pune, Maharashtra (IN-MH), India. As an AWS Devops Engineer, you will be responsible for building and maintaining a robust, scalable real-time data streaming platform leveraging AWS and Confluent Cloud Infrastructure. Your key responsibilities will include developing and building the platform, monitoring performance, collaborating with cross-functional teams, managing code using Git, applying Infrastructure as Code principles using Terraform, and implementing CI/CD practices using GitHub Actions. The ideal candidate must have a strong proficiency in AWS services such as IAM Roles, Access Control RBAC, S3, Lambda Functions, VPC, Security Groups, RDS, CloudWatch, and more. Hands-on experience in Kubernetes (EKS) and expertise in managing resources/services like Pods, Deployments, and Helm Charts is required. Additionally, expertise in Datadog, Docker, Python, Go, Git, Terraform, and CI/CD tools is essential. Understanding of security best practices and familiarity with tools like Snyk, Sonar Cloud, and Code Scene is also necessary. Nice-to-have skills include prior experience with streaming platforms like Apache Kafka, knowledge of unit testing around Kafka topics, and experience with Splunk integration for logging and monitoring. Familiarity with Software Development Life Cycle (SDLC) principles is a plus. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. NTT DATA offers diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is dedicated to providing digital and AI infrastructure and is part of the NTT Group, investing significantly in R&D to support organizations and society in moving confidently into the digital future. Visit us at us.nttdata.com.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are an experienced Tech Lead with over 8+ years of full-stack software engineering experience. In this role, you will be responsible for developing and maintaining high-quality software solutions that meet business requirements and contribute to the overall success of the company. Collaboration with the CTO and cross-functional teams is essential to define technical direction, make strategic technology decisions, and ensure the delivery of high-quality, scalable software solutions that align with our business goals. Your responsibilities will include leading the design, development, and deployment of scalable applications using Java and modern backend technologies. Collaboration with cross-functional teams to identify requirements, prioritize tasks, and develop solutions is crucial. Design and implementation of software solutions using industry best practices and design patterns will be part of your role. Writing clean, efficient, and maintainable code that adheres to coding standards and guidelines is key. Conducting code reviews and providing technical guidance to ensure high-quality, scalable, and maintainable code is also expected. You will be responsible for developing and maintaining project plans, timelines, and budgets. Mentoring and training junior team members to ensure ongoing professional development is an essential part of this role. Staying up to date with emerging trends and technologies in Java development and related areas is also required. Requirements for this role include 8+ years of experience in full-stack or backend development with Java. Excellent communication and collaboration skills are necessary. Strong technical skills in software engineering, including DB Structure, design patterns, algorithms, and development best practices are required. Experience with Agile development methodologies and tools such as JIRA and Confluence is preferred. Prior experience working with or willingness to learn Web3 technologies, particularly Hedera Hashgraph, is desired. Experience with logging frameworks, monitoring tools, and observability practices is beneficial. Experience with cloud-based technologies and platforms such as AWS or Azure is a plus. A strong understanding of Core Java versions 17 and 21, Spring Boot, Spring Security, JSON Web tokens, Spring Data JPA, Microservices, Hibernate ORM, data structures, and algorithms is essential. Experience with ORM frameworks such as Hibernate, Spring JDBC is preferred. Proficiency in PostgreSQL / MySQL and MongoDB is required. Additional knowledge in web technologies such as HTML, CSS, JavaScript, and ReactJS is an added advantage. Experience in Logging and Analyzing Log files is beneficial. Additional knowledge with Docker is a plus. Additional knowledge or working experience of Blockchain/DLT technologies is an added advantage. Excellent problem-solving skills and attention to detail are necessary. Good knowledge/experience with Git & Maven is preferred.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for translating complex functional requirements into technical requirements and implementing a coherent and progressive development strategy for our product line. Designing, developing, and maintaining complex systems using the best development practices and technology available will be a key part of your role. You will oversee the overall software development life cycle, ensuring the delivery of high-quality, scalable, and extensible systems and applications on time and within budget. It will be your responsibility to adopt and evolve software engineering practices and tools within the organization, staying up-to-date with the latest technology developments and open source offerings. Collaboration with other technology and business teams to provide efficient and robust solutions to problems is crucial. Additionally, driving and managing the bug triage process, as well as reporting on the status of product delivery and quality to management, customer support, and product teams, will be part of your daily tasks. To be successful in this role, you should have a minimum of 5+ years of hands-on experience in Java and a strong understanding of data structures and algorithms. A sound understanding of object-oriented programming, excellent software design skills, and experience with SOA/Microservices/Restful services and N-tier J2EE/Java Springboot applications (API's) are required. You should also possess a strong understanding of database design, experience in writing optimized SQL queries, exposure to NoSQL databases, and familiarity with Apache Kafka, RabbitMQ, or other Queueing systems. Good knowledge of caching technologies and experience in log processing and creating monitoring dashboards will be beneficial for this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Solution Designer (Cloud Data Integration) at Barclays within the Customer Digital and Data Business Area, you will play a vital role in supporting the successful delivery of location strategy projects. Your responsibilities will include ensuring projects are delivered according to plan, budget, quality standards, and governance protocols. By spearheading the evolution of the digital landscape, you will drive innovation and excellence, utilizing cutting-edge technology to enhance our digital offerings and deliver unparalleled customer experiences. To excel in this role, you should possess hands-on experience working with large-scale data platforms and developing cloud solutions within the AWS data platform. Your track record should demonstrate a history of driving business success through your expertise in AWS, distributed computing paradigms, and designing data ingestion programs using technologies like Glue, Lambda, S3, Redshift, Snowflake, Apache Kafka, and Spark Streaming. Proficiency in Python, PySpark, SQL, and database management systems is essential, along with a strong understanding of data governance principles and tools. Additionally, valued skills for this role may include experience in multi-cloud solution design, data modeling, data governance frameworks, agile methodologies, project management tools, business analysis, and product ownership within a data analytics context. A basic understanding of the banking domain, along with excellent analytical, communication, and interpersonal skills, will be crucial for success in this position. Your main purpose as a Solution Designer will involve designing, developing, and implementing solutions to complex business problems by collaborating with stakeholders to understand their needs and requirements. You will be accountable for designing solutions that balance technology risks against business delivery, driving consistency and aligning with modern software engineering practices and automated delivery tooling. Furthermore, you will be expected to provide impact assessments, fault finding support, and architecture inputs required to comply with the bank's governance processes. As an Assistant Vice President in this role, you will be responsible for advising on decision-making processes, contributing to policy development, and ensuring operational effectiveness. If the position involves leadership responsibilities, you will lead a team to deliver impactful work and set objectives for employees while demonstrating leadership behaviours focused on listening, inspiring, aligning, and developing others. Alternatively, as an individual contributor, you will lead collaborative assignments, guide team members, identify new directions for projects, consult on complex issues, and collaborate with other areas to support business activities. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive. By demonstrating these values and mindset, you will contribute to creating an environment where colleagues can thrive and deliver consistently excellent results.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Lead Software Engineer specialized in .NET with secondary expertise in Java, you will play a pivotal role in spearheading the evolution and implementation of software products across multiple markets. Your responsibilities will include both hands-on development and mentorship of a team of engineers, ensuring a constant drive towards innovation and the enhancement of technology solutions. Your primary objective will be to lead the cloud-native transformation of the .NET platform, providing expert guidance on devops practices and cloud architecture design. Additionally, you will champion the enhancement of security measures and operational resilience within the platform, thereby improving its non-functional aspects significantly. To excel in this role, you must possess a deep understanding of both .NET and Java technologies, along with practical experience in designing and implementing cloud-native architectures. Proficiency in 12-factor principles, microservices architecture, and event-driven architecture is essential, with desirable experience in Apache Kafka. Your expertise should extend to implementing robust security models in API-based solutions, encompassing various authentication and encryption frameworks. A strong command over UML, design patterns, and architecture frameworks is crucial, with knowledge of TOGAF considered advantageous. Prior experience in migrating platforms from legacy on-prem architectures to cloud-native setups is highly beneficial. Moreover, a comprehensive understanding of domain-driven architecture and engineering principles is vital for success in this role. Effective communication skills, both written and verbal, are key, as you will be responsible for creating and disseminating architecture and design documentation. In alignment with Mastercard's security policies, you must uphold the confidentiality and integrity of information assets, report any security breaches promptly, and diligently complete all mandatory security training sessions. Your commitment to information security and proactive approach to risk mitigation will be integral to your role as a Lead Software Engineer at Mastercard.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

The job requires you to translate complex functional requirements into technical requirements and implement a coherent and progressive development strategy for the product line. You will design, develop, and maintain complex systems using the best development practices and technology available. You will be responsible for the overall software development life cycle, ensuring the delivery of high-quality, scalable, and extensible systems and applications on time and within budget. It is essential to adopt and evolve software engineering practices and tools within the organization, staying updated with the latest technology developments and open-source offerings to solve business problems effectively. Collaboration with other technology and business teams is crucial to provide efficient and robust solutions. Managing the bug triage process and reporting on the status of product delivery and quality to management, customer support, and product teams are also part of the responsibilities. The ideal candidate should have a minimum of 5 years of hands-on experience in Java, a strong understanding of data structures and algorithms, and sound knowledge of object-oriented programming with excellent software design skills. Experience with SOA/Microservices/Restful services and development of N-tier J2EE / Java Springboot applications is required, along with a strong understanding of database design and optimized SQL query writing. Exposure to NoSQL databases, Apache Kafka, RabbitMQ, or other Queueing systems, caching technologies, log processing, and creating monitoring dashboards will be beneficial for this role.,

Posted 1 month ago

Apply

0.0 - 4.0 years

0 Lacs

pune, maharashtra

On-site

As a Junior Full Stack Developer (Java) at Barclays, you will be responsible for supporting the successful delivery of location strategy projects within the planned budget, agreed quality, and governance standards. Your role will involve spearheading the evolution of the digital landscape, driving innovation, and excellence to revolutionize the digital offerings and ensure unparalleled customer experiences. To excel in this role, you should have proficiency in Java (3+ years of experience) with programming skills in reading, writing, and debugging multi-threaded code, as well as Rest Services. You must demonstrate your ability to work effectively in a team environment throughout the Software Development Lifecycle. Additionally, a solid understanding of Java, J2EE, Spring Framework, JDBC, Rest Services/Microservices, CI, unit test frameworks, ORM technologies like Hibernate and Spring Data/JPA, Java Profilers, memory dump analysis, messaging platforms such as MQ and Solace, XML/JSON technologies, SQL, and databases like MS SQL Server, Oracle, and Mongo DB is required. Experience in working with AGILE or SCRUM SDLC model is a plus. Moreover, highly valued skills may include knowledge of Apache Kafka, Docker, Kubernetes, NoSQL MongoDB, React, Angular, familiarity with DevOps fundamentals practices, and proven experience in Quality Assurance techniques relevant to application development. In this role based in Pune, your primary purpose will be to design, develop, and enhance software using various engineering methodologies to provide business, platform, and technology capabilities for customers and colleagues. Your key accountabilities will include developing and delivering high-quality software solutions using industry-aligned programming languages, frameworks, and tools, ensuring code scalability, maintainability, and performance optimization. You will collaborate cross-functionally with product managers, designers, and engineers to define software requirements, devise solution strategies, promote code quality, and facilitate knowledge sharing. Additionally, staying updated on industry technology trends, adhering to secure coding practices, implementing effective unit testing practices, and fostering a culture of technical excellence will be essential aspects of your role. As an analyst, you will be expected to meet stakeholders" needs through specialist advice and support, perform activities in a timely and high-standard manner, lead specific processes within a team, and supervise and support professional development. For individuals with leadership responsibilities, demonstrating leadership behaviours such as listening, inspiring, aligning, and developing others will be crucial. On the other hand, individual contributors will manage their workload, implement systems and processes, collaborate on broader projects, ensure relevant rules and regulations are followed, and build a deep understanding of how various teams contribute to broader objectives. All colleagues are required to uphold Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Senior Engineer at Impetus Technologies, you will play a crucial role in designing, developing, and deploying scalable data processing applications using Java and Big Data technologies. Your responsibilities will include collaborating with cross-functional teams, mentoring junior engineers, and contributing to architectural decisions to enhance system performance and scalability. Your key responsibilities will revolve around designing and maintaining high-performance applications, implementing data ingestion and processing workflows using frameworks like Hadoop and Spark, and optimizing existing applications for improved performance and reliability. You will also be actively involved in mentoring junior engineers, participating in code reviews, and staying updated with the latest technology trends in Java and Big Data. To excel in this role, you should possess a strong proficiency in Java programming language, hands-on experience with Big Data technologies such as Apache Hadoop and Apache Spark, and an understanding of distributed computing concepts. Additionally, you should have experience with data processing frameworks and databases, strong problem-solving skills, and excellent communication and teamwork abilities. In this role, you will collaborate with a diverse team of skilled engineers, data scientists, and product managers who are passionate about technology and innovation. The team environment encourages knowledge sharing, continuous learning, and regular technical workshops to enhance your skills and keep you updated with industry trends. Overall, as a Senior Engineer at Impetus Technologies, you will be responsible for designing and developing scalable Java applications for Big Data processing, ensuring code quality and performance, and troubleshooting and optimizing existing systems to enhance performance and scalability. Qualifications: - Strong proficiency in Java programming language - Hands-on experience with Big Data technologies such as Hadoop, Spark, and Kafka - Understanding of distributed computing concepts - Experience with data processing frameworks and databases - Strong problem-solving skills - Knowledge of version control systems and CI/CD pipelines - Excellent communication and teamwork abilities - Bachelor's or master's degree in Computer Science, Engineering, or related field preferred Experience: 7 to 10 years Job Reference Number: 13131,

Posted 1 month ago

Apply

6.0 - 15.0 years

0 Lacs

karnataka

On-site

You have a unique opportunity to join as an Integration Architect specializing in Apache Nifi and Kubernetes. Your primary responsibilities will involve developing and performing automated builds, testing, and deployments in conjunction with Apache Nifi v2. This role requires a minimum of 15+ years of relevant experience and an in-depth understanding of various technologies and tools. Your expertise should include proficiency in the Linux CLI, extensive knowledge of SQL, and familiarity with ServiceNow CMDB. You must possess a good grasp of security principles, particularly OAuth basic/2.0, and IP networking concepts such as TCP, UDP, DNS, DHCP, firewalls, and IP routing. Additionally, experience with web services using SOAP/REST API, scripting languages like Bash/RegEx/Python/Groovy, and Java programming for custom code creation is essential. As an Integration Architect, you will be expected to build data integration workflows using NiFi, NiFi registry, and custom NiFi processors. Performance tuning of NiFi processing, working with Apache Kafka, and following Agile methodology are also crucial aspects of the role. Your responsibilities will extend to designing, deploying, and managing Kubernetes clusters, infrastructure-as-code tools like Crossplane, and container orchestration. Proficiency in GitOps practices, container monitoring/logging tools, networking principles, and identity/access management tools is highly desirable. You will play a pivotal role in maintaining Kubernetes clusters for open-source applications, implementing GitOps continuous delivery with ArgoCD, managing cloud resources with Crossplane API, and ensuring secure access with Keycloak. Your expertise in secrets management, API gateway management, persistent storage solutions, and certificate management will be invaluable for the organization. Furthermore, implementing security best practices, documenting procedures, and contributing to open-source projects are key elements of this dynamic role. The preferred qualifications for this position include a Bachelor's degree in computer science or a related field, Kubernetes certification, and knowledge of software-defined networking solutions for Kubernetes. Your soft skills, such as effective communication, stakeholder management, taking ownership, and autonomy, will be essential in leading technical discussions and resolving issues effectively. If you are passionate about integration architecture, possess a strong technical background, and are eager to work in a collaborative environment, this role offers a challenging yet rewarding opportunity to showcase your skills and contribute to cutting-edge projects in a fast-paced setting.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for developing scalable web applications using Python (FastAPI), React.js, and cloud-native technologies. Specifically, you will work on building a low-code/no-code AI agent platform, designing an intuitive workflow UI, and integrating with LLMs, enterprise connectors, and role-based access controls. As a Full-Stack Developer, your responsibilities will include developing and optimizing APIs using FastAPI, integrating with LangChain, Pinecone/Weaviate vector databases, and enterprise connectors like Airbyte/Nifi for backend development. For frontend development, you will build an interactive drag-and-drop workflow UI using React.js along with supporting libraries like React Flow, D3.js, and TailwindCSS. You will also be tasked with implementing authentication mechanisms such as OAuth2, Keycloak, and role-based access controls for multi-tenant environments. Database design will involve working with PostgreSQL for structured data, MongoDB for unstructured data, and Neo4j for knowledge graphs. Your role will extend to DevOps and deployment using Docker, Kubernetes, and Terraform across various cloud platforms like Azure, AWS, and GCP. Performance optimization will be crucial as you strive to enhance API performance and frontend responsiveness for an improved user experience. Collaboration with AI and Data Engineers will be essential to ensure seamless integration of AI models. To excel in this role, you should have at least 5 years of experience in FastAPI, React.js, and cloud-native applications. A strong understanding of REST APIs, GraphQL, and WebSockets is required. Experience with JWT authentication, OAuth2, and multi-tenant security is essential. Proficiency in databases such as PostgreSQL, MongoDB, Neo4j, and Redis is expected. Knowledge of workflow automation tools like n8n, Node-RED, and Temporal.io will be beneficial. Familiarity with containerization tools (Docker, Kubernetes) and CI/CD pipelines is preferred. Any experience with Apache Kafka, WebSockets, or AI-driven chatbots would be considered a bonus.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

StoneX Ltd is an FCA authorized and regulated firm that excels in trade execution, clearing, and advisory services, primarily focusing on Commodities and Foreign Exchange sectors. As a proud member of the Fortune 500 StoneX Inc. family, we provide comprehensive services globally, spanning Commodities, Capital Markets, Currencies, and Asset Management. Our global team operates across Europe, the US, and Asia Pacific, where innovative minds collaborate in cross-functional teams to shape the future of financial markets. Technology is at the core of our competitive edge, driving innovation and value creation. Our engineering teams leverage cutting-edge tools to deliver impactful solutions to production at pace, through rapid iteration and close collaboration with business stakeholders. We are currently seeking a hands-on Senior Software Engineer with experience in building high-performing, scalable, enterprise-grade applications. In this role, you will be involved in architecture and development across all tiers of the application stack, focusing on low-latency mission-critical applications. You will work with a talented team of engineers, collaborating on application architecture, development, testing, and design. Additionally, you will technically lead a team of highly skilled software engineers and build relationships with key stakeholders across a diverse user base. Key responsibilities include: - Primary focus on server-side development - Contributing to all phases of the development lifecycle within an Agile methodology - Writing well-designed, testable, efficient code - Ensuring designs align with specifications - Preparing and releasing software components - Supporting continuous improvement by exploring new technologies for architectural review Qualifications required for this role include: - Extensive experience in developing complex distributed event-based microservices using Java/Spring - Designing and maintaining robust, scalable, high-performance Java applications - Developing Restful APIs, gRPC services - Experience with containerization (Docker, Kubernetes) and cloud platforms (Azure, AWS) - Exposure to distributed messaging/streaming platforms (Apache Kafka) - Building CI/CD pipelines (Azure DevOps, GHE) - Familiarity with TDD/BDD, testing frameworks - Strong knowledge of Relational Databases SQL and No-SQL databases - Working as part of a global Agile team - Knowledge of Reactive programming is a plus - Proficiency in English Standout factors for this role include: - Minimum 5 years of experience, ideally within Financial services or FinTech - Experience with DataDog observability, APM, Alerting - Open Policy Agent (OPA) experience Education: - BS/MS degree in Computer Science, Engineering, or a related subject Working Environment: - Hybrid (2 days from home, 3 days from the office),

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

kochi, kerala

On-site

You will be joining Dbiz, a high-performing product and engineering company that collaborates with organizations to develop digital solutions using cutting-edge technology. Known for our innovative approach, we leverage technology in various impactful ways. As a Full Stack Developer at Dbiz, you will play a crucial role in producing scalable software solutions. Working within a cross-functional team, you will be involved in the entire software development life cycle, from conceptualization to deployment. Your responsibilities will include developing, enhancing, modifying, and maintaining applications within the Corporate Communications domain. This entails designing, coding, testing, debugging, and documenting programs, as well as providing support for corporate systems architecture. You will closely collaborate with business partners to define requirements for system applications, drawing upon your in-depth knowledge of development tools and languages. In this individual contributor role, you will be recognized as a content expert by your peers. The position typically requires a minimum of 6-8 years of relevant experience. Key Responsibilities: - Possess deep knowledge and hands-on experience with Java and UI stack. - Demonstrate proficiency in web technologies, frameworks, and tools such as HTML, CSS, JavaScript, React, NodeJS, XML, jQuery, and Spring. - Be well-versed in State Management, Redux, Reducers, JavaScript, and CSS in ReactJS/UI. - Have practical experience in test-driven development and constant refactoring within a continuous integration environment. - Exhibit experience and knowledge of SQL and relational databases. - Be familiar with agile practices, such as Scrum, Kanban, or XP. - Display expertise in Functional Analysis. - Showcase excellent communication and teamwork skills. - Take end-to-end ownership in driving the team towards successful delivery. - Uphold a performance and productivity orientation to ensure high-quality outcomes. - Demonstrate profound analytical skills and problem-solving abilities. - Possess proactive and flexible working approaches. - Apply innovative thinking to solve problems. - Be efficient, well-organized, detail-oriented, with strong interpersonal skills. Mandatory Skill Sets: - Minimum of 5 years of proven experience as a Java Developer or a related role. - Strong understanding of Java programming language and related frameworks like Springboot and Hibernate. - Solid knowledge of object-oriented programming principles, design patterns, and software development methodologies. - Familiarity with UI technologies like Angular and TypeScript. - Hands-on experience in Micro Services and use of tools like SonarQube and Jenkins. - Understanding of databases like MySQL and MongoDB. - Proficiency in Java development tools (e.g., Eclipse, IntelliJ) and version control systems (e.g., Git, SVN). - Strong problem-solving skills and attention to detail. - Excellent collaboration and communication skills. - Ability to work effectively in a fast-paced, team-oriented environment. - Experience with Agile development methodologies (e.g., Scrum) is advantageous. - Hands-on experience deploying applications on AWS or similar cloud platforms is highly preferred. - Good understanding of CI/CD Pipeline. - Exposure to messaging tools like Apache Kafka or RabbitMQ is desirable. - Bachelor's or master's degree in Computer Science, Software Engineering, or a related field. - 4-7 years of experience. Life at Dbiz: - Competitive salary and attractive benefits. - Dynamic and innovative work environment. - Opportunities for personal growth and development. - Engaging and collaborative company culture. This is a full-time position that offers a challenging and rewarding opportunity to contribute to the cutting-edge technology solutions developed by Dbiz.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

The AM3 Group is looking for a highly skilled Senior Java Developer with a strong background in AWS cloud services to be a part of our dynamic team. In this role, you will have the opportunity to create and manage modern, scalable, and cloud-native applications using Java (up to Java 17), Spring Boot, Angular, and a comprehensive range of AWS tools. As a Senior Java Developer at AM3 Group, your responsibilities will include developing full-stack applications utilizing Java, Spring Boot, Angular, and RESTful APIs. You will be involved in building and deploying cloud-native solutions with AWS services such as EC2, S3, Lambda, RDS, DynamoDB, and API Gateway. Additionally, you will be tasked with designing and implementing microservices architectures for enhanced scalability and resilience. Your role will also entail creating and maintaining CI/CD pipelines using tools like Jenkins, GitHub Actions, AWS CodePipeline, and Terraform, as well as containerizing applications with Docker and managing them through Kubernetes (EKS). Monitoring and optimizing performance using AWS CloudWatch, X-Ray, and the ELK Stack, working with Apache Kafka and Redis for real-time event-driven systems, and conducting unit/integration testing with JUnit, Mockito, Jasmine, and API testing via Postman are also key aspects of the role. Collaboration within Agile/Scrum teams to deliver features in iterative sprints is an essential part of your responsibilities. The ideal candidate should possess a minimum of 8 years of Java development experience with a strong understanding of Java 8/11/17, expertise in Spring Boot, Hibernate, and microservices, as well as solid experience with AWS including infrastructure and serverless (Lambda, EC2, S3, etc.). Frontend development exposure with Angular (v212), JavaScript, and Bootstrap, hands-on experience with CI/CD, GitHub Actions, Jenkins, and Terraform, familiarity with SQL (MySQL, Oracle) and NoSQL (DynamoDB, MongoDB), and knowledge of SQS, JMS, and event-driven architecture are required skills. Additionally, familiarity with DevSecOps and cloud security best practices is essential. Preferred qualifications include experience with serverless frameworks (AWS Lambda), familiarity with React.js, Node.js, or Kotlin, and exposure to Big Data, Apache Spark, or machine learning pipelines. Join our team at AM3 Group to work on challenging and high-impact cloud projects, benefit from competitive compensation and benefits, enjoy a flexible work environment, be part of a culture of innovation and continuous learning, and gain global exposure through cross-functional collaboration. Apply now to be a part of a future-ready team that is shaping cloud-native enterprise solutions! For any questions or referrals, please contact us at careers@am3group.com. To learn more about us, visit our website at https://am3group.com/.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Software Development Engineer with expertise in Java and strong AWS experience, you will be responsible for designing, developing, and deploying robust cloud-native applications. With a minimum of 5+ years of Java development experience, including Java 8/11/17 features, you will utilize your skills in Spring Boot, Angular, and RESTful APIs to build full-stack Java applications. Additionally, you will work on AWS-based cloud solutions, leveraging services such as EC2, S3, Lambda, DynamoDB, API Gateway, and CloudFormation. Your key responsibilities will include developing and optimizing microservices architectures for high availability and scalability, implementing CI/CD pipelines using Jenkins, AWS CodePipeline, and Terraform, and collaborating in an Agile/Scrum environment to drive innovation and efficiency. You will also work with Docker & Kubernetes (EKS) for containerized applications, optimize system performance using monitoring tools like AWS CloudWatch, X-Ray, and ELK Stack, and ensure robust testing and quality assurance using tools like JUnit, Mockito, Jasmine, and Postman. You should have a strong understanding of AWS cloud services and infrastructure, experience with CI/CD pipelines, GitHub Actions, Terraform, and Jenkins, and proficiency in front-end development using Angular, JavaScript, and Bootstrap. Hands-on experience with SQL (Oracle, MySQL) and NoSQL (MongoDB, DynamoDB) databases, as well as knowledge of APIs, messaging systems (SQS, JMS), and event-driven architectures, will be essential for this role. Preferred qualifications include experience in serverless architecture using AWS Lambda, familiarity with React.js, Node.js, and Kotlin, and a background in machine learning, data analytics, or big data processing (Apache Spark, Storm). If you are looking to work in a dynamic environment where you can showcase your Java development skills and AWS expertise, this position offers an exciting opportunity to contribute to the development of cutting-edge cloud-native applications.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

kochi, kerala

On-site

DifferentByte Technologies Pvt Ltd, a GENAI consultancy firm located in Cochin, Kerala, is in immediate need of a talented Python Developer/Backend Developer to join their team. The primary focus of this position is to contribute to the development of an AI-based high-frequency trading platform. This role presents a unique opportunity for individuals looking to establish a rewarding career in a technology-oriented trading organization. As a Python Developer at DifferentByte Technologies Pvt Ltd, your responsibilities will include writing effective and scalable code, enhancing back-end components to optimize responsiveness and performance, integrating user-facing features into applications, as well as testing and debugging programs. You will also be tasked with improving the functionality of existing systems, implementing security measures, evaluating and prioritizing feature requests, and collaborating with internal teams to understand user needs and offer technical solutions. The ideal candidate should possess a minimum of 2 years of demonstrable experience as a Python Developer within a product-based company. Proficiency in Python frameworks such as Fast API and Flask, along with knowledge of PostgreSQL and familiarity with Supabase, are essential requirements for this role. Additionally, expertise in queuing systems like Apache Kafka and Airflow, as well as strong problem-solving skills, are highly valued. Candidates with experience in artificial intelligence, machine learning, deep learning, and technologies like TensorFlow and PyTorch will be preferred. A degree in B.Tech/B.E. in Computer Science, Engineering, or a related field is a prerequisite. This position is open to candidates currently residing in Kerala or those willing to relocate to Cochin. Apart from the technical qualifications, DifferentByte Technologies Pvt Ltd offers a collaborative and informal work environment with a casual dress code and a flat structure. Joining the highly driven and Agile team at DifferentByte presents an excellent opportunity for career advancement, competitive salary, and a comprehensive benefits package. If you are looking to be a part of an innovative team that values code quality, testing, and making a significant impact on the future of AI-based trading platforms, apply now by contacting hr@differentbyte.in.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

BizViz is a company that offers a comprehensive view of a business's data, catering to various industries and meeting the diverse needs of business executives. With a dedicated team of over 50 professionals working on the BizViz platform for several years, the company aims to develop technological solutions that provide our clients with a competitive advantage. At BizViz, we are committed to the success of our customers, striving to create applications that align with their unique visions and requirements. We steer clear of generic ERP templates, offering businesses a more tailored solution. As a Big Data Engineer at BizViz, you will join a small, agile team of data engineers focused on building an innovative big data platform for enterprises dealing with critical data management and diverse application stakeholders at scale. The platform handles data ingestion, warehousing, and governance, allowing developers to create complex queries efficiently. With features like automatic scaling, elasticity, security, logging, and data provenance, our platform empowers developers to concentrate on algorithms rather than administrative tasks. We are seeking engineers who are eager for technical challenges, to enhance our current platform for existing clients and develop new capabilities for future customers. Key Responsibilities: - Work as a Senior Big Data Engineer within the Data Science Innovation team, collaborating closely with internal and external stakeholders throughout the development process. - Understand the needs of key stakeholders to enhance or create new solutions related to data and analytics. - Collaborate in a cross-functional, matrix organization, even in ambiguous situations. - Contribute to scalable solutions using large datasets alongside other data scientists. - Research innovative data solutions to address real market challenges. - Analyze data to provide fact-based recommendations for innovation projects. - Explore Big Data and other unstructured data sources to uncover new insights. - Partner with cross-functional teams to develop and execute business strategies. - Stay updated on advancements in data analytics, Big Data, predictive analytics, and technology. Qualifications: - BTech/MCA degree or higher. - Minimum 5 years of experience. - Proficiency in Java, Scala, Python. - Familiarity with Apache Spark, Hadoop, Hive, Spark SQL, Spark Streaming, Apache Kafka. - Knowledge of Predictive Algorithms, Mllib, Cassandra, RDMS (MYSQL, MS SQL, etc.), NOSQL, Columnar Databases, Big table. - Deep understanding of search engine technology, including Elasticsearch/Solr. - Experience in Agile development practices such as Scrum. - Strong problem-solving skills for designing algorithms related to data cleaning, mining, clustering, and pattern recognition. - Ability to work effectively in a matrix-driven organization under varying circumstances. - Desirable personal qualities: creativity, tenacity, curiosity, and a passion for technical excellence. Location: Bangalore To apply for this position, interested candidates can send their applications to careers@bdb.ai.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

We are seeking an experienced software engineer to join our team at Grid Dynamics. The ideal candidate should be proficient in Java, Spring Framework, and either Google Cloud Platform (GCP) or Azure. Hands-on experience with BigQuery, Apache Kafka, and GitHub/GitHub Actions is preferred, along with a strong background in developing RESTful APIs. If you are passionate about working with cutting-edge cloud technologies and building scalable solutions, we would love to connect with you! The ideal candidate should have at least 6-9 years of experience in Java and extensive expertise in the Spring Boot Framework. A strong background working with either MS Azure, Google Cloud Platform (GCP), or AWS is required, along with a solid understanding of data integration patterns and ETL processes. Experience with unit and integration testing (e.g., JUnit, Mockito) is essential for this role, along with knowledge of distributed systems architecture. Strong analytical and problem-solving skills are necessary to tackle complex challenges effectively. Immediate joiners are preferred for this position, and it can be hired across Bangalore, Hyderabad, or Chennai.,

Posted 1 month ago

Apply

6.0 - 8.0 years

19 - 35 Lacs

Hyderabad

Work from Office

We are hiring a Senior Data Engineer with 6- 8 years of experience Education:- Candidates from premier institutes like IIT, IIM, IISc, NIT, IIIT top- ranked institutions in India are highly encouraged to apply.

Posted 1 month ago

Apply

3.0 - 5.0 years

8 - 10 Lacs

Chennai

Work from Office

Seeking a skilled Java Developer to join our dynamic team. The ideal candidate will have experience in designing, developing, and maintaining Java-based applications and will work with cross-functional teams to deliver quality software solutions. Provident fund

Posted 1 month ago

Apply

4.0 - 8.0 years

0 - 0 Lacs

coimbatore, tamil nadu

On-site

You have the opportunity to apply for the position of Senior ETL and Feature Engineer at PrivaSapien, based in Bangalore. PrivaSapien is at the forefront of Privacy Enhancing & Responsible AI Technologies, where you will play a crucial role in setting up the big data ecosystem for the world's first privacy red teaming and blue teaming platform. As an individual contributor, you will work on cutting-edge privacy platform requirements with clients globally, spanning across various industry verticals. Joining as one of the early employees, you will receive a significant ESOP option and collaborate with brilliant minds from prestigious institutions such as IISc and IIMs. Your responsibilities will include developing and maintaining ETL pipelines for processing large-scale datasets, creating a Python connector for ETL applications, and demonstrating proficiency in AWS Glue. You will be involved in ETL pipeline development for AI/ML workloads, orchestrating scaling, and resource management. Additionally, you will work on managing unstructured data tasks, optimizing query performance in SQL databases, and integrating multiple databases into the ETL pipeline within a multi-cloud environment. To be eligible for this role, you should have a minimum of 4 years of hands-on experience in setting up ETL and feature engineering pipelines on cloud or big data ecosystems. Proficiency in Apache Spark, pyspark, Apache Airflow, and AWS Glue is essential, along with expertise in at least one ETL tool. Strong programming skills in Python, familiarity with data manipulation libraries, and experience in handling various data types are required. Furthermore, you should possess knowledge in SQL databases, networking, security, and cloud platforms. The interview process will consist of a technical round with the Director, an assessment, an assessment review round with the Senior Backend person, and an HR round. To apply for this opportunity, you need to register or login on the portal, fill out the application form, clear the video screening, and click on "Apply" to be shortlisted. Your profile will then be shared with the client for the interview round upon selection. At Uplers, our aim is to simplify and expedite the hiring process, assisting talents in finding and applying for relevant contractual onsite opportunities. We provide support for any challenges faced during the engagement and assign a dedicated Talent Success Coach to guide you throughout the process. If you are prepared for a new challenge, a conducive work environment, and an opportunity to elevate your career, seize this chance today. We look forward to welcoming you aboard!,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You should have hands-on experience in deploying and managing large-scale dataflow products such as Cribl, Logstash, or Apache NiFi. Additionally, you should be proficient in integrating data pipelines with cloud platforms like AWS, Azure, Google Cloud, and on-premises systems. It is essential to have experience in developing and validating field extraction using regular expressions. A strong understanding of Operating Systems and Networking concepts is required, including Linux/Unix system administration, HTTP, and encryption. You should possess knowledge of software version control, deployment, and build tools following DevOps SDLC practices such as Git, Jenkins, and Jira. Strong analytical and troubleshooting skills are crucial for this role, along with excellent verbal and written communication skills. An appreciation of Agile methodologies, specifically Kanban, is also expected. Desirable skills for this position include enterprise experience with a distributed event streaming platform like Apache Kafka, AWS Kinesis, Google Pub/Sub, or MQ. Experience in infrastructure automation and integration, preferably using Python and Ansible, would be beneficial. Familiarity with cybersecurity concepts, event types, and monitoring requirements is a plus. Experience in Parsing and Normalizing data in Elasticsearch using Elastic Common Schema (ECS) would also be advantageous.,

Posted 1 month ago

Apply

0.0 - 4.0 years

0 Lacs

pune, maharashtra

On-site

As a Back End Developer at Cequence Security in India - Pune, you will contribute to building products that safeguard web applications and APIs worldwide from various threats including online fraud, business logic attacks, exploits, and sensitive data exposure. Our innovative platform caters to global enterprise customers in sectors like finance, banking, retail, social media, travel, and hospitality by offering a unified solution for runtime API visibility, security risk monitoring, and behavioral fingerprint-based threat prevention. Cequence Security stands out for its ability to consistently detect and prevent evolving online attacks without the need for extensive application integration, making it a trusted choice for enterprises seeking verified security solutions. If you are passionate about global security, enjoy collaborating with a dedicated team, and are eager to contribute to the growth of a dynamic organization, we welcome your application. As a Backend Developer at Cequence Security, your role involves designing, developing, and maintaining critical backend components of our security products. You will play a key part in architecting, designing, and implementing new product features from inception to final execution, including working on backend services, data pipelines, and network components. Collaboration with Architects, Engineers, Data Scientists, and Security experts will be essential as you tackle challenging problems and bring new features to life. Your responsibilities will include: - Overseeing projects from ideation to completion. - Designing server-side architecture enhancements for existing backend services. - Developing new services in alignment with the overall architecture. - Creating and implementing backend RESTful services and APIs. - Enhancing data pipelines for increased throughput and scalability. - Improving the high throughput data plane for our products. - Operating within an Agile framework to coordinate tasks and engage with team members. - Participating in a Test-Driven Development environment to produce reliable, well-documented code. Requirements for this role: - Bachelor's degree or equivalent experience in Computer Science or related fields. - Proficiency in JVM languages like Java, Kotlin & Scala. - Familiarity with big data tools such as Elasticsearch and Apache Kafka. - Experience with high-throughput networking components like proxies, firewalls, etc. - Background in network and/or application security domains is advantageous. - Strong problem-solving abilities and attention to detail. - Sound understanding of data structures, system design, and algorithms. - Exposure to Cloud services like AWS EC2, EMR, EKS, etc., is a bonus. - Knowledge of Docker and Kubernetes is desirable for this role.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies