Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 5.0 years
0 Lacs
karnataka
On-site
The primary responsibility of Tax Operations is to calculate withholding and Tax liability for the clients across all business units in the firm. The team also handles client documentation and regulatory information reporting. Sedona, a high volume transaction processing application, centralizes the Tax logic and calculates withholding & tax in real time. It also facilitates the timely reporting of millions of transactions to various tax authorities globally. You will be responsible for designing and architecture systems for Tax functions, including client documentation, reporting, and tax withholding with high scalability, high throughput, and large amounts of data. You will contribute to the technical design of the strategic engine by evaluating and employing Distributed Caching implementations for Multi-threaded and Concurrent Highly-Scalable processing and Complex Matching Algorithms for highly flexible, optimized, and extremely fast matching. Additionally, you will design and implement user workflows using BPMN notation, manage project deliverables and tasks during development, user acceptance (UAT), and production implementation phases, and perform memory, thread, and CPU profiling of all core features for high volume processing. Qualifications for this role include 3+ years of Java development, 1+ years of service-oriented architecture development, knowledge of Web UI technologies like React and Angular, familiarity with Spring Frameworks, strong skill in databases like DB2, Sybase IQ, MongoDB, knowledge of BPMN modeling and designing, and 1+ years of experience implementing systems with automated testing using junit, fitnesse, karate, etc. Preferred qualifications include experience in Financial Services/Industry and knowledge of messaging related technologies such as JMS, Apache Kafka. Goldman Sachs is committed to fostering diversity and inclusion in the workplace and beyond. They provide various opportunities for professional and personal growth, including training and development opportunities, firmwide networks, benefits, wellness programs, personal finance offerings, and mindfulness programs. The company is also dedicated to finding reasonable accommodations for candidates with special needs or disabilities during the recruiting process. Learn more about Goldman Sachs at GS.com/careers.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
We are seeking full stack Senior Solution Engineers with a solid background in .NET development to join our team. In this role, you will utilize your expertise to design, implement, and maintain complex software applications. We are looking for detail-oriented professionals who excel both independently and in team settings, delivering high-quality solutions with precision and passion. Fluency in UX, webservices, database, and infrastructure layers is essential for success in this position. We value individuals with a strong work ethic, a sense of urgency, and a commitment to timely quality delivery. This role is based in Mumbai, India, and knowledge or interest in AI/ML, LLMs, and Prompt Engineering is a strong plus. PristineAI is a leader in integrated optimization solutions for retail pricing, promotions, personalization, and assortment. Headquartered in Woburn, MA, with AI Engineering labs in Mumbai and Chennai, we are a team of innovators, subject-matter experts, and AI specialists dedicated to pushing the boundaries of AI to create groundbreaking enterprise solutions. Our platforms include Kai, a Conversational AI platform democratizing AI for decision-makers, and Presto, a Predictive Optimization platform that significantly improves retailers" profitability. Our SaaS platform processes over 30 million new customer interactions daily, representing 12,000 stores/websites and 30 million customers. **Role**: Senior Solution Engineer - Microsoft .NET, Mumbai, India **Experience**: 3 - 5 years **Qualification**: Bachelors or Masters in Computer Science/Information Technology **Key Attributes and Abilities**: - Strong work ethic, productivity focus, and client orientation - Quality and process orientation - Excellent communication and presentation skills - Troubleshooting technical issues and Performance Engineering - Exceptional problem-solving abilities - Strong logical and technical design skills - Proficiency in writing clean, reliable, efficient, and maintainable code - Adaptability to dynamic environments - Collaborative mindset - Hands-on experience from concept to deployment and support - Proficiency in rigorous reviews and offering quality feedback - Analyzing and resolving software defects and performance bottlenecks - Coursework in AI and ML is advantageous **Hands-on Expertise**: - Microsoft: .NET Core, ASP.NET, C#, NUNIT - MVC, Web Services, REST API, JavaScript, jQuery, CSS, React, Testing Frameworks - Tomcat, IIS, Docker, Kubernetes - Oracle, MySQL, NoSQL, Apache Kafka, In-memory Processing - GitHub Copilot, GIT, JIRA, testRigor, JUnit, Jenkins, OAuth - O/S: Linux, Unix, Windows **Job Responsibilities**: - Contribute to full solution lifecycle - Practice PristineAI's formal solution life cycle process - Create, review, and certify requirements, designs, code, and user documentation - Architect robust data structures and define specifications - Code with passion and focus on clean, scalable, and dependable solutions - Speed up processes and automate tasks - Create and execute meticulous test scenarios for reliability - Focus on User experience and interface refinement - Document technical intricacies and project milestones accurately **Remuneration and Benefits**: - Start-up environment shaping the industry's future - Work with talented and driven Retail Experts, AI Experts, and Solution Engineers - Annual 2 Weeks Study Time - Competitive salaries, generous equity, and health insurance coverage Pristine AI is an equal opportunity employer. We value diversity and are committed to fostering an inclusive environment for all employees.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior Full Stack Developer at Barclays, where you will spearhead the evolution of the digital landscape, driving innovation and excellence. You will utilize cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You will be assessed on critical skills relevant for success in the role, such as experience with skills to meet business requirements and job-specific skillsets. To be successful as a Senior Full Stack Developer, you should have experience with: - 6+ years of experience with Java 8+: Core language proficiency, lambdas, streams. - Spring Framework: Spring Boot, Spring MVC. - Database Skills: SQL, JPA/Hibernate, database design. - RESTful APIs: Design, development, and consumption. - Build Tools: Maven or Gradle. - Version Control: Git. - CI/CD: Jenkins, GitLab. - Testing: JUnit, Mockito, integration testing. - Message Queues: Apache Kafka, Solace. Desirable skillsets: - Frontend Development: Basics of JavaScript, Angular. - Microservices: Spring Cloud, service mesh. - Caching. - Containerization: Docker, Kubernetes basics. - Cloud Platforms: AWS, Azure, or GCP. - Monitoring: Application performance monitoring. - System Design: Scalability patterns, load balancing. - Performance Optimization: Profiling, tuning. - Soft Skills: Problem-solving, debugging, code review, collaboration. - Agile/Scrum methodology. - Communication with stakeholders. This role will be based out of Pune. Purpose of the role: To design, develop, and improve software utilizing various engineering methodologies that provide business, platform, and technology capabilities for customers and colleagues. Accountabilities: - Development and delivery of high-quality software solutions using industry-aligned programming languages, frameworks, and tools. - Cross-functional collaboration with product managers, designers, and engineers to define software requirements and ensure alignment with business objectives. - Collaboration with peers, participation in code reviews, and promotion of a culture of code quality and knowledge sharing. - Stay informed of industry technology trends and contribute to the organization's technology communities. - Adherence to secure coding practices and implementation of effective unit testing practices. Assistant Vice President Expectations: - Consult on complex issues and provide advice to support the resolution of escalated issues. - Identify ways to mitigate risk and develop new policies/procedures. - Take ownership for managing risk and strengthening controls. - Collaborate with other areas and engage in complex analysis of data. - Communicate complex information and influence stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values and the Barclays Mindset.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Software Engineering Manager at SAP, you will lead the App2App Integration team within SAP Business Data Cloud. Your role is pivotal in driving the design, development, and delivery of integration services that facilitate real-time, secure, and scalable data exchange across SAP's enterprise applications and unified data fabric. Your primary responsibilities will include owning end-to-end delivery of App2App integration components and services using Java, Apache Kafka, RESTful APIs, and modern data pipelines. You will accelerate the App2App integration roadmap by identifying reusable patterns, driving platform automation, and establishing best practices. Additionally, you will drive the evolution of ETL pipelines and real-time data processing systems to support scalable, low-latency data flows across SAP's platforms. Collaboration with product owners, solution architects, and global engineering teams to translate business requirements into technical execution plans will be a key aspect of your role. You will also drive the integration of modern data engineering tools such as Databricks, Spark, or similar technologies to enrich and accelerate data workflows. Ensuring the delivery of robust, secure, and performant services with strong observability, testing, and monitoring practices will be essential. To be successful in this role, you should have 10+ years of hands-on experience in backend development using Java and distributed systems. Proven experience in ETL design, data transformation pipelines, and large-scale data integration systems is required. Technical expertise with messaging frameworks like Apache Kafka and event-driven architecture is crucial, along with familiarity with modern cloud-native data processing platforms such as Databricks or Apache Spark. Moreover, experience working in large-scale enterprise software environments, ideally involving SAP applications or data platforms, is beneficial. Strong interpersonal and communication skills, the ability to influence across global, cross-functional teams, and proven leadership capabilities to build and lead diverse engineering teams in an agile environment are highly valued. A degree in Computer Science, Software Engineering, or a related field is preferred. Join our team at SAP, where we foster a positive culture focused on collaboration, learning, accountability, and innovation. As part of the Business Data Cloud organization in Bangalore, India, you will contribute to cutting-edge engineering efforts in a collaborative, inclusive, and high-impact environment that enables innovation and integration across SAP's data platforms. At SAP, we believe in unleashing all talent and creating a better and more equitable world. If you are ready to bring out your best and contribute to SAP's mission of helping the world run better through innovative technology solutions, we invite you to apply and be a part of our diverse and inclusive team. SAP is proud to be an equal opportunity workplace and an affirmative action employer. We are committed to Equal Employment Opportunity values and provide accessibility accommodations to applicants with disabilities. Should you require assistance during the application process, please reach out to the Recruiting Operations Team at Careers@sap.com.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be joining a team of passionate researchers, engineers, and designers dedicated to transforming the way research-intensive projects are conducted. Our goal is to reduce cognitive load and facilitate the conversion of information into knowledge. As part of our engineering team, you will contribute to the development of a scalable platform that manages large volumes of data, incorporates AI processing capabilities, and engages users globally. We firmly believe in the power of research to improve the world, and we strive to make the research process accessible and enjoyable. As we continue to expand, we are seeking talented engineers to join our team. We are currently in search of an experienced Backend Application Developer proficient in Java and Python. Your role will involve enhancing our product, ensuring seamless scalability to accommodate a growing global user base, and exploring new business opportunities. Key Responsibilities: - Develop high-performing, scalable, enterprise-grade applications using Java and Python (specifically Django). - Utilize the Spring Boot framework and Java Concurrency Utilities framework in application development. - Implement micro-services architecture and design RESTful APIs. - Collaborate with the team to explore additional technologies such as Apache Kafka and AWS cloud computing. - Familiarity with front-end frameworks like Angular, Vue, or React is a plus. Qualifications: - 3-5 years of experience in backend application development, with a focus on scalability. - Proficiency in Java and Python, with a strong emphasis on Python (Django). - Working knowledge of the Spring Boot framework and Java Concurrency Utilities. - Understanding of micro-services architecture and RESTful APIs. - Knowledge or exposure to Apache Kafka and AWS cloud computing is advantageous. - Experience with front-end frameworks like Angular, Vue, or React would be beneficial. If you are a proactive and innovative developer with a passion for building cutting-edge solutions, we invite you to join our dynamic team and contribute to our mission of simplifying and enhancing the research process.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are an experienced Full-Stack Developer with 5+ years of experience in building scalable web applications using Python (FastAPI), React.js, and cloud-native technologies. In this role, you will be responsible for developing a low-code/no-code AI agent platform, implementing an intuitive workflow UI, and integrating with LLMs, enterprise connectors, and role-based access controls. Your responsibilities will include backend development where you will develop and optimize APIs using FastAPI, integrating with LangChain, vector databases (Pinecone/Weaviate), and enterprise connectors (Airbyte/Nifi). Additionally, you will work on frontend development to build an interactive drag-and-drop workflow UI using React.js (React Flow, D3.js, TailwindCSS). You will also be involved in implementing OAuth2, Keycloak, and role-based access controls (RBAC) for multi-tenant environments. Database design is a crucial part of this role, where you will work with PostgreSQL (structured data), MongoDB (unstructured data), and Neo4j (knowledge graphs). DevOps & Deployment tasks will involve deploying using Docker, Kubernetes, and Terraform across multi-cloud (Azure, AWS, GCP) to ensure smooth operations. Performance optimization is another key area where you will focus on improving API performance and optimizing frontend responsiveness for seamless user experience. Collaboration with AI & Data Engineers is essential, as you will work closely with the Data Engineering team to ensure smooth AI model integration. To be successful in this role, you are required to have 5+ years of experience in FastAPI, React.js, and cloud-native applications. Strong knowledge of REST APIs, GraphQL, and WebSockets is essential, along with experience in JWT authentication, OAuth2, and multi-tenant security. Additionally, proficiency in PostgreSQL, MongoDB, Neo4j, and Redis is expected. Knowledge of workflow automation tools (n8n, Node-RED, Temporal.io), familiarity with containerization (Docker, Kubernetes), and CI/CD pipelines is also required. Bonus skills include experience in Apache Kafka, WebSockets, or AI-driven chatbots.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for developing and maintaining Node.js applications, utilizing your expertise in SQL and Kubernetes. Your role will involve deploying and managing applications on Kubernetes, working with Google Cloud Platform (GCP) services, and building and maintaining RESTful APIs or backend services using Node.js. You should also be familiar with Apache Kafka for message production and consumption, as well as PostgreSQL or similar relational databases for writing queries and basic schema design. Proficiency in Git and GitHub workflows for version control, and experience using Visual Studio Code (VSCode) or similar IDEs for development tools is essential. In addition to technical skills, clear communication in English, the ability to work effectively in distributed teams, and a strong problem-solving mindset are crucial for this role. You should be willing to learn new technologies and adapt to changing requirements. Experience with CI/CD pipelines, Agile methodologies, and monitoring/logging tools like Prometheus, Grafana, and ELK stack are preferred but not mandatory. This is a full-time position with a shift from 2-11PM, located in Hyderabad, Chennai, Bengaluru, or Pune. The annual budget for this role is 18LPA, including variable components. Health insurance is among the benefits offered, and the work schedule follows a UK shift pattern.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Full Stack Developer located in Pune, you will be responsible for designing, developing, and maintaining scalable Java-based backend services using Spring Boot and Microservices architecture. You will also build rich and dynamic front-end applications with React.js, integrating them with RESTful APIs. Leveraging Apache Kafka for building event-driven, real-time microservices and ensuring asynchronous communication between services will be a crucial part of your role. Your responsibilities will also include participating in the design and implementation of secure, scalable, and cloud-native solutions on platforms like AWS or similar. You must apply best practices for performance, reliability, scalability, and security throughout the software development lifecycle. Working with DevOps tools and containerization technologies such as Docker and Kubernetes to streamline deployment and CI/CD processes will be essential. Collaboration with cross-functional teams to gather and evaluate user requirements and translate them into technical specifications is a key aspect of the role. You will be expected to write efficient, clean, and testable code, perform thorough code reviews, and document application components. Supporting the development of training materials for QA and end users to ensure seamless integration of frontend and backend systems using JSON-based REST APIs is also part of your responsibilities. To excel in this role, you must possess strong communication skills, both verbal and written, along with relationship-building, collaborative, and organizational skills. Working effectively as a member of a matrix-based, diverse, and geographically distributed project team is crucial. Demonstrating ethics and values to foster high team trust is highly valued in this position. In return, we offer you the opportunity to drive impactful projects in a dynamic environment, continuous learning and career advancement opportunities, acknowledgment for innovative contributions, and the chance to lead initiatives with global impact. Our benefits package includes flexible schedules prioritizing well-being, relocation support, global opportunities for seamless transitions and international exposure, performance-based bonuses, annual rewards, and comprehensive well-being benefits such as Provident Fund and health insurance. Come join our team and make a difference with your expertise and skills.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a DevOps engineer at C1X AdTech Private Limited, a global technology company, your primary responsibility will be to manage the infrastructure, support development pipelines, and ensure system reliability. You will play a crucial role in automating deployment processes, maintaining server environments, monitoring system performance, and supporting engineering operations throughout the development lifecycle. Our objective is to design and manage scalable, cloud-native infrastructure using GCP services, Kubernetes, and Argo CD for high-availability applications. Additionally, you will implement and monitor observability tools such as Elasticsearch, Logstash, and Kibana to ensure full system visibility and support performance tuning. Enabling real-time data streaming and processing pipelines using Apache Kafka and GCP DataProc will be a key aspect of your role. You will also be responsible for automating CI/CD pipelines using GitHub Actions and Argo CD to facilitate faster, secure, and auditable releases across development and production environments. Your responsibilities will include building, managing, and monitoring Kubernetes clusters and containerized workloads using GKE and Argo CD, designing and maintaining CI/CD pipelines using GitHub Actions integrated with GitOps practices, configuring and maintaining real-time data pipelines using Apache Kafka and GCP DataProc, managing logging and observability infrastructure using Elasticsearch, Logstash, and Kibana (ELK stack), setting up and securing GCP services including Artifact Registry, Compute Engine, Cloud Storage, VPC, and IAM, implementing caching and session stores using Redis for performance optimization, and monitoring system health, availability, and performance with tools like Prometheus, Grafana, and ELK. Collaboration with development and QA teams to streamline deployment processes and ensure environment stability, as well as automating infrastructure provisioning and configuration using Bash, Python, or Terraform will be essential aspects of your role. You will also be responsible for maintaining backup, failover, and recovery strategies for production environments. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Engineering, or a related technical field with at least 4-8 years of experience in DevOps, Cloud Infrastructure, or Site Reliability Engineering. Strong experience with Google Cloud Platform (GCP) services including GKE, IAM, VPC, Artifact Registry, and DataProc is required. Hands-on experience with Kubernetes, Argo CD, and GitHub Actions for CI/CD workflows, proficiency with Apache Kafka for real-time data streaming, experience managing ELK Stack (Elasticsearch, Logstash, Kibana) in production, working knowledge of Redis for distributed caching and session management, scripting/automation skills using Bash, Python, Terraform, etc., solid understanding of containerization, infrastructure-as-code, and system monitoring, and familiarity with cloud security, IAM policies, and audit/compliance best practices are also essential qualifications for this role.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Performance Tester with 6-9 years of experience, you will play a crucial role in ensuring the responsiveness, scalability, and stability of applications under various load conditions. Your expertise in performance testing tools and methodologies, combined with a strong understanding of Java, Apache Kafka, and Microsoft technologies, will be instrumental in analyzing and improving application performance. Your key responsibilities will include designing, developing, and executing performance test plans, scripts, and scenarios. You will conduct load, stress, scalability, and endurance tests using industry-standard tools and analyze the results to collaborate with development teams in resolving bottlenecks. Your role will also involve collaborating with cross-functional teams to understand system architecture and performance requirements, monitoring system performance during test execution, and contributing to continuous improvement in performance testing strategy. To excel in this role, you must possess strong hands-on experience with performance testing tools like JMeter, LoadRunner, or Gatling. Proficiency in Java for scripting and backend logic analysis is essential, along with experience in Apache Kafka, including knowledge of producers, consumers, and message flow. Familiarity with Microsoft technologies such as .NET applications, Azure environments, or SQL Server will be advantageous. A good understanding of software development life cycles (Agile/Scrum) and CI/CD pipelines, along with the ability to identify performance bottlenecks across different system layers, will be key to your success in this role.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Technical Security Architect specializing in Confluent Cloud and AWS Data Platform, you will be responsible for supporting security design, governance, and assurance activities for a project hosted on Confluent Cloud and AWS. Your role will involve collaborating with platform engineers, data architects, and application teams to ensure security is integrated into the solution design and deployment process. Additionally, you will lead threat modeling and risk assessments for key components and integrations. Your key responsibilities will include defining and documenting secure architecture patterns and guardrails for Confluent Cloud, Kafka pipelines, and associated AWS infrastructure. You will review and write technical design and security documentation, ensuring compliance with internal security policies, industry best practices, and regulatory requirements. Furthermore, you will map security control requirements to organizational policies and contribute to the development of platform-specific security controls and operational playbooks. In terms of skills and experience, you should have at least 10 years of overall experience in cloud technologies, with a minimum of 5 years in a security architect role. You must have proven experience as a Security Architect on cloud-native platforms, preferably AWS, and familiarity with AWS security services such as IAM, KMS, VPC, GuardDuty, and Security Hub. A strong understanding of security principles across identity, network, data, and application layers is essential, along with knowledge of Confluent Cloud, Apache Kafka, and secure data streaming practices. Experience in writing and reviewing technical documentation clearly and concisely is required, as well as working with cross-functional DevOps, data, and infrastructure teams. Desirable skills include experience in Agile project environments and tools like Jira and Confluence, as well as previous involvement in enterprise data platform or streaming data projects. NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. Committed to helping clients innovate, optimize, and transform for long-term success, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, and application, infrastructure, and connectivity management. As one of the leading providers of digital and AI infrastructure, NTT DATA is part of the NTT Group, investing significantly in R&D to support organizations and society in confidently moving into the digital future.,
Posted 1 month ago
10.0 - 12.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Budget: As per industry standards Please share Current CTC, Expected CTC, and Notice Period when applying. We are seeking an experienced Java Backend Developer / Technical Lead with strong hands-on expertise in designing and implementing scalable backend services, RESTful APIs, and microservices using the Spring ecosystem. What You Will Do: Design and develop scalable RESTful Web Services. Lead development using Spring Boot, Spring Framework, and Microservices architecture. Work with databases such as Oracle, PostgreSQL, SQL Server, or MySQL. Integrate real-time messaging solutions like Apache Kafka. Collaborate with DevOps for CI/CD using tools like Jenkins, GitLab, etc. Leverage cloud platforms (AWS, Azure) for scalable deployments. Follow Agile practices and ensure quality-first development and clean architecture. Architect, design, and implement complex backend systems and mentor junior engineers. Primary Skills Required: Java/J2EE, Spring Boot, Spring AOP/DI Kafka or similar messaging frameworks REST API Development Oracle / SQL Server / PostgreSQL / MySQL CI/CD tools (Jenkins, GitLab) Cloud Experience: AWS or Azure Strong understanding of Agile methodologies Proven experience leading or mentoring software teams Qualifications: B.Tech in Computer Science or equivalent Minimum 10+ years of relevant backend development experience Excellent communication, leadership, and technical design skills Apply Now If interested, please share your resume along with: Current CTC Expected CTC Notice Period Current Location / Willingness to Relocate to Hyderabad
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
You will be working with MCX, where your career growth path will be sharpened to help you excel. At MCX, the success is attributed to the employees" domain expertise and commitment. The recruitment process at MCX focuses on finding the right fit between roles and profiles, offering exciting and challenging career opportunities for ambitious and result-oriented professionals. It provides a platform to realize your potential in your chosen area of expertise. As a Manager - Java Full Stack Developer, based in Mumbai, you are required to have a Bachelor's degree in computer science, Information Technology, or a related field, along with 8 to 10 years of overall experience in software development, with at least 6 years as a full stack developer. You should have proficiency in React for frontend development, strong experience in backend development using Java Spring Boot, expertise in designing and consuming RESTful APIs, hands-on experience with Apache Kafka, and knowledge of relational and non-relational databases. Additionally, you should possess strong problem-solving skills, attention to detail, and domain knowledge in trading exchanges, financial markets, and surveillance systems. Your responsibilities will include full stack development, building responsive and interactive frontend interfaces using React, designing and implementing secure, scalable, and efficient backend services using Java Spring Boot, developing RESTful APIs for seamless frontend-backend communication, integrating APIs with third-party services and trading exchange systems, managing real-time data streaming solutions using Apache Kafka, and incorporating domain knowledge to enhance features like fraud detection, trading behavior monitoring, and risk management. You will collaborate with cross-functional teams, provide technical guidance to team members, monitor system performance, resolve application issues, optimize performance for both frontend and backend components, implement secure coding practices, and ensure compliance with financial industry standards and regulatory guidelines. If you need further assistance, you can contact MCX at 022-67318888 / 66494000 or email at careers@mcxindia.com.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
We are searching for a talented Java Kafka Developer to join our dynamic engineering team. As a Senior Java Kafka Developer, your responsibilities will include designing, developing, and maintaining real-time data streaming applications using Apache Kafka and Java technologies. Your role will be vital in constructing scalable and fault-tolerant event-driven systems that cater to our business requirements. This is a full-time employment opportunity based in Hyderabad. In this role, you will be expected to design, develop, and maintain robust Java applications utilizing the Spring Framework. You will also be responsible for developing and managing event-driven systems with Apache Kafka, as well as implementing enterprise integration solutions using Apache Camel. Collaborating with cross-functional teams to gather and analyze requirements, optimizing performance, conducting code reviews, writing unit/integration tests, and mentoring junior developers are also key aspects of this role. To be successful in this position, you should possess 6 to 10 years of hands-on experience in Java development. Strong expertise in the Spring Framework (Spring Boot, Spring MVC, Spring Cloud), solid experience with Apache Kafka, hands-on experience with Apache Camel, understanding of RESTful APIs, microservices architecture, messaging systems, build tools, version control, CI/CD pipelines, and database technologies are essential. Additionally, strong problem-solving skills, the ability to work in a fast-paced environment, excellent communication, and interpersonal skills are required. In return, we offer a competitive salary and benefits package, a culture focused on talent development with quarterly promotion cycles, opportunities to work with cutting-edge technologies, employee engagement initiatives, annual health check-ups, and various insurance coverages. We are committed to fostering diversity and inclusion in the workplace, providing hybrid work options, flexible hours, and ensuring accessible facilities for employees with disabilities. At Persistent, we strive to create a values-driven and people-centric work environment where employees can accelerate growth, impact the world positively, enjoy collaborative innovation, and unlock global opportunities. If you are ready to unleash your full potential, join us at Persistent, an Equal Opportunity Employer that values diversity and prohibits discrimination and harassment.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As an experienced Integration Architect, your primary responsibility will be to define and implement integration architecture and strategies that align with the product ecosystem. You will lead technical discussions with customer teams to understand their business, functional, and non-functional requirements, and then translate them into integration designs. Your expertise will be crucial in architecting, designing, and developing system integrations using API-first, event-driven, and publish-subscribe models. In this role, you will provide technical guidance on best practices for API management, security, scalability, and data governance. Working closely with cross-functional teams, such as product engineering, professional services, and customer success, you will ensure smooth integration implementations. It will be your responsibility to implement data validation, transformation, and normalization processes to maintain data integrity and comply with the data model. Your role will also involve troubleshooting, debugging, and optimizing integration performance to ensure scalability and resilience. You will stay updated with emerging technologies in cloud integration, API management, and middleware solutions, integrating them into the existing framework. Additionally, you will develop and maintain integration documentation, best practices, and coding standards. To excel in this role, you should have over 10 years of experience in integration architecture, middleware solutions, and API management. A strong understanding of cloud architecture, hybrid integration, and microservices-based integrations is essential. Hands-on experience with API design, RESTful services, SOAP, Webhooks, and event-driven architectures will be required. Expertise in data formats like JSON, XML, and EDI, along with integration security protocols, is crucial. Experience with integration frameworks such as MuleSoft, Boomi, Apache Kafka, or equivalent is preferred. Knowledge of integrating with enterprise applications like SAP, Oracle NetSuite, Salesforce, Oracle Utilities, and other industry-specific platforms will be beneficial. You should possess strong problem-solving skills and the ability to collaborate effectively with internal teams and customers to overcome technical challenges. The role requires you to manage multiple projects with competing priorities and deliver within deadlines. Excellent communication skills are necessary to engage with both technical and non-technical stakeholders effectively. Your contribution in ensuring seamless connectivity and data flow across the ecosystem will enhance the overall customer experience and platform efficiency.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
NTT DATA is looking for a Technical Security Architect with expertise in Confluent Cloud and AWS Data Platform to join their team in Bangalore, Karnataka, India. As a Technical Security Architect, you will be responsible for supporting security design, governance, and assurance activities for a Confluent Cloud and AWS-hosted data platform project. Your role will involve defining secure architecture patterns, collaborating with various teams, and ensuring security is integrated into solution design and deployment. Key Responsibilities: - Define and document secure architecture patterns and guardrails for Confluent Cloud, Kafka pipelines, and associated AWS infrastructure. - Collaborate with platform engineers, data architects, and application teams to embed security in solution design. - Lead threat modeling and risk assessments for key components and integrations. - Review and write technical design and security documentation, ensuring adherence to internal security policies and compliance requirements. - Map security control requirements to organizational policies and regulatory frameworks. - Support project management by acting as a key security stakeholder, tracking security deliverables, and communicating effectively with project managers and stakeholders. Skills & Experience: Mandatory: - 10+ years of experience in cloud technologies with at least 5 years in a security architect role. - Proven experience as a Security Architect on cloud-native platforms, preferably AWS. - Familiarity with AWS security services such as IAM, KMS, VPC, GuardDuty, Security Hub. - Strong understanding of security principles across identity, network, data, and application layers. - Knowledge of Confluent Cloud, Apache Kafka, and secure data streaming practices. - Ability to write and review technical documentation clearly and concisely. - Experience working with cross-functional DevOps, data, and infrastructure teams. Desirable: - Experience in Agile project environments and tools like Jira and Confluence. - Previous involvement in enterprise data platform or streaming data projects. Join NTT DATA, a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team of experts in over 50 countries, NTT DATA offers services in business and technology consulting, data and artificial intelligence, industry solutions, and application development and management. Be part of a leading provider of digital and AI infrastructure and contribute to a sustainable digital future. Visit us at us.nttdata.com.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Python Back End Developer, you will be responsible for various day-to-day tasks related to back-end web development, software development, and object-oriented programming (OOP). This contract role offers a hybrid work model, with the primary location being Hyderabad and some flexibility for work-from-home. You should possess proficiency in Back-End Web Development and Software Development, along with a strong understanding of Object-Oriented Programming (OOP). Basic skills in Front-End Development, solid programming skills, and experience with Cloud platforms such as GCP/AWS are essential for this role. Additionally, excellent problem-solving and analytical skills are required, enabling you to work effectively both independently and as part of a team. Experience with Python frameworks like Django or Flask would be advantageous. It would also be beneficial if you have worked on data pipelines using tools like Airflow, Netflix Conductor, etc., and have experience with Apache Spark/beam and Kafka. This role offers an exciting opportunity for a skilled Python Back End Developer to contribute to a dynamic team and work on challenging projects in a collaborative environment.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You will be responsible for developing and maintaining high-performance server-side applications in Python following SOLID design principles. You will design, build, and optimize low-latency, scalable applications and integrate user-facing elements with server-side logic via RESTful APIs. Maintaining ETL and Data pipelines, implementing secure data handling protocols, and managing authentication and authorization across systems will be crucial aspects of your role. Additionally, you will ensure security measures and setup efficient deployment practices using Docker and Kubernetes. Leveraging caching solutions for enhanced performance and scalability will also be part of your responsibilities. To excel in this role, you should have strong experience in Python and proficiency in at least one Python web framework such as FastAPI or Flask. Familiarity with ORM libraries, asynchronous programming, event-driven architecture, and messaging tools like Apache Kafka or RabbitMQ is required. Experience with NoSQL and Vector databases, Docker, Kubernetes, and caching tools like Redis will be beneficial. Additionally, you should possess strong unit testing and debugging skills and the ability to utilize Monitoring and Logging frameworks effectively. You should have a minimum of 1.5 years of professional experience in backend development roles with Python. Your expertise in setting up efficient deployment practices, handling data securely, and optimizing application performance will be essential for success in this position.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As an experienced Java Developer with over 5 years of expertise, you have a strong background in building scalable, distributed, and high-performance microservices utilizing Spring Boot and Apache Kafka. Your proficiency lies in designing and developing event-driven architectures, RESTful APIs, and integrating real-time data pipelines. You are well-versed in the full software development life cycle (SDLC), CI/CD practices, and Agile methodologies. Your key skills include Java (8/11/17), Spring Boot, Spring Cloud, Apache Kafka (Producer, Consumer, Streams, Kafka Connect), Microservices Architecture, RESTful Web Services, Docker, Kubernetes (basic knowledge), CI/CD (Jenkins, Git, Maven), Relational and NoSQL Databases (MySQL, PostgreSQL, MongoDB), Monitoring (ELK Stack, Prometheus, Grafana - basic), Agile/Scrum methodology, and Unit and Integration Testing (JUnit, Mockito). In your professional journey, you have developed and maintained multiple Kafka-based microservices handling real-time data ingestion and processing for high-volume applications. Your expertise extends to implementing Kafka consumers/producers with error-handling, retries, and idempotency for robust message processing. Additionally, you have designed and deployed Spring Boot microservices integrated with Kafka, PostgreSQL, Redis, and external APIs, showcasing your leadership in performance tuning and optimization to ensure low-latency and fault-tolerant behavior. If you are passionate about leveraging your skills in Java, Spring Boot, Apache Kafka, and microservices architecture to drive impactful projects and contribute to cutting-edge technologies, this opportunity might be the perfect match for you. Thank you for considering this role. Best regards, Renuka Thakur renuka.thakur@eminds.ai,
Posted 1 month ago
11.0 - 15.0 years
0 Lacs
haryana
On-site
You are an experienced software engineer with over 11 years of total experience. You have a strong background in architecture and development using Java 8 or higher. Your expertise includes working with Spring Boot, Spring Cloud, and related frameworks. You possess a deep understanding of Object-Oriented Programming and various Design Patterns. You have hands-on experience in building and maintaining microservices architecture in cloud or hybrid environments. Your skills include working with REST APIs, Caching systems like Redis, multithreading, and cloud development. You are proficient in Apache Kafka, including Kafka Streams, Kafka Connect, and Kafka clients in Java. Additionally, you have worked with SQL and NoSQL databases such as MySQL, PostgreSQL, and MongoDB. You are familiar with DevOps tools and technologies like Ansible, Docker, Kubernetes, Puppet, Jenkins, and Chef. You have proven expertise in CI/CD pipelines using Azure DevOps, Jenkins, or GitLab CI/CD. Your experience includes using build automation tools like Maven, Ant, and Gradle. You have hands-on experience with cloud technologies such as AWS, Azure, or GCP, and have knowledge of Snowflake or equivalent cloud data platforms. Understanding predictive analytics and basic ML/NLP workflows is part of your skill set. You have a strong grasp of UML and design patterns. With excellent problem-solving skills and a passion for continuous improvement, you can communicate effectively and collaborate with cross-functional teams. Your responsibilities include writing and reviewing high-quality code, analyzing clients" needs, and envisioning solutions for functional and non-functional requirements. You will implement design methodologies, coordinate application development activities, and lead/support UAT and production rollouts. Additionally, you will estimate effort for tasks, address issues promptly, and provide constructive feedback to team members. You will troubleshoot and resolve complex bugs, propose solutions during code/design reviews, and conduct POCs to validate design/technologies. You hold a bachelor's or master's degree in computer science, Information Technology, or a related field.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
indore, madhya pradesh
On-site
Are you passionate about building scalable data pipelines and working with real-time streaming platforms Join our growing team as a Data Engineer and help power next-gen data solutions! As a Data Engineer, you will be responsible for designing and maintaining real-time data pipelines using Apache Kafka. You will write efficient and optimized SQL queries for data extraction and transformation. Additionally, you will build robust ETL/ELT processes for structured & unstructured data and collaborate with analysts, data scientists & developers to deliver insights. Ensuring data quality, security & performance optimization will be a crucial part of your role. Integration with tools like Spark, Airflow, or Snowflake (as applicable) will also be part of your responsibilities. We value proficiency in Apache Kafka, Kafka Streams or Kafka Connect, strong skills in SQL, Python/Scala, and cloud platforms (AWS/GCP/Azure), experience with data lakes, message queues, and large-scale systems, as well as a problem-solving mindset and a passion for clean, efficient code. Working with us will involve exciting projects with global clients, a collaborative and innovation-driven environment, flexible working options, and competitive compensation. If you are excited about this opportunity, apply now at yukta.sengar@in.spundan.com or tag someone perfect for this role!,
Posted 1 month ago
7.0 - 12.0 years
10 - 20 Lacs
Bengaluru
Work from Office
8+ Years of exp in Database Technologies: AWS Aurora-PostgreSQL, NoSQL,DynamoDB, MongoDB,Erwin data modeling Exp in pg_stat_statements, Query Execution Plans Exp in Apache Kafka,AWS Kinesis,Airflow,Talend.AWS Exp in CloudWatch,Prometheus,Grafana, Required Candidate profile Exp in GDPR, SOC2, Role-Based Access Control (RBAC), Encryption Standards. Exp in AWS Multi-AZ, Read Replicas, Failover Strategies, Backup Automation. Exp in Erwin, Lucidchart, Confluence, JIRA.
Posted 1 month ago
5.0 - 6.0 years
5 - 6 Lacs
Pune
Work from Office
• Design, develop, and maintain scalable using .NET Core and C#. • Work with Entity Framework Core and LINQ • Develop and optimize complex SQL queries • Integrate and manage Apache Kafka • Develop and maintain Windows Services
Posted 1 month ago
2.0 - 5.0 years
4 - 8 Lacs
Kolkata
Work from Office
Responsibilities : - Hands-on development in Golang to deliver trustworthy and smooth functionalities to our users - Monitor, debug, and fix issues in production at high velocity based on user impact - Maintain good code coverage for all new development, with well-written and testable code - Write and maintain clean documentation for software services - Integrate software components into a fully functional software system - Comply with project plans with a sharp focus on delivery timelines Requirement : - Bachelor's degree in computer science, information technology, or a similar field - Must have 3+ years of experience in developing highly scalable, performant web applications - Strong problem-solving skills and experience in application debugging - Hands-on experience of Restful services development using Golang - Hands-on working experience with database; SQL (PostgreSQL / MySQL) - Working experience of message streaming/queuing systems like Apache Kafka, RabbitMQ, - Cloud experience with Amazon Web Services (AWS) - Experience with Serverless Architectures (AWS) would be a plus - Hands-on experience with API / Echo framework.
Posted 1 month ago
2.0 - 5.0 years
4 - 8 Lacs
Nashik
Work from Office
Responsibilities : - Hands-on development in Golang to deliver trustworthy and smooth functionalities to our users - Monitor, debug, and fix issues in production at high velocity based on user impact - Maintain good code coverage for all new development, with well-written and testable code - Write and maintain clean documentation for software services - Integrate software components into a fully functional software system - Comply with project plans with a sharp focus on delivery timelines Requirement : - Bachelor's degree in computer science, information technology, or a similar field - Must have 3+ years of experience in developing highly scalable, performant web applications - Strong problem-solving skills and experience in application debugging - Hands-on experience of Restful services development using Golang - Hands-on working experience with database; SQL (PostgreSQL / MySQL) - Working experience of message streaming/queuing systems like Apache Kafka, RabbitMQ, - Cloud experience with Amazon Web Services (AWS) - Experience with Serverless Architectures (AWS) would be a plus - Hands-on experience with API / Echo framework
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |