Lead Analyst/Developer – One Identity Manager (OIM) We’re seeking a highly skilled Lead Analyst/Developer with deep development experience in One Identity Manager (OIM) to join our global ISD team. The ideal candidate will have hands-on development expertise (not just support) in OIM, strong database scripting knowledge, and a firm grasp of Identity and Access Management (IAM) concepts. 📍 Locations: Pune | Bhopal | Bengaluru | UK | Poland 🛠 Key Responsibilities: Lead development, analysis, and L3 support for One Identity Manager (OIM) Manage a pipeline of enhancements and changes through full software development lifecycle Perform requirements gathering, development, regression testing, deployment & documentation Mentor L2 resources and provide handovers Collaborate with stakeholders across infrastructure and application teams Design and develop OIM-related integrations, reports, and workflows Manage environments and coordinate UAT and production releases ✅ Must-Have Skills: 5+ years of strong development experience in One Identity Manager (OIM) SQL , MS SQL Server , VB.NET , PowerShell Solid understanding of IAM concepts – JML, RBAC, PAM, Recertification Experience with complex database systems and data architecture Strong troubleshooting, documentation, and application testing skills Familiarity with Jira , Confluence , and service desk platforms ➕ Desirable Skills: Experience with Ikasan , C# , HTML , Angular , C++ Understanding of Active Directory , LDAP , and web services Experience in Investment Banking or Financial Services Exposure to SAP , Oracle ERP , or middleware solutions Familiarity with software configuration management 🎓 Qualifications: Degree in Computer Science , Engineering , Mathematics , or related field Show more Show less
Experience Range: 4 - 14 Years Location: Hyderabad (Hybrid Work Mode) Interview Mode: Face - to - Face (11th October) About the Role We are looking for a highly skilled Backend Developer with strong expertise in Java and modern backend technologies. The ideal candidate will be responsible for designing, developing, and deploying scalable backend solutions while working on cloud-native architectures and microservices. Key Responsibilities Design, develop, and maintain high-performance backend services using Java & Spring Boot . Build and manage microservices-based architectures ensuring scalability and reliability. Deploy and manage applications using Docker and Kubernetes . Work with cloud platforms (AWS / GCP / Azure) for application development and deployment. Collaborate with cross-functional teams (Frontend, DevOps, QA) to deliver end-to-end solutions. Write clean, efficient, and testable code while following best practices. Troubleshoot and optimize system performance and scalability. Required Skills Strong programming experience in Java (Core & Advanced). Hands-on expertise with Spring Boot framework. Proven experience in Microservices architecture . Containerization experience with Docker and orchestration with Kubernetes . Exposure to any major Cloud platform (AWS / Azure / GCP). Strong problem-solving and debugging skills. Good understanding of software development best practices, CI/CD, and Agile methodologies. Good to Have Knowledge of REST APIs, Kafka, or messaging systems . Familiarity with database technologies (SQL / NoSQL). Experience with monitoring and logging tools.
Role: SDET Engineer Location: Hyderabad Mandatory Skill-Automation Testing, Playwright, Testing Dynamics 365 CRM Applications, API Testing • 4+ years of exp in Automation Testing, Playwright, API Testing • Demonstrable experience with Java as a programming Language • Exp working in BDD framework • Experienced in continuous integration CICDGenkin git • Exp working in Agile projects • Exp in SQL postgres cassandra elasticsearch Good communication skills The candidate should have excellent soft skills strong technical ability with an extensive passion to learn with a growth mindset
We are seeking a highly skilled Data Engineer to design, develop, and optimize scalable data pipelines and solutions on Google Cloud Platform (GCP) . The ideal candidate will have strong expertise in BigQuery, SQL, Python/Java , and hands-on experience in building robust data pipelines to support advanced analytics and business intelligence. Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL/ELT processes. Work extensively on GCP services including BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Composer. Optimize data models and queries for performance and cost efficiency in BigQuery . Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data quality, governance, and security across all data pipelines and storage systems. Troubleshoot and resolve data pipeline issues in real time. Contribute to architecture design discussions and provide best practices for data engineering. Required Skills Strong hands-on experience with Google Cloud Platform (GCP) , particularly BigQuery . Proven expertise in building and maintaining data pipelines . Strong SQL skills for query optimization and large-scale data manipulation. Proficiency in Python or Java for developing scalable ETL/ELT solutions. Good understanding of data modeling, partitioning, and performance tuning . Experience with workflow orchestration tools (e.g., Airflow/Cloud Composer) is a plus. Familiarity with CI/CD, version control (Git), and agile methodologies. Good to Have Exposure to streaming data pipelines (Pub/Sub, Kafka). Knowledge of machine learning pipelines integration. Experience with other cloud platforms (AWS/Azure) . Soft Skills Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Ability to work independently and as part of a team in a fast-paced environment.