The Team: The team is responsible for building carbon trading platform using emerging tools and technologies. The team works in a significant environment that gives ample opportunities to use creative ideas to take on complex analytical problems. You will have the opportunity every single day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: You will be making meaningful contribution in building solutions for the User Interfaces/Webservices/API/Data Processing. The work you do will provide the capability to platform users to trade the carbon credits Whats in it for you: Build a career with a global company Work on code that fuels the global carbon markets Grow and improve your skills by working on enterprise level products and new technologies Attractive benefits package (Medical services, Special discounts for gyms, Meal vouchers) Ongoing Education (Participation in conferences and training) Access to the most interesting information technologies Flexible Working Hours Responsibilities: Architect, design and develop solutions within a multi-functional Agile team to support key business needs Design, and implement software components for different IT systems. Perform analysis and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Engineer components, and common services based on standard corporate development models, languages, and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. What Were Looking For: Basic Qualifications: Bachelor's /Masters Degree in Computer Science, Data Science or equivalent. 10 to 16 years of Full Stack Java, Springboot,, AWS, API development, restful services, data modelling persistence stores and ORMs Hands on experience with Java and related technologies. Have excellent communication and interpersonal skills Have strong analytical skills and learning agility. Must be hands on in coding specifically using NodeJS and related technologies. Have ability to work in a collaborative work environment Team leadership experience Knowledge and experience of deploying to cloud services, preferably AWS. Strong expertise and knowledge in Microservices Cloud experience in AWS or Azure, Optional Qualifications: Other JavaScript frameworks like Angular, Proficient with software development lifecycle (SDLC) methodologies like Agile, Test- driven development. Flexible Working (optional) We pride ourselves on our agility and diversity, and we welcome requests to work flexibly. For most roles, flexible hours and/or an element of remote working are usually possible. Please talk to us at interview about the type of arrangement that is best for you. We will always try to be adaptable wherever we can. Return to Work Have you taken time out for caring responsibilities and are now looking to return to work? As part of our Return to Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply, and will actively support your return to the workplace.
Who you are 1 - 3 years of experience building enterprise-level applications. Strong back-end developerwith experience ofbuilding high-performance back-end services, jobswith C# and .NET Core. Knowledge of RESTful APIs, microservices,design principles(SOLID)anddesign patterns. Implemented both relational and non-relational databases, such as Microsoft SQL Server or Postgres and MongoDB. Good to have awareness ofTest-DrivenDevelopment (TDD), testing frameworks. Comfortable using Azure DevOps or similar CI/CD tools, Git. Good to have some frontend experience using frameworks like ReactJS, Angular etc. Full project life-cycle experience, understanding of software development lifecycle (SDLC). Strong communicationskills. Problem-solving mindset with strong analytical and debugging skills. Bachelors or an advanced degree in Computer Scienceor a related engineering disciplinerequired. Good to have some exposure to Exposure to cloud platforms (Azure/AWS/GCP) and understanding of CI/CD pipelines. Good to have knowledge of message brokers or pub-sub systems like Kafka, RabbitMQ, etc. Good to have knowledge of Docker, Kubernetes and Terraform. Goodto have knowledge of Python and experience of working in applications which process and analyzelarge amountsof data. Agile software development methodologies. Working in multi-functional, multi-location teams.
Data Analyst The Team: The Data Analyst I is responsible for the collection, update, and quality control of Macro Economic, Financial, and Industry data as well as the maintenance of data ingestion tools. The Impact: The Data Analyst I role contributes to the business by supporting our macroeconomic and financial product delivering critical data and helping our customers make investment decisions. Data gathered and maintained constitutes the route of many S&P Global Market Intelligence businesses. It is crucial for GIA forecasts and analysis. Whats in it for you: Development opportunity Great atmosphere; teamwork. Possibility to learn foreign languages. Responsibilities: Update macroeconomic, financial, and industry data in specified time frame Rebase data and recalculate history as needed or work on data revisions in case of methodology changes Data addition as requested by manager Quality control of all supported data Collaborate with Analysts to provide help in day-to-day questions/issues Provide second level support to customer via the ticket system What Were Looking For: Bachelor's degree in Economics (Masters in Economics would be an advantage) 0-2 years experience in economic and financial data Excellent computer and analytical skills Proficiency in Microsoft Office products European language specialization like French, Spanish, Russian, German, Italian preferred Preferred candidate profile European language skills (both reading and writing)
The Team : Assembly Operations team is responsible for managing optimizing and enhancing daily production operations within the Data Factory, focusing on the development and implementation of data processes for Mobility data products. It thrives on collaboration, encouraging open communication and teamwork, which fosters a dynamic environment where innovative solutions are valued. The team is distinguished by its commitment to data integrity and operational excellence, ensuring that client needs are met with precision and efficiency. The Impact : This role is crucial for maintaining the efficiency of production processes, directly contributing to the quality of deliverables that serve our clients in the automotive sector. By managing large datasets and ensuring data accuracy, the position enhances client satisfaction and trust, ultimately impacting market performance and the company''s reputation. Whats in it for you: Growth Opportunities: Gain exposure to advanced data management techniques and tools, enhancing your technical skillset. Networking: Collaborate with policy makers and market leaders, expanding your professional network and industry knowledge. Global Exposure: Work in a multinational environment, engaging with teams across different regions and cultures. Innovative Environment: Be part of a team that values creativity and encourages innovative thinking to solve complex data challenges. Responsibilities: Manage and support daily production operations, ensuring optimal performance of data processes. Develop and implement scripts and modifications based on business specifications. Conduct thorough data analysis to support business requirements and enhance operational efficiency. Collaborate with cross-functional teams, including ETL and DBA, for seamless code changes and deployments. Monitor production processes post-release to ensure accuracy and address any issues promptly. Create comprehensive documentation for processes, procedures, and project management. What Were Looking For: Key Qualifications: Bachelors degree in computer science or a related field, with 2-3 years of relevant experience. Proficient in SQL , PL/SQL, Python. Experience with hands on in AWS native including Airflow, S3, EMR serverless, Athena. Strong analytical skills with experience in managing large volumes of transactional data. Familiarity with Oracle databases and tools like Toad or SQL Developer. Key Soft Skills: Excellent communication skills, able to convey technical concepts to non-technical stakeholders. Strong team player with the ability to work independently and collaboratively. Capable of managing multiple tasks under tight deadlines while maintaining attention to detail. Proactive problem solver who can anticipate challenges and propose effective solutions
Role Summary As a Director, Quality Engineering , you will provide strategic, delivery, and people leadership for the Analysis pillar within Ratings Technology. You will be accountable for leading a large, high performing QE organization while delivering complex, business critical Capital Markets platforms in a regulated environment. You will bring deep Capital Markets domain expertise, modern engineering practices, and forward-looking AI capabilities to build and scale intelligent quality systems that go beyond defect detection to defect prevention , continuous feedback , and product excellence This role requires a handson leader who balances enterprise level QE strategy with day-to-day delivery execution , stakeholder management , and direct people leadership . You will lead senior engineering and QE leaders, drive a data driven quality culture, and ensure predictable, highquality outcomes across a complex delivery portfolio. Key Responsibilities People Leadership & Culture Lead, mentor, and develop a distributed organization of senior QE and engineers (50-75+), fostering engagement, accountability, and continuous improvement. Build a strong, inclusive culture focused on engineering excellence, operational efficiency, and quality ownership . Own talent strategy including hiring, onboarding, performance management, succession planning, and leadership development. Navigate and resolve complex interpersonal and organizational challenges with empathy, professionalism, and clarity. Lead teams through change, ambiguity, and transformation while maintaining morale and delivery confidence. Leverage people, delivery, and engagement metrics to guide coaching, development plans, capacity planning, and conflict resolution. Delivery & Capital Markets Execution Own end to end delivery for a complex, multi??program portfolio supporting Capital Markets and Analytical platforms. Ensure consistent, predictable delivery across global, cross??functional teams in regulated financial services environments . Apply deep Capital Markets knowledge to drive risk??based quality strategies for data??intensive, market critical systems. Balance speed, quality, regulatory compliance, and operational risk in day??to??day delivery decisions. Stakeholder & Executive Management Act as a trusted partner to Product, Engineering, Data, Risk, and Compliance leadership . Translate quality, delivery, and operational risks into clear business impact for senior and executive stakeholders. Represent Quality Engineering in governance forums, audits, and regulatory interactions. Quality Engineering Transformation Lead the evolution from QA to engineering led QE , embedding shift??left, automation first, and continuous testing practices. Establish enterprise QE standards, operating models, and tooling strategies. Champion TDD, BDD, DevSecOps, and CI/CD??integrated quality practices across teams. AI Enabled & Data Centric Quality Drive adoption of AI powered quality engineering , including agentic test generation, defect triage, and predictive quality analytics. Define responsible AI usage standards aligned with security, compliance, and auditability. Lead test data and data quality strategies , including synthetic data, masking, lineage, migration validation, and cataloging. Use quality and delivery data to continuously improve outcomes and organizational effectiveness. Operational & Financial Accountability Own QE budget planning, vendor management, and investment prioritization. Measure and improve cost of quality, automation ROI, and delivery efficiency using objective metrics. Qualifications 15+ years in Quality Engineering or Engineering leadership within Financial Services / Capital Markets . Proven experience leading leaders and scaling large, complex delivery organizations. Demonstrated ability to manage challenging people situations , organizational change, and high??stakes delivery. Strong track record of data??driven decision??making across delivery, people management, and operations. Deep expertise in automation frameworks, CI/CD, test data management, and AI??enabled testing. Excellent executive communication and stakeholder management skills. Why This Role Lead a high impact organization at the intersection of Capital Markets, Analytics, Engineering, Data, and AI . Shape both people outcomes and platform quality at enterprise scale. Define how modern Quality Engineering operates in a regulated, data??driven financial environment.
Role We are seeking a senior technology leader (Senior Principal Engineer / Chief Architect & Technology Strategy) to partner closely with the Head of Technology for Enterprise Solutions in shaping the technology direction of a complex, regulated financial-services estate. This is a hands-on principal engineer and architect role, not a pure governance or advisory architecture function. The role exists to ensure technology strategy is grounded in engineering reality, deep domain understanding, and operational outcomes. You will work directly with the Head of Technology to shape and refine strategy, stress-testing ideas against engineering, regulatory, and operational constraints. You will engage closely with senior engineering leaders, platform teams, and product partners to turn strategy into real designs that can be built and run. You will get involved early in complex or high-risk initiatives and step in when it matters most, including during major incidents and recovery efforts. A key part of the role is knowing when to push forward, when to simplify, and when to slow down or stop. You will act as the engineering authority across Enterprise Solutions, leading primarily through influence rather than line management and the role carries broad responsibility for shaping architecture, design, and delivery decisions across multiple teams and domains. You will balance long-term strategy with near-term delivery constraints, operational resilience, and regulatory expectations, and represent the technology direction with clients and senior stakeholders, explaining design choices and trade-offs clearly and credibly. Overall, this role strengthens the technology function by making strategy buildable, operable, and defensible. Core Technical & Domain Expertise Experienced in setting technology strategy and architecture for complex, regulated, data-driven platforms, with strong experience in banking and financial services (public and private markets, syndicated lending, securities processing, counterparty and client lifecycle, regulatory reporting, or similar domains) Experienced in defining and owning multi-year technology strategies and roadmaps, setting clear priorities and sequencing change based on business outcomes, regulatory needs, and engineering reality Sets and governs reference architectures, design standards, and architectural guardrails, ensuring teams build in a consistent, scalable, and sustainable way. Strong understanding of how financial products, operational workflows, data, and controls work end-to-end, and uses that understanding to make sound system and platform design decisions Has demonstrated strong judgment designing distributed, event-driven, and data-intensive systems, including how they scale, fail, recover, and isolate risk. Experienced with cloud platforms (AWS, Azure, GCP) and the trade-offs around managed services, multi-region and multi-tenant designs, data residency, cost, performance, and resilience. Knowledge of advanced and new architectures and design paradigms e.g. agent-based systems, service abstractions, semantic data models, knowledge graphs, and where to apply or not. Experienced in ensuring AI-enabled systems are explainable, auditable, and operationally safe, with clear judgment on when AI is appropriate versus deterministic or rules-based approaches. Strong understanding of analytics and AI in production (e.g. data quality, anomaly detection, fraud/AML, workflow automation) and understands what it takes to run these reliably in regulated environments. Designs with security and controls built in, with a strong understanding of identity, data protection, failure containment, and regulatory risk. Experience & Mindset 20+ years building and operating systems in tier-1 banks, capital markets infrastructure, or similarly regulated, high-availability environments. Has owned or deeply influenced production systems under real load, is comfortable reading and writing code, and works effectively alongside senior engineers. Acts as a coach and multiplier for senior engineers and architects, raising the bar through hands-on problem solving, design reviews, and practical guidance rather than formal line management. Pragmatic, direct, and outcome focused. Leads through earned credibility and influence, is comfortable shaping decisions without authority, and communicates complex technical topics clearly to business, client, and regulatory audiences. Measures success by outcomes and evidence, not intent or presentation: technology strategy that translates into systems that work in production, reduce late-stage surprises and repeat incidents, scale and recover as designed, and stand up to real operational and regulatory scrutiny. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction.
The Team: The Team of Data Analyst s work on various research reports and company documents to collect information & generate meaningful consensus from the collected data. This effort is coupled with real time monitoring of global industry trade publications and websites/news aggregators. Different Team supports different different data groups like Fundamentals, Entity Management , People Data, Estimates & Market data Team, Public Ownership and many more The Impact: We provide highest quality content that is essential for our clients to make decisions with conviction . As a Data Analyst, you will support the integrity and comprehensiveness of the data set by utilizing internal & external public research sources such as government & regulatory documents, stock exchanges, industry journals, analyst reports and our internal research tools to collect, summarize, and synthesize relevant information. With exciting learning opportunities at its core, we''ll help you focus on building the essential skills needed for a successful and meaningful transition into the professional world. This position is an excellent steppingstone to understand the global market dynamism, that will allow you to gain a comprehensive understanding of the market and enable you to learn the various facets of the assigned industry. Once strong fundamental understanding of the dataset and proficiency at workflows is developed, this role would require working with new talent to develop/enhance your skillset and working on process improvement projects including LEAN/automation projects. Responsibilities: High quality data (Financial / Non-Financial Data) collation, analysis, extraction and entering the data in work tools as per guideline specifications for assigned vertical Understand the working of the dataset, be aware of the workflows and have strong working knowledge of work tools Providing input and ideas for new collection methods and product enhancements related to the dataset Deliver on predefined individual and team targets including delivering outcomes with quality and excellence. Create tech expertise within department Troubleshoots problems or issues and support team in enhancing the workflow/processes for department Reviewing feedback involving transactions content to help correct errors and establish or refine procedures and processes to improve accuracy What Were Looking For: Fresh graduates with a commerce (BBA/BMS/BCom) background (Passing year 2025) OR PGDM/MBA/M.Com specializing in Finance/Accounts (Passing year 2025 or 2026) Excellent communication skills, both written and oral Strong understanding of corporate finance and accountancy, including financial statements and annual reports Well versed with secondary research sources Willing to work in 24*5 environment on rotational shifts (including night shifts) Certification or knowledge/experience in MS-office (Excel, Word, PowerPoint) Strong quantitative, analytical, and interpretive skills Ability to conduct efficient thematic online research
The Impact: This position directly interacts with the business team to understand their requirements and the added value to the business customers. The role provides exposure to the business and functional model of the company. The applications being worked on are highly business centric and revenue generating. Whats in it for you: You will be working within flat hierarchies in a young and dynamic team with flexible working hours. You will benefit from a bandwidth of career enhancing opportunities. You have very good opportunities to shape your own working environment in combination with a very good compensation as well as benefits and will experience the advantage of both a big enterprise and a small start-up at the same time. Also, you will be a key person to grow our team. You should also be motivated to introduce new innovative processes and tools into an existing global enterprise structure. Roles & Responsibilities Develop, and maintain scalable web applications using modern front-end frameworks (React, Angular). Build robust backend services and APIs using Python and/or Java. Implement cloud-native solutions, preferably on AWS. Develop and maintain CI/CD pipelines for automated deployment to cloud environments. Collaborate with cross-functional teams to deliver high-quality software solutions. Utilize AI-powered coding tools (e.g., GitHub Copilot, GenAI tools) to enhance productivity and code quality. Ensure application security, performance, and reliability. Troubleshoot, debug, and optimize code across the stack. Work hands-on with relational databases (RDBMS) such as Oracle and PostgreSQL. Requirements 6+ years of experience as a Full Stack Engineer or similar role. Expertise in front-end development using React and Angular. Strong backend development skills in Python and/or Java. Solid experience with AWS cloud services and architecture. Proven track record in building and maintaining CI/CD pipelines (DevOps practices). Experience using AI coding assistants (GitHub Copilot, GenAI tools, etc.). Hands-on experience with RDBMS (Oracle, PostgreSQL). Familiarity with containerization and microservices architecture. Excellent problem-solving and communication skills. What Were Looking For: 6 + years of proven experience as a Python Developer with expertise in Flask and Django frameworks. Strong background in backend data engineering and data processing. Hands-on experience with AWS services, including ECS, Fargate, S3, SNS, and Load Balancers. Proficiency in containerization. DevOps experience with CI/CD pipelines. Experience working with RDBMS databases like Oracle and PostgreSQL.
Position summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You''ll Do You will be part of Mastermind software engineering team. As part of this agile team, you will work in our cloud native environment to Develop and maintain applications using C# and .NET Core for the backend. Collaborate with cross-functional teams to design, implement, and deliver high-quality software solutions. Write clean, scalable, and maintainable code following best practices and coding standards. Integrate APIs and services to ensure seamless communication between frontend and backend systems. Work with cloud platforms (preferably GCP) for deployment, monitoring, and scaling applications. Participate in code reviews, unit testing, and debugging to ensure robust and reliable software. Contribute to architecture discussions and help improve system performance and security. Stay updated with emerging technologies and frameworks to continuously enhance product capabilities Who you are 1 - 3 years of experience building enterprise-level applications. Strong back-end developer with experience of building high-performance back-end services, jobs with C# and .NET Core. Knowledge of RESTful APIs, microservices, design principles (SOLID) and design patterns. Implemented both relational and non-relational databases, such as Microsoft SQL Server or Postgres and MongoDB. Good to have awareness of Test-Driven Development (TDD), testing frameworks. Comfortable using Azure DevOps or similar CI/CD tools, Git. Good to have some frontend experience using frameworks like ReactJS, Angular etc. Full project life-cycle experience, understanding of software development lifecycle (SDLC). Strong communication skills. Problem-solving mindset with strong analytical and debugging skills. Bachelor??s or an advanced degree in Computer Science or a related engineering discipline required. Good to have some exposure to Exposure to cloud platforms (Azure/AWS/GCP) and understanding of CI/CD pipelines. Good to have knowledge of message brokers or pub-sub systems like Kafka, RabbitMQ, etc. Good to have knowledge of Docker, Kubernetes and Terraform. Good to have knowledge of Python and experience of working in applications which process and analyze large amounts of data. Agile software development methodologies. Working in multi-functional, multi-location teams. Hybrid Model: twice a week work from office Shift Time: 12 pm to 9 pm IST
About the Role: Data Security Lead The Team: The IP Security team safeguards S&P Global??s proprietary assets??source code, models, datasets, design artifacts, architecture, and know??how across applications and infrastructure. We set the standards, controls, and guardrails that prevent leakage, misuse, or unauthorized commercialization of intellectual property, and we enable product teams to innovate quickly while staying compliant with enterprise policy and regulatory obligations. Responsibilities and Impact: The Data Security Engineering Lead ?? Intellectual Property (IP) is accountable for designing, implementing, and operating S&P Global??s enterprise capabilities for protecting intellectual property. This role converts the IP security architecture into scalable, resilient engineering solutions that provide deep visibility into where critical data and IP artifacts reside, what type of data they represent, how they are accessed, and how they are utilized across applications and AI systems. The Engineering Lead oversees the day??to??day technical execution of IP discovery, classification, access monitoring, and protection controls. The role includes leading a team of engineers and partnering closely with architecture, cloud, data, and AI platform teams to ensure comprehensive and consistent safeguarding of the company??s intellectual property. What We??re Looking For: Engineering & Platform Implementation Lead the design, deployment, and operation of data security platforms supporting discovery, classification, and monitoring Implement enterprise-scale DSPM capabilities across cloud, on-prem, SaaS, and third-party environments Ensure platforms are reliable, scalable, and integrated with the broader security ecosystem Data Discovery & Classification Execution Build and operate pipelines to identify where data resides across structured and unstructured sources Implement automated data classification and tagging aligned with enterprise standards Ensure continuous discovery and reclassification as data changes Data Sources & Access Monitoring Integrate data security tools with IAM, cloud services, and data platforms Monitor and surface who has access to data , including privileged, service, and AI identities Detect over-permissioned access, stale entitlements, and anomalous data usage AI Data Consumption Controls Implement controls to monitor and govern data used by AI agents, ML models, and training pipelines Instrument visibility into data flows supporting AI prompts, training, inference, and outputs Partner with AI platform teams to enforce approved data usage patterns Intellectual Property (IP) Protection Engineering Implement IP protection mechanisms , including data watermarking, tagging, fingerprinting, and metadata controls Integrate IP protections into data pipelines and AI workflows Support detection and response to potential IP leakage events Operational Security & Incident Response Operate and tune detection mechanisms for data exposure and misuse Support security incident investigations by rapidly identifying impacted data and access paths Partner with SOC and IR teams to automate alerts and response workflows Engineering Leadership Lead, mentor, and grow a team of data security engineers Establish engineering standards, coding practices, and operational runbooks Drive automation to reduce manual effort and improve consistency Cross-Functional Collaboration Partner with Data Security Architects, Cloud Security, IAM, Privacy, Legal, and AI teams Translate architectural requirements into deployable engineering solutions Provide technical input into roadmap planning and platform evaluations Basic Required Qualifications: 10+ years of experience in security engineering, data security, or platform security Strong hands-on experience implementing data discovery, classification, and monitoring solutions Deep expertise in cloud platforms (AWS, Azure, GCP) and data services Experience integrating security platforms with IAM, logging, SIEM, and SOAR Practical experience supporting AI/ML platforms and data pipelines Familiarity with IP protection techniques , including watermarking or tagging Strong scripting or programming skills (Python, Go, or similar) Experience operating production security platforms at scale Preferred Qualifications Experience with DSPM platforms (e.g., Cyera, Wiz DSPM, BigID) Background in regulated industries (financial services, data providers) Knowledge of DLP, CASB, encryption, and key management Security or cloud certifications (CISSP, CCSP, cloud provider certs) Location - Gurugram,Noida,Uttarpradesh,Hyderabad
The Impact: Be apart of a growing global business which empowers its people to provide insight and guidance into how multiple teams interact along with how data sets are created then utilized by our clients. What's in it for you: Join a team which fosters team collaboration and client focus, enabling high performers to grow within the team. Direct exposure to industry leading client(s) who demand operational perfection. Further your knowledge solidifying a foundation of best practices alongside industry experts Responsibilities: You will be part of a dynamic team responsible for all client-facing operations-related items, data research, and processing functions on the WSO Specialized Account Services team, which primarily supports WSO users and their loan processing needs. As a Specialized Account Services Sr. Specialist, you are expected to support our clients and the business by orchestrating workflow and prompt resolution between internal and external teams. This includes data reconciliation and validation of trades and positions to ensure syndicated facility data integrity. Audit and correct any accounts that are material to the fund and/or flagged via a tolerance check for critical reporting needs. Support the month-end close process by ensuring timely and accurate operational processing and completion of critical audits by business day two. Reconcile various accounts and promptly research and correct any variances. Monitor and resolve aged exceptions across accounts, partnering with multiple internal and external parties. Assist in the opening and structuring of fund onboarding, ensuring all standard operating procedures are followed. Additional tasks include but are not limited to: Accurately reviewing and maintaining asset and contract data Overseeing new deal creation and ensuring data point accuracy Processing amendments, investigating discrepancies, and resolving issues Monitoring and resolving exceptions Performing additional syndicated loan maintenance Coaching and assisting other analysts and leadership as needed Maintaining strict adherence to processing deadlines, quality-controlled operational processes, and LSTA/LMA/Private Deals/Middle Market standards for loan transactions and/or trades Proactively identify internal and external processing and system-related issues. Collaborate with internal partners and management on resolution plans and escalations. Lead research and projects on cross-product integrations and workflow enhancements to support client business requirements and strategic growth. This includes direct client engagement, project management, and presentation coordination. Capture and channel user feedback through effective communication and listening. Partner with leaders, peers, product associates, sales, and relationship managers on projects to improve products and services. Support team members with daily project management, data validation, and data deliverables as needed. Be willing to cross-train with neighboring divisions to provide an agile workforce capable of handling issues and engaging with joint clients. What We??re Looking For: Basic Required Qualifications: Bachelor Degree or equivalent work experience 4-6 years? experience within financial services in the syndicated loan industry Must be able to handle stressful customer situations with patience and poise Ability to organize and prioritize complex issues and projects to completion Ability to think laterally, provide proper issue analyses, and question current processes Ability to provide guidance and clarification to support team deliverables Performs work effectively under little to no guidance Positive, proactive attitude and ability to work well in teams Exceptional skills in listening to clients, articulating ideas and complex information in a clear and concise manner Proven record of maintaining strong relationships with senior members of client organizations, addressing their needs, and maintaining a high level of client satisfaction Can resolve issues that are often varied and non-routine Additional Preferred Qualifications: WSO and/or Loan IQ experience Public accounting experience from Big 4 Substantial syndicated loan knowledge Strong MS office (Word, Excel, PowerPoint) skills Data research and analytics Shift Timings - US Shift
The Role: In this position, you will act as both a problem solver and subject matter expert, providing valuable insights to stakeholders and external clients on AI and data capabilities related to credit risk. The Impact: We are looking for a Lead AI Researcher to join our innovative thought leadership team within Credit Solutions. This role uniquely combines applied AI, problem-solving, and technical innovation. The ideal candidate will possess extensive expertise in quantitative disciplines, machine learning, and cloud technologies, while remaining abreast of the rapidly evolving AI landscape, including Retrieval-Augmented Generation (RAG), Agentic AI, and enterprise applications. Responsibilities: Domain Expertise: Serve as a domain expert on Generative and Agentic AI for Credit Solutions. Attain fluency in AI applications, including the use of open-source and internal tools, to enhance AI expertise within Credit Solutions. Hands-On Application: Utilize internal and external AI applications offered by S&P Global to support credit risk analysis and decision-making. Technology Monitoring: Closely monitor emerging technologies, industry adoption, and associated risks, including strengths, limitations, and regulatory guidelines concerning Gen-AI in credit risk. Use Case Development: Develop and test new credit risk use cases utilizing data and AI applications within S&P Global across multiple platforms, including data wrangling and cross-referencing. Data Science and Statistics: Analyze quantitative and qualitative data from various content sets linked to credit ratings within S&P Global to understand their contributions to credit risk surveillance and risk management workflows. Collaborate with the Credit Solutions Thought Leadership team to develop and work on internal applications within a sandbox environment. Thought Leadership: Author thought leadership content independently and represent S&P Global at webinars and conferences, focusing on AI advancements in credit risk management. This role offers invaluable exposure to client workflows and is pivotal in driving Credit Solutions'' digital transformation journey. You will have the opportunity to evolve into a thought leader in AI and credit risk management while gaining hands-on experience with our internal and external AI tools. What We''re Looking for: Master??s or Ph.D. degree in Data Science, Statistics, Quantitative Finance, Artificial Intelligence, or a related quantitative/computational field. 3-5 years of relevant experience in AI/ML, quantitative roles, or data science. Ph.D. graduates in a statistics/data science/quant finance discipline from an internationally accredited university with hands-on experience in Gen-AI projects may be considered with 0-3 years of experience. Proficient in Python, R, and SQL Experience in accessing data via feeds, cloud, or REST API preferred. Prior experience in developing predictive analytics, data wrangling of structured and unstructured data, Natural Language Processing, or other emerging AI and automation use cases in financial services highly advantageous General knowledge on GenAI frameworks (e.g., RAG, Agentic AI) and libraries, testing and validating AI models (e.g. via cross-validation, A/B testing, and performance metrics) will be helpful. Strong communication skills with a proven ability to solve problems and engage effectively across functions. Self-starter with a proactive mindset, capable of working independently across multiple groups and stakeholders. Location - Gurugram,Mumbai,Maharastra,Hyderabad
Your Role You will be a key member of the ES- Markets Professional Services Team. Key responsibilities include: Executing product analysis to identify specific requirements and use cases with Private Equity Enterprise software clients. Collaborating with internal team members to identify the risks involved in the process, raise them & get them sorted appropriately. Acting as an extension of APAC/US/EMEA client deployment team, helping to implement, customize and enhance the Portfolio Monitoring and Reporting services provided to Private Equity clients. Independently managing multiple implementation projects using best practices regarding system setup and utilization Preparing financial reports, templates, configure portal as per PE/VC clients requirements. Balancing multiple client implementations simultaneously and efficiently maintaining accountability to all stakeholders Regularly collaborating and working with Senior Team members, Global Project Managers and Team Managers regarding forward-looking initiatives. Should be quick in problem solving and providing solutions to the team and to the clients on the product and its functions. Communicating weekly updates internally to the Implementations TLs. About You Are you an analytical thinker who enjoys working with clients and is looking to excel in a highly dynamic and innovative environment of FinTech and Private Capital Markets space? Then this is the right place for you. Key Qualifications and Skills: MBA in Finance/Business Analytics is required with a minimum of 3 years of Financial Services industry experience. Professional certifications like CA or CMA, CFA or FRM is a plus. 3-5 years of experience in software, financial services vendor or tech/ management project management consulting services will be an added advantage for the role. Knowledge of S&P MIs proprietary software (e.g. QVAL, iLEVEL, WSO/EDM) is an added advantage for the role. Experience in Private Capital Market space is a plus. Proficiency in Excel is must. Thorough understanding of financial reporting and operating statement analysis. Collaborative attitude, ability to work cross-functionally with diverse personalities. Intellectual curiosity and inquisitiveness. Excellent communication skills and comfort in driving discussions and presentations with team, clients, global stakeholders on a regular basis. Job Families for Job Profiles
About the Role As a GCP Observability Engineer, you will play a critical role in design and implementation of enterprise-scale applications within the Google Cloud Platform (GCP) environment. Your expertise in GCP observability services will ensure that our systems are reliable, scalable, and efficient, delivering exceptional value to our internal customers. The Team The Observability and Application Services team, part of Digital Technology Services (DTS), is responsible for designing, implementing, and maintaining large-scale data-based systems that power corporate applications, shared services, and information security systems. This is a rare opportunity to join a dynamic and impactful team providing tangible value to our internal customers, leveraging modern cloud-based platforms and services to deliver insights and drive efficiency. Responsibilities and Impact GCP technical lead within a team rapidly growing in GCP usage. Design and implementation of technical integrations to support enterprise scale observability and data pipelines feeding GCP. Develop and maintain CI/CD pipelines and Infrastructure as Code (IaC) practices to ensure robust deployment processes. Experience deploying OpenTelemetry at scale for monitoring, APM tracing, and performance optimization. Creation of detailed documentation and architectural diagrams, along with the presentation of solutions to peers and stakeholders. What Were Looking For Basic Required Qualifications Minimum Bachelor''s degree in a technical subject. Proven expertise in Google Cloud Platform (GCP) at enterprise scale with relevant certifications. Proven scripting/coding experience in at least one of: Bash, Python, PowerShell. Experience working with code repositories (e.g., Git) and CI/CD pipelines. Soft Skills Required Ability to present ideas and designs to both technical and non-technical audiences at all levels. Strong written and verbal communication skills. Strong documentation skills. Additional Preferred Qualifications Experience working in an Agile (Scrum) team. Experience with Terraform. Experience with enterprise observability tools and platforms. Knowledge of Agentic AI workflows and understanding of LLM Observability concepts.
What You''ll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of big data from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who you are 6 to 8 years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres. Experience workflow management tools: AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Hybrid Model: twice a week work from office Shift Time: 12 pm to 9 pm IST .
The Role Project Manager, Collection Platforms and AI The Team Collection Platforms and AI team is responsible for driving key automation projects, AI/ machine learning based solutions, collection platforms and lean initiatives across Enterprise Data Organization. We are responsible for creating, planning, and delivering transformational projects for the company using state of the art technologies and data science methods, developed either in-house or in partnership with vendors. We are transforming the way we are collecting the essential intelligence our clients need to do decisions with conviction, delivering it faster and on a scale while maintaining the highest quality standards. The Impact This role will be a part of IT Product Management and will work in a team responsible for leading the project management side of strategic initiatives related to the area of Data Science, ML, Gen AI /LLM based solutions and Data Analytics. The role involves creating and maintaining the momentum with stakeholders, focusing on value and incremental results while continuously looking for the latest and newest technologies and machine learning solutions available for various uses cases. Whats in it for you: You will be part of a team at the center of the transformation of a Fortune 500 company. You will be spearheading the strategic projects, witnessing the whole project life cycle, and deploying the end-to-end machine learning based data extraction solution. You will be working closely with stakeholders of various teams to understand their business challenges and to provide cuttingedge solutions to them. You will be getting immense exposure to commercial discussion with the vendors and will actively participate in the discussion with Legal, Finance and Technology teams. Collaborate with global stakeholders to solve complex business challenges using data analytics and AI/ML technologies. You will be part of a diverse, global and dynamic team. Responsibilities Interpret project requirements analyze and develop project plans, , and lead an end-to-end execution of AI/ML and analytics projects. Engage with business leads and peers, interfaces with senior leaders across the organization. Builds strong internal and external relationships and ensures service quality and customer satisfaction Build and present strategic insights using statistical analysis and data visualization to senior leadership. Act as an extension of our client service model ensuring seamless service delivery across Business Operations for our clients. Actively participate in the evaluation of machine learning based & data analytics solutions and lead the proof of concept for those solutions in collaboration with the Data Operations team and vendor. Build, synthesize and present strategic insights and recommendations to the senior stakeholders and ensure continuous communication with project stakeholders. Partner with Data teams and SMEs to drive the adoption of data standards, definitions, structures, hierarchies, taxonomies, attributes, and models to support and optimize end-to-end business solutions. Partner with IT and Scrum Teams to develop documentation and communication of system and process requirements to ensure smooth implementations. Act as a link between Scrum teams and vendor for any technical requirement. What are we looking for Basic Qualifications 7 + years of project management experience with a proven track record of leading and/or collaborating with teams, operations, and vendors in complex global matrix structures and business models. Technical experience/background in similar projects especially related to implementation of Data science, machine learning and Gen AI/LLM, and Data Analytics based projects Strong understanding of statistics, hypothesis testing, and predictive modeling. Familiarity with SQL, Python/R, and data visualization tools (Power BI, Tableau). Passion for understanding, learning, and identifying the latest technology trends and machine learning based solutions. Ability to drive initiatives independently, with minimal supervision, yet ensuring transparency to team members, SMEs, and other stakeholders. Ability to effectively communicate and collaborate with the key stakeholders to work towards generating value for the organization. Open-minded professional enjoying and used to working in a multicultural company. Preferred Qualifications Certified Project Management Professional or PRINCE2 is a plus Machine Learning, Gen AI/LLM , Python and Cloud exposure Demonstrable track of continuous self-learning is valued. A degree in a relevant field such as Computer Science, Engineering and Data Science or Statistics is preferred SAFe/Agile
OSTTRA India The Role: Associate Order to Cash (Collections) The Team: Osttra Finance Team is based out of the UK, India, Penang, Sweden and US. It is an entrepreneurial team, who is highly energetic, flexible, dynamic and works beyond boundaries to deliver the real values. We are standing up an independent global finance team. The Impact: OSTTRA has the largest financial institutions in the world as clients. We believe that long-term value to the client is the most important goal of our team, and technology is the tool to bring that value, and your contribution in the form of delivery, improvements and innovations can have a significant impact on the financial system. Whats in it for you: Associate - Order to Cash (OTC) Collections role will be part of Osttra Finance Team which is based out of the UK, India, Penang, Sweden and US. It is an entrepreneurial team, who is highly energetic, flexible, dynamic and works beyond boundaries to deliver the real values. We are standing up an independent global finance team and this is a great opportunity to be part of OSTTRA Work Hours: UK Shift Responsibilities: The Collections Associate will focus on the execution and accurate reporting of collections activities: Core Collections & Account Execution Execute the daily Collections process, focusing on timely follow-up for outstanding invoices and resolution of basic client issues. Maintain complete, accurate control of client emails and list of open issues, clearly documenting actions for end-to-end closure. Assist in developing and implementing standard, realistic payment plans and schedules. Proactively identify and resolve basic discrepancies or issues that may arise during the collection cycle, such as billing errors or missing PO information. Systems and Documentation Execute the collections process with a good understanding of OSTTRA businesses and product data. Support the maintenance of relevant SOPs (Standard Operating Procedures). Develop a working understanding of the different billing systems, specifically SAP, and log issues with respective technical teams. Support the Billing Team by assisting with data gathering necessary for invoice amendments. Stakeholder Management & Reporting Collaborate with internal departments (Sales, Technology, Finance) and customers to resolve routine billing errors and payment disputes and help expedite the collections process. Assist in generating data and preparing reports for the Commercial Dashboard. Assist in producing and tracking daily/weekly metrics including, but not limited to: Cash Inflow, Collection Effectiveness Index (CEI), and Recovery Rate. Process & Audit Support Provide support during audit engagements for the Collections process by gathering required documentation. Assist in identifying basic opportunities to enhance and streamline existing processes and gaps. Key Metrics: Responsible for producing metrics including but not limited to: Cash Inflow forecast to Treasury Collection Effectiveness Index (CEI) Recovery Rate Right Party Contact (RPC) Rate What Were Looking For: Basic Required Experience: To be successful in this role, the Associate - Order to Cash (OTC) Collections must possess a mix of Technical & Professional competencies. They must be adept at: Experience: 5-7 years of relevant experience in an OTC Collections process with a global organisation. Financial Services experience will be preferred. Stakeholder Management: managing key stakeholders across teams such as Sales, Technology including Finance across multiple locations Customer Management: Ability to resolve customer queries with effective resolution within agreed timescales. Teamwork: A collaborative and cooperative attitude, with the ability to work effectively with cross-functional teams. A cooperative attitude, with the ability to work effectively within cross-functional teams. Communication: Exceptional verbal and written communication skills, with the ability to present complex billing information clearly and concisely to key stakeholders. Good presentation skills is a must Analytical skills: Strong ability to analyse billing data, identify trends, and explain variances Attention to Detail: Meticulous approach to reviewing historical outstanding invoicing data, identify and resolve issues with the stakeholders and customers, Candidate should be highly accurate in data entry and record keeping Adaptability: Ability to thrive in a fast-paced, dynamic environment and manage competing priorities effectively. Problem-solving: Proactive and solution-oriented mindset, with the ability to identify issues and implement effective resolutions. Strong analytical and critical thinking skills to troubleshoot and resolve complex issues. Organizational Skills: Ability to multitask, prioritize, and manage a high volume of work in a fast-paced environment. Technological proficiency: Strong proficiency in Microsoft Excel (VLOOKUP, pivot tables, advanced formulas) and extensive experience with SAP, High Radius and Salesforce environment for billing functionalities. Qualifications : Graduate in Finance. MBA is preferred. Specialization in Order to Cash management process at least for the last 5 - 7 years Proven specialization in collections process
As an Associate director, you will be essential to drive customer satisfaction by delivering tangible business results to the customers. You will be working for the Enterprise Data Organization and will be an advocate and problem solver for the customers in your portfolio as part of the Collection Platforms and AI team. You will be using communication and problem-solving skills to support the customer on their automation journey with emerging automation tools to build and deliver end-to-end automation solutions for them. What are we looking for ? You will lead the design, development, and scaling of AI-driven agentic pipelines to transform workflows across S&P Global. This role requires a strategic leader who can architect end-to-end automation solutions using agentic frameworks, cloud infrastructure, and orchestration tools while managing senior stakeholders and driving adoption at scale. A visionary technical leader with knowledge of designing agentic pipelines and deploying AI applications in production environments. Understanding of cloud infrastructure (AWS/Azure/GCP), orchestration tools (e.g., Kubeflow), and agentic frameworks (e.g., LangChain, AutoGen). Technical implementation of: Interoperability amongst external tools/APS for example leveraging Model Context Protocol (MCP) Context Engineering: particularly applied short/long term memory of multi-step interactions that maintain the context Observability and Evaluation: to keep track of the agent quality, performance and monitoring. Agent to Agent interactions and protocols whether in prototyping or production to scale multi-agent systems. Proven ability to translate business workflows into automation solutions, with emphasis on financial/data services use cases. An independent proactive person who is innovative, adaptable, creative, and detailed-oriented with high energy and a positive attitude. Exceptional skills in listening to clients, articulating ideas, and complex information in a clear and concise manner. Proven record of creating and maintaining strong relationships with senior members of client organizations, addressing their needs, and maintaining a high level of client satisfaction. Ability to understand what the right solution is for all type of problems, understanding and identifying the ultimate value of each project. Operationalize this technology across S&P Global, delivering scalable solutions that enhance efficiency, reduce latency, and unlock new capabilities for internal and external clients. Exceptional communication skills with experience presenting to C-level executives Responsibilities Engage with the multiple client areas (external and internal) and truly understand their problem and then deliver and support solutions that fit their needs. Understand the existing S&P Global product to leverage existing products as necessary to deliver a seamless end to end solution to the client. Evangelize agentic capabilities through workshops, demos, and executive briefings. Educate and spread awareness within the external client-base about automation capabilities to increase usage and idea generation. Increase automation adoption by focusing on distinct users and distinct processes. Deliver exceptional communication to multiple layers of management for the client. Provide automation training, coaching, and assistance specific to a users role. Demonstrate strong working knowledge of automation features to meet evolving client needs. Extensive knowledge and literacy of the suite of products and services offered through ongoing enhancements, and new offerings and how they fulfill customer needs. Establish monitoring frameworks for agent performance, drift detection, and self-healing mechanisms. Develop governance models for ethical AI agent deployment and compliance. Preferred Qualification 12+ years work experience with 5+ years in the Automation/AI space Knowledge of: Cloud platforms (AWS SageMaker, Azure ML; etc) Orchestration tools (Prefect, etc) Agentic toolkits (LangChain, LlamaIndex, AutoGen) Experience in productionizing AI applications. Strong programming skills in python and common AI frameworks Experience with multi-modal LLMs and integrating vision and text for autonomous agents. Excellent written and oral communication in English Excellent presentation skills with a high degree of comfort speaking with senior executives, IT Management, and developers. Hands-on ability to build quick prototype/visuals to assist with high level product concepts and capabilities. Experience in deployment and management of applications utilizing cloud-based infrastructure. A desire to work in a fast-paced and challenging work environment Ability to work in a cross functional, multi geographic teams
Position Summary: The QA Engineer contributes to the design, implementation, and maintenance of enterprise-scale ML/AI systems. This role focuses on hands-on development, system optimization, and technical execution while supporting strategic initiatives. The Team: The candidate will join a growing team of Engineers & Data Scientists as a member of the Data Science and Platforms team in Market Intelligence (MI) Enterprise Solutions (ES) at S&P Global. We work alongside product teams across MI ES on break-through ideas using tools and techniques spanning the entire spectrum of Data Science, Statistics, Machine Learning, Deep Learning, Gen AI, Operations Research, Data and Machine Learning Engineering. We are responsible for the end-to-end data science life cycle, right from ideation to developing, building, testing and maintaining production data science/ML/AI applications/products/features. The Impact: This is a great time to be joining a truly global team on a great technology journey. This is a great product covering multi-asset classes in both a pre/post-trade capacity with real-time insights. If you want to be an integral part of this forward-thinking team with a drive to succeed and the opportunity to enhance your development career and expand your technical skill sets, then this is the role for you. Your challenge will be reducing the time to market for products without compromising quality, by using innovation and technical skills. You will be a member of team and will have the opportunity to write, design and execute various test plans as well as plan and implement automation to speed up our testing and regression cycles. Whats in it for you: Be a part of an industry leading, Fortune 500 company Be a part of GREAT PLACE TO WORK Certified firm Be a part of a People First organization that Values Partnership, Integrity, and Discovery to Accelerate Progress Develop and deliver industry-leading software solutions using cutting-edge technologies and the latest toolsets. Plenty of training and development programs that support continuous learning and skill enhancement. Build a fulfilling career with a truly global and leading provider of financial market intelligence, data, and analytics. Responsibilities Analyse Requirements, define and design test scenarios, write, and execute test cases. Develop and maintain modular, reusable test frameworks aligned with best practices. Collaborate with developers and DevOps teams to ensure high-quality releases. Run regression on solutions. Perform performance and load testing using JMeter, analyze results, and provide optimization recommendations. Troubleshoot test failures, perform root cause analysis, and maintain test stability Participate with the team to develop user stories, bugs, and tasks. Contribute and adhere to QA best practices. Prepare and maintain test documentation Contribute to continuous improvement of testing processes and tools. What Were Looking For: Youre a self-motivated, passionate, and high-performing engineer that shows a passion for testing technologies new and old. Thorough understanding of test principals such as unit, smoke, functional, user acceptance and usability testing You have an excellent communication skill with strong ability to articulate technically complex subjects to audiences of any technical level. The ideal candidate must be technical, be able to troubleshoot and problem solve, have a great attitude, be able to work both in a team and independently and possess leadership qualities. Basic Qualifications: Strong understanding of SDLC, STLC, and Agile methodologies. 4+ year experience as QA engineer Degree in MIS or CS Strong experience in test automation frameworks like Selenium and Pytest Strong proficiency in Python (OOP concepts, scripting, libraries). Expertise in API testing (REST, Swagger Validations) Knowledge of CI/CD tools (GitLab, Jenkins) Familiarity with cloud platforms (Azure, AWS) Performance testing Hands on with Jmeter (Spike testing and Load testing) VCS familiarity GitHub, GitLab or similar Strong debugging and analytical skills Familiarity with cloud environments (AWS, Azure). Required Qualifications: Bachelor''s degree in Computer Science Engineering or related Engineering field/ Masters in Computer Applications Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction.
The Team: The CIQ External API team is responsible for managing all external APIs that enable clients to access and consume data from multiple data sources, including CIQ, CIQ Pro, and IPD. These APIs support over 800 active external clients, making this a critical client-facing component of our business. The team is currently focused on expanding integrations with new data providers such as ChatIQ , Visible Alpha, and SNL Financial, as well as enhancing existing functionalities for client consumption and improving overall API performance. Also ensure the GDS API framwork is scalable for the increased usage growth, modernize where necessary, perform mandatory maintenance, BAU and general systems upkeeping. Improve Disaster Recovery/High Availability posture and ultimately migrate the CiQ to CIQ Pro. The team works in a challenging environment that gives ample opportunities to use innovative ideas to solve complex problems. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: This role is very critical , as the individual will contribute to the development and delivery of ChatIQ and GQL onboarding in GDS, as well as delivering to external clients. Additionally, they will need to manage the infrastructure of GDS, ensure that existing API functionalities are not broken, and support any client issues. Whats in it for you: Are you looking for an exciting and engaging place to work? A place where you can continue to grow your skills, in a stable environment where we put our people first. Build a career with a global company. Opportunity to grow personally and professionally. Grow and improve your skills by working onenterprise levelproducts andnew technologies. Exposure toNew Technologies (AWSGQL,MCP, OKTA). Responsibilities: Design and develop software based on business requirements, workflows and use cases; execute andmaintainthe test cases and document test results. CIQ Pro Migration from CIQ. Drive the setup, configuration, and maintenance ofenvironmentand framework. Activelyparticipatein continuous improvement of testing processes and procedures by analyzing reported bugs/issues with a strong emphasis on using automation to shorten the SDLC. Identify,prioritizeand execute tasks in Agile software developmentenvironment. Mentoring& providingtechnical guidance to team members. Provide effective communicationregardingissues, testingstatusand potential risks. Eagerto learnnew technologiesand approaches and be part of a very progressive test team. What Were Looking For: Bachelors degree in computer science , Information Systems, or Engineering. 5-9 years of experience building web applications, with strong hands-onexpertise in Java, Spring Boot,JavaScript . Proven experience implementing RESTful APIs using Java . Strong working knowledge of SQL Server or Oracle databases. Familiarity with modern frontend frameworks such as React.js or Angular . Proficiencywith SDLC methodologies , including Agile and Test-Driven Development (TDD) . Excellent problem-solving and troubleshooting skills. Experience working in cloud environments , especially AWS .
FIND ON MAP