Jobs
Interviews

1127 Data Extraction Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

5 - 11 Lacs

Chennai, Bengaluru

Hybrid

Hi Greetings!!!!!, We have an immediate opening for Test Data Management position for our company. Mandatory skills: Test data Management, K2view, Data Extraction, Data Masking, Data Generation, Data Maintenance, Data Reservation, SQL, Exp: 5+yrs Rev Exp: 5+yrs Work Location: Chennai, Bangalore Mode of interview: First: Virtual Second round: Face to Face Job Description: Must have at least 5 years of relevant experience in Test Data Management (TDM), especially with K2View TDM Software. Experience in implementing TDM solution including: Data Extraction, Data Masking, Data generation, On Demand Data provisioning enabled with Data Reservation and Data Maintenance (Data Purging, update scripts and reuse). Strong in writing SQLs per business requirements, to provision the right test data per test scenarios. Experienced with working in Agile Model, managing client and application stakeholders. Good to have experience in provisioning the data from mainframe applications to lower test environments using TDM Software. Interested candidate please share me the updated to the given mail.ID: elakkiya.t@truetechdigital.com Thanks & Regards, T.Elakkiya Sr. TAG

Posted 1 month ago

Apply

0.0 - 5.0 years

4 - 9 Lacs

Thane

Remote

Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Required Candidate profile Ability to work with stakeholders to assess potential risks. Analyze existing tools and databases and provide software solution recommendations. Understanding of addressing and metadata standards. Perks and benefits Flexible Work Arrangements. Bonuses.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Pune

Work from Office

Relevant experience required (in years) 6+ years of total exp with minimum 3 years as an Analytics Engineer or in a similar role . Your department As Analytics Engineer within the Enabling Functions department of Vanderlande you support Finance, HR, and IT by delivering robust, scalable, and future-ready data solutions. Our mission is to empower these departments with high-quality, governed data products that drive operational excellence and strategic decision-making. At the core of our work is the Vanderlande Data Platform, a cloud-native solution built on Azure and aligned with Data Mesh principles. This platform enables us to ingest, transform, and enrich data from diverse source systems, creating trusted, reusable data assets tailored to the needs of our Enabling Functions stakeholders. At the heart of our efforts lies the Data Platform, the core product we are diligently enhancing. This Azure cloud enabled platform based on the Data Mesh principle empowers us to deliver valuable Data Products by transforming, processing, combining, and enriching data from various source systems, ultimately creating a reliable source of truth for our customers. By utilizing advanced analytics techniques and the latest technology, our focus on innovation empowers us to continuously improve and innovate. Operating as a scaled Agile organization, we embrace the SAFe Agile framework and work collaboratively across different teams, all united by a common backlog and shared roadmap. This approach allows us to be highly adaptable, continuously adjusting to the evolving strategic needs of stakeholders within Vanderlande. Responsibilities As an Analytics Engineer at Vanderlande youll be at the forefront of driving data-driven solutions and shaping the future of our organization by working on the data platform. Your key responsibilities will involve translating business needs into functional requirements, designing and developing data products, pipelines, and reports, and analyzing data to solve use cases resulting in optimized business processes and fact-based decision-making. As being part of a cross functional full-stack team, together with your colleagues you are responsible for the creation and delivery of an end-to-end solution to our business stakeholders. In this role, you are a tech-savvy, action-oriented, and collaborative colleague who can wear multiple hats - part Data Engineer, - part Data Analyst and even though you have an area of expertise, you can fulfil each of those roles up to a certain point. On a day-to-day basis, you will focus on creating and maintaining data products, data pipelines using Python, PySpark and SQL, and dashboards using tools like Qlik, or Power BI. Working in an Agile environment, proactively contributes to scrum events ensuring seamless coordination with the team and swift adaptability to changing priorities. They thrive in a fast-paced, iterative development environment, where constant feedback and continuous improvement are key. In this role, you Translate business needs into functional requirements providing essential information on business use cases. Translate functional requirements into thorough and feasible data products, analytics solutions and dynamic dashboards; Utilize Python, PySpark, SQL, or R for data retrieval and manipulation. Develop, test, and maintain data products, pipelines by the use of the Azure stack and Databricks, ensuring data reliability and quality. Design and implement architectures for efficient data extraction and transformation. Work on creating and maintaining landing zones in the data platform. Actively participate in and contribute to Continuous Integration and Continuous Deployment (CI/CD) practices, ensuring smooth and efficient development and deployment processes within the data platform Integrate data pipelines and reports into testing frameworks, allowing for rigorous performance testing and validation to ensure seamless performance. Monitor and maintain data pipeline stability, offering support when required. Analyze, interpret, and visualize data to drive business process optimization and fact-based decision-making. Create, deploy, and maintain interactive dashboards and visualizations using Qlik or Power BI. Perform a comprehensive analysis and proactively implement solutions to assess and enhance data quality and data reliability. You are eager to improve yourself and strive for continuous enhancement of processes and development of our data products. Stay updated with the latest developments in the analytics field and share knowledge with the team. Your Qualification and Skills: If youre an experienced, enthusiastic and versatile Analytics Engineer, you will bring: Minimum bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). 6+ yrs of total exp with min 3 years of prior experience working as an Analytics Engineer or in a similar role . Ability to work effectively as part of a cross-functional international full stack team, collaborating with other developers, and stakeholders. Experience in writing code in Python, PySpark, SQL, or R for data retrieval and manipulation is a strong preference. Demonstrated experience and proficiency in working with Azure Stack and Databricks is required for this role. Proficiency or interest in using DevOps practices and tools for continuous integration, continuous delivery (CI/CD), and automated deployment of data products and data pipelines Strong communication skills to effectively convey complex technical concepts in a clear and understandable manner to both technical and non-technical stakeholders. Enthusiastic, proactive, driven, and actively seeking opportunities for personal development and growth. Demonstrated curiosity and commitment to staying updated with the latest trends, tools, and best practices in the data & analytics field. Familiar with agile methodologies and thrives in its dynamic and collaborative environment. Knowledge of visualization tools like Power BI or Qlik is plus.

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 25 Lacs

Bengaluru

Hybrid

IRB Scorecard Analyst (Python) Location: Bangalore (Hybrid) Exp: 3-8 yrs Join: Immediate to 30 days What Youll Do: Develop and maintain Basel IRB scorecards (PD/LGD/EAD) end-to-end using Python Perform feature engineering, binning, weight-of-evidence transformation & model calibration Back-test and monitor scorecard performance; investigate PSI/KS drift & data anomalies Extract and profile data via SQL; prepare regulator-ready documentation Must Have: 3-8 yrs in IRB scorecard development or credit-risk analytics at a bank/consultancy Expert in Python for statistical modeling (pandas, scikit-learn, statsmodels) Strong SQL skills for data extraction & profiling Solid understanding of Basel II/III IRB guidelines and risk-weight parameters Nice to Have: Hands-on experience with SAS or R for supplementary analyses Familiarity with model-risk management frameworks (SR 11-7/RBI MRM) Experience automating workflows via Python scripting or CI/CD If Interested, kindly share your resume on my email ID - divya.sehgal@thethreeacross.com

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 12 Lacs

Bengaluru

Hybrid

Data/Business Analyst Experience: 5-8 Years Bangalore - Hybrid/Remote Shift - Night shift Notice: Immediate Joiners only Skills: Tool Proficiency: Microsoft Excel Google Sheets Microsoft Power BI Role: Business/Data Analyst Our main work is to gather and interpret data to solve a specific problem. The role includes plenty of time spent with data but entails communicating findings too. Gather data: Analysts often collect data themselves. Clean data: Raw data might contain duplicates, errors, or outliers. Cleaning the data means maintaining the data quality in a spreadsheet. Model data: This entails creating and designing the structures of a database. You might choose what data types to store and collect, establish how data categories are related, and work through how the data appears. Interpret data: Interpreting data will involve finding patterns or trends in data that can help you answer the question at hand. Present: Communicating the results of your findings will be a crucial part of your job. You create visualizations like charts and graphs, write reports, and present information to interested parties. Drop an email with your updated resume and basic details to js001102065@techmahindra.com if you find this role and criteria relevant.

Posted 1 month ago

Apply

3.0 - 8.0 years

12 - 22 Lacs

Bengaluru

Hybrid

IRB Scorecard Analyst (Python) Location: Bangalore (Hybrid) Exp: 3-8 yrs Join: Immediate to 30 days What Youll Do: Develop and maintain Basel IRB scorecards (PD/LGD/EAD) end-to-end using Python Perform feature engineering, binning, weight-of-evidence transformation & model calibration Back-test and monitor scorecard performance; investigate PSI/KS drift & data anomalies Extract and profile data via SQL; prepare regulator-ready documentation Must Have: 3–8 yrs in IRB scorecard development or credit-risk analytics at a bank/consultancy Expert in Python for statistical modeling (pandas, scikit-learn, statsmodels) Strong SQL skills for data extraction & profiling Solid understanding of Basel II/III IRB guidelines and risk-weight parameters Nice to Have: Hands-on experience with SAS or R for supplementary analyses Familiarity with model-risk management frameworks (SR 11-7/RBI MRM) Experience automating workflows via Python scripting or CI/CD If Interested, kindly share your resume on my email ID - simran.salhotra@portraypeople.com

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

JD for Azure Databricks. Role Description: A minimum of 5 years experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partnersExperience in troubleshooting and Supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tunning, ETL importing large volume of data extracted from multiple systems, capacity planning Competencies: Digital : Python, Digital : Databricks, Digital : PySpark, Azure Data Factory, MySQL Experience (Years): 6-8 Essential Skills: Strong knowledge of Extraction Transformation and Loading (ETL) processes using frameworks like Azure Data Factory or Synapse or Databricks; establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etc. Databricks Architect, Cloud Architect, Python, SQL Desirable Skills: Design and develop ETL processes based on functional and non-functional requirements in python / pyspark within Azure platform. Databricks Architect, Cloud Architect, Python, SQL

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

About DrinkPrime Were DrinkPrime, a startup based in Bengaluru thats all about making clean, safe, and healthy drinking water accessible to everyone. We provide cutting-edge IoT-enabled customized water purifiers on a subscription basis, and we currently serve more than one lakh users across Bengaluru, Delhi NCR, and Hyderabad. Were not just another boring water purifier manufacturing company - were a forward-thinking drinking water solution provider that speaks directly to our target audience, and this has earned us the support of some big VCs in the industry. Were looking for awesome tech professionals who want to join us in creating history and making a real difference. So why not come and be a part of our journey towards a better future? Job Overview The Senior Data Analyst will play a key role in collecting, analyzing, and interpreting complex data to inform business decisions. The ideal candidate will have extensive experience with data analytics, statistical analysis, and visualization tools. This position requires a deep understanding of business operations, excellent problem-solving skills, and the ability to communicate data-driven insights to stakeholders at all levels. Key Responsibilities: - Analyze large datasets to identify trends, patterns, and insights that can inform business decisions. - Develop and maintain interactive dashboards and reports to monitor key business metrics. - Collaborate with business units to understand data needs and translate them into actionable analyses. - Design and implement data models for predictive and prescriptive analytics. - Build and maintain forecasting models to support strategic planning. - Develop and enhance reporting processes and metrics to ensure they align with business goals. - Ensure data integrity and accuracy by performing data validation and cleaning tasks. - Collaborate with data engineering teams to improve data pipelines and data governance practices. - Lead ad-hoc data analysis projects, ensuring timely delivery and quality of insights. Qualifications: - Experience - 5+ years of experience in data analysis, business intelligence, or a related field. - Proven track record of working with large datasets and deriving actionable insights. - Experience in data visualization, dashboard creation, and reporting (e.g., Tableau, Power BI, Looker, etc.). - Technical Skills: - Proficiency in SQL for data extraction and manipulation. - Strong knowledge of statistical analysis and techniques. - Expertise in Excel or other spreadsheet tools for complex data analysis. - Familiarity with scripting languages such as Python, R, or similar. - Experience with data warehousing, ETL processes, and big data tools is a plus.

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

As a BI Developer, you ll turn raw data into stunning visual stories that drive decisions. Collaborate with clients, create jaw-dropping dashboards, and lead end-to-end BI projects. If you love transforming insights into action and thrive in a vibrant consulting environment, we want you on our team! What You ll Tackle Each Day : End-to-End BI Implementation: Develop and manage the full BI lifecycle from data modelling and report building to delivery and post-implementation support. Tableau Development: Design, Develop, and maintain interactive and visually compelling dashboards and reports in Tableau. SQL Expertise: Write efficient SQL queries for data extraction, transformation, and analysis. PySpark experience is added advantage. Ability to independently manage end-to-end dashboard development projects with minimal supervision, taking full ownership of design, data integration, and deployment activities. Business Knowledge: Collaborate with clients to understand their business needs and provide actionable insights through BI solutions. Cross-Tool Integration: Experience with other BI tools such as Power BI or Qlik Sense. 5 to 7 years of experience in Business Intelligence, focusing on Tableau development and SQL, you consistently deliver impactful BI solutions. A strong understanding of data visualization best

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Mumbai

Work from Office

The role The Data Analyst Investment Team is a vital role within the Blenheim Chalcot portfolio and BCI Finance . A Data Analyst Investment Team supports investment professionals by analyzing financial, market, and economic data to identify trends, risks, and opportunities. They build models, dashboards, and reports to guide investment decisions, ensuring strategies are data-driven and aligned with performance goals.. You will gain hands-on experience in a fast-paced and progressive environment, where you will support us in building our next generation of GenAI enabled tech businesses. List of key responsibilities and duties Run-Off Modelling - Build, maintain, and analyse run-off models to assess credit security against borrower loan portfolios. - Contribute to quarterly IFRS9 provisioning by updating run-off models. - Run scenario and sensitivity analysis for potential new deal structures. Borrower Data Analysis - Conduct loan-level analysis to identify performance trends, portfolio risks, and concentration issues. - Support investment memo preparation with credit risk analysis. - Update and maintain PowerBI dashboards for ongoing borrower monitoring. Feedback on emerging trends in BCI s portfolio during monthly monitoring sessions. Data Management and Platform Building - Manage data uploads from borrower excel files into a PostgreSQL database; maintain ongoing data integrity. - Help map new borrower data into the existing data framework. - Assist in developing Python-based analytics tools for internal use by non-technical team members. Technical Experience - Necessary tools o Excel for complex model building and analysis o Python for in-depth data analysis and development of complex models o AI tools (Cursor, ChatGPT, etc.) used to accelerate analysis and scripting - Nice-to-haves o SQL (PostgreSQL) for data extraction, uploads, and structuring o PowerBI for dashboarding and trend analysis - *Support in SQL, python and PowerBI will be available in BCI s India team. A good excel background with some knowledge/experience with python is required. About you The ideal candidate will have a track-record in delivering results in a fast-moving business and hence be comfortable with change and uncertainty. Excellent stakeholder management experience is essential to being successful in this role. List of qualifications, technical and or professional experience list of qualifications, technical and or professional experience - Strong quantitative background (STEM or finance-heavy academic background) - Solid experience with Excel and financial model logic - Working knowledge of Python and/or SQL - Basic understanding of credit risk and investment structures - Ability and experience working with large datasets About Blenheim Chalcot Blenheim Chalcot is one of the leading venture builders in the world. We have been building exciting and disruptive businesses for over 26 years across sectors including FinTech, EdTech, GovTech, Media, Sport, Charity and more. These companies are all GenAI enabled and are some of the most innovative companies in the UK and increasingly around the world. The BC team in India has been instrumental to the growth and success of Blenheim Chalcot. Established in 2014, Blenheim Chalcot India serves as a pivotal launchpad for those aiming to make a difference in the realm of innovation and entrepreneurship. Blenheim Chalcot India is driven by a mission to empower visionaries to lead, innovate, and build disruptive solutions. We support our diverse portfolio of ventures and create impactful solutions that shape global trends. We provide a range of services to help new business get off the ground, including technology, growth (marketing and sales), talent, HR, finance, legal and tax, plus so much more! One of our FinTech ventures, BCI Finance, is scaling fast and we re looking to hire high energy, motivated and curious talent to support them on that journey! About BCI Finance BCI Capital, part of Blenheim Chalcot, is a private credit-specialist Investment Manager focused on supporting high-growth fintechs with flexible debt solutions. With a strong track record through its Credit Opportunities Fund, BCI aims to build long-term borrower relationships by offering empathetic, growth-focused funding. The role involves supporting the Loan Operations team with the daily administration and oversight of the existing loan portfolio. What we can offer you Be part of the World s Leading Digital Venture Builder Have the opportunity to be a part of and learn from the incredible diverse talent in BC Be exposed to the right mix of challenges, within a culture that promotes continuous learning and development and opportunity to work with Gen AI A fun and open, if a little cricket obsessed, atmosphere we own the Rajasthan Royals IPL team! 24 days of annual leave &10 public holiday days Private Medical for you and your immediate family & Life Insurance for yourself Important At Blenheim Chalcot, we strive to create an environment where differences are not only accepted but greatly valued; where everyone can make the most of their capabilities and potential. We promote meritocracy, competence and the sharing of ideas and opinions. We are driven by data and believe the diversity, agility, generosity, and curiosity of our people is what sets us apart as an organisation. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing and advancing individuals based on their skills and talent.

Posted 1 month ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Job Description As a BI Developer, you ll turn raw data into stunning visual stories that drive decisions. Collaborate with clients, create jaw-dropping dashboards, and lead end-to-end BI projects. If you love transforming insights into action and thrive in a vibrant consulting environment, we want you on our team! What You ll Tackle Each Day : End-to-End BI Implementation: Develop and manage the full BI lifecycle from data modelling and report building to delivery and post-implementation support. Tableau Development: Design, Develop, and maintain interactive and visually compelling dashboards and reports in Tableau. SQL Expertise: Write efficient SQL queries for data extraction, transformation, and analysis. PySpark experience is added advantage. Ability to independently manage end-to-end dashboard development projects with minimal supervision, taking full ownership of design, data integration, and deployment activities. Business Knowledge: Collaborate with clients to understand their business needs and provide actionable insights through BI solutions. Cross-Tool Integration: Experience with other BI tools such as Power BI or Qlik Sense. Qualifications 5 to 7 years of experience in Business Intelligence, focusing on Tableau development and SQL, you consistently deliver impactful BI solutions. A strong understanding of d

Posted 1 month ago

Apply

3.0 - 8.0 years

20 - 25 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

We are seeking an experienced Oracle Cloud Data Migration Consultant to join our rapidly growing Oracle consultancy. The ideal candidate will have a strong background in Oracle Cloud data migration , with expertise in ETL processes, data transformation, and integration methodologies. They should be proficient in handling data migration across multiple Oracle Cloud applications and ensuring seamless transitions for enterprise systems. This role will involve delivering end-to-end data migration solutions , working across various Oracle Cloud environments to support project implementations, technical development, and system optimisation. Mandatory Skills & Experience: 3+ years of experience in Oracle Cloud data migration projects. Good understanding of Oracle ERP and HCM application processes and back ground technical data dictionaries. Strong expertise in PL/SQL, SQL, and ETL processes for Oracle Cloud applications. Experience working with Oracle FBDI and integration technologies . Understanding of data security, governance, and compliance considerations in cloud environments. Excellent communication skills and ability to collaborate across technical and functional teams. Ability to work in a fast-paced, multi-tasking environment with periods of high pressure. Flexibility to support new technologies and evolving data migration approaches . Responsible for the development, testing, and support of Oracle Cloud data migration processes, ensuring seamless transition and integration between legacy systems and Oracle Cloud environments. Tasks may include: Development of SQL and PL/SQL components for data extraction, transformation, and loading (ETL) processes. Design and execution of data migration strategies , ensuring data integrity and accuracy. Utilization of Oracle Cloud tools such as FBDI and Cloud-based ETL solutions for efficient data transfer. Integration using REST and SOAP web services , APIs, and Oracle Cloud-specific communication frameworks. Validation and reconciliation processes to confirm successful data migration and resolve discrepancies. Candidates are expected to have experience in multiple areas but are not required to be proficient in all technologies. We value individuals with diverse skill sets and a willingness to learn and adapt to emerging technologies. Key Deliverables: Development and execution of data migration components aligned with functional and technical requirements. Unit testing and validation of migrated data to ensure consistency and accuracy. Support for existing data structures , troubleshooting migration issues, and implementing enhancements. Adherence to best practices and documented development standards. Collaboration with Oracle Support and stakeholders to resolve technical challenges.

Posted 1 month ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Chennai

Work from Office

This position is responsible for life Insurance modelling and exposure management services for all Insurance books of the company. These services will include and not limited to end-to-end life modelling, data testing, basis and model developments. The ideal candidate should be able to understand Life Insurance industry terminologies and stay abreast of industry trends, emerging technologies, proficient in R and Python, and changes in regulatory landscape to drive continuous improvement in modelling practices.Experience 3 to 5 year of experience in Life InsuranceEducation Bachelors degree in Actuarial Science, Mathematics, or a related field. Must have cleared Actuarial Science paper CM1 & CS1.Role and Responsibilities Provide analytical support to life modeling team operations by sharing knowledge and information Perform data extraction, and analysis using R and Python to support actuarial projects. Provide timely and frequent feedback to team members. Maintain a high standard of professionalism, ethics, and integrity in all actuarial activities. Participating in pre-specified modelling changes while learning to appreciate best practice modelling guidelines. Raising queries with the clients for missing piece of information. Performing quality checks and provide feedback. Participating in system testing activities and constructing results dashboards to aid in the walkthroughs of results analyses to stakeholders. Communicating progress on activities to relevant stakeholders and delivery managers Preparing complete documentation and audit packs supporting developments and overall model releases. Ensuring process activities are consistent with defined process manuals and updating procedural manuals when necessary. Ensure timely completion of all deliverables. Training and mentoring of team membersRequirements Extensive experience in life insurance actuarial modelling. Excellent analytical and problem-solving skills, with a keen attention to detail. Good hands-on experience on Prophet or other modelling tools Excellent communication skills. Technical Skills:o Proficiency in R and Python, Strong working knowledge of VBA, Access, SQL, o Experience with actuarial software and tools.o Strong Excel skills and MS-Office apps. Qualifications Bachelors degree in Actuarial Science, Mathematics, or a related field. Must have cleared Actuarial Science paper CM1 & CS1.

Posted 1 month ago

Apply

0.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Proven experience working with PeopleSoft HCM and CRM modules. Strong technical knowledge of SQR, Application Designer, Application Engine, and PeopleCode. Proficiency in SQL for data extraction and manipulation. Experience with data archival projects and enterprise archival solutions is a plus. Excellent problem-solving skills and attention to detail. Primary Skill Proven experience working with PeopleSoft HCM and CRM modules. Strong technical knowledge of SQR, Application Designer, Application Engine, and PeopleCode. Proficiency in SQL for data extraction and manipulation. Secondary Skill Experience with data archival projects and enterprise archival solutions is a plus. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.

Posted 1 month ago

Apply

5.0 - 7.0 years

6 - 10 Lacs

Mumbai, Bengaluru, Delhi

Work from Office

Must have skills required : Java, Groovy, SQL, AWS, Data Engineering, Agile, Database Good to have skills : Machine Learning, Python, CI/CD, MicroServices, Problem Solving Intro and job overview: As a Senior Software Engineer II, you will join a team working with next gen technologies on geospatial solutions in order to identify areas for future growth, new customers and new markets in the Geocoding data integrity space. You will be working on the distributed computing platform in order to migrate existing geospatial datasets creation process-and bring more value to Preciselys customers and grow market share. Responsibilities and Duties: You wil be working on the distributing computing platform to migrate the existing Geospatial data processes including sql scripts,groovy scripts. Strong Concepts in Object Oriented Programming and development languages, Java, including SQL, Groove/Gradle/maven You will be working closely with Domain/Technical experts and drive the overall modernization of the existing processes. You will be responsible to drive and maintain the AWS infrastructures and other Devops processes. Participate in design and code reviews within a team environment to eliminate errors early in the development process. Participate in problem determination and debugging of software product issues by using technical skills and tools to isolate the cause of the problem in an efficient and timely manner. Provide documentation needed to thoroughly communicate software functionality. Present technical features of product to customers and stakeholders as required. Ensure timelines and deliverables are met. Participate in the Agile development process. Requirements and Qualifications: UG - B.Tech/B.E. OR PG M.S. / M.Tech in Computer Science, Engineering or related discipline At least 5-7 years of experience implementing and managing geospatial solutions Expert level in programming language Java, Python. Groovy experience is preferred. Expert level in writing optimized SQL queries, procedures, or database objects to support data extraction, manipulation in data environment Strong Concepts in Object Oriented Programming and development languages, Java, including SQL, Groovy/Gradle/maven Expert in script automation in Gradle and Maven. Problem Solving and Troubleshooting Proven ability to analyze and solve complex data problems, troubleshoot data pipelines issues effectively Experience in SQL, database warehouse and data engineering concepts Experience with AWS platform provided Big Data technologies (IAM, EC2, S3, EMR, RedShift, Lambda, Aurora, SNS, etc.) Strong analytical, problem-solving, data analysis and research Good knowledge of Continuous Build Integration (Jenkins and Gitlab pipeline) Experience with agile development and working with agile engineering teams Excellent interpersonal skills Knowledge on micro services and cloud native framework. Knowledge of Machine Learning / AI. Knowledge on programming language Python.

Posted 1 month ago

Apply

1.0 - 2.0 years

5 - 8 Lacs

Mumbai

Work from Office

Role : ESG Data Analyst Position type: Offroll (Quess Corp Payroll) During work from home (WFH) phase candidates should have continuous stable internet connectivity (broadband) and Laptop/ PC for working, minimum 100 MBPS needed. Experience: 06 Months OR 1.5 years (In ESG) Work profile: Extraction of ESG related data from annual reports, company websites, google search. Data basing of extracted data. Should have working knowledge of excel and word Working on ad hoc tasks involving use of excel/ databases relating to financial and non-financial information relating to corporates, business entities, industry performance, etc. Skills: Data extraction skills. Experience in secondary research. Proficiency in MS Excel/ spread sheets is must. Proficient in Communication. Working with colleagues towards achievement of individual and team level goals.

Posted 1 month ago

Apply

2.0 - 7.0 years

2 - 4 Lacs

Noida, Ghaziabad, New Delhi

Work from Office

Seeking an Excel & Google Sheets expert with deep knowledge of advanced functions like Pivot Tables, Conditional Formatt,QUERY, ARRAYFORMULA, VLOOKUP, HLOOKUP, IMPORTRANGE etc, and Mail Merge.Expert in Google Apps Script for automation and reporting.

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Mandatory 6+ years of expertise in Forms6i,Forms 10g,Reports 6i or Forms 10g and Advanced PL/SQL. Mandatory Working experience in Oracle Financials or SCM modules. Knowledge in different modules of Oracle Financials. Knowledge of Tax & Custom process is an advantage. Expertise in Forms6i,Forms 10g,Reports 6i or Forms 10g and PL/SQL Expertise in O2C end to end process. Good knowledge in Oracle Apps Standards, Table Structure, Architecture & DB Comprehensive knowledge/experience in SQL, PL/SQL, writing reports, forms and debugging. As per business logic (MD50) do change objects(Forms / Reports / Interfaces / Converstions) and develop the new objects. Expertise in Oracle Applications RICE components (Reports, Interfaces, Conversions, and Extensions) Proficient in written and verbal communication skills with the ability to communicate conceptual ideas clearly and effectively Should have Good analytical skills. Knowledge in migrating objects. Knowledge in Developing new packages, procedures, and functions and updating existing packages and procedures, functions. Knowledge in Developing the workflows. Knowledge in Developing the APIs as per business logics for data updates. Knowledge in Developing the data extraction automation programs (.sql files developments) for data extractions. Knowledge of Agile methodology and tools is mandatory.

Posted 1 month ago

Apply

13.0 - 15.0 years

45 - 55 Lacs

Bengaluru

Work from Office

Overview: An expert in their field, applying broad business knowledge and strategic insight surrounding emerging trends and technologies to solve complex problems and drive organizational results. Leads critical, high-impact projects and designs/implements innovative business strategies. Works with minimal oversight, frequently consulting senior leadership and influencing executive decisions. Serves as a mentor and assists others with challenging issues. Performs very complex data science work, creates new ways of analyzing data, builds very complex business models, and makes recommendations that impact an entire business unit or very large complex sector. Key Roles and Responsibilities: Typical tasks may include, but are not limited to, the following: Data Extraction and Preparation: Collect data from various structured and unstructured sources (datalakes, databases, data warehouses, on cloud, internal, external) and ensure its quality for analysis through cleaning and preprocessing. Designs, builds, and analyzes large (e.g. 100 s of Terabytes or higher as technology advances) and complex data sets while thinking strategically about data use and data design. Tools can include snowflake and databricks. Coding Solutions, Algorithms and Feature Engineering: Create relevant features and conduct exploratory data analysis. Codes solutions following typical workflow; data extraction, cleansing, feature engineering, exploratory data analysis, model selection/creation, hyper-parameter tuning, model interpretation, model retraining, business process and/or system implementations, high level proof of concept and trials, visualization, deployment to production, post deployment ML ops monitoring / diagnosis / resolutions. Coding proficiency required in at least one data science language (Python, R, Scala, etc.), as well as expertise with modern ML packages and libraries (Spark, SciKitLearn, Pandas, PyTorch, TidyVerse, Tensorflow, Keras, Shiny, and/or AutoML tools). Model Development, Deployment and Optimization: Build, evaluate, and optimize machine learning models through hyperparameter tuning. Implement models into production, continuously monitor their performance, and ensure they remain explainable and reliable to minimize model decay. Ability to develop custom Machine Learning (ML). Highly proficient in the full AI workflow such as (1) data extraction, cleansing, feature engineering, exploratory data analysis, model selection/creation, hyper-parameter tuning, model interpretation, model retraining and (2) Uses concepts like mlflow to log metrics. Well-versed in Interactive Development Environments (IDEs) such as Databricks Workspaces or Visual Studio Code. Proficiency in algorithm categories such as Supervised Learning, Unsupervised Learning, Optimization Algorithms, Deep Learning, AI-Computer Vision, Natural Language Processing, Deep Reinforcement Learning, Search Algorithms, and AI- Knowledge Graphs. Visualization and Collaboration: Create visualizations and reports for stakeholders while working closely with cross-functional teams to align efforts with business objectives. Can utilize advanced coding methods to produce visualizations (e.g. ggplot, D3.js, etc.). Generative AI: Develop and implement generative AI models, focusing on creating new content or augmenting existing data. Generative Models-Understanding of GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and Transformers. Fine-Tuning-Techniques for adapting pre-trained models to specific tasks using smaller, task-specific datasets. Agentics-Understanding of agentic architecture, concepts and optimization of solutions. Prompt Engineering-Crafting effective prompts to guide generative models in producing desired outputs. Retrieval-Augmented Generation (RAG)-Combining generative models with retrieval systems to enhance performance and relevance. Text Generation-Proficiency in using models like GPT-3/4 for generating human-like text. Image Generation-Familiarity with tools like DALL-E and Stable Diffusion for creating images from text descriptions. Key Skills: Python, R, Scala, Generative AL, Credit, Modele development, Fraud, tiral analysis, Machine Learning (ML), Snowflake, databricks, Spark, SciKitLearn, Pandas, PyTorch, TidyVerse, Tensorflow, Keras, Shiny, AutoML Job ID R-72967 Date posted 06/30/2025

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

Hyderabad

Work from Office

As a BI Developer, you ll turn raw data into stunning visual stories that drive decisions. Collaborate with clients, create jaw-dropping dashboards, and lead end-to-end BI projects. If you love transforming insights into action and thrive in a vibrant consulting environment, we want you on our team! What You ll Tackle Each Day : End-to-End BI Implementation: Develop and manage the full BI lifecycle from data modelling and report building to delivery and post-implementation support. Tableau Development: Design, Develop, and maintain interactive and visually compelling dashboards and reports in Tableau. SQL Expertise: Write efficient SQL queries for data extraction, transformation, and analysis. PySpark experience is added advantage. Ability to independently manage end-to-end dashboard development projects with minimal supervision, taking full ownership of design, data integration, and deployment activities. Business Knowledge: Collaborate with clients to understand their business needs and provide actionable insights through BI solutions. Cross-Tool Integration: Experience with other BI tools such as Power BI or Qlik Sense. 5 to 7 years of experience in Business Intelligence, focusing on Tableau development and SQL, you consistently deliver impactful BI solutions. A strong understanding of data visualization best p

Posted 1 month ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Gurugram

Work from Office

Key Responsibilities Assist in executing internal audit assignments and reviews as per the audit plan Support in data extraction, sampling, and preparation of workpapers for testing key controls Review process documentation, SOPs, and policy compliance across departments Identify process gaps, control lapses, and potential fraud indicators Participate in walkthroughs and discussions with process owners Coordinate with consultants for execution of audits, information sharing, and status tracking Prepare draft observations, audit findings, and presentation material for review Track and follow up on audit recommendations and closure status Conduct Management Testing of Internal Controls over Financial Reporting (ICFR) Support in automation initiatives and use of analytics tools for testing controls Qualification - B.Com/M.Com or CMA/CA Inter or MBA Finance Experience of 1-3 years in internal audits, process reviews, risk management etc. Analytical mindset, attention to detail, strong documentation & communication

Posted 1 month ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Bengaluru

Work from Office

We are looking for an experienced Azure Data Engineer to join our data engineering team. You will be responsible for designing, developing, and maintaining data pipelines and data platforms on Azure using modern tools and services including Microsoft Fabric, Azure Data Factory, Python notebooks, Azure Functions , and Data Warehouses . Your focus will be on building scalable, reliable, and secure data workflows that empower analytics and business intelligence. Key Responsibilities: Design and develop scalable ETL/ELT pipelines using Azure Data Factory (ADF) and Microsoft Fabric . Implement data integration from diverse sources into centralized data warehouses or lakehouses (e.g., Fabric OneLake, Synapse). Build and operationalize Python notebooks for data transformation, enrichment, and machine learning preparation. Develop and deploy Azure Function Apps for lightweight compute tasks, API integration, or orchestration. Collaborate with analysts and business stakeholders to deliver high-quality, trustworthy data models. Implement best practices in data governance, security, version control, and CI/CD for data pipelines. Monitor pipeline performance and ensure high availability and reliability of data services. Contribute to architecture discussions and continuous improvement of the data platform. Required Skills & Experience: 3-5 years of hands-on experience in data engineering with strong Azure expertise. Proficiency in: Microsoft Fabric (Data pipelines, Lakehouse, shortcuts, etc.) Azure Data Factory (ADF) Python notebooks (in Synapse, Fabric, or Azure ML) Azure Functions / Function Apps Data Warehousing concepts and tools (e.g., Synapse SQL, Snowflake, Fabric DWH) Strong SQL skills for data extraction, transformation, and modeling. Good understanding of modern data architectures: medallion architecture, Lakehouse, and data mesh principles.

Posted 1 month ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

Pune

Work from Office

Join us as a Senior BI Analyst at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. Youll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior BI Analyst you should have experience with: Experience of working within a Business Intelligence Function in a large, complex organisation Mathematical or computing educational background or relevant experience Expertise in building dashboards on Tableau , sourced from a data warehouse (preferably Microsoft SQL Server) Expert in Data Management including Ability to develop, understand and execute SQL queries Knowledge of relational and dimensional database concepts and technologies, data warehousing, ETL tasks and their design patterns Understanding of IT Service Management processes based on ITIL i. e. ITIL Foundation Proven ability to interact with senior stakeholders, with demonstrable communication and stakeholder management skills Demonstrable ability to collaborate across teams whilst taking the lead on specific projects and topic areas Desirable skills/Preferred Qualifications: Certification in Tableau or related data visualization tools. Experience with or knowledge of other Reporting tools e. g. Microsoft SQL Server Reporting Services (SSRS), QlikView, Microsoft Power BI, etc. Experience working with and reporting from Configuration Management Database and Asset Management tools (preferably ServiceNow) You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To transform raw data into useful and actionable insights that support strategic decision-making across the bank. Accountabilities Delivery of Business Intelligence (BI) solutions including multi-dimensional database models, data marts, data warehousing, data transforms, data analytics, and reporting solutions, to support the decision-making processes across the bank. Execution of data extraction, cleansing and maintenance initiatives from a range of resources including internal systems, external data bases and market feeds. Development of data models, dashboards and reports to clearly communicate the banks key business KPIs, performance, trends, and relationships to various levels of management within the bank. Provision of responses and solutions to business requests for data analysis by building custom queries and reports to address specific questions and problems. Monitoring KPIs against established objectives and benchmarks to identify trends, patterns and anomalies that may impact the banks performance and provide recommendations to improve highlighted processes. Collaboration with various stakeholders to understand their data needs and provide solutions through new technologies and tools that are compliment with the banks objectives, regulatory requirements, and data governance policies. Continual improvement and automation of reporting and metric provision across Technology. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad, Ahmedabad, Gurugram

Work from Office

About the Role: Grade Level (for internal use): 11 The Team As a member of the EDO, Collection Platforms & AI Cognitive Engineering team you will work on building GenAI-driven and ML-powered products and capabilities to power natural language understanding, data extraction, information retrieval and data sourcing solutions for S&P Global. You will define AI strategy, mentor others, and drive production-ready AI products and pipelines while leading by example in a highly engaging work environment. You will work in a (truly) global team and be encouraged for thoughtful risk-taking and self-initiative. Whats in it for you: Be a part of a global company and build solutions at enterprise scale Lead and grow a highly skilled, hands-on technical team (including mentoring junior data scientists) Contribute to solving high-complexity, high-impact problems end-to-end Architect and oversee production-ready pipelines from ideation to deployment Responsibilities: Define AI roadmap, tooling choices, and best practices for model building, prompt engineering, fine-tuning, and vector retrieval systems Architect, develop and deploy large-scale ML and GenAI-powered products and pipelines Own all stages of the data science project lifecycle, including: Identification and scoping of high-value data science and AI opportunities Partnering with business leaders, domain experts, and end-users to gather requirements and align on success metrics Evaluation, interpretation, and communication of results to executive stakeholders Lead exploratory data analysis, proof-of-concepts, model benchmarking, and validation experiments for both ML and GenAI approaches Establish and enforce coding standards, perform code reviews, and optimize data science workflows Drive deployment, monitoring, and scaling strategies for models in production (including both ML and GenAI services) Mentor and guide junior data scientists; foster a culture of continuous learning and innovation Manage stakeholders across functions to ensure alignment and timely delivery Technical : Hands-on experience with large language models (e.g., OpenAI, Anthropic, Llama), prompt engineering, fine-tuning/customization, and embedding-based retrieval Expert proficiency in Python (NumPy, Pandas, SpaCy, scikit-learn, PyTorch/TF 2, Hugging Face Transformers) Deep understanding of ML & Deep Learning models, including architectures for NLP (e.g., transformers), GNNs, and multimodal systems Strong grasp of statistics, probability, and the mathematics underpinning modern AI Ability to surf and synthesize current AI/ML research, with a track record of applying new methods in production Proven experience on at least one end-to-end GenAI or advanced NLP projectcustom NER, table extraction via LLMs, Q&A systems, summarization pipelines, OCR integrations, or GNN solutions Familiarity with orchestration and deployment toolsDocker, Airflow, Kubernetes, Redis, Flask/Django/FastAPI, PySpark, SQL, R-Shiny/Dash/Streamlit Openness to evaluate and adopt emerging technologies and programming languages as needed Good to have: Masters or Ph.D. in Computer Science, Statistics, Mathematics, or related field (minimum Bachelors) 6+ years of relevant experience in Data Science/AI, with at least 2 years in a leadership or technical lead role Prior experience in the Economics/Financial industry, especially with market-intelligence or risk analytics products Public contributions or demos on GitHub, Kaggle, StackOverflow, technical blogs, or publications Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Hyderabad, Ahmedabad, Gurugram

Work from Office

About the Role: Grade Level (for internal use): 12 The Team As a leader in the EDO, Collection Platforms & AI Cognitive Engineering team, you will own the vision and delivery of enterprise-scale Data Science and GenAI solutions that power natural language understanding, data extraction, information retrieval, data sourcing, and speech-to-text services for S&P Global. You will define the data science strategy, champion best practices, mentor across functions, and drive production-ready AI/ML products from ideation through deployment. Youll work in a (truly) global team that values thoughtful risk-taking and self-initiative. Whats in it for you: Lead data science strategy and build solutions at enterprise scale Grow and mentor a world-class team of data scientists and ML engineers Solve high-complexity, high-impact problems end-to-end Define roadmaps for emerging AI capabilities (GenAI, ASR, speaker analytics) Responsibilities: Shape and execute the Data Science roadmaptooling, methodologies, and best practices for ML, GenAI, and ASR Architect, develop, and deploy large-scale ML and GenAI pipelines-including Voice Activity Detection (VAD), Speaker Diarization, and ASR models-for automated transcription services Own the full data science lifecycleopportunity identification, requirement gathering, modeling, evaluation, validation, deployment, monitoring, and optimization Lead exploratory data analysis, proof-of-concepts, benchmarking, and production model validation for both text- and speech-based AI solutions Establish and enforce coding standards, perform technical reviews, and optimize workflows for reproducibility and scalability Drive MLOps practicesCI/CD for models, feature stores, monitoring, alerting, and automated rollback strategies Mentor and develop junior and mid-level data scientists; foster a culture of continuous learning, innovation, and collaboration Partner with cross-functional stakeholders (Engineering, Product, IT, Compliance) to align on project goals, timelines, and SLAs Technical : 8+ years of hands-on experience in Data Science/AI, with at least 3 years in a senior or leadership role with strong hands on skills. Proven expertise in developing and deploying ASR systems: Training or fine-tuning end-to-end ASR models (e.g., Whisper, QuartzNet) Designing VAD pipelines for robust speech segmentation Implementing Speaker Diarization (e.g., Pyannote, UISR) and handling multi-speaker audio Optimizing transcription accuracy across accents, languages, and noisy environments Deep knowledge of large language models (OpenAI, Anthropic, Llama), prompt engineering, fine-tuning, and embedding-based retrieval Expert proficiency in Python (NumPy, Pandas, SpaCy, scikit-learn, PyTorch/TF 2, Hugging Face Transformers) Strong understanding of ML & deep learning architecturesNLP transformers, GNNs, multimodal systems Solid grasp of statistics, probability, and the mathematics underpinning modern AI Track record of synthesizing cutting-edge research into production solutions Experience with orchestration and deployment toolsDocker, Kubernetes, Airflow, Redis, Flask/Django/FastAPI, PySpark, SQL, and cloud services (AWS/GCP/Azure) Openness to evaluate and adopt emerging technologies and languages as needed Good to have: Master's or Ph.D. in Computer Science, Electrical Engineering, Statistics, Mathematics, or related field Prior experience in the Economics/Financial industry, especially market-intelligence or risk analytics Public contributions on GitHub, Kaggle, StackOverflow, technical blogs, or publications Familiarity with speaker embedding techniques, speech enhancement, and noise-robust modeling Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH103.2 - Middle Management Tier II (EEO Job Group)

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies