Jobs
Interviews

1129 Data Extraction Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Job Description: We are looking for a Data Scientist with expertise in Python, Azure Cloud, NLP, Forecasting, and large-scale data processing. The role involves enhancing existing ML models, optimising embeddings, LDA models, RAG architectures, and forecasting models, and migrating data pipelines to Azure Databricks for scalability and efficiency. Key Responsibilities: Model Development Model Development & Optimisation Train and optimise models for new data providers, ensuring seamless integration. Enhance models for dynamic input handling. Improve LDA model performance to handle a higher number of clusters efficiently. Optimise RAG (Retrieval-Augmented Generation) architecture to enhance recommendation accuracy for large datasets. Upgrade Retrieval QA architecture for improved chatbot performance on large datasets. Forecasting & Time Series Modelling Develop and optimise forecasting models for marketing, demand prediction, and trend analysis. Implement time series models (e.g., ARIMA, Prophet, LSTMs) to improve business decision-making. Integrate NLP-based forecasting, leveraging customer sentiment and external data sources (e.g., news, social media). Data Pipeline & Cloud Migration Migrate the existing pipeline from Azure Synapse to Azure Databricks and retrain models accordingly - Note: this is required only for the AUB role(s) Address space and time complexity issues in embedding storage and retrieval on Azure Blob Storage. Optimise embedding storage and retrieval in Azure Blob Storage for better efficiency. MLOps & Deployment Implement MLOps best practices for model deployment on Azure ML, Azure Kubernetes Service (AKS), and Azure Functions. Automate model training, inference pipelines, and API deployments using Azure services. Experience: Experience in Data Science, Machine Learning, Deep Learning and Gen AI. Design, Architect and Execute end to end Data Science pipelines which includes Data extraction, data preprocessing, Feature engineering, Model building, tuning and Deployment. Experience in leading a team and responsible for project delivery. Experience in Building end to end machine learning pipelines with expertise in developing CI/CD pipelines using Azure Synapse pipelines, Databricks, Google Vertex AI and AWS. Experience in developing advanced natural language processing (NLP) systems, specializing in building RAG (Retrieval-Augmented Generation) models using Langchain. Deploy RAG models to production. Have expertise in building Machine learning pipelines and deploy various models like Forecasting models, Anomaly Detection models, Market Mix Models, Classification models, Regression models and Clustering Techniques. Maintaining Github repositories and cloud computing resources for effective and efficient version control, development, testing and production. Developing proof-of-concept solutions and assisting in rolling these out to our clients. Required Skills & Qualifications: Hands-on experience with Azure Databricks, Azure ML, Azure Synapse, Azure Blob Storage, and Azure Kubernetes Service (AKS). Experience with forecasting models, time series analysis, and predictive analytics. Proficiency in Python (NumPy, Pandas, TensorFlow, PyTorch, Statsmodels, Scikit-learn, Hugging Face, FAISS). Experience with model deployment, API optimisation, and serverless architectures. Hands-on experience with Docker, Kubernetes, and MLflow for tracking and scaling ML models. Expertise in optimising time complexity, memory efficiency, and scalability of ML models in a cloud environment. Experience with Langchain or equivalent and RAG and multi-agentic generation Location: DGS India - Bengaluru - Manyata N1 Block Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 months ago

Apply

3.0 - 8.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Data Engineer II The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. Youll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfair s ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role : As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What youll do : Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What Youll Need : Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies - Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools. .

Posted 2 months ago

Apply

5.0 - 7.0 years

25 - 30 Lacs

Chennai

Work from Office

Your role Location: Chennai Work Hours: 12 PM - 9 PM IST Work Mode: Onsite - 5 days a week We are looking for a detail-oriented and proactive Senior Data Scientist for our Cross-Border line of business. In this pivotal role, you will work closely with the North America Quantitative Analytics leadership team to deliver high-impact data analysis and reporting in support of our Credit and Treasury functions. You will play a hands-on role in maintaining and enhancing our core reporting infrastructure, performing critical data analysis, and supporting operational excellence across our analytics efforts. This role is ideal for an experienced analyst ready to take ownership of high-impact deliverables and collaborate across global teams. What youll be doing Own and deliver complex, business-as-usual (BAU) reports. Analyze large datasets to extract insights, trends, and identify improvement opportunities. Collaborate with North American stakeholders to gather requirements and deliver data-driven solutions. Utilize SQL, Python, and Excel for data extraction, transformation, and reporting automation. Work with internal partners to resolve data issues and ensure accuracy. Proactively troubleshoot anomalies and communicate findings and resolutions. Support data integration from new sources and adapt to evolving reporting needs. Contribute to continuous improvements in reporting efficiency, accuracy, and control measures. 5-7 years of experience in data analysis, business intelligence, or a related role. Bachelor s or Master s degree in Computer Science, Mathematics, Engineering, Statistics, or a related field. Proficiency in SQL, Python, and Excel . Working knowledge of VBA, data visualization, or AI tools is a plus. Strong attention to detail and data quality. Eagerness to learn about compliance and risk in a data context. Ability to work independently, manage deadlines, and collaborate effectively. Experience working with global stakeholders. Familiarity with financial services is preferred but not required. Willingness to work in office, during the specified shift hours, is mandatory . About Corpay Corpay is a global technology organisation that is leading the future of commercial payments with a culture of innovation that drives us to constantly create new and better ways to pay. Our specialized payment solutions help businesses control, simplify, and secure payment for fuel, general payables, toll and lodging expenses. Millions of people in over 80 countries around the world use our solutions for their payments. All offers of employment made by Corpay (and its subsidiary companies) are subject to the successful completion of satisfactory pre-employment vetting by an independent supplier (Experian). This is in accordance with Corpays Resourcing Policy and include employment referencing, identity, adverse financial, criminal and sanctions list checks. We do this to meet our legal and regulatory requirements. Corpay is dedicated to encouraging a supportive and inclusive culture among our employees. It is within our best interest to promote diversity and eliminate discrimination in the workplace. We seek to ensure that all employees and job applicants are given equal opportunities. Notice to Agency and Search Firm Representatives: Corpay will not accept unsolicited CVs from agencies and/or search firms for this job posting. Resumes submitted to any Corpay employee by a third party agency and/or search firm without a valid written & signed search agreement, will become the sole property of Corpay. No fee will be paid if a candidate is hired for this position as a result of an unsolicited agency or search firm referral. Thank you.

Posted 2 months ago

Apply

3.0 - 5.0 years

9 - 13 Lacs

Chennai

Work from Office

We are looking for a detail-oriented and data-driven Web Analytics & Reporting Specialist to join our digital analytics team. The ideal candidate will be responsible for tracking, analyzing, and reporting on user behavior across digital platforms to drive actionable insights. This role requires hands-on experience with web analytics tools and strong reporting skills to support data-informed decision-making across marketing, product, and UX teams. Job Description: Key Responsibilities: Implement, manage, and maintain web analytics tracking using tools such as Google Analytics , Adobe Analytics , or similar platforms. Develop and automate dashboards and reports that highlight key performance indicators (KPIs), trends, and user behavior insights. Collaborate with stakeholders to define analytics requirements and deliver timely, accurate reporting. Monitor website performance metrics, user engagement, traffic sources, and conversion funnels. Translate complex data into clear, actionable insights and present findings to both technical and non-technical audiences. Work with developers to validate data collection implementations using tools like Google Tag Manager or other tag management systems. Continuously optimize tracking setups to ensure data accuracy and completeness . Conduct A/B tests and performance analyses to inform marketing and UX strategies. Required Skills & Experience: 3-5+ years of experience in web analytics and reporting . Hands-on expertise with tools such as Google Analytics (GA4) , Adobe Analytics , Google Tag Manager , and Data Studio/Tableau/Power BI . Proficient in SQL and/or other query languages for data extraction and manipulation. Strong knowledge of website tagging, cookies, and user tracking methodologies. Excellent data visualization and reporting skills with the ability to synthesize data into insights. Familiarity with marketing platforms (Google Ads, Meta Ads, etc.) is a plus. Nice to Have: Experience with customer journey mapping and attribution modeling. Background in digital marketing or e-commerce analytics. Familiarity with JavaScript or debugging tools for tag validation (e.g., Chrome DevTools) Location: Chennai Brand: Paragon Time Type: Full time Contract Type: Permanent

Posted 2 months ago

Apply

1.0 - 2.0 years

8 - 9 Lacs

Bengaluru

Work from Office

HRIS Analyst will be part of the HRIS team and point of contact for all global HR reporting requests. We are looking for an experienced HR Reporting Analyst who is quality minded, ready to bring creative solutions, and is passionate about turning data into information that is easily understood and utilized. Duties will include day-to-day/ad-hoc report creation and support as well as project-based work for the new implementation or enhancement of reporting solutions. The HRIS Analyst will have proficiency over several data extraction tools/technologies and the development of reports and analysis using these tools. Responsibilities : Create, maintain, and support a variety of reports and queries utilizing appropriate reporting tools Assist in development of standard reports for ongoing HR needs Promote self-service reporting functionality with end users. Support HRIS team in key projects and integrations as required. Quick turnaround of ad hoc report requests, able to prioritize workload and adjust as business needs require. Maintain documentation such as data dictionaries, report catalogs, and report specifications. Develop reporting procedures, guidelines and documentation, job aids as required. Train new reporting users as required. Assure the accuracy of all data, reports and analysis. Continuously seek opportunities to implement process/customer service and reporting improvements. Demonstrate behaviour consistent with the company s Code of Ethics and Conduct The candidate should proactively report any quality issues or flaws to senior management. This ensures that corrective measures can be implemented promptly, preventing recurrence of the issue. Duties may be modified or assigned at any time to meet the needs of the business. Proven experience with developing queries/reports/metrics and analysis to answer simple to complex HR operations questions. Preferred hands-on experience with HCM systems (Oracle fusion preferred) Planning and organization skills, attention to detail, ability to handle multiple tasks, and work in a fast-paced, time-sensitive environment to meet deadlines. Excellent communication skills with the ability to convey complex results to non-technical stakeholders in a clear manner. Understanding of HR processes and people data preferred Demonstrate analytical and problem-solving skills with experience applying these skills within complex programs to address business and reporting requirements. Education : Bachelor s degrees in computer science with 1 - 2 years of experience with Strong SQL knowledge of related experience; or equivalent work experience. Expereince : 1 - 2 years of experience with Strong SQL knowledge of related experience; or equivalent work experience. Key Skill : Experience in Oracle fusion BI Publisher, Logical SQL, Microsoft Power automate. and OTBI reports (added advantages). Strong knowledge of writing SQL queries In-depth knowledge of Excel (Macros, V Lookup, Pivot table & Charts, Mathematical, all functions) Experience with BI tools; Power BI preferred.

Posted 2 months ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Faridabad

Work from Office

We are seeking a highly detail-oriented and technically adept 3D Data Annotation Specialist to join our growing team. This role is critical in shaping high-quality datasets for training cutting-edge AI and computer vision models, particularly in domains such as LiDAR data processing, and 3D object detection. Qualifications: B.Tech in Computer Science, IT, or related field preferred (others may also apply strong analytical and software learning abilities required). Strong analytical and reasoning skills, with attention to spatial geometry and object relationships in 3D space. Basic understanding of 3D data formats (e.g., .LAS, .LAZ, .PLY) and visualization tools. Ability to work independently while maintaining high-quality standards. Excellent communication skills and the ability to collaborate in a fast-paced environment. Attention to detail and ability to work with precision in visual/manual tasks. Good understanding of basic geometry, coordinate systems, and file handling. Preferred Qualifications: Prior experience in 3D data annotation or LiDAR data analysis. Exposure to computer vision workflows. Comfortable working with large datasets and remote sensing data Key Responsibilities: Annotate 3D point cloud data with precision using specialized tools [ Training would be provided] Label and segment objects within LiDAR data, aerial scans, or 3D models. Follow annotation guidelines while applying logical and spatial reasoning to 3D environments. Collaborate with ML engineers and data scientists to ensure annotation accuracy and consistency. Provide feedback to improve annotation tools and workflow automation. Participate in quality control reviews and conduct re-annotation as needed

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 6 Lacs

Pune

Work from Office

Key Responsibilities: Validate, and update master data requests, ensuring adherence to data quality rules and country-specific exceptions. Conduct periodic quality checks on master data to maintain accuracy and consistency. Update vendor and customer master data in alignment with Service Level Agreements (SLAs) and data guidelines. Identify, analyze, and resolve duplicate records within the master data system. Collaborate with Data Steward and Governance teams to support data quality initiatives and clean-up activities. Maintain and update Standard Operating Procedures (SOPs) for master data processes. Participate in discussions with Business Partners and Stakeholders, addressing ad-hoc requirements with urgency and professionalism. Utilize SQL queries to extract and analyze data for quality checks and reporting purposes. Create and maintain Power BI dashboards to monitor and report on data quality. Support audit requirements and ensure compliance with master data guidelines. Continuously improve data management processes by identifying areas for enhancement. Requirements: Proven experience in Master Data Management (MDM) or a related field. Excellent analytical and problem-solving skills. Proficiency in SQL for data extraction and analysis. Ability to work independently and collaboratively as part of a team. Strong communication skills for effective stakeholder interaction. Attention to detail with a proactive approach to resolving data issues. Ability to manage multiple tasks and prioritize work effectively.

Posted 2 months ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Pune

Work from Office

Job Description Job Purpose ICE is the leading cloud-based platform provider for the mortgage finance industry. ICE solutions enable lenders to originate more loans, reduce origination costs, and reduce the time to close, all while ensuring the highest levels of compliance, quality, and efficiency. The Associate will perform exception reviews on documents and perform audits at different times in the process of loan production, in accordance with customer needs and regulations. Responsibilities Validation of Document Recognition and Data Extraction. Perform Data Integrity check on extracted data. Follow the Standard Operating Procedures. Engage with management to overcome any obstacles to service delivery. Ensure that data security is maintained. Ensure meet the defined SLAs as per the requirement. Create Bounding Boxes around Data and label the Documents. Provide labelled dataset for training the model. Identify and raise the anomalies in the Software Application. Meet the Quality and Productivity target on a daily basis. Ability to work in 24 / 7 shifts. Knowledge and Experience Bachelor s degree or academic equivalent. 0 to 2 years of mortgage lending experience, or a combination of education and experience to include processing, underwriting, closing, quality control and/or compliance review. Preferred Proficiency in mortgage document terminology. Proficiency with Microsoft Office (Excel and Word) and Microsoft Windows. Proficiency in using keyboard shortcuts. Strong attention to detail. Excellent time management and organizational skills. Ability to work efficiently. Ability to work under pressure and time constraints. Effective and clear communication in English, both spoken and written.

Posted 2 months ago

Apply

1.0 - 5.0 years

3 - 6 Lacs

Pune

Work from Office

Job Description Job Purpose ICE is the leading cloud-based platform provider for the mortgage finance industry. ICE solutions enable lenders to originate more loans, reduce origination costs, and reduce the time to close, all while ensuring the highest levels of compliance, quality, and efficiency. The Delivery Analyst will perform exception reviews on documents and perform audits at different times in the process of loan production, in accordance with customer needs and regulations. Responsibilities Validation of Document Recognition and Data Extraction. Perform Data Integrity check on extracted data. Follow the Standard Operating Procedures. Engage with management to overcome any obstacles to service delivery. Ensure that data security is maintained. Ensure meet the defined SLAs as per the requirement. Create Bounding Boxes around Data and label the Documents. Provide labelled dataset for training the model. Identify and raise the anomalies in the Software Application. Focus is on production administration, process improvement and continuing to deliver on mass production tasks based on existing procedures. Perform testing on product improvements, expansions, and fixes, including accuracy-related functional testing and pre-production testing for client onboarding and production rollout. Performs complex tasks for multiple delivery tracks simultaneously and Quality Audits. Meet the Quality and Productivity target on a daily basis. Subject Matter Expert for the process. Perform regular analysis and take performance improvement initiatives, independently. Management and Tracking of Workflow Queue Prepare and Conduct Training Maintain Job aid and SOP. Conduct complex analysis: Time and Motion, Field level, page level, Accuracy, High Confident Skip, Redundant field etc. Perform POC related analysis and validation tasks. Respond to associate s escalations. Troubleshoot issues reported by team members. Ability to work in 24 / 7 shifts. Knowledge and Experience Bachelor s degree or academic equivalent. 3+ years of mortgage lending experience, or a combination of education and experience to include processing, underwriting, closing, quality control and/or compliance review. Proficiency in mortgage document terminology. Proficiency with Microsoft Office (Excel and Word) and Microsoft Windows. Proficiency in using keyboard shortcuts. Strong attention to detail. Excellent time management and organizational skills. Ability to work efficiently. Ability to work under pressure and time constraints. Effective and clear communication in English, both spoken and written.

Posted 2 months ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Hyderabad

Work from Office

Job Description Job Purpose ICE is the leading cloud-based platform provider for the mortgage finance industry. ICE solutions enable lenders to originate more loans, reduce origination costs, and reduce the time to close, all while ensuring the highest levels of compliance, quality, and efficiency. The Associate will perform exception reviews on documents and perform audits at different times in the process of loan production, in accordance with customer needs and regulations. Responsibilities Validation of Document Recognition and Data Extraction. Perform Data Integrity check on extracted data. Follow the Standard Operating Procedures. Engage with management to overcome any obstacles to service delivery. Ensure that data security is maintained. Ensure meet the defined SLAs as per the requirement. Create Bounding Boxes around Data and label the Documents. Provide labelled dataset for training the model. Identify and raise the anomalies in the Software Application. Meet the Quality and Productivity target on a daily basis. Ability to work in 24 / 7 shifts. Knowledge and Experience Bachelor s degree or academic equivalent. 0 to 2 years of mortgage lending experience, or a combination of education and experience to include processing, underwriting, closing, quality control and/or compliance review. Preferred Proficiency in mortgage document terminology. Proficiency with Microsoft Office (Excel and Word) and Microsoft Windows. Proficiency in using keyboard shortcuts. Strong attention to detail. Excellent time management and organizational skills. Ability to work efficiently. Ability to work under pressure and time constraints. Effective and clear communication in English, both spoken and written.

Posted 2 months ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad

Work from Office

We are looking for an experienced Azure Data Engineer with 2+ years of hands-on experience in Azure Data Lake and Azure Data Factory. The ideal candidate will have a strong background in connecting data sources to the Data Lake, writing PiSpark SQL codes, and building SSIS packages. Additionally, experience in data architecture, data modeling, and creating visualizations is essential. Key Responsibilities : Work with Azure Data Lake and Azure Data Factory to design, implement, and manage data pipelines. Connect various data sources (applications, databases, etc.) to the Azure Data Lake for storage and processing. Write PiSpark SQL codes and SSIS packages for data retrieval and transformation from different data sources. Design and develop efficient Data Architecture and Data Modeling solutions to support business requirements. Create data visualizations to communicate insights to stakeholders and decision-makers. Optimize data workflows and pipelines for better performance and scalability. Collaborate with cross-functional teams to ensure seamless data integration and delivery. Ensure data integrity, security, and compliance with best practices. Skills and Qualifications : 2+ years of experience working with Azure Data Lake, Azure Data Factory, and related Azure services. Proficiency in writing PiSpark SQL codes for data extraction and transformation. Experience in developing SSIS packages for data integration and automation. Strong understanding of Data Architecture and Data Modeling concepts. Experience in creating effective and insightful data visualizations using tools like Power BI or similar. Familiarity with cloud-based storage and computing concepts and best practices. Strong problem-solving skills with an ability to troubleshoot and optimize data workflows. Ability to collaborate effectively in a team environment and communicate with stakeholders. Preferred Qualifications : Certifications in Azure (e.g., Azure Data Engineer or similar) would be a plus. Experience with other Azure tools like Azure Synapse, Databricks, etc.

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Work from Office

About Firstsource Firstsource Solutions is a leading provider of customized Business Process Management (BPM) services. Firstsource specialises in helping customers stay ahead of the curve through transformational solutions to reimagine business processes and deliver increased efficiency, deeper insights, and superior outcomes. We are trusted brand custodians and long-term partners to 100+ leading brands with presence in the US, UK, Philippines, India and Mexico. Our rightshore delivery model offers solutions covering complete customer lifecycle across Healthcare, Telecommunications & Media and Banking, Financial Services & Insurance verticals. Our clientele includes Fortune 500 and FTSE 100 companies. Fraud Technical Specialist Band 4 What You Will Do: Reporting into the Fraud Technical Lead, you will work as part of a team to help underpin the delivery of Skys Technical Fraud strategy ensuring Sky is afforded the highest level of protection via efficient and optimal use of best-in-class fraud controls. Undertake operational tasks that ensure the smooth running of Skys fraud function across multiple systems. Continually build comprehensive knowledge of common fraud trends through a mixture of analyst feedback and proactive interrogation of data. Present rule configuration changes to the Fraud Technical Lead with the aim of optimising overall fraud detection. Respond to information requests by developing customised reports through the combination of SQL database queries and ad-hoc data extracts. Be able to go beyond figures and deliver meaningful insight in a way that can be easily digested by various stakeholders to help answer the important questions. Be involved in maintaining quality assurance of fraud systems through user acceptance testing of upgrade deployments and or system enhancements. Ensure functional deliveries match expected requirements whilst all bugs and issues are removed prior to production release. Respond to system incidents by working with relevant support teams / vendors to resolve operational issues within the defined schedules and service level agreements. Apply commercial judgement to correctly prioritize operational malfunctions and ensure impactful system issues are appropriately escalated. Key Skills & Experience Include: Highly numerate preferably with experience in a data analysis role Strong proven SQL experience is a must Solid critical thinking and methodical approach when dealing with technical issues Demonstrates resilience when faced with competing priorities Previous fraud experience advantageous but not essential Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses.

Posted 2 months ago

Apply

1.0 - 2.0 years

1 - 5 Lacs

Bhiwadi

Work from Office

Job Description: Power BI, Power Automate, Power Apps Specialist Location: Bhiwadi - Rajasthan Timings: 7 AM - 3 PM (Majorly) Position Overview: We are seeking a dedicated and experienced specialist in Power BI, Power Automate, and Power Apps. The ideal candidate will have a solid background in Office 365 skills, capable of working on-site in Bhiwadi, and managing multiple projects and priorities. Key Responsibilities: Develop and manage Power BI dashboards, Power Automate workflows, and Power Apps solutions. Ensure timely delivery of projects based on the schedule. Link and integrate data from different resources to provide comprehensive business solutions. Maintain operational discipline across all assigned tasks and projects. Collaborate with cross-functional teams to ensure the effective implementation of solutions. Required Skills and Qualifications: Minimum of 1.5-2 years of experience with Office 365 skills, particularly in Power BI, Power Automate, and Power Apps . Strong domain knowledge in the mentioned skills. Technical capability to deliver projects on time based on the schedule. Excellent ability to link and integrate data from various sources. Strong operational discipline and organizational skills. Preferred Qualifications: Previous experience in a similar role. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Ability to manage multiple projects and priorities effectively. Thanks & Regards Your Manpower Manager” DIVYA SHARMA Contact No-6262000413 Officer- TA | HR Ashkom.hr1@ashkom.com Divya.ashkom@gmail.com Ashkom Media India Private Limited Website: www.ashkom.com

Posted 2 months ago

Apply

4.0 - 6.0 years

0 - 3 Lacs

Mumbai

Work from Office

Role & responsibilities : Data Analysis, Superior planning to support Brand team: • Extract multiple distribution and sales reports from different sources and collate them in the desired concrete format. Take charge of dashboards and Master data files while ensuring that they are maintained in the format monthly (channel/ geography/market level) Track progress of initiatives via reports at geographic/ channel levels Create a monthly share analysis report based on defined segments (e.g price tier, pack size) by market Lead Coordination and Collaboration with multi-function teams to ensure superior online and offline execution: e-Content: Development and management of e-Content across key markets, customers and SKUs, Ensure primary and secondary images are FFU, A+/e+ content and the claims are correct Fill defined ECL templates, liaison with digital transformation team and upload content timely on aligned platforms for ease of access Pack shot/POSM design creation: Collaborate with design agency to build and deliver monthly POSM (Point of Sale material) and product booklets. Manage the timeline and ensure agency output adherence, Upload/removal on PIM & Brandstore Claims Management: Lead claim management documentation in lines with Brand Managers design brief, collaborate with MFT team on approvals and corrections in forums, submission and post submission tracking Landscape tracking: Track, analyze and report key category trends, new launches, money chart, search across e-com platforms, social and google. Collaborate with internal teams on product pricing changes / new product launches and communicate the same to the desired partners. Manage and Track delivery and creation of Samples across markets Lead coordination for product samples for MOT as and when needed. Be the administrative guide for the Category Team Setting up meetings within Category Team Pack-shot management

Posted 2 months ago

Apply

1.0 - 5.0 years

9 - 12 Lacs

Bengaluru

Work from Office

Role Overview Support analytics solutions in SAP BI/BW or SAC environments Assist in development, testing, and documentation. Key Responsibilities Build reports and dashboards in SAC, BW, or BO. Prepare data models and queries as per business needs. Support data extraction and analysis. Skills Required Exposure to SAP BW, HANA Modeling, or SAC. Understanding of ETL and visualization techniques. Willingness to learn and work on cross-functional analytics tools. Work Details Shift: UK Shift

Posted 2 months ago

Apply

12.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We are looking for a highly skilled Solution Architect to lead the design and implementation of a Data and Machine Learning Platform spanning edge, cloud, and on-prem components. The ideal candidate will have deep experience in Azure cloud services, data engineering, edge computing, and ML lifecycle management. Technical Expertise Azure Cloud Stack DevOps Azure Databricks (including ML workspace for Feature Store and Model Store) Azure Data Factory (ADF) for orchestration and compute Azure Data Lake Storage (ADLS) implementing medallion architecture (raw, bronze, silver, gold) Azure Event Hub: Experience in defining topics, managing consumer groups, and integrating ETL events Azure Streaming Analytics: Real-time data processing for telemetry and operational data Azure Key Vault Azure App Service Azure Container Registry (ACR) Azure IoT Hub for connecting edge devices Azure DevOps GitHub Actions (for CI/CD pipelines) GitHub self-hosted runners for ML workflow automation Edge and On-Prem Integration Strong experience in OT-IT integration and data extraction from industrial systems Edge VM deployment using: - Docker and Portainer for container orchestration - RabbitMQ for messaging (read/write services from edge) - OPC UA for interfacing with PLCs (eg, FX Filter, NH3 Compressor) - IDMZ deployment practices and edge-to-cloud data service integration Machine Learning Platform (MLP) and MLOps End-to-end ML lifecycle implementation: Feature Engineering, Model Training Validation, Model Export, Versioning, and Deployment Hands-on with ADB ML workspace, Feature Store, Model Store Monitoring deployed models at 1-minute intervals Understanding of training vs inference, cloud vs edge deployment Cadence for ML models (Weekly Refresh, Monthly Retrain, Quarterly Revamp) Use of GitHub monorepo structure for managing model code Data Architecture Integration Implementation of medallion architecture in the data platform Integration with Unity Catalog (UC) for governance, data sharing, and cataloging Experience with CDC tools (eg, Aecorsoft) for real-time SAP data ingestion Consumption layer design for BI, ML, and operational workloads Familiarity with streaming and API-based ingestion from external environments Template-driven ingestion and mapping using configurations Governance and Data Modeling Define and enforce data governance standards using Unity Catalog and enterprise frameworks. Design scalable data models to support operational analytics and ML features. Implement policies for access control, quality, and metadata tagging across DLZ/zones. Key Responsibilities 1. Architect Integrated Solutions: Lead architectural design across edge, cloud, and ML across zones 2. Build and Govern Data Platform: Oversee ingestion, transformation, and cataloging across Raw Gold layers, aligned to UC. 3. Enable Scalable ML Platform: Support ML teams with infrastructure for feature storage, model ops, and deployment pipelines. 4. Edge Integration and Automation: Design robust and secure OT-IT interfaces with RabbitMQ, OPC UA, and container orchestration tools. 5. Monitor and Optimize Pipelines: Set up real-time monitoring for ML and ETL pipelines; optimize for performance and cost. 6. Governance and Security Compliance: Ensure enterprise compliance, tagging, and secure access across all zones and services. 7. Lead CI/CD Automation: Use GitHub Actions and Azure DevOps to streamline deployment of ML workflows and platform components. Skills Devops, Azure, Rabbitmq, Data Modeling, Scala, Compliance, Machine Learning, Vault, Workflow, Docker, Cadence, Cloud Service

Posted 2 months ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Gurugram

Work from Office

Conduct data research by collecting, analysing , and interpreting data sets. Could be able to perform RCA and identify outliers in the data . Helps the customers in troubleshooting the issues related to the system. Utilise business tools for data extraction, model l ing , and troubleshooting to uncover the insights that help the customer for business growth. Collaborating with teams across functions to ensure smooth execution while adhering to timelines and quality standards. Align with organizational goals, and values and contribute to long-term success. Communicate effectively with stakeholders in the US , sharing project progress and addressing inquiries to maintain alignment. Requirements and skills: Experience of 1 -3 years in technical support to software products (B2B SaaS product) Experience in customer service, technical support, ticketing tools , and troubleshooting skills. Proficiency in Excel including Python/SQL Basic level . Should understand the renewable energy business. Strong team player while possessing the autonomy to manage his own workload. Excellent communication skill . Qualifications: Bachelor s degree in E lectrical engineering , Electronics & communication (Only)

Posted 2 months ago

Apply

1.0 - 3.0 years

8 - 11 Lacs

Bengaluru

Work from Office

Key Responsibilities: Data Collection & Preparation : Collect and organize data from various sources, such as databases, spreadsheets, and APIs. Perform data cleaning and preprocessing to ensure data accuracy and consistency. Data Analysis & Reporting : Analyze datasets to identify trends, patterns, and insights to support business objectives. Create reports, dashboards, and visualizations to present findings clearly to stakeholders. Assist in tracking key performance indicators (KPIs) and metrics. Collaboration : Work closely with cross-functional teams (e.g., Marketing, Sales, Operations) to understand data needs. Support business teams by providing data-driven recommendations. Tools & Technology : Utilize tools like Excel, SQL, and BI platforms (e.g., Tableau, and Power BI) for data analysis and visualization. Support database queries and data extraction processes. Documentation & Process Improvement : Document data processes, methods, and findings to ensure repeatability and transparency. Identify opportunities for process improvements and contribute to their implementation. Qualifications: Education : Bachelor s degree in Data Science, Statistics, Mathematics, Computer Science or a related field. Technical Skills : Proficiency in Microsoft Excel (e.g., PivotTables, VLOOKUP). Strong SQL skills for querying databases. Familiarity with data visualization tools (e.g., Tableau, Power BI). Understanding of statistical analysis and basic concepts. Soft Skills : Strong problem-solving and analytical skills. Excellent attention to detail and organizational skills. Effective communication skills to present data insights to non-technical audiences. Experience :1-3 years of relevant experience in data analysis or a related role

Posted 2 months ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Chennai

Remote

Execute and support R&D activities leveraging metadata from multiple databases, ETL/ELT products, reporting tools etc. Develop, test, and deploy Python and SQL-based solutions to automate and optimize operational processes. Data Analysis & Reporting Required Candidate profile Provide hands-on programming support for AI-driven initiatives. Mastery in Python Programming Advanced SQL Proficiency analytical methodologies, statistical concepts, and data visualization techniques

Posted 2 months ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Primary Responsibilities Gather and analyze requirements for clinical data conversion projects Collaborate with clients and vendors to define project scope, timelines, and deliverables Prepare and transform clinical data for conversion activities Address and resolve data-related issues reported by clients Develop and maintain documentation and specifications for data conversion processes Monitor project progress and ensure timely completion of milestones Troubleshoot common database issues and provide technical support Ensure compliance with US healthcare regulations and standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Familiarity with US healthcare systems and regulations Knowledge of standard EHR/EMR clinical data workflows Understanding of healthcare clinical dictionaries Proficiency in EHR database architecture and data extraction/transformation using MS SQL Server Solid knowledge of stored procedures, triggers, and functions Proven excellent problem-solving and troubleshooting skills Solid communication and collaboration abilities

Posted 2 months ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Job Description: We are seeking an experienced Oracle Integration Cloud (OIC) Technical Consultant for the role of Manager - OIC Integrations . The ideal candidate will have a deep understanding of Oracle Integration Cloud and a proven track record in designing, developing, and managing integrations between various Oracle Cloud and third-party systems. Mandatory Skills: Oracle Integration Cloud (OIC): Expertise in designing and building integrations using OIC Strong experience in working with OIC adapters (REST, SOAP, File, FTP, ERP, HCM, etc.) Proficient in handling integration error management, orchestration, and mapping Experience with scheduled and real-time integrations Nice to Have Skills: BI Publisher (BIP): Creating and customizing BIP reports Working with data models and bursting functionality PL/SQL: Writing and optimizing SQL/PLSQL queries Experience in working with Oracle databases for data extraction and manipulation Key Responsibilities: Lead and manage integration projects across Oracle Cloud applications Design robust, scalable, and efficient integration solutions using OIC Collaborate with functional and technical teams to gather integration requirements Troubleshoot and resolve integration issues Document technical designs, integration flows, and best practices Qualifications: Bachelors degree in Computer Science, Engineering, or related field [X]+ years of hands-on experience with Oracle Integration Cloud Strong analytical and problem-solving skills Excellent communication and leadership abilities Let me know if you want this tailored further for a specific industry or client. ",

Posted 2 months ago

Apply

3.0 - 6.0 years

8 - 18 Lacs

Chennai

Work from Office

About Us : We are a dynamic and innovative team specializing in building scalable and efficient web scraping solutions for complex e-commerce platforms and diverse web environments. Our team is proficient in a wide range of technologies and advanced web scraping techniques, ensuring high-quality data extraction and delivery Job Summary: We are seeking a talented Web Scraping Engineer / Data Extraction Specialist to join our growing team. The ideal candidate will have a strong background in web scraping, data extraction, and backend technologies. You will be responsible for designing, developing, and maintaining robust web scraping solutions, handling dynamic content, and overcoming anti-crawling measures. Responsibilities: Develop and maintain scalable web scraping scripts using Python, JavaScript, and related frameworks (e.g., Scrapy, Selenium, Puppeteer, Beautiful Soup, Cheerio.js). Implement advanced web scraping techniques, including API interception, sitemap parsing, and handling dynamic content. Design and build data pipelines for efficient data extraction, processing, and storage. Manage and optimize data extraction workflows, ensuring high speed and accuracy. Implement anti-crawling solutions, including IP rotation, proxy management, and CAPTCHA bypassing. Collaborate with cross-functional teams to gather requirements and deliver complex data solutions. Utilize backend technologies such as Flask, FastAPI, Django, Node.js, Spring Boot, and relational databases (PostgreSQL, MySQL) for data storage and API development. Work with cloud platforms like Azure and leverage services such as AzureML and ADLS GEN2. Employ data processing techniques using libraries like NumPy and Pandas. Use tools like Postman, MITM, and DevTools for API testing and network traffic analysis. Apply machine learning and NLP techniques for data analysis and processing (e.g., sentiment analysis, content classification). Set up and manage server-side scraping environments. Monitor and troubleshoot scraping scripts to ensure optimal performance. Required Skills: Strong proficiency in Python and/or JavaScript. Experience with web scraping libraries and frameworks (Scrapy, Selenium, Puppeteer, Beautiful Soup, Cheerio.js). Knowledge of backend technologies (Flask, FastAPI, Django, Node.js, Spring Boot). Experience with relational databases (PostgreSQL, MySQL). Understanding of HTTP/HTTPS protocols and API communication. Familiarity with cloud platforms (Azure). Ability to handle dynamic content and JavaScript-heavy websites. Experience with anti-crawling techniques (IP rotation, proxies, CAPTCHA bypassing). Data processing and analysis skills (NumPy, Pandas). Experience with API testing tools (Postman). Knowledge of machine learning and NLP concepts is a plus. Strong problem-solving and debugging skills. Excellent communication and collaboration skills. Experience: 3+ years of professional experience in web scraping or related fields. Education: Bachelor's degree in Computer Science, Engineering, or a related field

Posted 2 months ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Noida

Work from Office

Excellent opportunity for freshers. Who should you be ? A graduate with good academic records and excellent written English skills & Comprehension skills. A dynamic personality with good communication skills. A keen player with an ability to learn fast and grow. A team player with a zeal to perform in a team and grow. What do we offer ? * A fast-paced environment to learn and grow. Roles and Responsibilities Manage data extraction, analysis, and reporting using Excel. Ensure timely delivery of high-quality results. Manage large datasets by cleaning, organizing, and transforming data into a usable format Please connect with Ms. Shweta on 9873630477 if interested.

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 6 Lacs

Chandigarh

Work from Office

Job Summary: We are seeking a results-oriented and strategic team leader of Business Analytics to lead our Reporting Analytics team in harnessing data to drive business success. This role requires a blend of analytical expertise, leadership capabilities, and a deep understanding of business operations. The ideal candidate will empower teams to extract meaningful insights from data, support key initiatives, and foster a data-driven culture across the organization. Key Responsibilities: Team Leadership and Development: Recruit, train, and develop a high-performing analytics team, providing coaching and mentorship to foster professional growth. Conduct regular performance reviews, setting clear objectives and providing feedback to enhance team effectiveness. Responsible for deliverables of Reporting Analytics team, includes leading team of 2 - 3 Associates Process Manager and a span of 10-15 member team of Reporting Analysts and Sr. Reporting Analysts Data Strategy and Governance: Define and execute a comprehensive data analytics strategy that aligns with organizational goals and industry best practices. Establish data governance protocols to ensure data quality, security, and compliance with relevant regulations. Business Insights and Analysis: Partner with stakeholders to understand their needs, challenges, and goals, translating these into analytical projects that drive value. Conduct exploratory data analysis to uncover trends, patterns, and insights that inform strategic decisions. Reporting, Visualization, and Communication: Design and implement effective reporting frameworks, utilizing advanced data visualization tools to present findings to diverse audiences. Prepare and deliver presentations to senior leadership, articulating insights and recommendations in a clear and actionable manner. Performance Metrics and Dashboards: Develop and monitor dashboards and performance metrics to track the effectiveness of business initiatives, providing timely updates to stakeholders. Identify opportunities for improving operational efficiency through data-driven recommendations. Project Management: Lead cross-functional analytics projects, coordinating resources and timelines to ensure successful project delivery. Manage project budgets and resources effectively, ensuring alignment with strategic priorities. Continuous Improvement and Innovation: Stay abreast of industry trends, emerging technologies, and best practices in analytics, applying this knowledge to enhance team capabilities. Foster a culture of continuous improvement by encouraging team members to explore new methodologies and technologies. Leadership and Interpersonal Skills: Strong leadership qualities with the ability to influence and inspire cross-functional teams. Excellent interpersonal skills with a knack for building relationships at all levels of the organization. Analytical and Problem-Solving Skills: Strong analytical mindset with the ability to think critically and strategically to solve complex business problems. Technical Skills: Data Analysis and reporting Tools: Database Management: Data Visualization: ETL Processes: Data Warehousing: Scripting Languages: Must have: Proficiency in Microsoft Excel, PowerPoint, VBA and SQL including advanced functions and data visualization features. Good to have: Knowledge of BI tools such as SSIS Packages, Tableau or Power BI for creating interactive dashboards and reports. Good to have: Understanding of statistical methods, Forecasting and Predictive analysis knowledge Must have: Strong knowledge of SQL for querying databases and extracting data. Must have Familiarity with database management systems like Microsoft SQL Server Must have: Proficiency in visualization tools like Excel dashboard, PowerPoint, Tableau to create meaningful visual representations of data. Ability to create visually appealing and informative reports and dashboards that convey insights effectively. Understanding of how to design effective visualizations and best practices in data visualization design. Must have: knowledge of Extract, Transform, Load (ETL) processes and tools (e.g. SSIS packages) for data integration and preparation. Good to have: Understanding of data warehousing concepts and architecture. Good to have: Experience with data warehouse technologies and methodologies. Must have: Proficiency in scripting languages like VBA and SQL query for automating reporting tasks and data manipulation.

Posted 2 months ago

Apply

6.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Proven experience working with PeopleSoft HCM and CRM modules. Strong technical knowledge of SQR, Application Designer, Application Engine, and PeopleCode. Proficiency in SQL for data extraction and manipulation. Experience with data archival projects and enterprise archival solutions is a plus. Excellent problem-solving skills and attention to detail. Primary Skill Proven experience working with PeopleSoft HCM and CRM modules. Strong technical knowledge of SQR, Application Designer, Application Engine, and PeopleCode. Proficiency in SQL for data extraction and manipulation. Secondary Skill Experience with data archival projects and enterprise archival solutions is a plus. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies