Jobs
Interviews

958 Data Cleansing Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

3 - 5 Lacs

vasai

Work from Office

Job Responsibilities: Collect, filter, and maintain accurate data from multiple sources. Prepare and deliver Daily, Weekly, and Monthly MIS reports as per business requirements. Generate dashboards and performance reports for management review, ensuring timely submission. Analyze data trends to provide actionable insights by highlighting variances and performance gaps. Perform data validation, reconciliation, and troubleshooting to ensure accuracy and consistency. Identify and implement reporting automation opportunities to improve efficiency. Collaborate with cross-functional teams to support data-driven decision-making. Key Skills Required: Strong expertise in SQL, Power BI, and Advanced Excel (Pivot Tables, VLOOKUP, Power Query) . Excellent data presentation and visualization skills. Strong analytical and problem-solving abilities. Attention to detail with the ability to manage large datasets.

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

mumbai, navi mumbai

Work from Office

Sage Software and Solutions Pvt. Ltd. is looking for Data Analyst Intern to join our dynamic team and embark on a rewarding career journey Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

bengaluru

Work from Office

Playdawn Consulting is looking for Data Analyst / APM to join our dynamic team and embark on a rewarding career journey Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

bengaluru

Work from Office

Leadership and Collaboration: Lead and collaborate with various IT Centers of Excellence (COEs) teams to foster innovation and synergies with the IR analytics team and their platforms and solutions capabilities. System and Architecture Expertise: Discrete Manufacturing SAP experience including experience in Master data, transactional data, finance, supply chain or engineering analysis, data conversions to drive architectural synergies and solutions recommendations. Big Data Technologies: Demonstrate a relentless commitment to learning, implementing and eventually training other teams on novel big data solutions available within Ingersoll Rand. Hands-On Approach: Exhibit a hands-on approach to learning through experience, creating a track record of solutions via individual contributions and collaboration with other COEs. Change Management Champion: Continue to grow as a change management champion, working with different teams on big data and analytics enablement and training. Integrated IT Function: Push the agenda of a more integrated IT function, promoting data literacy and architectural excellence that incorporates both data and information systems solutions. Innovation and Synergies: Identify and capitalize on opportunities for system/analytics architecture and solutions synergies, starting with the SAP space and expanding to other information systems within Ingersoll Rands operating system.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

mumbai

Work from Office

Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelors Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

25 - 30 Lacs

bengaluru

Work from Office

Role Overview: As a Data Scientist within IBM's Chief Analytics Office, you will support AI-driven projects across the enterprise. You will apply your technical skills in AI, machine learning, and data analytics to assist in implementing data-driven solutions that align with business goals. This role involves working with team members to translate data insights into actionable recommendations. Key Responsibilities: Technical Execution and Leadership: Develop and deploy AI models and data analytics solutions. Support the implementation and optimization of AI-driven strategies per business stakeholder requirements. Help refine data-driven methodologies for transformation projects Data Science and AI: Design and implement machine learning solutions and statistical models, from problem formulation through deployment, to analyze complex datasets and generate actionable insights. Learn and utilize cloud platforms to ensure the scalability of AI solutions. Leverage reusable assets and apply IBM standards for data science and development. Project Support: Lead and contribute to various stages of AI and data science projects, from data exploration to model development. Monitor project timelines and help resolve technical challenges. Design and implement measurement frameworks to benchmark AI solutions, quantifying business impact through KPIs. Collaboration: Ensure alignment to stakeholders’ strategic direction and tactical needs. Work with data engineers, software developers, and other team members to integrate AI solutions into existing systems. Contribute technical expertise to cross-functional teams. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Education: Bachelor’s or Master’s in Computer Science, Data Science, Statistics, or a related field is required; an advanced degree is strongly preferred. Experience: 2-4 yearsof experience in data science, AI, or analytics with a focus on implementing data-driven solutions. Experience with data cleaning, data analysis, A/B testing, and data visualization. Experience with AI technologies through coursework or projects. Technical Skills: Proficiency in SQL and Python for performing data analysis and developing machine learning models. Knowledge of common machine learning algorithms and frameworkslinear regression, decision trees, random forests, gradient boosting (e.g., XGBoost, LightGBM), neural networks, and deep learning frameworks such as TensorFlow and PyTorch. Experience with cloud-based platforms and data processing frameworks. Understanding of large language models (LLMs). Familiarity with IBM’s watsonx product suite. Familiarity with object-oriented programming. Analytical Skills: Strong problem-solving abilities and eagerness to learn. Ability to work with datasets and derive insights. Other Requirements: Good communication skills, with the ability to explain technical concepts clearly. Enthusiasm for learning and applying new technologies. Strong project management skills, with the ability to balance multiple initiatives, prioritize tasks effectively, and meet deadlines in a fast-paced environment. Preferred technical and professional experience Advanced degree is strongly preferred.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

pune

Work from Office

Lead SAP data migration projects in Life Sciences. Perform ETL, data cleansing, validation, and conversion. Must have 7+ years’ experience with SAP tools, strong analytical skills, and ability to ensure data integrity across systems

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

kolkata

Work from Office

Collect and validate data from various sources, Analyses & process large datasets to identify trends, patterns for Business growth & decisions. Maintain dashboards and reports - Power BI, Tableau, or Excel. Ensure data integrity, accuracy & security. Required Candidate profile Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.in data analysis Proficient in SQL, Excel, and at least one data visualization tool (e.g., Tableau, Power BI).

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

pune

Work from Office

Brief Job Summary The Data Migration Item MDM will manage extract, transform and load (ETL) item related data from/to Oracle PD/PIM and Oracle EBS. Responsibilities : Part of the Data Migration team doing ETL activities in Oracle Fusion cloud PD and PIM related to updating item attributes and BOM. Loading new item, document, attachments and BOM information Oracle EBS related migration of all master and transactional data. Updating item attribution and BOM information. Previous experience of Data Migration of Item related data in Oracle PD/PIM or Oracle EBS a must. Adhere to a data migration strategy and usage of specific data migration tools. Identify risks and issues in a timely manner and escalate for resolution as needed. Manage data quality across different phases of the data migration and make sure that data is fit for purpose. Knowledge of Fusion Data Migration tools including FBDI/HDL/ADFDI and Fusion Web Services. Work collaboratively to ensure data is cleansed in a timely manner. Substantial experience working with databases and ETL tools capable of data cleansing. Perform data migration audit, reconciliation and exception reporting. Work with subject matter experts and project team to identify, define, collate, document and communicate the data migration requirements. Work across multiple functional work streams to understand data usage and implications for data migration. Support initiatives for data integrity and governance. Perform source data identification and analysis to manage source to target data mapping. Managing master and transactional data, including creation, updates, and deletion. Requirements : Bachelors Degree in Information Technology, Process Management or related degree or experience At least 4 years of combined experience in item/product data migration specifically extract, transform and load. Candidates should have 2+ years of experience in Oracle Fusion and Oracle EBS data migration roles. Business Knowledge: Demonstrates strong knowledge of current and possible future policies, practices, trends, technology, and information related to the business and the organization. Communication: Demonstrates excellent listening and communication skills (written and verbal) Initiative : Works independently and is highly motivated to initiate and accept new challenges Judgment/Decision Making : Makes solid decisions based on a mixture of analysis, wisdom, experience, and judgment. Managing & Adapting to Change : Readily adapts to changes in priority of initiatives and overall strategic direction within a multi-plan, geographically widespread organization. Professionalism : Exhibits appropriate attributes in all aspects of performance and demeanor Teamwork : Organizes and directs effective teams at the cross-functional level that consistently achieve stated goals Results Oriented : Bottom-line oriented and can be counted on to consistently meet and exceed goals.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

bengaluru

Work from Office

The Data Engineer exercises judgment when following general instructions and works with minimal instruction to support the integration and automation of data solutions. This role focuses on data massaging, reconciliation, and analysis, resolving routine to semi-routine issues. Responsibilities include creating optimized SQL queries, managing data pipelines, and collaborating with cross-functional teams to ensure data accuracy and availability.NoteThis role may come into contact with confidential or sensitive customer information requiring special treatment in accordance with Red Hat policies and applicable privacy laws.- Write optimized and scalable complex SQL queries- Automate data processing tasks using Python, focusing on cleaning and merging datasets.- Manage data pipelines, including scheduling, monitoring, and debugging workflows.- Collaborate with data engineers and IT teams to maintain data accessibility for stakeholders.- Assist in developing automated tests to ensure the accuracy and integrity of data.- Participate in version control and CI/CD processes for deploying and testing pipeline changes across environments.- Work cross-functionally with analysts, engineers, and operations.- Data stewardship includingdata governance, data compliance, data transformation, data cleanliness, data validation, data audit/maintenance. Primary Job Responsibilities Writing complex, highly-optimized SQL queries across large datasets, involved in SQL Query tuning and provided tuning recommendations Experienced in Data Analytics, hands-on experience of various Python libraries such as NumPy and Pandas Python development experience to massage, clean data and automate data extract and loads Expertise to convert raw data to processed data by merging, finding outliers, errors, trends, missing values and distributions in the data Expertise in Creating, Debugging, Scheduling and Monitoring jobs using Airflow, resolve performance tuning related issues and queries Foster collaboration among Data engineers, IT & other business groups to ensure data is accessible to FP&A team Scheduled a regular hot backup process and involved in the backup activities Strong analytical and problem-solving skills with ability to represent complex algorithms in software Develop automated unit tests, end-to-end tests, and integration tests to assist in quality assurance (QA) procedures Required Skills Bachelor's or Master's degree in Computer Science, IT, Engineering or equivalent 5+ years of experience as a Data Engineer, BI Engineer, Systems Analyst in a company with large, complex data sources Working knowledge of DBT, Snowflake, Fivetran, Git and SQL or Python programming skills for data querying, cleaning, and presentation Build highly available, reliable and secured API solutions, experience working with REST API design and Implementation Working knowledge of relational databases (PostgreSQL, MSSQL, etc.), experience with AWS services including S3, Redshift, EMR and RDS. Ability to manage multiple projects at the same time in a fast-paced team environment, across time zones, and with different cultures, while maintaining ability to work as part of a team The candidate must have good troubleshooting skills and be able to think through issues and problems in a logical manner and planning knowledge would be an added advantage Detail-oriented and enthusiastic who is also focused and diligent on delivering results

Posted 3 weeks ago

Apply

13.0 - 18.0 years

14 - 19 Lacs

bengaluru

Work from Office

Data Collection: Gather and compile relevant data from various sources for marketing purposes. Data Cleansing & Organization: Ensure accuracy and structure of marketing and sales data. Marketing Coordination: Work closely with the marketing team on campaigns, brochures, content sharing, and outreach activities. Sales Coordination: Support the sales team with lead data. Market Research: Assist in identifying target audiences, competitors, and trends. Communication Support: Help with email campaigns, content distribution, and basic client communication. Reporting: Maintain and update marketing-related reports for management review. Eligibility Criteria: Freshers only Any graduate (BBA, B.Com, BA, MBA, etc.) Non-IT background is welcome . Strong communication and coordination skills. Basic knowledge of MS Excel / Google Sheets . Ability to multitask, stay organized, and work in a team. Must be willing to join immediately .

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

pune

Work from Office

We are seeking an experienced Data Scientist with a strong foundation in Python, Machine Learning, and Cloud-based Analytics. You will play a key role in building data-driven energy optimization solutions across buildings and campuses. From time-series analysis to predictive maintenance, youll work on impactful projects that accelerate energy efficiency and sustainability on a global scale. This position is for Pune Location. Youll Make a Difference By: o Collaborat ing with stakeholders to understand business objectives and develop data science strategies aligned with those objectives. o Design, develop, and implement advanced machine learning and statistical models to solve complex business problems . o Conduct exploratory data analysis, data cleansing, and feature engineering to prepare datasets for analysis. o Explor ing and utilize various data mining and machine learning techniques to extract valuable insights and patterns from large datasets . o Develop ing predictive/prescriptive models, algorithms, and prototypes to support business. o H av ing working knowledge of handling IOT and IIOT data and handling traditional use cases like anomaly detection for a multi-sensor system. o Having hands-on experience in Forecasting using traditional ML and Deep Networks. o Perform ing statistical analysis, hypothesis testing, and A/B testing to evaluate the effectiveness of models and algorithms. o Communicate findings and insights to technical and non-technical stakeholders through reports, presentations, and data visualizations. o Understanding CI/CD processes in product deployment and used it in delivery. o Understanding of the Dockerization , REST APIs. o Working knowledge on software development processes. o Stay up to date with the latest trends and advancements in data science, machine learning, and AI Youll Win Us Over If You Have: o Advanced degree (Master's or Ph.D.) in a quantitative field such as Data Science, Computer Science, or Statistics, or Bachelors degree (BE. BTech, BS) with demonstrated equivalent practical experience. o 5+ years of experience in data science and/or data analysis with a proven track record of developing ML models and algorithms. oA masters or Ph.D. degree in Data Science, Computer Science, Statistics, or a related field . oStrong programming skills in languages such as Python . oExposure to the industrial engineering domain is preferred. oKnowledge of big data technologies such as Hadoop, Spark, or Hive is a plus. oProficient in the use of data science tools . oA collaborative, team-oriented attitude with a proactive mindset .

Posted 3 weeks ago

Apply

6.0 - 10.0 years

18 - 25 Lacs

gurugram

Work from Office

Job description We are seeking a highly motivated and detail-oriented Data Analyst to join our growing team. This role is responsible for transforming data into meaningful insights that inform strategic decision-making across the organization. The ideal candidate will possess strong analytical capabilities, a deep understanding of data structures, and the ability to communicate findings clearly and effectively to both technical and non-technical stakeholders. Key Responsibilities Collect data from different sources like databases, reports, and systems, clean, and analyze structured and unstructured data from various internal and external sources. Develop, maintain, and optimize dashboards, reports, and performance metrics that support key business functions. Identify trends, patterns, and anomalies in data to provide actionable insights and support business planning. Collaborate cross-functionally with teams including product, marketing, finance, and operations to address analytical needs and provide data-driven recommendations. Assist in the development of predictive models, customer segmentation, and performance tracking frameworks. Ensure data integrity and accuracy through rigorous testing, validation, and documentation of data pipelines and processes. Preferred Skills Experience with cloud-based data platforms (e.g., Snowflake, BigQuery, Redshift). Knowledge of statistical techniques such as regression, clustering, and hypothesis testing. Familiarity with ETL processes and data warehousing concepts. Understanding of data governance, privacy, and compliance best practices. Bachelors degree in Statistics, Mathematics, Computer Science, Economics, Data Science, or a related field. 5+ years of experience in a data analyst or business analyst role. Proficienc Qualifications y in SQL for data extraction, transformation, and analysis. Hands-on experience with data visualization tools such as Tableau, Power BI, or Looker. Proficient in Microsoft Excel; familiarity with Python or R is a plus. Strong analytical thinking and problem-solving skills. Excellent written and verbal communication skills, with the ability to present complex data in a clear and concise manner. Shift Days - Monday to Friday WFO Shift Timings - 10:30 to 7:30pm

Posted 3 weeks ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

mumbai, pune

Work from Office

Looking for challenging role? If you really want to make a difference - make it with us About the role- A TEAMCENTER PLM Data Migration Specialist is a professional who specializes in the movement of data from one system or platform to another. They understand TEAMCENTER PLM data structures and related PLM processes. They play a crucial role in ensuring the successful transfer and integration of data. TEAMCENTER PLM Data Migration Specialists are responsible for analyzing data structures, mapping data fields, validating data integrity, and implementing efficient migration strategies. Main data formats are meta-data, documents and Creo CAD data. Source data systems shall be SAP, PTC Windchill PDMLink, Oracle Agile CADIM, Sharepoint and file storage systems. The role includes to ensure that the target system meets requirements of the to be migrated data. Gaps will be identified and worked out to a solution with respective system teams. o Specializes in the movement of data from one system or platform to another o Developing and adapting the migration tools o Migration dry runs o Assessment of the migration result (with the help of the respective roll-out project) o Data Cleansing (with the help of the respective roll-out project) o Goal: identify and rectify inconsistencies, gaps and errors within the data to meet the Teamcenter data model In source systems During the Migration (as part of the transformation in the ETL process) In the target system Main responsibilities of a data migration specialist are: o Analyze data structures and target system data model o Analyze, prepare and execute data cleansing services. o Analyze, specify and document data migration requirements with clients and internal teams o Collaborate with cross-functional teams o Collaborate with data architects to design data migration solutions o Conduct post-migration data validation o Conduct testing on migrated data to ensure client requirements are fulfilled. o Coordinate with stakeholders to gather requirements o Create spreadsheets or use other data analysis tools with large numbers of figures without mistakes. o Develop and execute test plans on migrated data o Develop templates for Data Migration objects that can be leveraged for multiple rollouts. o Document data migration processes and procedures o Extract, transform, and load are terms often used by the team o Handle escalated client complaints and concerns as needed Bug/Issue tracking for data subjects. o Identify and mitigate data risks o Identify data migration requirements o Maintain appropriate levels of data security and privacy relating to customer data. o Maintain data migration documentation o Map data fields between source and target systems o Optimize data migration processes for efficiency within the data and migration team o Profile data results from legacy data sources. o Troubleshoot and resolve data migration issue We dont need superheroes, just super minds. BE or BTech Degree in Information Technology, Computer Science Minimum 8+ years of support experience in Teamcenter Application 11.5 or higher, Administration of Teamcenter Application / 4T architecture, Teamcenter deployments, code quality reviews Ability to create deployment scripts in a Windows based Client/Server Platforms. Knowledge of windows Batch scripting, PowerShell and Python scripting languages is essential Good Understanding of Active Workspace, T4X, SWIM in addition to Teamcenter Server and client administration Good understanding of Teamcenter Data model which includes BMIDE code full and codeless customization Well versed with Teamcenter modules (Query builder, structure manager, Multi BOM manager, TEAMCENTER PLMXML, Workflow designer, Product Configurator) Installation, configuration, administration, maintenance, including integrations deployments with Teamcenter Ability to document architecture / technical specifications Exposure of migration projects / rollouts High proficient in Server-side customization (ITK) High proficient in Client-side customization (RAC) High proficient in AWC customization including style sheets Web Services High proficient in SOA development High proficient in Workflow handlers' development This role is based in Mumbai or Pune , where youll get the chance to work with teams impacting entire cities, countries and the shape of things to come.

Posted 3 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

pune

Work from Office

Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India)

Posted 3 weeks ago

Apply

2.0 - 6.0 years

15 - 25 Lacs

bengaluru

Work from Office

Description : Zeta Global is seeking a Solutions Associate for our Data Cloud Applications team to drive operational excellence, client support, and solution innovation. This role provides critical leverage to the team by supporting projects related to knowledge sharing, operational execution, and strategic solution enhancement. The Solutions Associate will work closely with Zetas key partners to help win new business, grow existing accounts, and maintain their competitive edge. They will have the autonomy to develop unique working models that best fit their strengths and workflow preferences while maintaining strong collaborating with the broader Zeta team and client stakeholders. The Solutions Associate will play a key role in informing Zetas product roadmap by capturing client feedback and identifying opportunities for greater efficiency and effectiveness. Success in this role will be measured by the ability to deliver on critical client requests and contribute meaningfully to client satisfaction and long-term growth. Roles & Responsibilities Develop a comprehensive understanding of the Zeta Data Cloud Identity Graph, attributes, and signals to support audience curation and data-related inquiries Demonstrate a deep understanding of Zetas Opportunity Explorer solutions, with the ability to demo these solutions internally and externally Identify strategic opportunities from Data Cloud Intelligence solutions and present actionable findings to client stakeholders during insight readouts. Act as a primary point of contact for Data Cloud-related questions from client account teams, providing accurate and timely support. Offer strategic recommendations during RFP responses, identifying creative applications of Zetas identity, intelligence, and activation solutions to differentiate client proposals. Train client account teams on how to leverage Data Cloud Intelligence solutions, enhancing client teams' ability to independently utilize platform features Support day-to-day Data Cloud operational requests, ensuring smooth execution of client initiatives Independently kick off and troubleshoot Data Cloud reports, ensuring timely and successful delivery to stakeholders. Audit and maintain client accounts, verifying that all requested solutions are accurately loaded and active. Capture client needs and feedback that align with the Zeta product roadmap, acting as a liaison between client teams and Zetas Product team. Advocate for client-driven enhancements, ensuring client needs are communicated clearly to influence future platform developments Qualifications Thrives in a challenging, fast-paced entrepreneurial environment with real-time impact on day-to-day business, championing a high agency mindset Highly organized and detail-oriented, with proven ability to manage multiple projects and prioritize effectively under dynamic conditions Analytical thinker, comfortable with quantitative analysis and data interpretation Translates complex data findings into clear, concise, and compelling narratives tailored to various audiences Creative problem-solver who can think outside the box to develop innovative solutions Collaborative team player with strong independent working skills; self-motivated and dependable in driving initiatives forward Proficient in Excel (VLookups, Pivot Tables, Logic-based queries, data cleaning & filtering) Advanced in Microsoft PowerPoint for professional client-facing presentations Preferred Qualifications Expert in Microsoft PowerPoint Proficient in Tableau Working understanding of SQL and relational databases

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Veeva Vault Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Offshore Migration Lead, you will oversee and coordinate offshore migration execution into the Veeva Vault platform. You will lead a team of migration specialists and analysts and BA, apply hands-on expertise in Vault migrations, SQL, and RDBMS, and collaborate with onshore counterparts to execute plans, manage timelines, resolve issues, and ensure compliance with quality standards. Roles & Responsibilities:-Lead and mentor a team of offshore migration specialists handling execution of document and metadata migration tasks.-Review deliverables and ensure adherence to migration standards, best practices, and compliance expectations.-Manage work allocation, backlog tracking, and progress reporting for offshore migration tasks.-Monitor the completion of daily/weekly migration targets, ensuring on-time and accurate delivery.-Perform root cause analysis on migration errors and coordinate with technical teams to resolve Vault Loader or API issues.-Validate output quality through spot checks, sampling, and test case validations.-Provide hands-on support when needed for migration jobs, SQL-driven data transformation, and validation checks.-Troubleshoot migration errors using Vault logs and work with developers or Vault SMEs to resolve blockers.-Act as the primary offshore contact for the onshore Migration Lead or Project Manager.-Ensure the offshore team follows controlled migration procedures and documentation protocols.-Maintain audit trails, job trackers, and version-controlled artifacts. Professional & Technical Skills: Must To Have Skills: Hands-on experience with Vault Loader and Vault REST APIs for document and object migration.-Strong command of SQL for data extraction, transformation, and validation.-Experience working with CSVs, XML, JSON payloads, and migration packaging.-Strong leadership and coordination skills in an offshore delivery model.-Excellent communication skills for daily sync-ups, reporting, and issue escalations.-Attention to detail, quality orientation, and ability to manage workload under deadlines.-Familiarity with regulatory requirements in GxP, 21 CFR Part 11 contexts.-Familiarity with Vault metadata models, document types, lifecycles, and object structures.-Experience with PromoMats / MedComms / Quality Suite / RIMS / Clinical and other Vault domains.-Proficiency in working with RDBMS like Oracle, SQL Server, PostgreSQL, or MySQL.-Experience in writing complex joins, subqueries, case statements, and data cleansing scripts.-Familiarity with legacy content/document systems such as Documentum, SharePoint, Calyx Insight, OpenText.-The candidate should have experience leading offshore migration teams for Veeva Vault projects.-Prior experience in regulated environments (GxP, 21 CFR Part 11) is required.-A minimum of 3-5 years of experience in Vault migrations is expected. Additional Information:-The candidate should have a minimum of 3 years of experience in Computer System Validation (CSV).-This position is PAN-INDIA based.-A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 9.0 years

7 - 13 Lacs

bengaluru

Remote

• Manage lead acquisition strategy & capture process from sources with accuracy by regular validation & deliver target lead • Update database, track interaction status, onboard new sources, improve quality • Analyze data for trend & sales opportunity Required Candidate profile • Enforce data governance policies to ensure compliance • Leverage CRM (Salesforce/HubSpot) & sales technology to manage & track lead • Collaborate with marketing/sales teams to align data initiatives

Posted 3 weeks ago

Apply

6.0 - 10.0 years

12 - 18 Lacs

hyderabad, pune, bengaluru

Work from Office

Job Profile As a member of the development group, you will become part of a team that develops and maintains one of Coupas software products developed using Ruby and React, built as a multi-tenant SaaS solution on all Cloud Platforms like AWS, Windows Azure & GCP. We expect that you are a strong leader with extensive technical experience. You have a well-founded analytical approach to finding good solutions, a strong sense of responsibility, and excellent skills in communication and planning. You are proactive in your approach and a strong team player. What you will do: Implement a cloud-native analytics platform with high performance and scalability Build an API-first infrastructure for data in and data out Build data ingestion capabilities for Coupa data, as well as external spend data Leverage data classification AI algorithms to cleanse and harmonize data Own data modelling, microservice orchestration, monitoring & alerting Build solid expertise in the entire Coupa application suite and leverage this knowledge to better design application and data frameworks. Adhere to Coupa iterative development processes to deliver concrete value each release while driving longer-term technical vision. Engage with cross-organizational teams such as Product Management, Integrations, Services, Support, and Operations, to ensure the success of overall software development, implementation, and deployment. What you will bring to Coupa: Bachelors degree in computer science, information systems, computer engineering, systems analysis or a related discipline, or equivalent work experience. 4 to 8 years of experience building enterprise, SaaS web applications using one or more of the following modern frameworks technologies: Java/ .Net/C etc. Exposure to Python & Familiarity with AI/ML-based data cleansing, deduplication and entity resolution techniques Familiarity with a MVC framework such as Django or Rails Full stack web development experience with hands-on experience building responsive UI, Single Page Applications, reusable components, with a keen eye for UI design and usability. Understanding of micro services and event driven architecture Strong knowledge of APIs, and integration with the backend Experience with relational SQL and NoSQL databases such MySQL / PostgreSQL / AWS Aurora / Cassandra Proven expertise in Performance Optimization and Monitoring Tools. Strong knowledge of Cloud Platforms (e.g., AWS, Azure, or GCP) Experience with CI/CD Tooling and software delivery and bundling mechanisms Nice to have: Expertise in Python & Familiarity with AI/ML-based data cleansing, deduplication and entity resolution techniques Nice to have: Experience with Kafka or other pub-sub mechanisms Nice to have: Experience with Redis or other caching mechanisms Candidates Profile BE/BTECH, MCA/BCA with Min 5+ Years’ experience in Python Django & Cloud Platforms like AWS, Windows Azure & GCP. Ready for 6 to 12 months contract role in Bangalore, Hyderabad and Pune in Hybrid mode Can join within 15 days

Posted 3 weeks ago

Apply

1.0 - 2.0 years

6 - 8 Lacs

gurugram

Hybrid

Role & responsibilities: Managed Accounts support Develop a strong understanding of the clinical operations landscape and our Managed Accounts model Account Performance Management : Under the guidance of senior members: Support account performance tracking, including metrics such as Gross Awards, Pipeline, Revenue, and Operational KPIs Assist in developing standardized tools and templates to support account planning and performance management Provide research, analysis, and presentation support for strategic account initiatives Partnership Management: support the partnership management activities including: Learning Management System (LMS) administration and reporting Governance meeting preparation and tracking Partner engagement and documentation support Design, set-up and launch for surveys, newsletters etc. Responding to ad-hoc analytical requests, ranging from bid proposals to reporting for internal decision-makers Reporting & Analytics Enablement Design and build AI / BI solutions to support internal and client-facing reporting needs Support operational data platforms, including Synopsis Rewired and other key reporting systems Assist with technical documentation and continuous improvement of reporting solutions Preferred candidate profile: Should have strong proficiency in MS Excel & any reporting tools like Power BI / Tableau. Self-motivated person with strong research and team working skills, analytical and problem-solving mindset.

Posted 3 weeks ago

Apply

7.0 - 11.0 years

18 - 32 Lacs

indore, pune, bengaluru

Hybrid

At Globant, we are working to make the world a better place, one step at a time. We enhance business development and enterprise solutions to prepare them for a digital future. With a diverse and talented team present in more than 30 countries, we are strategic partners to leading global companies in their business process transformation. We seek a Senior Salesforce Business Analyst in Financial Services Cloud (FSC) with Mid-Level who shares our passion for innovation and change. This role is critical to helping our business partners evolve and adapt to consumers' personalized expectations in this new technological era. Location - Bangalore/Pune/Indore/Ahmedabad/Hyderabad Experience 6 to 9 years Skill - Senior Salesforce Business Analyst what do we expect from you: Responsibilities 1. Support Solution Architects in Salesforce projects Service cloud 2. Serve as a trusted advisor to Globant clients 3. Demonstrate knowledge of salesforce and relevant business processes 4. Elicit requirements and document user stories 5. Responsible for documentation including requirements, user stories, designs, deployment plans, etc. 6. Support System Test and User Acceptance Test activities 7. Complete hands on configuration activities 8 . Find creative solutions in order to successfully implement requirements 9. Participate in and build relationships within the Salesforce community 10. Salesforce Knowledge: Deep understanding of Salesforce platform features, functionalities, and capabilities. Familiarity with Salesforce objects, fields, workflows, automation, security, and reporting. 11. Configuration and Customization: Ability to configure Salesforce settings and customize the platform to meet business requirements. This includes creating and managing custom objects, fields, page layouts, validation rules, workflows, process builder, and customizing user interfaces. 12. Data Management: Proficiency in data management tasks, such as data import/export, data cleansing, data deduplication, and data security. Knowledge of data migration tools and best practices. 13. User Management: Experience in managing user profiles, roles, permission sets, and sharing settings. Understanding of Salesforce security model and best practices for user access and data visibility. 14. Reporting and Dashboards: Ability to create and customize reports and dashboards to provide actionable insights to stakeholders. Knowledge of report types, report formulas, report filters, and dashboard components. Interested candidates can share their CV on madhavi.jaju@globant.com

Posted 3 weeks ago

Apply

2.0 - 7.0 years

3 - 5 Lacs

bengaluru

Work from Office

CAT Modelling 2+ Yrs exp Cat modelling Upto 7LPA Immediate Joiners to 30 Days Flexible with Shifts Cat Modeling expertise in RMS RiskLink/AIR Touchstone,SQL, MS-Office Word, Excel and Access Karishma.imaginator@gmail.com Required Candidate profile preparing exposure data for multiple lines of businessData cleansing, enhancing and analysis of COPE information Working on complex and large datasets using MS-Exce good communication skills

Posted 3 weeks ago

Apply

10.0 - 16.0 years

20 - 30 Lacs

hyderabad

Work from Office

You will work with Insurance clients to gather data and analytics requirements, and then help design and develop solutions to provide business insights and manage business performance. The person will be responsible for the gathering of requirements from customers, analysis of those requirements, and assisting in the implementation of our in-house insurance data platform. The person must have insurance product and process knowledge and experience Must have 8 Years Life or P&C Insurance experience Must have 8 years of data, reporting and analytics requirements gathering experience, with a heavy emphasis on data integration, reporting and analytics . Strong experience/exposure to one or more of Microsoft, Oracle, Amazon, Google, data reporting and management tools a plus 3 years of data profiling and data cleansing a plus Possess a comprehensive knowledge of insurance products and processes Demonstrated ability to liaise closely with business and IT leadership to establish, defend, and persuade implementing recommended solutions Strong communication, organizational, interpersonal and time management skills Must work independently with minimal direction and the ability to deal with a dynamic environment Experience leading others through the full development life cycle

Posted 3 weeks ago

Apply

8.0 - 10.0 years

16 - 20 Lacs

mumbai

Work from Office

Role and Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes. - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement. recommendations including changes in operational processes, technology or data management. - Primary area of focus : PSCM/ VMI business; secondary area of focus : ICS KPI's. - Business improvements pre & post (either operational program, algorithm, model or resultant software). - Improvements measured in time and/or dollar savings. - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements : - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills : - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : 1. R or python (Data Cleaning, Statistical and Modelling packages). 2. SQL.VBA and DAX (PowerBI). - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique. Responsibilities : - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company's, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills : - Analytical skills. - Negotiation. - Convincing skills. Key Competencies : - Never give up attitude. - Flexible. - Eye to detail. Experience : Minimum 8 years of experience.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

25 - 35 Lacs

bengaluru

Hybrid

Eligibility: Apply only if you are based in Bangalore and can join within 2 weeks (Immediate to 2 weeks notice period only). Job Title: Data Analyst / Data Governance Specialist Location: Bangalore (Hybrid 3 days office per week) Key Responsibilities: Profile and analyze new/existing data sources for structure, content, quality & consumption readiness Conduct deep investigations on data lineage, anomalies, and inconsistencies Work with Data Engineering & SMEs to document datasets, tables, fields & business definitions Perform metadata quality checks and enrich both business and technical metadata Support logical & physical data model development and recommend schema improvements Build dashboards/reports highlighting data quality, metadata completeness & coverage metrics Requirements: Strong understanding of data profiling, metadata, and governance concepts Hands-on experience in SQL, data modeling, and reporting tools Analytical mindset with keen attention to detail Ability to collaborate across data engineering, business, and governance teams Work Mode: Hybrid (Bangalore 3 days office per week) Interested candidates can share their profiles at Vijay.S@xebia.com with the below details: Total Experience Relevant Experience Current CTC Expected CTC Notice Period ( Immediate to 2 weeks only apply if you can join early ) Current Location Preferred Location LinkedIn Profile URL

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies