Home
Jobs

1649 Data Processing Jobs - Page 23

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 10.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Execute Business Insights & Analytics responsibilities (for PepsiCo Europe Beverages Sector team) as part of the broader Global Business Services function in Hyderabad, India. This role will help to enable accelerated growth for PepsiCo by contributing to the Europe Beverages Sector team while also working alongside the consumer marketing team to provide an integrated holistic overview to the business. Primary responsibilities include creating/updating existing dashboards, Excel/Power BI reports, delivering periodic and on-demand brand reporting, and addressing ad-hoc requests based on internal and external data sources. The role will have short-term responsibilities for knowledge transfer from the business and flawless delivery of recurring reports. Once established, the role will execute optimization of the data-based Insights & Analytics processes, including ad hoc questions and overall automation of delivery where applicable.. Responsibilities 1. Build Strong Business Insights & Analytics Execute market, portfolio, brand & promotion campaign performance reporting (utilizing dashboards, templated decks, and reporting tools) Analyze & Report category, brand & promotion performance drivers, and optimization opportunities Bring impactful insights for the BU by integrating & leveraging multiple data sources such as Internal Sales, Agency (RMS, HHP etc) Translate complex data findings into actionable insights and strategic recommendations for decision-making. Assist the team in analysing marketing expenses & budgets for better utilization of marketing investments Manage Ad-hoc & follow up deep-dives into the Data to address tactical performance issues & challenges Collaborate with stakehokders to develop analysis and reports offering strategic plans. 2. Build strong Data Processing & Automation Integrate & Optimize Data sets & Reporting system to manage heavy data processing for routine reporting Explore Automation opportunities with Higher focus on developing significant Insights for the Marketing Teams Speed up the Business Intelligence & Insights for timely & impactful decision making Help on implementing and automating Pan Europe Quarterly Business Reviews Implement innovative solutions to enhance data analysis capabilities and efficiency. Qualifications 9-10 years of experience in Analytics with exposure to Global Fortune 500 FMCG companies Ability to work and think independently Good analytics and insights experience - end-to-end understanding of the best research approach Can synthesize multiple, disparate data sources into compelling growth strategies. Formulates a strong POV and can articulate future scenarios and is an exceptional story-teller. Strong collaborator; Interested and motivated by working with others. Actively creates and participates in opportunities to co-create solutions across markets or brands; will be willing and able to embrace Responsive Ways of Working Proven analytics, data research experience, consumer insights experience or commercial experience in combination with strong analytical skills Good degree of familiarity with CPG and Food & Beverage industry data sources, including Nielsen (POS and HH panel), Kantar Worldpanel Deep understanding of FMCG industry business performance outputs and causal measures, their relationships, and how to bring business performance insights to life visually Proficient with PowerPoint and Advanced Excel; including ability to write complex formulas Ability to create macros and dashboards in Excel Good to have ExperiencePowerBI and statistical analysis tool(s) Operational experience from business servicing sector and/or consulting experience would be a plus Fluent English communication skills Excellent communication skills, confident and credible with senior stakeholders Strong story-telling and presentation skills to turn data into impactful insight and brand strategy that can drive the business forward.

Posted 2 weeks ago

Apply

0.0 - 4.0 years

1 - 1 Lacs

Pune

Hybrid

Naukri logo

We Are looking for Computer Operator, Who can Perform defined tasks per documented instructions/process, Male and Female Both Can Apply,Fresher and Experience both can apply ,Basic computer Knowledge must hardworking

Posted 2 weeks ago

Apply

4.0 - 7.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Normally receives general direction/instructions on new assignments. Conducts surveys to ascertain the locations of natural features and human-made structures on the earth's surface, underground, and underwater, using electronic distance-measuring equipment and other surveying instruments. Operates and manages land-information computer systems, performs tasks such as storing data, making inquiries, and producing plots and reports. Reviews information from survey teams regarding measurement of distances, directions, angles between points and elevation of points, lines, and contours on, above, and below the earths surface. Researches legal records, looks for evidence of previous boundaries, and analyzes the data to determine the location of boundary lines. Records the results of surveys, verifies the accuracy of data, and prepares plots, maps, and reports. An experienced professional with full understanding of area of specialization. Works on complex problems of diverse scope.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 327855 We are currently seeking a Python Django Microservices Lead to join our team in Bangalore, Karntaka (IN-KA), India (IN). "Job DutiesResponsibilities: Lead the development of backend systems using Django. Design and implement scalable and secure APIs. Integrate Azure Cloud services for application deployment and management. Utilize Azure Databricks for big data processing and analytics. Implement data processing pipelines using PySpark. Collaborate with front-end developers, product managers, and other stakeholders to deliver comprehensive solutions. Conduct code reviews and ensure adherence to best practices. Mentor and guide junior developers. Optimize database performance and manage data storage solutions. Ensure high performance and security standards for applications. Participate in architecture design and technical decision-making. Minimum Skills RequiredQualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 8+ years of experience in backend development. 8+ years of experience with Django. Proven experience with Azure Cloud services. Experience with Azure Databricks and PySpark. Strong understanding of RESTful APIs and web services. Excellent communication and problem-solving skills. Familiarity with Agile methodologies. Experience with database management (SQL and NoSQL). Skills: Django, Python, Azure Cloud, Azure Databricks, Delta Lake and Delta tables, PySpark, SQL/NoSQL databases, RESTful APIs, Git, and Agile methodologies"

Posted 2 weeks ago

Apply

8.0 - 10.0 years

12 - 16 Lacs

Noida, Pune, Chennai

Work from Office

Naukri logo

Job Title: Lead Data Scientist We are seeking a highly skilled and experienced Lead Data Scientist to join our dynamic team. In this role, you will be responsible for leading data-driven projects, mentoring junior data scientists, and guiding the organization in making strategic decisions based on data insights. Key Responsibilities: - Develop and implement advanced statistical models and algorithms to analyze complex data sets. - Collaborate with cross-functional teams to identify business opportunities and translate them into data-driven solutions. - Mentor and oversee a team of data scientists, providing guidance on best practices and techniques in data analysis and modeling. - Communicate findings and insights to stakeholders through presentations, reports, and visualizations. - Stay current with industry trends and emerging technologies in data science and analytics. - Design and implement experiments to validate models and hypotheses. - Ensure the quality and integrity of data throughout the analytic process. Qualifications: - Master's. in Computer Science, Statistics, Mathematics, or a related field. - Proven experience as a Data Scientist, with a strong portfolio of successful projects. - Expertise in programming languages such as Python or R, as well as experience with machine learning frameworks. - Strong knowledge of statistical analysis and modeling techniques. - Excellent problem-solving skills and the ability to work with large and complex data sets. - Strong communication skills, with the ability to convey technical concepts to non-technical stakeholders. - Experience in leading and managing teams is a plus. We offer a competitive salary, comprehensive benefits, and the opportunity to work in a collaborative and innovative environment. If you are passionate about data science and eager to make a significant impact, we would love to hear from you. Roles and Responsibilities Job Title: Lead Data Scientist Roles and Responsibilities: 1. Leading the design and implementation of advanced machine learning models and algorithms to address complex business problems. 2. Collaborating with cross-functional teams to define data strategies and identify opportunities for data-driven decision-making. 3. Overseeing data collection, cleaning, and preprocessing to ensure high-quality datasets for analysis and modeling. 4. Mentoring and guiding junior data scientists and analysts, fostering a culture of continuous learning and innovation within the team. 5. Communicating findings and insights to stakeholders through visualizations, presentations, and reports, ensuring clarity and understanding of complex analyses. 6. Staying current with industry trends, tools, and technologies in data science, and applying best practices to enhance team capabilities and project outcomes. 7. Developing and maintaining scalable data pipelines and architectures to support large-scale data processing. 8. Evaluating the performance of statistical models and machine learning algorithms to ensure accuracy and effectiveness in real-world applications. 9. Collaborating with IT and engineering teams to deploy models into production environments and monitor their performance. 10. Contributing to the strategic direction of the data science team and advising leadership on data-driven opportunities and potential risks.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

20 - 25 Lacs

Noida

Work from Office

Naukri logo

Technical Requirements SQL (Advanced level): Strong command of complex SQL logic, including window functions, CTEs, pivot/unpivot, and be proficient in stored procedure/SQL script development. Experience writing maintainable SQL for transformations. Python for ETL : Ability to write modular and reusable ETL logic using Python. Familiarity with JSON manipulation and API consumption. ETL Pipeline Development : Experienced in developing ETL/ELT pipelines, data profiling, validation, quality/health check, error handling, logging and notifications, etc. Nice-to-Have Skills Knowledge of CI/CD practices for data workflows. Key Responsibilities Collaborate with analysts and data architects to develop and test ETL pipelines using SQL and Python in Data Brick and Yellowbrick. Perform related data quality checks and implement validation frameworks. Optimize queries for performance and cost-efficiency Technical Requirements SQL (Advanced level): Strong command of complex SQL logic, including window functions, CTEs, pivot/unpivot, and be proficient in stored procedure/SQL script development. Experience writing maintainable SQL for transformations. Python for ETL : Ability to write modular and reusable ETL logic using Python. Familiarity with JSON manipulation and API consumption. ETL Pipeline Development : Experienced in developing ETL/ELT pipelines, data profiling, validation, quality/health check, error handling, logging and notifications, etc. Nice-to-Have Skills: Experiences with AWS Redshift, Databrick and Yellow brick, Knowledge of CI/CD practices for data workflows. Roles and Responsibilities Leverage expertise in AWS Redshift, PostgreSQL, Databricks, and Yellowbrick to design and implement scalable data solutions. Partner with data analysts and architects to build and test robust ETL pipelines using SQL and Python. Develop and maintain data validation frameworks to ensure high data quality and reliability. Optimize database queries to enhance performance and ensure cost-effective data processing.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 5 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred technical and professional experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients business processes and challenges

Posted 2 weeks ago

Apply

1.0 - 5.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a Junior Data Engineer with 1–3 years of experience, primarily focused on database management and data processing using MySQL. The candidate will support the data engineering team in maintaining reliable data pipelines

Posted 2 weeks ago

Apply

6.0 - 7.0 years

6 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities : Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required technical and professional expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 weeks ago

Apply

3.0 - 6.0 years

2 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Establish and implement best practices for DBT workflows, ensuring efficiency, reliability, and maintainability. Collaborate with data analysts, engineers, and business teams to align data transformations with business needs. Monitor and troubleshoot data pipelines to ensure accuracy and performance. Work with Azure-based cloud technologies to support data storage, transformation, and processing Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong MS SQL, Azure Databricks experience Implement and manage data models in DBT, data transformation and alignment with business requirements. Ingest raw, unstructured data into structured datasets to cloud object store. Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance Preferred technical and professional experience Establish best DBT processes to improve performance, scalability, and reliability. Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Databricks Proven interpersonal skills while contributing to team effort by accomplishing related results as required

Posted 2 weeks ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Chennai

Work from Office

Naukri logo

Wipro Limited (NYSE:WIT, BSE:507685, NSE:WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role : Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing:analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Reinvent your world.We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 17 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Immediate hiring for UiPath Automation in a leading MNC company Position- UiPath Automation Experience-5+ Years Location- Bangalore/Pune/Hyderabad/Chennai/Gurgaon Notice Period-Immediate Hybrid-3 Days WFO 45 years of hands-on experience in UiPath development. Strong experience in Salesforce automation using UiPath (UI interaction, data processing, API preferred). Good understanding of UiPath REFramework, exception handling, and reusable components. Exposure to support and enhancement work and willingness to shift from dev to support mode. Strong understanding of Orchestrator, including managing schedules, assets, queues, and logs. Ability to work independently in a second shift environment with limited oversight. Excellent analytical, problem-solving, and communication skills. Design, develop, test, and deploy RPA solutions using UiPath for Salesforce automation. Automate complex processes involving Salesforce UI, data entry, validations, report generation, etc. Collaborate with business and technical teams to understand automation needs and deliver scalable solutions. Follow UiPath best practices, especially REFramework, for modular and maintainable development.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Jaipur

Work from Office

Naukri logo

Job Summary: We are seeking a proactive and detail-oriented Lead Support Executive. The ideal candidate will be responsible for generating leads, reaching out to potential clients through various channels, maintaining accurate records, and supporting client engagement efforts. This role plays a key part in building and maintaining our sales pipeline. Key Responsibilities: Send professional email communications to potential and existing clients. Reach out to prospects via LinkedIn messages and maintain follow-ups. Identify relevant companies and fill out their website inquiry/contact forms. Maintain and update lead and client information in Excel sheets. Generate and qualify new business leads through online research and outreach. Schedule and coordinate meetings between clients and the internal team. Assist in tracking responses, follow-ups, and meeting outcomes. Support the business development team with any additional administrative tasks. Requirements: Bachelors degree Strong written and verbal communication skills in English. Proficient in using Microsoft Excel, Google Sheets, and basic online tools. Ability to manage multiple tasks with attention to detail. Comfortable working independently as well as part of a team. Basic knowledge of LinkedIn and email communication etiquette. Preferred Skills: Experience in client outreach or lead generation roles is a plus. Good time management and organizational skills.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Gurugram

Work from Office

Naukri logo

Alethe Labs is looking for Data Scientist to join our dynamic team and embark on a rewarding career journey. Undertaking data collection, preprocessing and analysis Building models to address business problems Presenting information using data visualization techniques Identify valuable data sources and automate collection processes Undertake preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine - learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Strong in Python and experience with Jupyter notebooks, Python packages like polars, pandas, numpy, scikit-learn, matplotlib, etc. Must have: Experience with machine learning lifecycle, including data preparation, training, evaluation, and deployment Must have: Hands-on experience with GCP services for ML & data science Must have: Experience with Vector Search and Hybrid Search techniques Must have: Experience with embeddings generation using models like BERT, Sentence Transformers, or custom models Must have: Experience in embedding indexing and retrieval (e.g., Elastic, FAISS, ScaNN, Annoy) Must have: Experience with LLMs and use cases like RAG (Retrieval-Augmented Generation) Must have: Understanding of semantic vs lexical search paradigms Must have: Experience with Learning to Rank (LTR) techniques and libraries (e.g., XGBoost, LightGBM with LTR support) Should be proficient in SQL and BigQuery for analytics and feature generation Should have experience with Dataproc clusters for distributed data processing using Apache Spark or PySpark Should have experience deploying models and services using Vertex AI, Cloud Run, or Cloud Functions Should be comfortable working with BM25 ranking (via Elasticsearch or OpenSearch) and blending with vector-based approaches Good to have: Familiarity with Vertex AI Matching Engine for scalable vector retrieval Good to have: Familiarity with TensorFlow Hub, Hugging Face, or other model repositories Good to have: Experience with prompt engineering, context windowing, and embedding optimization for LLM-based systems Should understand how to build end-to-end ML pipelines for search and ranking applications Must have: Awareness of evaluation metrics for search relevance (e.g., precision@k, recall, nDCG, MRR) Should have exposure to CI/CD pipelines and model versioning practices

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Team Leader Job Description: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Collaborate with Survey Programmers and internal project managers to understand survey requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement Manage the pipeline and internal/external stakeholder expectations Train and mentor junior team members Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 4+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Team Leader Job Description: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Collaborate with Survey Programmers and internal project managers to understand survey requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement Manage the pipeline and internal/external stakeholder expectations Train and mentor junior team members Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 4+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

{"company":" Catalyst Clinical Research provides customizable solutions to the biopharmaceutical and biotechnology industries through , a full-service oncology CRO, and multi-therapeutic global functional and CRO services through . The companys customer-centric flexible service model, innovative technology, expert team members, and global presence advance clinical studies. Visit . The Machine Learning Engineer is a pivotal contributor responsible for designing and implementing cutting-edge machine learning solutions with a focus on generative AI technologies. You will drive the development and deployment of advanced models and pipelines that enable the creation of AI-driven applications and enhance organizational decision-making capabilities. Additionally, you will support data engineering initiatives to enable utilization of data across the organization. Collaborating closely with internal and external stakeholders, you will translate complex requirements into innovative solutions that advance Catalysts AI strategies while ensuring alignment with broader enterprise goals. ","role":" Position Responsibilities/ Accountabilities: Design, build, and optimize machine learning workflows, with a focus on generative AI models such as large language models (LLMs) and diffusion-based architectures. Develop and deploy scalable machine learning pipelines using frameworks like TensorFlow, PyTorch, and Databricks MLflow. Develop AI solutions using tools like Azure AI/Copilot Studio and Databricks AI Builder. Lead the creation of domain-specific generative AI models, ensuring ethical AI practices and bias mitigation throughout the model lifecycle. Design, build, and maintain scalable data pipelines with Delta Live Tables for model integration into enterprise applications. Enhance and expand CI/CD strategies for automated testing, model monitoring, and continuous delivery of ML artifacts. Manage data preprocessing, feature engineering, and synthetic data generation for machine learning use cases. Collaborate with cross-functional teams to align AI-driven solutions with business goals and ensure high availability for end-to-end systems. Provide technical expertise in the exploration of novel generative AI methods, tools, and frameworks. Support team members in understanding data science and AI best practices, encouraging a culture of innovation and continuous learning. Represent AI as a key member of the Data & Architecture Review Committee. Position Qualification Requirements : Education : B.S. or M.S. Computer Science, Engineering, Economics, Mathematics, related field, or relevant experience. Experience: 5+ years of experience in machine learning engineering, including model development and deployment. Hands-on experience with generative AI models (e.g., GPT, GANs, VAEs) and frameworks like PyTorch or TensorFlow. 5+ years of experience with cloud computing technologies (Azure, AWS, GCP), especially AI and ML services. Proficiency in developing data pipelines and integrating ML models into production environments. Expertise in model evaluation and monitoring, including techniques for explainability and fairness in AI. Experience collaborating with DevOps and MLOps teams to ensure scalability and reliability of AI solutions. Familiarity with project management tools such as JIRA. Required Skills: Advanced proficiency in Python or PySpark for ML applications. Deep understanding of generative AI principles, model architecture, and training methodologies. Expertise in large-scale data processing and engineering using Spark, Kafka, and Databricks. Proficiency with big data technologies and data structures like delta, parquet, YAML, JSON, and HTML. Strong knowledge of cloud-based AI platforms (e.g. Databricks, Azure ML, etc). Solid understanding of machine learning pipelines and MLOps practices. Exceptional problem-solving and analytical skills. Ability to manage priorities and workflow effectively. Proven ability to handle multiple projects and meet tight deadlines. Strong interpersonal skills with an ability to work collaboratively across teams. Commitment to excellence and high standards. Creative, flexible, and innovative team player. Ability to work independently and as part of various committees and teams. Nice to Have: Data Engineering experience, including Webhooks, API, ELT/ETL, rETL, Data Lakehouse Architecture, and Event-Driven Architectures. Familiarity with deep learning frameworks for generative AI (e.g., Hugging Face Transformers). Knowledge of synthetic data generation techniques and tools. Experience with data visualization tools (e.g., Tableau, Power BI) for AI model interpretability. Familiarity with ethical AI principles, including explainability and bias reduction strategies. Experience with containerization and orchestration tools like Docker and Kubernetes. Background or familiarity with clinical trials or pharmaceutical development. Working Hours Everyday: 1:30 PM - 9:00 PM IST OR Monday, Wednesday, Friday: 2:30 PM - 10:30 PM IST Tuesday, Thursday: 9:00 AM - 5:00 PM IST Note: Working hours may vary based on individual seniority, business demand, and ability to work independently. This will be evaluated on a case-by-case basis. "},"

Posted 2 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Mumbai

Work from Office

Naukri logo

About this role VP Intellectual Property Counsel (Patents, Trade Secrets, Open-Source) Job Description: Your team Elevate your career by joining the worlds largest asset manager! Thrive in an environment that fosters positive relationships and recognizes outstanding performance! We know you want to feel valued every single day and be recognized for your contribution. At BlackRock, we strive to empower our employees and effectively engage your involvement in our success. With over USD $11.6 trillion in AUM, we have an outstanding responsibility: our technology and services empower millions of investors to save for retirement, pay for college, buy a home, and improve their financial well-being. BlackRock is one of the largest, most sophisticated global investment management firms and a leading provider of financial technology solutions to clients worldwide, including its class-leading Aladdin investment management platform for institutional investors, and Aladdin Wealth and Advisor Center analytics platforms for financial advisors. Our Digital Enterprise Legal team provides legal support to BlackRock s revenue-generating technology business lines and financial advisor and institutional client engagement and education platforms; supports BlackRock s data, technology, and markets infrastructure; and enables innovation by managing the company s intellectual property strategy and assets. Your role and impact The IP Legal team within BlackRock s Digital Enterprise Legal team is seeking an experienced, business-minded patent attorney to join our team. You will partner closely with our product engineering and business teams globally in a fast-paced, cutting-edge environment to strategically protect BlackRock s growing intellectual property assets. You will help protect our innovations by counseling teams on patents, trade secrets, publications, and open-source software licensing. You will ensure that BlackRock s patent strategy aligns with its business objectives by learning our businesses, counseling engineers, harvesting inventions, and partnering with external counsel to prosecute patents. You will be based in Mumbai, India or Mexico City, Mexico. Your responsibilities Helping to drive and develop BlackRock s patent strategy based on the company s strategic goals and competitive position. Working closely with BlackRock s engineers and product teams to identify and harvest inventions capable of IP protection, and working with external counsel and other service providers to file, prosecute, and maintain patents. Cultivating an IP-aware culture by counseling and educating internal partners on intellectual property issues, including patents, trade secrets, clean room development, and open source software licensing issues. Working with external counsel to analyze and respond to third-party patent demands and licensing opportunities. Conducting freedom-to-operate analyses to assess potential patent risks associated with product development and commercialization, including patent landscape analyses, evaluating claim scope and validity, and advising on risk mitigation strategies. Actively contributing to BlackRock s overall IP strategy, including by refining our patent and trade secret-related policies and procedures. Staying abreast of and communicating patent and trade secret legal developments that could impact BlackRock s business. Advising on IP aspects of commercial agreements, including joint development agreements and technology transfers. Working cross-functionally with security and engineering teams to advise on open source software and open source AI/LLM use, licensing, and compliance strategies, including developing efficient processes to enable compliant open source use in BlackRock s products at scale. Providing clean room development guidance to ensure the protection and integrity of BlackRock products, including establishing protocols and conducting reviews to prevent unauthorized use of third-party IP. You have At least 7 to 10 years of patent prosecution experience in a top-tier law firm or in-house legal department. Admission to the United States Patent and Trademark Office (USPTO) Bar and membership in good standing in at least one U.S. state bar. Experience in strategically building patent portfolios, including harvesting inventions, counseling engineers and business teams, and prosecuting patents. Degree or equivalent experience in a technical field, preferably in Engineering, Computer Science, or Computer Engineering. Experience in fintech, AI, machine learning, data processing, and/or software patents. Strong knowledge of open source licenses and related legal issues. Experience in the asset management industry or related financial industry is a plus, though not required. Ability to work across time zones to support BlackRock innovators globally, including around 25% of working hours for meetings with US-based inventors and US-based external counsel. Initiative, attention to detail, and a collaborative working style. Excellent interpersonal and communication skills, including the ability to effectively and concisely communicate complex legal issues to a non-legal audience orally and in writing, and to be a dedicated business partner to internal stakeholders. Strong organizational skills and an ability to manage multiple competing, and often evolving, priorities and deadlines. Our benefits . Our hybrid work model . About BlackRock . This mission would not be possible without our smartest investment - the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com / company / blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

35 - 40 Lacs

Gurugram

Work from Office

Naukri logo

About this role Role Description The Private Markets Insight Data Services (PDS) team seeks an Investor Account Services Lead for India region. This individual will lead efforts around Private Markets data processing and providing high quality service to our clients, leveraging technology and automation to drive scale, alongside disciplined Operations best practices. Insight s Managed Data Service is a key foundation to the growing Data & Analytics solutions delivered by the Insight business, and critical to maintain the growth of the Insight business. The team is responsible for the document retrieval, data extraction, normalization, and delivery for investors in private markets products including Private Equity, Private Credit, Real Estate, and Infrastructure. Key responsibilities Lead a team focused on creation of the market leading Private Markets database and analytics ecosystem for Cashflow & Capital Account Statement Services Manage a team of data analysts and work with team leads to manage workload and priorities Support a business growing by >30% per annum by ensuring scalable growth of services including new document types, asset classes and beyond Actively participate in Digital transformation of the Business, including transformation of process and workflow to leverage Aladdins patented data and document automation solution. Partner closely with Insight s Client Success and Sales teams planning for continued service delivery, on time (client SLAs) and with quality as well as supporting RFPs Create an inclusive environment oriented around trust, open communication, creative thinking, and cohesive team effort A Leader who grows the next set of Leaders in the business, and ability to become a BlackRock Global Citizen Experience Required Bachelor or Master degree (preferably in Economics, Organizational Sciences, Mathematics, or related Accounting background) Demonstrated experience in running an end-to-end managed data service organization Demonstrated transformational leader with the ability and desire to influence the people, process, and technology. Experience in Financial Markets, preferably Private Markets, is preferred Ability to succinctly communicate KPI-driven progress to stakeholders and Senior Leadership Strong organizational and change management skill Excellent communication skills: Fluency in English, both written and verbal Our benefits . Our hybrid work model . About BlackRock . This mission would not be possible without our smartest investment - the one we make in our employees. It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com / company / blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

9 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Job Description Design, develop, and optimize large-scale data processing pipelines using PySpark . Work with various Apache tools and frameworks (like Hadoop, Hive, HDFS, etc.) to ingest, transform, and manage large datasets. Ensure high performance and reliability of ETL jobs in production. Collaborate with Data Scientists, Analysts, and other stakeholders to understand data needs and deliver robust data solutions. Implement data quality checks and data lineage tracking for transparency and auditability. Work on data ingestion, transformation, and integration from multiple structured and unstructured sources. Leverage Apache NiFi for automated and repeatable data flow management (if applicable). Write clean, efficient, and maintainable code in Python and Java . Contribute to architectural decisions, performance tuning, and scalability planning. Required Skills: 5 7 years of experience. Strong hands-on experience with PySpark for distributed data processing. Deep understanding of Apache ecosystem (Hadoop, Hive, Spark, HDFS, etc.). Solid grasp of data warehousing , ETL principles , and data modeling . Experience working with large-scale datasets and performance optimization. Familiarity with SQL and NoSQL databases. Proficiency in Python and basic to intermediate knowledge of Java . Experience in using version control tools like Git and CI/CD pipelines. Nice-to-Have Skills: Working experience with Apache NiFi for data flow orchestration. Experience in building real-time streaming data pipelines . Knowledge of cloud platforms like AWS , Azure , or GCP . Familiarity with containerization tools like Docker or orchestration tools like Kubernetes . Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Self-driven with the ability to work independently and as part of a team.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Requirement Strong core Java, collection framework experience required. API architecture, types and vulnerabilities exposure required. TCP/IP protocol's client-server architecture experience required. ActiveX APIimplemented experience will be added advantage. File I/OInter-Application Communication knowledge required. JSON & XML format data processing experience required. Skill Set- Java 8+, Spring Boot, Microservices, React, Junit, SQL Server/ MySQL, Design Patterns & API Development

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Fynd is India s largest omnichannel platform and a multi-platform tech company specializing in retail technology and products in AI, ML, big data, image editing, and the learning space. It provides a unified platform for businesses to seamlessly manage online and offline sales, store operations, inventory, and customer engagement. Serving over 2,300 brands, Fynd is at the forefront of retail technology, transforming customer experiences and business processes across various industries. About You As a TPM you will be acting as a bridge between business and engineering teams. You will working on complex business constraints that translate into product requirements and features. You bring technical knowledge to the team, taking the project from prototype to launch in tight timelines. A people s person who can provide strong leadership and inspire teams to build a world-class products. What will you do at Fynd? Gather requirements from diverse teams and stakeholders Work with Platform Architects and Engineers to convert these requirements to an implementation Work closely with engineers to prioritize product features and requirements Own the execution of the sprint by collaborating with multiple engineering and product teams within the org Be responsible for the delivery of these sprints both from a timeline and quality point of view Manage technical and product risks and unblock engineering by helping mitigate them Provide accurate visibility in terms of features readiness, issues with other engineering teams Who are we looking for? You, if you can walk the talk and convince others to walk with you Someone who can distinguish between the important and the urgent, and make sure both are addressed A leader who has the vision to see more than 14 million outcomes and pick the one where Thanos is defeated and Avengers thrive Juggling time, resources and priorities feels as natural as data charts and spreadsheets Someone who listens to everybody, distils information and makes stuff happen! Some Specific Requirements Basic Knowledge of Technology, data orchestration tools and frameworks such as Apache Airflow,API Integrations, Micro-services Architecture, CI/CD etc Strong communication skills Knowledge of data modeling and ETL (Extract, Transform, Load) processes. Familiarity with data streaming and real-time data processing technologies. Proficiency in data visualization tools (e.g., Tableau, Power BI) to create reports and dashboards. Ability to automate repetitive tasks and workflows using scripting or automation tools. A commitment to staying current with evolving data technologies and industry trends. Ability to explain technical concepts/flows to a non-technical audience Clear written communication skills. You must be able to clearly articulate flows for engineers and SDETs to understand deeply Build strong relationships and collaborate with a diverse team containing engineering, product and business stakeholders Effective Delegation Must know how to build ownership and execution within the team without micro-managing Proficiency with data platform technologies, including database management systems (e.g., MySQL, PostgreSQL, or MongoDB). Knowledge of server and storage hardware, virtualization, and cloud computing. Strong Attention to Detail Growth Mindset to learn skills while performing the role 4 - 7years of experience in a Business Analyst/Project Manager role Some experience with Bug Tracking tools like JIRA, Confluence, Asana, Redmine or Azure Devops Strong analytical and problem-solving skills, with the ability to make data-driven decisions. Excellent communication and collaboration skills, including the ability to work effectively with cross-functional teams. Strong knowledge of data technologies, databases, and data analytics tools. Familiarity with cloud-based data solutions (e.g., AWS, Azure, GCP). Strong knowledge of ETL tools and techniques, including data extraction, transformation, and loading from various sources. Experience with Change Data Capture (CDC) methodologies to capture real-time data changes for synchronization. Deep understanding of machine learning concepts and their application to data-driven decision-making. Proficiency in data integration tools, including Big DataOps Platforms, to streamline data collection and management. Familiarity with workflow management systems for process automation and orchestration. Knowledge of artificial intelligence (AI) technologies and their integration into data platforms to enhance automation, prediction, and decision support. Strong problem-solving skills to address complex technical and business challenges. Ability to communicate and present complex technical concepts to non-technical stakeholders. Leadership skills to guide cross-functional teams in product development. What do we offer? Growth Growth knows no bounds, as we foster an environment that encourages creativity, embraces challenges, and cultivates a culture of continuous expansion. We are looking at new product lines, international markets and brilliant people to grow even further. We teach, groom and nurture our people to become leaders. You get to grow with a company that is growing exponentially. Flex University: We help you upskill by organising in-house courses on important subjects Learning Wallet: You can also do an external course to upskill and grow, we reimburse it for you. Culture Community and Team building activities Host weekly, quarterly and annual events/parties. Wellness Mediclaim policy for you + parents + spouse + kids Experienced therapist for better mental health, improve productivity & work-life balance We work from the office 5 days a week to promote collaboration and teamwork. Join us to make an impact in an engaging, in-person environment!

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bhiwandi

Work from Office

Naukri logo

Makwana World is looking for Junior Accountant / Data Entry Executive to join our dynamic team and embark on a rewarding career journey. Complying with all company, local, state, and federal accounting and financial regulations. Compiling, analyzing, and reporting financial data. Creating periodic reports, such as balance sheets, profit & loss statements, etc. Presenting data to managers, investors, and other entities. Maintaining accurate financial records. Performing audits and resolving discrepancies. Computing taxes. Keeping informed about current legislation relating to finance and accounting. Assisting management in the decision-making process by preparing budgets and financial forecasts.

Posted 2 weeks ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Analyst Job Description: Core Responsibilities: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Create and maintain detailed data documentation and validation reports. Collaborate with Survey Programmers and internal project managers to understand data processing requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices. Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement. Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 2+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies