Home
Jobs

2443 Data Quality Jobs - Page 42

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

13 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Summary The Salesforce Data Cloud Analyst will play a crucial role in leveraging Salesforce Data Cloud to transform how our organization uses customer data. This position sits within the Data Cloud Business Enablement Team and focuses on building, managing, and optimizing our data unification strategy to power business intelligence, marketing automation, and customer experience initiatives About the Role Location - Hyderabad About the role: The Salesforce Data Cloud Analyst will play a crucial role in leveraging Salesforce Data Cloud to transform how our organization uses customer data. This position sits within the Data Cloud Business Enablement Team and focuses on building, managing, and optimizing our data unification strategy to power business intelligence, marketing automation, and customer experience initiatives Key Responsibilities: Manage data models within Salesforce Data Cloud, ensuring optimal data harmonization across multiple sources Maintain data streams from various platforms into Data Cloud, including CRM, SFMC, MCP, Snowflake and third-party applications Develop and optimize SQL queries to transform raw data into actionable insights Build and maintain data tables, calculated insights, and segments for use across the organization Collaborate with marketing teams to translate business requirements into effective data solutions Monitor data quality and implement processes to ensure accuracy and reliability Create documentation for data models, processes, and best practices Provide training and support to business users on leveraging Data Cloud capabilities Essential Requirements: Advanced knowledge of Salesforce Data Cloud architecture and capabilities Strong SQL skills for data transformation and query optimization Experience with ETL processes and data integration patterns Understanding of data modeling principles and best practices Experience with Salesforce Marketing Cloud, MCI & MCP Familiarity with APIs and data integration techniques Knowledge of data privacy regulations and compliance requirements (GDPR, CCPA, etc. ) Bachelors degree in Computer Science, Information Systems, or related field 3+ years experience working with Salesforce platforms Salesforce Data Cloud certification preferred Demonstrated experience with data analysis and business intelligence tools Strong problem-solving abilities and analytical thinking Excellent communication skills to translate technical concepts to business users Ability to work collaboratively in cross-functional teams Experience working in regulated industries like pharma is a plus Preferred Requirements: Previous work with Customer Data Platforms (CDPs) Experience with Tableau CRM or other visualization tools Background in marketing technology or customer experience initiatives Salesforce Administrator or Developer certification Familiarity with Agile ways of working, Jira, and Confluence This role offers the opportunity to shape how our organization leverages customer data to drive meaningful business outcomes and exceptional customer experiences. Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl. india@novartis. com and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients lives. Ready to create a brighter future together? https://www. novartis. com / about / strategy / people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork. novartis. com/network Benefits and Rewards: Read our handbook to learn about all the ways we ll help you thrive personally and professionally:

Posted 2 weeks ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Analyst Job Description: Core Responsibilities: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Create and maintain detailed data documentation and validation reports. Collaborate with Survey Programmers and internal project managers to understand data processing requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices. Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement. Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 2+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Mumbai

Work from Office

Naukri logo

The purpose of this role is to script surveys on the survey platform, ensuring accurate execution according to specifications, with a focus on on-time delivery and end-to-end quality assurance. Job Description: Bachelor s degree in computer science, Information Technology, Statistics, or a related field 2+years of experience working with VOXCO, Forsta or similar survey programming software. Experience of scripting multi-market complex projects. Program and script high-complexity surveys on the Survey platform, ensuring accurate execution according to specifications, with a focus on on-time delivery and end-to-end quality assurance. Conduct thorough testing of surveys, review data, and provide high-quality links to clients. Be able to provide technical support and troubleshooting for survey-related issues. Coordinate with internal project managers / client services team members to finalize materials; provide guidance on tool functionality and solutions. Review survey questionnaires and make recommendations for efficient programming and optimal data layout to improve data quality and user experience. Develop, test, and implement innovative approaches, functions, and solutions to streamline survey programming and enhance project efficiency. Strong understanding of JavaScript, HTML, CSS, and other relevant programming languages. Should be comfortable to work in night shifts - rotational, 24/7 operational support and working on weekends - Roaster Client-focused with strong consulting, communication, and collaboration skills. Emotionally intelligent, adept at conflict resolution, and thrives in high-pressure, fast-paced environments. Demonstrates ownership, problem-solving ability, and effective multitasking and prioritization Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply

1.0 - 3.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

What We Do Internal Audit s mission is to independently assess the firm s internal control structure, including the firm s governance processes and controls, risk management, capital and anti-financial crime framework. In addition, it is also to raise awareness of control risk and monitor the implementation of management s control measures. In doing so, internal Audit: Communicates and reports on the effectiveness of the firm s governance, risk management and controls that mitigate current and evolving risk Raise awareness of control risk Assesses the firm s control culture and conduct risks; and Monitors management s implementation of control measures Goldman Sachs Internal Audit comprises individuals from diverse backgrounds including chartered accountants, developers, risk management professionals, cybersecurity professionals, and data scientists. We are organized into global teams comprising business and technology auditors to cover all the firm s businesses and functions, including securities, investment banking, consumer and investment management, risk management, finance, cyber-security and technology risk, and engineering. Who We Look For Goldman Sachs Internal Auditors demonstrate a strong risk, control and analytical mindset, exercise professional skepticism and challenge status quo on risks and control measures effectively with management. We look for individuals who enjoy learning about audit, businesses, and processes, have innovative and creative mindset in adapting analytical techniques to enhance audit function, develop teamwork and build relationships and are able to evolve and thrive in a fast-paced global environment. Embedded - Data Analytics In Internal Audit, we ensure that Goldman Sachs maintains effective controls by assessing the reliability of financial reports, monitoring the firm s compliance with laws and regulations, and advising management on developing smart control solutions. Embed Data Analytics team leverages its programming and analytical capabilities to build innovative data driven solutions. The team works closely with auditors to understand their pain points and develop data-centric solutions to address the same. Your Impact As part of the third line of defense, you will be involved in independently assessing the firm s overall control environment and its effectiveness as it relates to current and emerging risks and communicating the results to local/ global management. In doing so, you will be supporting the provision of independent, objective and timely assurance around the firm s internal control structure, thereby supporting the Audit Committee, Board of Directors and Risk Committee in fulfilling their oversight responsibilities. We are looking for a strong data scientist, passionate about using data to challenge the norm, to join our Embed Data Analytics team. The candidate will work closely with the audit teams to build innovative and reusable analytical tools that will help make audit testing more efficient and provide meaningful insights into firm s control environment. Responsibilities Execute on DA strategy developed by IA management within the context of audit responsibilities, such as risk assessment, audit planning, creation of reusable tools and providing innovative solutions to complex problems Partner with audit teams to help identify risks associated with businesses and facilitate strategic data sourcing and develop innovative solutions to increase efficiency and effectiveness of audit testing Build production ready analytical tools to automate repeatable and reusable processes within IA Build and manage relationships and communications with Audit team members Basic Qualifications 1-3 years of experience with a minimum of Bachelor s in Computer Science, Math, or Statistics Experience with RDBMS/ SQL Proficiency in programming languages, such as Python, Java, or C++ Knowledge of basic statistics, including descriptive statistics, data distribution models, Time Series Analysis, correlation, and regression, and its application to data Strong team player with excellent communication skills (written and oral). Ability to communicate what is relevant and important in a clear and concise manner and ability to handle multiple tasks Strong contributing member of Data Science team and help build analytical capabilities for Internal Audit Division Driven and motivated and constantly taking initiative to improve performance Preferred Qualifications Experience with advanced data analytics tools and techniques Strong experience in RDBMS/ SQL and Data Warehousing. Exposure to ETL Processes and Data Engineering. Experience in implementing Data Quality measures and entitlement models Familiarity in programming languages such as Python. Strong team player with excellent communication skills (written and oral). Ability to communicate what is relevant and important in a clear and concise manner and ability to handle multiple tasks Self-driven and motivated to take up initiatives to improve our processes. We re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https: / / www.goldmansachs.com / careers / footer / disability-statement.html

Posted 2 weeks ago

Apply

0.0 - 2.0 years

1 - 4 Lacs

Mumbai

Work from Office

Naukri logo

The purpose of this role is to script surveys on the survey platform, ensuring accurate execution according to specifications, with a focus on on-time delivery and end-to-end quality assurance. Job Description: Bachelor s degree in computer science, Information Technology, Statistics, or a related field 2+years of experience working with VOXCO, Forsta or similar survey programming software. Experience of scripting multi-market complex projects. Program and script high-complexity surveys on the Survey platform, ensuring accurate execution according to specifications, with a focus on on-time delivery and end-to-end quality assurance. Conduct thorough testing of surveys, review data, and provide high-quality links to clients. Be able to provide technical support and troubleshooting for survey-related issues. Coordinate with internal project managers / client services team members to finalize materials; provide guidance on tool functionality and solutions. Review survey questionnaires and make recommendations for efficient programming and optimal data layout to improve data quality and user experience. Develop, test, and implement innovative approaches, functions, and solutions to streamline survey programming and enhance project efficiency. Strong understanding of JavaScript, HTML, CSS, and other relevant programming languages. Should be comfortable to work in night shifts - rotational, 24/7 operational support and working on weekends - Roaster Client-focused with strong consulting, communication, and collaboration skills. Emotionally intelligent, adept at conflict resolution, and thrives in high-pressure, fast-paced environments. Demonstrates ownership, problem-solving ability, and effective multitasking and prioritization

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Analyst Job Description: Core Responsibilities: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Create and maintain detailed data documentation and validation reports. Collaborate with Survey Programmers and internal project managers to understand data processing requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices. Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement. Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 2+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Overview We are seeking a strategic and hands-on Manager of Business Intelligence (BI) and Data Governance to lead the development and execution of our enterprise-wide data strategy. This role will oversee data governance frameworks, manage modern BI platforms, and ensure the integrity, availability, and usability of business-critical data. Reporting into senior leadership, this role plays a pivotal part in shaping data-informed decision-making across functions including Finance, Revenue Operations, Product, and more. The ideal candidate is a technically proficient and people-oriented leader with a deep understanding of data governance, cloud data architecture, and SaaS KPIs. They will drive stakeholder engagement, enablement, and adoption of data tools and insights, with a focus on building scalable, trusted, and observable data systems. About Us When you join iCIMS, you join the team helping global companies transform business and the world through the power of talent. Our customers do amazing things: design rocket ships, create vaccines, deliver consumer goods globally, overnight, with a smile. As the Talent Cloud company, we empower these organizations to attract, engage, hire, and advance the right talent. We re passionate about helping companies build a diverse, winning workforce and about building our home team. Were dedicated to fostering an inclusive, purpose-driven, and innovative work environment where everyone belongs. Responsibilities Data Governance Leadership: Establish and maintain a comprehensive data governance framework that includes data quality standards, ownership models, data stewardship processes, and compliance alignment with regulations such as GDPR and SOC 2. Enterprise Data Architecture: Oversee data orchestration across Salesforce (SFDC), cloud-based data warehouses (e.g., Databricks, Snowflake, or equivalent), and internal systems. Cross collaborate with data engineering team for the development and optimization of ETL pipelines to ensure data reliability and performance at scale. Team Management & Enablement: Lead and mentor a team of BI analysts, and governance specialists. Foster a culture of collaboration, continuous learning, and stakeholder enablement to increase data adoption across the organization. BI Strategy & Tools Management: Own the BI toolset (with a strong emphasis on Tableau), and define standards for scalable dashboard design, self-service reporting, and analytics enablement. Evaluate and incorporate additional platforms (e.g., Power BI, Looker) as needed. Stakeholder Engagement & Strategic Alignment: Partner with leaders in Finance, RevOps, Product, and other departments to align reporting and data strategy with business objectives. Translate business needs into scalable reporting solutions and drive enterprise-wide adoption through clear communication and training. Data Quality & Observability: Implement data quality monitoring, lineage tracking, and observability tools to proactively detect issues and ensure data reliability and trustworthiness. Documentation & Transparency: Create and maintain robust documentation for data processes, pipeline architecture, code repositories (via GitHub), and business definitions to support transparency and auditability for technical and non-technical users. Executive-Level Reporting & Insight: Design and maintain strategic dashboards that surface key SaaS performance indicators to senior leadership and the board. Deliver actionable insights to support company-wide strategic decisions. Continuous Improvement & Innovation: Stay current with trends in data governance, BI technologies, and AI. Proactively recommend and implement enhancements to tools, processes, and governance maturity. Qualifications Data Governance Expertise: Proven experience implementing data governance frameworks, compliance standards, and ownership models across cross-functional teams. SQL Expertise: Advanced SQL skills with a strong background in ETL/data pipeline development across systems like Salesforce and enterprise data warehouses. BI Tools Mastery: Expertise in Tableau for developing reports and dashboards. Experience driving adoption of BI best practices across a diverse user base. Salesforce Data Proficiency: Deep understanding of SFDC data structure, reporting, and integration with downstream systems. Version Control & Documentation: Hands-on experience with GitHub and best practices in code versioning and documentation of data pipelines. Leadership & Stakeholder Communication: 3+ years of people management experience with a track record of team development and stakeholder engagement. Analytics Experience: 8+ years of experience in analytics roles, working with large datasets to derive insights and support executive-level decision-making. Programming Knowledge: Proficiency in Python for automation, data manipulation, and integration tasks. SaaS Environment Acumen: Deep understanding of SaaS metrics, business models, and executive reporting needs. Cross-functional Collaboration: Demonstrated success in partnering with teams like Finance, Product, and RevOps to meet enterprise reporting and insight goals. EEO Statement iCIMS is a place where everyone belongs. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at iCIMS.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Sanas is revolutionizing the way we communicate with the world s first real-time algorithm, designed to modulate accents, eliminate background noises, and magnify speech clarity. Pioneered by seasoned startup founders with a proven track record of creating and steering multiple unicorn companies, our groundbreaking GDP-shifting technology sets a gold standard. Sanas is a 200-strong team, established in 2020. In this short span, we ve successfully secured over $100 million in funding. Our innovation have been supported by the industry s leading investors, including Insight Partners, Google Ventures, Quadrille Capital, General Catalyst, Quiet Capital, and other influential investors. Our reputation is further solidified by collaborations with numerous Fortune 100 companies. With Sanas, you re not just adopting a product; you re investing in the future of communication. We re looking for a sharp, hands-on Data Engineer to help us build and scale the data infrastructure that powers cutting-edge audio and speech AI products. You ll be responsible for designing robust pipelines, managing high-volume audio data, and enabling machine learning teams to access the right data fast. As one of the first dedicated data engineers on the team, youll play a foundational role in shaping how we handle data end-to-end, from ingestion to training-ready features. Youll work closely with ML engineers, research scientists, and product teams to ensure data is clean, accessible, and structured for experimentation and production. Key Responsibilities : Build scalable, fault-tolerant pipelines for ingesting, processing, and transforming large volumes of audio and metadata. Design and maintain ETL workflows for training and evaluating ML models, using tools like Airflow or custom pipelines. Collaborate with ML research scientists to make raw and derived audio features (e.g., spectrograms, MFCCs) efficiently available for training and inference. Manage and organize datasets, including labeling workflows, versioning, annotation pipelines, and compliance with privacy policies. Implement data quality, observability, and validation checks across critical data pipelines. Help optimize data storage and compute strategies for large-scale training. Qualifications : 2-5 years of experience as a Data Engineer, Software Engineer, or similar role with a focus on data infrastructure. Proficient in Python, SQL, and working with distributed data processing tools (e.g., Spark, Dask, Beam). Experience with cloud data infrastructure (AWS/GCP), object storage (e.g.,S3), and data orchestration tools. Familiarity with audio data and its unique challenges (large file sizes, time-series features, metadata handling) is a strong plus. Comfortable working in a fast-paced, iterative startup environment where systems are constantly evolving. Strong communication skills and a collaborative mindset you ll be working cross-functionally with ML, infra, and product teams. Nice to Have : Experience with data for speech models like ASR, TTS, or speaker verification. Knowledge of real-time data processing (e.g., Kafka, WebSockets, or low-latency APIs). Background in MLOps, feature engineering, or supporting model lifecycle workflows. Experience with labeling tools, audio annotation platforms, or human-in-the-loop systems. Joining us means contributing to the world s first real-time speech understanding platform revolutionizing Contact Centers and Enterprises alike. Our technology empowers agents, transforms customer experiences, and drives measurable growth. But this is just the beginning. Youll be part of a team exploring the vast potential of an increasingly sonic future

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience e

Posted 2 weeks ago

Apply

5.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As a Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 3+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience e

Posted 2 weeks ago

Apply

7.0 - 11.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly skilled and motivated Senior Snowflake Developer to join our growing data engineering team. In this role, you will be responsible for building scalable and secure data pipelines and Snowflake-based architectures that power data analytics across the organization. You ll collaborate with business and technical stakeholders to design robust solutions in an AWS environment and play a key role in driving our data strategy forward. Responsibilities Design, develop, and maintain efficient and scalable Snowflake data warehouse solutions on AWS. Build robust ETL/ELT pipelines using SQL, Python, and AWS services (e.g., Glue, Lambda, S3). Collaborate with data analysts, engineers, and business teams to gather requirements and design data models aligned with business needs. Optimize Snowflake performance through best practices in clustering, partitioning, caching, and query tuning. Ensure data quality, accuracy, and completeness across data pipelines and warehouse processes. Maintain documentation and enforce best practices for data architecture, governance, and security. Continuously evaluate tools, technologies, and processes to improve system reliability, scalability, and performance. Ensure compliance with relevant data privacy and security regulations (e.g., GDPR, CCPA). Bachelor s degree in Computer Science, Information Technology, or a related field. Minimum 5 years of experience in data engineering, with at least 3 years of hands-on experience with Snowflake.

Posted 2 weeks ago

Apply

8.0 - 9.0 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Team Leader Job Description: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Collaborate with Survey Programmers and internal project managers to understand survey requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement Manage the pipeline and internal/external stakeholder expectations Train and mentor junior team members Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 4+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply

1.0 - 2.0 years

5 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Data Validation (DV) Specialist (Using SPSS) - Analyst Job Description: Core Responsibilities: Perform data quality checks and validation on market research datasets Develop and execute scripts and automated processes to identify data anomalies. Collaborate with the Survey Programming team to review survey questionnaires and make recommendations for efficient programming and an optimal layout that enhances user experience. Investigate and document data discrepancies, working with survey programming team/data collection vendors as needed. Create and maintain detailed data documentation and validation reports. Collaborate with Survey Programmers and internal project managers to understand data processing requirements and provide guidance on quality assurance best practices. Provide constructive feedback and suggestions for improving the quality of data, aiming to enhance overall survey quality. Automate data validation processes where possible to enhance efficiency and reduce time spent on repetitive data validation tasks. Maintain thorough documentation of findings and recommendations to ensure transparency and consistency in quality practices. Actively participate in team meetings to discuss project developments, quality issues, and improvement strategies, fostering a culture of continuous improvement. Qualification: Bachelor s degree in computer science, Information Technology, Statistics, or a related field. At least 2+ years of experience in data validation process. Familiar with data validation using SPSS, Dimension, Quantum platform or similar tools A proactive team player who thrives in a fast-paced environment and enjoys repetitive tasks that contribute to project excellence. Programming knowledge in a major programming language such as R, JavaScript, or Python, with an interest in building automation scripts for data validation. Excellent problem-solving skills and a willingness to learn innovative quality assurance methodologies. A desire for continuous improvement in processes, focusing on creating efficiencies that lead to scalable and high-quality data processing outcomes. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 2 weeks ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Proficient with 4/5 Python (automation) and 4/5 Postman (API testing). 3/5 BI knowledge is good to have Job Description: Key Responsibilities:Test Planning and Test Execution:Update and Maintain existing test artifacts including the E2E Test Strategy, define new Test Cases and Test Execution Plans.Drive execution of Integration, End to End (E2E) and Regression testsWork with integration partners to drive defects to closureAssist with troubleshooting newly discovered data quality issues and clearly report findings in correct level of detail depending on the audienceSupport home grown Data Quality Comparator Tool as needed; the tool is a Python-based data comparator tool for exporting json files to excel where transaction level comparisons are made as part of test execution. Test Automation (where applicable):Client is interested in expanding automation of Point of Sale (POS) order entry and subsequent validation of data points in downstream systems. To do this, client would like to leverage an existing automation proof of concept (POC) Qualitest developed using PyWinAuto for additional automation, hence the need for Python programming skills should the automation scripting get approved to proceed.Capture test automation requirements and refactor automation scripts and framework as needed.Demo new automated test cases for leadership and stakeholders to ensure coverage.Lead Execution of Automated Tests, document and report test results Reporting:Attend project related meetings in addition to test status reporting meetingsTest execution status and defect reportingBe prepared to speak to test execution metrics during weekly status meeting Desired Skills:Python; for Python-based Data Comparator Tool maintenance and potential test automation with PyWinAutoVisual Studio Code; source code editor used for maintaining the Data Comparator ToolPostman; for API testing multiple systems which support the Jack in the Box (JIB) the Back of House system (from order entry to the general ledger (GL).SwaggerUI; also for API testing, is similar to Postman but the collections are always current. SwaggerUI is used for confirming updates are saved and written to the audit table. Environment Names, Collections and End points needed to run the API tests are provided by the team doing the integration. There are some instances where Web Hooks are used instead of APIs, so some knowledge of Web Hooks would be good.DataDog; is used for observability and messaging when updates are submitted via UI or API. Guids are captured in DataDog to use in SwaggerUI to confirm updates to the audit table.SLACK; a common direct messaging platform used by the numerous vendors at the client, as not all use MS Teams for messaging.Monday; work management application used for project and task management Nice To Have skillsProject experience in retail or restaurant vertical market as client is a fast food restaurant chainInventory management skills and ability to track inventory depletion as restaurant food orders are processedBasic accounting principles for testing various cash & sales metrics like (Net Sales, Gross Sales, Adjusted Gross Sales, Tax, Discounts, etc.). The cash & sales metrics formulas are already documented - only need to generate test cases out of them and possibly automate if requested.Be proactive, responsive and flexible as client demands can rapidly change

Posted 2 weeks ago

Apply

9.0 - 12.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled and experienced Ab Initio Lead with 10 years of hands-on experience in ETL development preferably Ab Initio. The ideal candidate should have strong technical expertise, leadership qualities, and the ability to guide teams and collaborate with cross-functional stakeholders. This role involves end-to-end ownership of data integration and ETL processes using Ab Initio tools. Lead the design, development, and implementation of complex ETL solutions using Ab Initio (GDE, EME, Co>Operating System, and related components). Work closely with business analysts, data architects, and stakeholders to gather requirements and translate them into scalable data pipelines. Optimize performance of ETL jobs and troubleshoot data issues in large-scale production environments. Lead a team of Ab Initio developers and ensure adherence to development standards and best practices. Manage code reviews, performance tuning, and deployment activities. Ensure high levels of data quality and integrity through effective testing and validation procedures. Work with DevOps and infrastructure teams to support deployments and release management. Required Skills: 10+ years of hands-on experience with Ab Initio development (GDE, EME, Conduct>It, Co>Operating System, Continuous Flows, etc.). Strong experience in data warehousing concepts and ETL design patterns. Expertise in performance tuning and handling large volumes of data. Knowledge of data modeling, data governance, and metadata management. Strong problem-solving and debugging skills. Excellent communication and stakeholder management skills. Ability to lead and mentor a team of developers effectively. Lead, Abinitio

Posted 2 weeks ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Senior Developer, Power BI Job Details | Hollister Incorporated Search by Keyword Search by Location (City, State, Country) Select how often (in days) to receive an alert: Select how often (in days) to receive an alert: Senior Developer, Power BI Jun 6, 2025 Gurugram, HR, IN, 122002 Hollister Global Business Services India Private L Summary: The Senior Power BI Developer is responsible for designing, developing, and maintaining business intelligence solutions using Power BI. The role involves gathering requirements from stakeholders, creating data models, developing interactive dashboards, and optimizing report performance. This position requires strong skills in data analysis, DAX, SQL, and Power BI best practices to ensure accurate and efficient reporting. The Power BI Developer works closely with business teams to transform data into meaningful insights, ensuring reports are clear, secure, and aligned with business needs. Testing, validation, and continuous improvement are key aspects of the role to support data-driven decision-making. Responsibilities: Gather and analyze business requirements for reporting and data visualization needs. Design and develop Power BI dashboards, reports, and data models to provide actionable insights. Create and optimize DAX calculations for performance and accuracy. Develop and maintain SQL queries to extract, transform, and load data. Ensure data accuracy, consistency, and security within Power BI reports. Collaborate with business users to refine dashboards and improve usability. Optimize report performance by managing data sources, relationships, and query efficiency. Conduct testing and validation to ensure reports meet business needs. Provide documentation and training for end-users on Power BI solutions. Stay updated on Power BI features and best practices to enhance reporting capabilities. Configure and manage workspaces, data refresh schedules, row-level security (RLS), and permissions in Power BI Service Collaborate with Data Engineers and Architects to build scalable data models and reporting solutions Strong proficiency in efficient data modeling, ensuring optimized performance and scalability using techniques like Aggregations, Indexing and Partitioning Ability to use Power BI APIs for scheduled refreshes, subscriptions, and monitoring usage analytics Advanced data transformations using Power Query Real-time reporting using DirectQuery and Composite models Knowledge of AI & ML features in Power BI would be a bonus Familiarity with Azure DevOps, Git and CI/CD for PowerBI version control and deployment pipelines Essential Functions of the Role**: Flexibility in work schedule, off-hours for project implementation. Travel via plane or automobile both locally and internationally Work Experience Requirements Number of Overall Years Necessary: 5-8 A minimum of 3 years of experience in Microsoft Power BI A minimum or 3 years of experience business process analysis and design Education Requirements BS/BA , or equivalent business experience in a business related discipline Specialized Skills/Technical Knowledge: In-depth Power BI expertise, including report development, data modeling, DAX calculations, and performance optimization. Strong knowledge of SQL, including querying, data transformation, and performance tuning for Power BI datasets. Understanding of enterprise-wide data structures, integrations, and key business processes relevant to reporting needs. Experience working with various data sources such as SQL databases, Excel, APIs, and cloud-based data platforms. Ability to analyze and translate business requirements into technical solutions using Power BI. Familiarity with data governance, security, and compliance best practices within Power BI and related tools. Proficiency in Microsoft Excel, including advanced formulas, pivot tables, and data visualization techniques. Strong analytical and problem-solving skills to assess data quality, identify trends, and provide meaningful insights. Effective communication skills to work with stakeholders, explain technical concepts in business terms, and document reporting solutions. Ability to stay updated on Power BI advancements and apply new features to improve reporting efficiency.

Posted 2 weeks ago

Apply

10.0 - 13.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: Full Stack Engineer: Data Engineering- Job Description Cigna, a leading Health Services company, is looking for an exceptional engineer in our Data & Analytics Engineering organization. The Full Stack Engineer is responsible for the delivery of a business need starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is ownership, eagerness to learn & an open mindset. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. Person should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Job Description & Responsibilities : Behaviors of a Full Stack Engineer: Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers - not institutionalized developers. Experience Required: 11 - 13 years of experience in Python. 11 - 13 years of experience in Data Management & SQL expertise. 5+ years in Spark and AWS. 5+ years in Databrick. Experience with working in agile CI/CD environments. Experience Desired: Git ,Teradata & Snowflake experience. Experience working on Analytical Models and their deployment / production enable? ment via data & analytical pipelines. Expertise with big data technologies - Hadoop, HiveQL, (Scala/Python) Expertise on Cloud technologies - (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR). Experience with BDD and TDD development methodologies. Health care information domains preferred. Education and Training Required: Bachelor s degree (or equivalent) required. Primary Skills: Python, AWS and Spark. CI/CD , Databrick. Data management and SQL. Additional Skills: Strong communication skills. Take ownership and accountability. Write referenceable & modular code Be fluent areas and have proficiency in many areas Have a passion to learn. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. Take risks and champion new ideas. About Evernorth Health Services

Posted 2 weeks ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Purpose of the role The Production Manager manages the end-to-end production of print and retail display projects The PM will also be expected to mentor and upskill the Production Executives and face the client directly You will work within your team and other relevant departments on the delivery of client production activities with the highest compliance and quality standards You will have accountability for the accuracy, quality, timeliness and profitability of multiple campaigns and projects by managing the briefing, scheduling and delivery process on behalf of the client You will help drive the effectiveness of our internal delivery processes and by doing so maintain client profitability Manage and fulfill BTL (print, POSM, displays) needs of the marketing team of the companys global and local clients Strategic Thinking Create a collaborative environment and stimulate discussion on upcoming projects and requirements in the production space Confidently demonstrate your knowledge as a subject matter expert and are happy to field questions and find solutions Process Focus Pro-actively propose continuous improvement initiatives to your line manager or wider team, as well as provide advice on strategic campaigns Campaign Management Be comfortable with managing briefs Be comfortable questioning a brief to ensure you can effectively deliver it Provide regular campaign updates and proactively communicate the next steps to the Senior Production Manager or Team Leader in advance Produce clear campaign plans and manage client expectations confidently throughout the delivery of a project Ensure all activity reports are accurate and deliver value to your client based on their reporting requirements Provide adequate support to the Client Services team when required from thorough spec completion and following the required ways of working Analytical Skills: Data Quality, Reporting and Insights Understand the importance of data quality and its business impact follow compliance processes to ensure data accuracy at all times Present monthly review of savings and projects delivered highlighting key achievements, challenges and progress against reciprocal KPIs Understand why savings are up/down and proactively take steps to collaborate with key stakeholders to improve performance and ensure data accuracy and completeness Negotiations Management Manage team and supplier conversations to find mutually beneficial outcomes, be it in terms of savings, value engineering, quality, etc- Skills, knowledge, experience and exposure: The Production Manager will likely have 7+ years of Print & POSM production management experience Relevant project management and supplier liaison experience will also be considered positively Industry background or education will be strongly positive Fluency both written and spoken in English Client Interfacing skills for management of the account Mindset to navigate the role Clear and effective communication skills at all levels in the hierarchy A solution-orientated and resolution mindset, where problems are just the starting point for finding solutions A problem-solving mindset, as the nature of the role will require to navigate different types of challenges and new scenarios A technology-oriented mindset, as you will have to use different technologies in your daily job Attention to detail is critical in the Production Managers role Good mediation skills, as you will act as one of the links between Indicia Worldwide and the suppliers The softer skills that we believe will help you thrive in this role: A positive ?can-doattitude at all times, setting an example for team members A high level of energy when delivering and the ability to face challenges in a serene and collaborative manner A curious mind, i e- wanting to know all about internal operations and processes, aiming for continuous improvement Accountability: you are a professional and we will treat you as such You will be expected to manage your workload and to raise your hand when you need team support Collaborative spirit: use we, not me If someone is in trouble, you will do your best to support A good communications skillset with both internal and external teams Exceptional prioritization skills A strong ability to work under pressure and to comply with deadlines Role Requirements An onsite model applies to this role, which requires one being at the clients office in Bangalore alongside the marketing team five days a week Travel to conferences or client events might be required on occasion Whilst the contracted hours for the role are 9 00am to 5:30pm, Mon Fri, the project-based nature of the role requires that some days you will have to work outside of these hours, and you will recover that time in other moments About Us Indicia Worldwide is an insight and technology led communications agency with global production expertise Why we exist: we create new value At Indicia Worldwide, our philosophy is one of ?creating new value We create new value at every step of the journey that a brand takes to market, by driving an increase in marketing performance and reducing costs in marketing execution For our clients and their customers, we are building mutually-beneficial partnerships We see this proposition, built around the perfect balance of efficiency and effectiveness, as pioneering, entrepreneurial and, above all else, sustainable Our substantial investment in our technology and data science capability, and resource, provides our differentiation in the marketplace Data insight and marketing technology give us the ability to measure our work, evidencing ROI as the most critical metric in todays environment We see ROI where others don't We are the only agency that combines creative, data and technology talent with production and procurement expertise to improve your marketing performance and efficiencies We realise ROI for our clients by enabling them to deliver more engaging, cost-effective and sustainable customer experiences As a business, we draw on a rich heritage from the worlds of print, creative production, retail, data, digital, tech, and creative, bringing these disciplines together to support global brands with their omnichannel marketing activation needs We believe in what we do We believe this proposition makes us unique We have the capacity to redefine the way marketing is activated for our clients across the globe Now, and well into the future The output: Improved client performance by engaging consumers with brand ideas better, faster, and more cost-effectively across every step of the journey that a brand takes to market We are an equal opportunities employer and as such, will make any reasonable adjustments to accommodate the needs of all candidates If you have any such needs or requirements in the context of your interview, please notify us so that we can make the appropriate arrangements Show more Show less

Posted 2 weeks ago

Apply

12.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Adobe DMe B2B analytics team is looking for a Senior Analytics Engineer to build the foundational data assets and analytical reports for the B2B customer journey analytics. The data assets and reports will provide quantitative and qualitative analysis for acquisition, engagement and retention. In addition, the candidate will contribute on multi-functional projects to help business achieve its potential in terms of revenue, customer success, and operational excellence. What you'll Do Analyze complex business needs, profile diverse datasets, and optimize scalable data products on various data platforms, like Databricks, prioritizing data quality, performance, and efficiency. Build and maintain foundational datasets and reporting solutions that power dashboards, self-service tools, and ad hoc analytics across the organization. Partner closely with Business Intelligence, Product Management, and Business Stakeholders to understand data needs. Guide and assist junior analytics engineers and analysts through code reviews, advocating standard methodologies, and encouraging a culture of code simplicity, clarity, and technical excellence. What you need to succeed Strong analytical approach with critical thinking and problem-solving skills. Ability to quickly understand Adobe products, business processes, existing data structures, and legacy code. Minimum of 8 years of experience in data analytics, data engineering, or a related field. Solid expertise in data warehousing, data modeling, and crafting scalable data solutions. Proficiency in SQL and/or Python for building, optimizing, and fixing data pipelines. Experience using AI tools (eg, GitHub Copilot, Claude) to speed up and improve development workflows. Excellent communication skills needed to effectively work with business collaborators, engineers, and analysts

Posted 2 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Mumbai

Work from Office

Naukri logo

Working as SME in data governance, metadata management and data catalog solutions, specifically on Collibra Data Governance. Client interface and consulting skills required Experience in Data Governance of wide variety of data types (structured, semi-structured and unstructured data) and wide variety of data sources (HDFS, S3, Kafka, Cassandra, Hive, HBase, Elastic Search) Partner with Data Stewards for requirements, integrations and processes, participate in meetings and working sessions Partner with Data Management and integration leads to improve Data Management technologies and processes. Working experience of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra Experience in setting up peoples roles, responsibilities and controls, data ownership, workflows and common processes Integrate Collibra with other enterprise toolsData Quality Tool, Data Catalog Tool, Master Data Management Solutions Develop and configure all Collibra customized workflows 10. Develop API (REST, SOAP) to expose the metadata functionalities to the end-users Location :Pan India

Posted 2 weeks ago

Apply

6.0 - 12.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

As a Data Architect, you'll design and optimize data architecture to ensure data is accurate, secure, and accessible. you'll collaborate across teams to shape the data strategy, implement governance, and promote best practices enabling the business to gain insights, innovate, and make data-driven decisions at scale. Your responsibilites Responsible for defining the enterprise data architecture which streamlines, standardises, and enhances accessibility of organisational data. Elicits data requirements from senior Business stakeholders and the broader IS function, translating their needs into conceptual, logical, and physical data models. Oversees the effective integration of data from various sources, ensuring data quality and consistency. Monitors and optimises data performance, collaborating with Data Integration and Product teams to deliver changes that improve data performance. Supports the Business, Data Integration Platforms team and wider IS management to define a data governance framework that sets out how data will be governed, accessed, and secured across the organisation; supports the operation of the data governance model as a subject matter advisor. Provides advisory to Data Platform teams in defining the Data Platform architecture, providing advisory on metadata, data integration, business intelligence, and data storage needs. Supports the Data Integration Platforms team, and other senior IS stakeholders to define a data vision and strategy, setting out how the organisation will exploit its data for maximum Business value. Builds and maintains a repository of data architecture artefacts (eg, data dictionary). What we're Looking For Proven track record in defining enterprise data architectures, data models, and database/data warehouse solutions. Evidenced ability to advise on the use of key data platform architectural components (eg, Azure Lakehouse, Data Bricks, etc) to deliver and optimise the enterprise data architecture. Experience in data integration technologies, real-time data ingestion, and API-based integrations. Experience in SQL and other database management systems. Strong problem-solving skills for interpreting complex data requirements and translating them into feasible data architecture solutions and models. Experience in supporting the definition of an enterprise data vision and strategy, advising on implications and/or uplifts required to the enterprise data architecture. Experience designing and establishing data governance models and data management practices, ensuring data is correct and secure whilst still being accessible, in line with regulations and wider organisational policies. Able to present complex data-related initiatives and issues to senior non-data conversant audiences. Proven experience working with AI and Machine Learning models preferred, but not essential. What We Can Offer You We support your growth within the role, department, and across the company through internal opportunities. We offer a hybrid working model, allowing you to combine remote work with the opportunity to connect with your team in modern, welcoming office spaces. We encourage continuous learning with access to online platforms (eg, LinkedIn Learning), language courses, soft skills training, and various we'llbeing initiatives, including workshops and webinars. Join a diverse and inclusive work environment where your ideas are valued and your contributions make a difference.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

25 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

The role will be responsible for setting up the data warehouses necessary to handle large volumes of data, create meaningful analyses, and deliver recommendations to leadership. Core Responsibilities Create and maintain optimal data pipeline architecture ETL/ ELT into structured data Assemble large, complex data sets that meet business requirements and create and maintain multi-dimensional modelling like Star Schema and Snowflake Schema, normalization, de-normalization, joining of datasets. Expert level experience in creating a scalable data warehouse including Fact tables, Dimensional tables and ingest datasets into cloud based tools. Identify, design, and implement internal process improvements including automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability. Collaborate with stakeholders to ensure seamless integration of data with internal data marts, enhancing advanced reporting Setup and maintain data ingestion, streaming, scheduling, and job monitoring automation using AWS services. Setup Lambda, code pipeline (CI/CD), Glue, S3, Redshift, Power BI needs to be maintained for uninterrupted automation. Build analytics tools that utilize the data pipeline to provide actionable insight into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs. Utilize GitHub for version control, code collaboration, and repository management. Implement best practices for code reviews, branching strategies, and continuous integration. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader Ensure data privacy and compliance with relevant regulations (eg, GDPR) when handling customer data. Maintain data quality and consistency within the application, addressing data-related issues as they arise. Required 7-10 years of relevant experience Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as we'll as working familiarity with a variety of databases and Cloud Data warehouse like AWS Redshift Experience in creating scalable, efficient schema designs to support diverse business needs. Experience with database normalization, schema evolution, and maintaining data integrity Proactively share best practices, contributing to team knowledge and improving schema design transitions. Develop data models, create dimensions and facts, and establish views and procedures to enable automation programmability. Collaborate effectively with cross-functional teams to gather requirements, incorporate feedback, and align analytical work with business objectives Prior Data Modelling, OLAP cube modelling Data compression into PARQUET to improve processing and finetuning SQL programming skills. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with manipulating, processing and extracting value from large disconnected unrelated datasets Strong analytic skills related to working with structured and unstructured datasets. Working knowledge of message queuing, stream processing, and highly scalable big data stores. Experience supporting and working with cross-functional teams and Global IT. Familiarity of working in an agile based working models. Preferred Qualifications/Expertise Experience with relational SQL and NoSQL databases, especially AWS Redshift. Experience with AWS cloud services Preferable : S3, EC2, Lambda, Glue, EMR, Code pipeline highly preferred. Experience with similar services on another platform would also be considered. Education: bachelors or masters degree on Technology and Computer Science background

Posted 2 weeks ago

Apply

0.0 - 2.0 years

12 - 14 Lacs

Perinthalmanna

Work from Office

Naukri logo

The Data Analyst will play a key role in transforming raw data into actionable insights to support strategic decision-making and enhance operational efficiency. The ideal candidate is a detail-oriented professional with strong analytical skills, proficiency in data tools, and the ability to communicate findings effectively to diverse stakeholders. Key Responsibilities Collect, analyze, and interpret data from multiple sources to provide actionable insights. Deliver regular and ad-hoc reports, dashboards, and analytical solutions to stakeholders. Develop high-quality, user-friendly reports and dashboards with self-service capabilities. Write SQL queries to manipulate and extract insights from large datasets. Collaborate with teams to ensure data quality and improve data collection and reporting processes. Present data-driven findings and recommendations in a clear and concise manner. Work closely with leadership to understand business needs and provide tailored analytical solutions. Identify trends, patterns, and opportunities to support operational and strategic goals. Experiece Proven experience in data analysis, reporting, and deriving insights in a fast-paced environment. Strong proficiency in manipulating and interpreting large datasets. Experience working with business stakeholders to translate data into actionable outcomes. Familiarity with data visualization and reporting tools Skills Demonstrated ability to deliver high-quality analytical reports and insights. Commitment to maintaining data accuracy and improving data processes. Strong organizational skills with the ability to manage multiple tasks and deadlines. Interpersonal skills to work effectively with cross-functional teams and stakeholders. Experience with Zoho Analytics or similar tools is highly desirable but not mandatory

Posted 2 weeks ago

Apply

3.0 - 8.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Data Scientist to join our data science team in Bangalore. This individual contributor role will be responsible for driving data-driven decision making across the organization. They will collaborate with business stakeholders, understand their problems, and design data-based solutions for them. Key responsibilities include: 1. Develop and implement machine learning models, perform data analysis, and create predictive models to support the Client Services teams. 2. Collaborate with team members and stakeholders to understand their data science needs and design solutions in collaboration/guidance from team members. 3. Conduct research to identify new methods and techniques to improve existing models. 4. Create visualizations to communicate complex data and insights in a clear and effective manner. 5. Ensure data quality throughout all stages of acquisition and processing, including data cleaning, normalization, and transformation. 6. Maintain clear and coherent communication, both verbal and written, to understand data needs and report results Basic Qualifications Bachelors degree, OR 3+ years of relevant work experience Preferred Qualifications Bachelors degree, OR 3+ years of relevant work experience 1 or more years of work experience with a Bachelors Degree or an Advanced Degree (e.g. Masters, MBA, JD, MD) in data science, machine learning, AI, or a related field. Proficient in Python, R, SQL, and other data science tools. Knowledge of big data platforms like Hadoop, Spark, or similar technologies. Knowledgeable in machine learning, generative AI techniques and algorithms. Familiarity with Large Language Models such as OpenAIs GPT-3, and open-source models like Falcon or Llama2. Proficient in Machine Learning Operations (MLOps), including deployment and maintenance of machine learning models. Excellent problem-solving skills and ability to think critically. Strong communication skills to clearly articulate the message with team members and stakeholders Knowledge of cloud platforms like AWS, GCP, or Azure is good to have. Familiarity with data visualization tools like Tableau, PowerBI, or similar. Ability to work in a team and independently as required. Familiarity with containerization technologies like Docker or Kubernetes.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

The analyst shall define data validation process and perform manual reviews, data source review and alignment with data providers, ensure data quality standards are met, perform gap analyses and design solutions for improving data quality, experience with cost allocation methodology and infrastructure inventory/capacity type data

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies