Home
Jobs

758 Metadata Jobs - Page 9

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

This role will be part of a team that develops software interfaces with all the major TV streaming providers, and downloads tens of thousands of movies and TV shows from US and international platforms. This team is at the heart of Nielsens streaming measurement strategy, and is continually updating their software to adjust to the quickly evolving streaming environment. MTS is ultimately responsible for delivering technical solutions: starting from the projects onboard until post-launch support and including development, testing, and user acceptance. It is expected to work with multiple delocalized project teams in multiple regions. You will work on our video and metadata capture systems, processing large audio files using proprietary algorithms to generate audio signatures/fingerprints. Your role will involve implementing and maintaining robust, scalable solutions that leverage Python/SQL code that is optimized for the AWS platform. You will play a key role in shaping the technical direction of our projects. Responsibilities System Deployment: Build new features in the existing video and meta-data asset capture systems. CI/CD Implementation: Leverage CI/CD pipelines for automated build, test, and deployment processes. Ensure continuous integration and delivery of features, improvements, and bug fixes. Code Quality and Best Practices: Adhere to coding standards, best practices, and design principles. Participate in code reviews and provide constructive feedback to maintain high code quality. Performance Optimization: Assist in the Identification of performance bottlenecks in both client-side and data upload components. Optimize applications for remote/unassisted installation. Team Collaboration: Follow best practices. Collaborate with cross-functional teams to ensure a cohesive and unified approach to software development. Security and Compliance: Implement security best practices for all tiers of the system. Ensure compliance with industry standards and regulations related to AWS platform security. Key Skills Bachelors or Master s degree in Computer Science, Software Engineering, or a related field. Proven experience, minimum 2 years software development expertise on the AWS platform Experience with scripting languages such as Python Good experience with SQL and a database system such as Postgres Good understanding of CI/CD principles and tools. GitLab a plus Good problem-solving and debugging skills. Good communication and collaboration skills with ability to communicate complex technical concepts and align organization on decisions. Utilizes team collaboration to contribute to innovative solutions efficiently Other desirable skills Knowledge of networking principles and security best practices. AWS certificationsExperience with Test Automation suites using the Selenium framework Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Mohali

Work from Office

Naukri logo

VNG Medical Innovation System is looking for Data Analyst to join our dynamic team and embark on a rewarding career journey Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

2 - 6 Lacs

Visakhapatnam

Work from Office

Naukri logo

Valita Technology Pvt Ltd is looking for Data Analyst to join our dynamic team and embark on a rewarding career journey Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing. Data Center Operations designs, installs maintains the world s largest Cloud Computing Infrastructure

Posted 2 weeks ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Data Governance Analyst Level 3 for US based IT Company based in Hyderabad. Candidates with experience in Data Governance can apply. Job Title : Data Governance Analyst Level 3 Location : Hyderabad Experience : 7+ Years CTC : 18 LPA - 20 LPA Working shift : Day shift Description: We are seeking a seasoned and detail-focused Senior Data Governance Analyst (Level 3) to support and drive enterprise-wide data governance initiatives . This position will play a crucial role in implementing and managing data governance frameworks , ensuring data quality , and supporting regulatory compliance efforts across business units. The ideal candidate will bring deep expertise in data governance best practices , data quality management , metadata , and compliance standards within the financial services industry . As a senior team member, the analyst will collaborate closely with data stewards , business stakeholders , and technical teams to ensure consistent, accurate, and trusted use of enterprise data. Key Responsibilities: Implement and enhance data governance policies, standards, and processes across the organization Partner with business and technical teams to define and manage data ownership, stewardship , and accountability models Maintain and improve metadata and data lineage documentation using tools such as Collibra , Alation , or similar platforms Monitor key data quality metrics , conduct root cause analysis , and lead issue resolution efforts Ensure compliance with regulatory data requirements (e.g., BCBS 239, GDPR, CCPA ) Facilitate and lead data governance meetings , working groups, and stakeholder communications Support the creation and deployment of data literacy initiatives across the enterprise Document governance practices and develop reports for audits and executive leadership Serve as a subject matter expert in data governance and promote data management best practices across departments Required Skills & Qualifications: 5+ years of experience in Data Governance , Data Quality , or Data Management roles Proven experience in developing and managing data governance frameworks in complex organizational environments Strong understanding of data quality principles , data standards , and issue management workflows Experience with metadata management , data cataloging , and lineage tracking Proficiency with governance tools like Collibra , Alation , or similar platforms Solid grasp of data compliance and regulatory standards in the financial services sector Excellent communication, stakeholder engagement, and documentation skills Strong analytical thinking and problem-solving capabilities Preferred Qualifications: Experience in banking or financial services environments Understanding of enterprise data architecture , Master Data Management (MDM) , and BI/reporting systems Knowledge of data privacy regulations such as GDPR , CCPA , etc. Experience working within Agile project methodologies For further assistance contact/whatsapp : 9354909517 or write to hema@gist.org.in

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Multiplatform Front End Development React, Amazon Web Services (AWS), React Native, React.js Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve working with Amazon Web Services (AWS), React.js, and React Native to develop multiplatform front-end applications. Key Responsibilities:1 Understanding requirements and involving in design and implementation 2 Collaborate with other peers, having domain expertise to build the right solution that business needs 3 Self-driven and capable of managing multiple priorities under pressure and ambiguity 4 Ability to work effectively in a fast-paced environment 5 Keen eye for usability, creating intuitive visually appealing experiences 6 UI will be used by consumers to extract the relevant data from Metadata repository Development work in the search space using Elastic search Technical Experience:1 7 plus years' experience developing with ReactJS 2 State management with Redux3 Strong Fundamental JavaScript skills ES5 and ES6,CSS skills, 4 Experience with TypeScript or ClojureScript is a plus, Thorough understanding of Reactjs and its core principles 5 React combined with Flux/Redux experience is preferred 6 Thorough understanding of Reactjs and its core principles 7 Experience developing component-driven UIs 8 Experience with data structure libraries 9 Knowledgeable in performance optimization techniques.10 Knowledge of AWS services and deployment is an advantage Additional Information: The candidate should have a minimum of 5 years of experience in Multiplatform Front End Development React. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful multiplatform front-end solutions. Ready to work in B shift - 12 PM to 10 PM Qualifications 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Veeva Vault Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Offshore Migration Lead, you will oversee and coordinate offshore migration execution into the Veeva Vault platform. You will lead a team of migration specialists and analysts and BA, apply hands-on expertise in Vault migrations, SQL, and RDBMS, and collaborate with onshore counterparts to execute plans, manage timelines, resolve issues, and ensure compliance with quality standards. Roles & Responsibilities:-Lead and mentor a team of offshore migration specialists handling execution of document and metadata migration tasks.-Review deliverables and ensure adherence to migration standards, best practices, and compliance expectations.-Manage work allocation, backlog tracking, and progress reporting for offshore migration tasks.-Monitor the completion of daily/weekly migration targets, ensuring on-time and accurate delivery.-Perform root cause analysis on migration errors and coordinate with technical teams to resolve Vault Loader or API issues.-Validate output quality through spot checks, sampling, and test case validations.-Provide hands-on support when needed for migration jobs, SQL-driven data transformation, and validation checks.-Troubleshoot migration errors using Vault logs and work with developers or Vault SMEs to resolve blockers.-Act as the primary offshore contact for the onshore Migration Lead or Project Manager.-Ensure the offshore team follows controlled migration procedures and documentation protocols.-Maintain audit trails, job trackers, and version-controlled artifacts. Professional & Technical Skills: Must To Have Skills: Hands-on experience with Vault Loader and Vault REST APIs for document and object migration.-Strong command of SQL for data extraction, transformation, and validation.-Experience working with CSVs, XML, JSON payloads, and migration packaging.-Strong leadership and coordination skills in an offshore delivery model.-Excellent communication skills for daily sync-ups, reporting, and issue escalations.-Attention to detail, quality orientation, and ability to manage workload under deadlines.-Familiarity with regulatory requirements in GxP, 21 CFR Part 11 contexts.-Familiarity with Vault metadata models, document types, lifecycles, and object structures.-Experience with PromoMats / MedComms / Quality Suite / RIMS / Clinical and other Vault domains.-Proficiency in working with RDBMS like Oracle, SQL Server, PostgreSQL, or MySQL.-Experience in writing complex joins, subqueries, case statements, and data cleansing scripts.-Familiarity with legacy content/document systems such as Documentum, SharePoint, Calyx Insight, OpenText.-The candidate should have experience leading offshore migration teams for Veeva Vault projects.-Prior experience in regulated environments (GxP, 21 CFR Part 11) is required.-A minimum of 3-5 years of experience in Vault migrations is expected. Additional Information:-The candidate should have a minimum of 3 years of experience in Computer System Validation (CSV).-This position is PAN-INDIA based.-A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Veeva Vault Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Offshore Migration Lead, you will oversee and coordinate offshore migration execution into the Veeva Vault platform. You will lead a team of migration specialists and analysts and BA, apply hands-on expertise in Vault migrations, SQL, and RDBMS, and collaborate with onshore counterparts to execute plans, manage timelines, resolve issues, and ensure compliance with quality standards. Roles & Responsibilities:-Lead and mentor a team of offshore migration specialists handling execution of document and metadata migration tasks.-Review deliverables and ensure adherence to migration standards, best practices, and compliance expectations.-Manage work allocation, backlog tracking, and progress reporting for offshore migration tasks.-Monitor the completion of daily/weekly migration targets, ensuring on-time and accurate delivery.-Perform root cause analysis on migration errors and coordinate with technical teams to resolve Vault Loader or API issues.-Validate output quality through spot checks, sampling, and test case validations.-Provide hands-on support when needed for migration jobs, SQL-driven data transformation, and validation checks.-Troubleshoot migration errors using Vault logs and work with developers or Vault SMEs to resolve blockers.-Act as the primary offshore contact for the onshore Migration Lead or Project Manager.-Ensure the offshore team follows controlled migration procedures and documentation protocols.-Maintain audit trails, job trackers, and version-controlled artifacts. Professional & Technical Skills: Must To Have Skills: Hands-on experience with Vault Loader and Vault REST APIs for document and object migration.-Strong command of SQL for data extraction, transformation, and validation.-Experience working with CSVs, XML, JSON payloads, and migration packaging.-Strong leadership and coordination skills in an offshore delivery model.-Excellent communication skills for daily sync-ups, reporting, and issue escalations.-Attention to detail, quality orientation, and ability to manage workload under deadlines.-Familiarity with regulatory requirements in GxP, 21 CFR Part 11 contexts.-Familiarity with Vault metadata models, document types, lifecycles, and object structures.-Experience with PromoMats / MedComms / Quality Suite / RIMS / Clinical and other Vault domains.-Proficiency in working with RDBMS like Oracle, SQL Server, PostgreSQL, or MySQL.-Experience in writing complex joins, subqueries, case statements, and data cleansing scripts.-Familiarity with legacy content/document systems such as Documentum, SharePoint, Calyx Insight, OpenText.-The candidate should have experience leading offshore migration teams for Veeva Vault projects.-Prior experience in regulated environments (GxP, 21 CFR Part 11) is required.-A minimum of 3-5 years of experience in Vault migrations is expected. Additional Information:-The candidate should have a minimum of 3 years of experience in Computer System Validation (CSV).-This position is PAN-INDIA based.-A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Veeva Vault Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Offshore Migration Lead, you will oversee and coordinate offshore migration execution into the Veeva Vault platform. You will lead a team of migration specialists and analysts and BA, apply hands-on expertise in Vault migrations, SQL, and RDBMS, and collaborate with onshore counterparts to execute plans, manage timelines, resolve issues, and ensure compliance with quality standards. Roles & Responsibilities:-Lead and mentor a team of offshore migration specialists handling execution of document and metadata migration tasks.-Review deliverables and ensure adherence to migration standards, best practices, and compliance expectations.-Manage work allocation, backlog tracking, and progress reporting for offshore migration tasks.-Monitor the completion of daily/weekly migration targets, ensuring on-time and accurate delivery.-Perform root cause analysis on migration errors and coordinate with technical teams to resolve Vault Loader or API issues.-Validate output quality through spot checks, sampling, and test case validations.-Provide hands-on support when needed for migration jobs, SQL-driven data transformation, and validation checks.-Troubleshoot migration errors using Vault logs and work with developers or Vault SMEs to resolve blockers.-Act as the primary offshore contact for the onshore Migration Lead or Project Manager.-Ensure the offshore team follows controlled migration procedures and documentation protocols.-Maintain audit trails, job trackers, and version-controlled artifacts. Professional & Technical Skills: Must To Have Skills: Hands-on experience with Vault Loader and Vault REST APIs for document and object migration.-Strong command of SQL for data extraction, transformation, and validation.-Experience working with CSVs, XML, JSON payloads, and migration packaging.-Strong leadership and coordination skills in an offshore delivery model.-Excellent communication skills for daily sync-ups, reporting, and issue escalations.-Attention to detail, quality orientation, and ability to manage workload under deadlines.-Familiarity with regulatory requirements in GxP, 21 CFR Part 11 contexts.-Familiarity with Vault metadata models, document types, lifecycles, and object structures.-Experience with PromoMats / MedComms / Quality Suite / RIMS / Clinical and other Vault domains.-Proficiency in working with RDBMS like Oracle, SQL Server, PostgreSQL, or MySQL.-Experience in writing complex joins, subqueries, case statements, and data cleansing scripts.-Familiarity with legacy content/document systems such as Documentum, SharePoint, Calyx Insight, OpenText.-The candidate should have experience leading offshore migration teams for Veeva Vault projects.-Prior experience in regulated environments (GxP, 21 CFR Part 11) is required.-A minimum of 3-5 years of experience in Vault migrations is expected. Additional Information:-The candidate should have a minimum of 3 years of experience in Computer System Validation (CSV).-This position is PAN-INDIA based.-A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Collibra Data Governance Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure effective communication within the team and stakeholders Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance- Strong understanding of data governance principles- Experience in implementing data governance frameworks- Knowledge of data quality management practices- Hands-on experience in data cataloging and metadata management Additional Information:- The candidate should have a minimum of 5 years of experience in Collibra Data Governance- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 6.0 years

9 - 12 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Hi We are hiring for Leading ITES Company for Clinical Data Manager Profile. Role & responsibilities: Candidate should have 2-5 years of experience of Clinical Data Manage (CDM) with experience in Conduct Scope of work Perform day-to-day Clinical Data Management activities. Work and coordinate with the team to perform data management activities and deliver an error-free quality database in accordance with the data management plan and regulator standards. Read and understand the study protocol and the timelines. Perform test data entry in the TEST environment, data listing review, data reconciliation, and query management tasks. Escalate/Action discrepancy in the clinical data as appropriate. Perform external checks to handle manual discrepancies and action the same. Ensure an error-free, quality data with no open queries. Escalate any discrepancy in the clinical data to the study lead as appropriate. Timely completion of training Any other tasks deemed appropriate To perform medical data collection and analysis of Prostate Cancer Data using databases like HIS/ EMR (Electronic Medical Record) and Caisis, Rave, CDM (startup, closeout, conduct) Client interaction and meetings. Bringing up new ideas and executing new plans to cope with the backlog. Training new team members as and when required. To Apply, WhatsApp 'Hi' @ 9151555419 Follow the Steps Below: >Click on Start option to Apply and fill the details >Select the location as Other ( to get multiple location option ) a) To Apply for above Job Role ( Mumbai ) Type : Job Code # 205 b) To Apply for above Job Role ( Pune ) Type : Job Code # 206 c) To Apply for above Job Role ( Bangalore ) Type : Job Code # 207

Posted 2 weeks ago

Apply

1.0 - 5.0 years

12 - 16 Lacs

Chennai

Work from Office

Naukri logo

Role Overview We are seeking a Research Analytics Manager with a strong technical background and a passion for social impact The candidate will have experience and ability to work with large-scale, nationally representative household surveys and datasets They must demonstrate experience in acquiring, structuring, analyzing, and cleaning data. This project is part of a broader effort to create informed, data-driven ways to estimate the real cost of ending extreme poverty Using household survey data across multiple countries, the effort will provide detailed and actionable cost estimates to support governments and NGOs in planning poverty reduction programs Ultimately, the work aims to impact global efforts to eliminate poverty by equipping decision-makers with more precise tools and insights. The Research Analytics Manager Will Have The Following Responsibilities Immediate engagement: This project requires supervision of the analysis, processing, and cleaning of around 20 large country household sample survey datasets. Supervisory role requires Deep experience and familiarity with large-scale, nationally representative household surveys such as LSMS or DHS, or comparable Indian datasets such as NSSO, NFHS, or the Periodic Labour Force Survey (PLFS) Experience and interest in development economics, preferably in poverty mapping and analysis of the costs of ending extreme poverty. Responsibilities Determine relevant modules and variables from large-scale household surveys, to compile datasets for the purpose of informing economic models of the cost of ending extreme poverty. Oversee and guide data analysts in cleaning, organizing, and preparing large-scale, nationally representative household survey datasets for analysis, providing technical support and troubleshooting as needed. Ensure clear documentation, data quality and consistency across multiple modules and potentially multiple surveys This includes preparing clear Metadata tables, and documentation on the inclusion or exclusion of modules and variables. Lead and oversee the data cleaning process to ensure high-quality, error-free datasets, including handling missing values (NaNs), identifying and resolving anomalies, and maintaining consistency across variables. Coordinate with research leads and external collaborators to align data preparation with the analytical goals of the poverty mapping study. Utilize Python, R, or STATA for statistical modelling, econometric analysis, and predictive analytics. Additional Responsibilities May Include Provide technical guidance and/or quality control on potential project proposals. Act as technical lead/writer or subject matter expert for proposals. Identify and meet with economics experts, academics, and/or other research organizations on potential new research analytics opportunities. Required Qualifications And Skills Masters degree in economics, statistics, international development, public policy, social development or other related field required. At least 5 years of experience with quantitative data analysis, program evaluation, policy analysis, proposal and report writing. Passionate about using data for social impact, with experience working in international development, public health, environmental sustainability, or related fields being a plus. Experience of working with demographic data sets like DHS, NSSO, World Bank Data etc. Desirable: Experience and technical knowledge of poverty estimation in LMIC Understanding of data ethics and responsible data use in diverse contexts. Must have excellent verbal and writing skills. Must have strong Microsoft Word and Excel experience. Ability to work collaboratively in a team environment, with staff from all education and experience levels, as well as across various geographic locations. Additional Requirements The successful candidate must not be subject to employment restrictions from a former employer (such as a non-compete) that would prevent the candidate from performing the job responsibilities as described. This position may require successful completion of a reference check and employment verification. Athena Infonomics is an Equal Opportunities Employer Athena Infonomics is an equal opportunity employer with a commitment to diversity All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

Posted 2 weeks ago

Apply

2.0 - 3.0 years

8 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Job Title: Performance Marketing Specialist Industry: Education Experience: 2-3 Years Location: Gurugram Job Summary: We are seeking a highly motivated Performance Marketing Specialist to drive digital marketing campaigns and optimize online lead generation for our education sector initiatives. The ideal candidate should have 2-3 years of experience in managing paid advertising across multiple platforms and a strong understanding of performance marketing, analytics, and ROI-driven strategies. Key Responsibilities: 1. Plan, execute, and optimize performance marketing campaigns across Google Ads, Meta Ads, LinkedIn Ads, and other paid media channels. 2. Develop and manage paid search and social campaigns to drive high-quality leads for educational programs. 3. Conduct A/B testing on ad creatives, landing pages, and audience targeting to maximize campaign performance. 4. Track and analyze key metrics, including CTR, CPC, conversion rates, and ROI, and generate performance reports. 5. Collaborate with content, design, and development teams to enhance ad creatives and landing page effectiveness. 6. Implement audience segmentation, retargeting strategies, and bid optimization for improved campaign efficiency. 7. Stay updated with industry trends, platform updates, and best practices to maintain a competitive edge in digital marketing. 8. Manage budgets effectively, ensuring maximum ROI on ad spend. 9. Leverage marketing automation tools and CRM platforms for lead nurturing and pipeline management. Required Skills & Qualifications: 1. 2-3 years of hands-on experience in performance marketing, specifically in the education sector or related industries. 2. Proficiency in Google Ads, Meta Business Suite, LinkedIn Ads, and other digital advertising platforms. 3. Strong analytical skills with experience in Google Analytics, Tag Manager, and other tracking tools. 4. Understanding of audience segmentation, retargeting, and bid management strategies. 5. Knowledge of SEO principles and their impact on paid campaigns is a plus. 6. Experience with marketing automation tools like HubSpot, Marketo, or CRM platforms such as Salesforce is a plus. 7. Ability to work in a fast-paced environment and manage multiple campaigns simultaneously. 8. Excellent communication and problem-solving skills. Benefits: 1. Competitive salary with performance-based incentives. 2. Opportunity to work with a dynamic and growing team in the education sector. 3. Professional development and training opportunities. 4. Flexible work environment (as per company policy

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Role : SharePoint Online Experience : 3 to 5 Years Location : Hyderabad Mandatory Skills : SharePoint Online, SharePoint OneDrive, Metadata, Content Types Rotational Shifts. Job Description: SharePoint Online Admin Job Description JD for Sharepoint Online: 1. Should have strong knowledge on SharePoint Online environment. 2. Should know the difference types of Sites used in SharePoint Online. 3. Must have knowledge on SharePoint Online and OneDrive for Business Limitations and Knowns issues. 4. Hands on experience on Metadata, crawled properties, managed properties, Content Types, Workflows, User Profiles and SharePoint Online Search. 5. Must be aware of latest and deprecated features in Sharepoint online. 6. Should have basic knowledge on DirSync, Azure ADSync, or Azure ADConnect. 7. Must know how Point in time restore works in SharePoint Online. 8. Good Knowledge on retention policy, DLP and E-discovery hold. 9. Need knowledge on OneDrive for Business sync issues. 10. Should have understanding about office 365 groups and Permissions in SharePoint Online. 11. Office 365 Licensing. 12. Should have basic knowledge on Fiddler, Search Query tools and SharePoint Designer Roles & Responsibilities

Posted 2 weeks ago

Apply

8.0 - 9.0 years

8 - 13 Lacs

Pune

Work from Office

Naukri logo

As a Senior D eveloper , you will play a pivotal role in the implementation and enhancement of projects. You will leverage your expertise to design, develop and deliver innovative software solutions. This role requires a good understanding of software development methodologies . You will work closely with Product owner, development team and other stakeholders to implement product features and enhancements. You will be part of a highly motivated development community consisting of skilled individuals. Role Responsibilities You have experience developing software in Java. You have experience with SQL, preferably PostgreSQL. You are interested in working with cloud technologies and know how to use containers. You enjoy learning new technologies . You are interested in data-centred applications. You like working in a collaborative team, where there is collective ownership of the product. You like getting involved with every stage of the software development lifecycle. You see failure as a chance to learn and welcome feedback. You are happy to deploy and operate your application as a DevOps concept. What you will be doing Within 3 months- Get familiar with our technology stack. Our applications are deployed to Kubernetes and virtual machines using Concourse. Start making minor changes to our codebase. Live our agile process and team ceremonies . Become familiar with the existing system documentation. By 3-6 months you will - Be a supportive member of the development of our applications by using the right technology solutions to solve the problem at hand. Understand in detail how our applications are designed. Take part in developing new features as a member of the tech team. Help to improve our technology stack. Understand the teams context within the publishing business we are working in. Be able to properly understand and discuss business requirements with stakeholders. Hold technical discussions with the team to improve the product architecture and code quality. Contribute to blameless post-mortems. By 6-12 months you will - Contribute to driving our applications and architecture forward. Understand the system s scope and how it connects to other systems. Confidently make changes and implement new features in our codebase. Transform high-level requirements into actionable work. Add unit tests to our applications. Proactively provide useful and actionable feedback to team members. Be able to explain and visualize the benefits and trade-offs of proposed solutions. Participate in user research to better understand our users needs. Understand our products and how we at Springer Nature operate . Experience, skills and qualifications Minimum 8 years of relevant experience. Ability to work independently as part of a team Excellent communication and interpersonal skills Able to write technical documentation Designing new solutions keeping in mind the technology trends Application monitoring and prompt action taking Root cause analysis and hot fixing Demonstrated experience in quickly adapting to new industries, business models, and project environments. Ability to prioritize tasks, and pivot as projects need change s .

Posted 2 weeks ago

Apply

5.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Design, build, and optimize interactive image viewers for Ophthalmologic Imaging solution Develop robust annotation tools supporting manual segmentation, labeling, corrections, and metadata capture. Implement image upload/download workflows, versioning, session handling, and data security. Collaborate with AI Imaging Engineers to integrate segmentation outputs into the viewer and annotation modules. Work with Backend Engineers to define and consume secure APIs for imaging data management. Ensure clinical-grade performance for large image sets (responsive, low-latency frontend experience). Write clean, maintainable, and well-documented code following software engineering best practices. Overview Emmes Group: Building a better future for us all. Emmes Group is transforming the future of clinical research, bringing the promise of new medical discovery closer within reach for patients. Emmes Group was founded as Emmes more than 47 years ago, becoming one of the primary clinical research providers to the US government before expanding into public-private partnerships and commercial biopharma. Emmes has built industry leading capabilities in cell and gene therapy, vaccines and infectious diseases, ophthalmology, rare diseases, and neuroscience. We believe the work we do will have a direct impact on patients lives and act accordingly. We strive to build a collaborative culture at the intersection of being a performance and people driven company. We re looking for talented professionals eager to help advance clinical research as we work to embed innovation into the fabric of our company. If you share our motivations and passion in research, come join us! OptymEdge is a global leader in ophthalmic endpoint certification, partnering with leading biopharma sponsors and CROs to ensure the quality and consistency of visual function data in clinical trials. With a reputation built on scientific expertise, operational excellence, and global delivery, we ve become a trusted name in advancing treatments for sight-threatening diseases. As the field evolves, so do we. OptymEdge is expanding into technology-driven product development, creating a new generation of platforms that redefine how ophthalmic data is captured, analyzed, and leveraged across the clinical trial lifecycle. Our innovations span AI-powered imaging, digital examiner certification, and intelligent operational tools designed to anticipate trial needs, streamline oversight, and enhance decision-making. This is a rare opportunity to help shape transformative technology at the intersection of science, software, and sight driving real-world impact in a field where every data point can influence patient vision. Primary Purpose We are seeking a Senior Software Engineer with 5-7 years of experience to design, develop, and optimize advanced Ophthalmology imaging applications. You will play a critical role in developing the Image viewer, annotation tools, image management workflows, and frontend-backend integration necessary for a scalable, clinical-grade ophthalmology imaging system. You will work closely with product, clinical, and engineering teams in an agile environment to deliver innovative, production-grade imaging solutions that redefine clinical outcomes. Responsibilities Design, build, and optimize interactive image viewers for Ophthalmologic Imaging solution Develop robust annotation tools supporting manual segmentation, labeling, corrections, and metadata capture. Implement image upload/download workflows, versioning, session handling, and data security. Collaborate with AI Imaging Engineers to integrate segmentation outputs into the viewer and annotation modules. Work with Backend Engineers to define and consume secure APIs for imaging data management. Ensure clinical-grade performance for large image sets (responsive, low-latency frontend experience). Write clean, maintainable, and well-documented code following software engineering best practices. Qualifications Bachelor s or Master s degree in Computer Science, Software Engineering, or a related field. 5-7 years of professional software development experience building full-stack or frontend applications. Strong expertise in ReactJS (hooks, state management, component design) and modern frontend frameworks. Experience working with WebGL, Canvas, or advanced rendering libraries for imaging applications. Solid backend integration experience with RESTful APIs or GraphQL. Working knowledge of Python services (Flask/FastAPI) for API communication (you dont need to be a Python backend expert, but you should know how to consume and work with it). Exposure to handling large imaging files (DICOM, TIFF etc.) and browser-based optimization techniques. Understanding of basic cloud concepts (AWS S3, Cloudfront, Authentication flows) CONNECT WITH US! Follow us on Twitter - @EmmesCRO Find us on LinkedIn - Emmes

Posted 2 weeks ago

Apply

3.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

ColumbiaSportswearCompany is looking for Sr Master Data Mgmt Analyst to join our dynamic team and embark on a rewarding career journey. Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 2 weeks ago

Apply

17.0 - 18.0 years

50 - 55 Lacs

Bengaluru

Work from Office

Naukri logo

Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the worlds leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role: We are seeking an experienced Data Architect to design, develop, and maintain our google cloud data architecture. The ideal candidate will have a strong background in data architecture, data engineering, and cloud technologies, with experience in managing data across google cloud platforms. Responsibilities: GCP Cloud Architecture: Design, implement, and manage robust, scalable, and cost-effective cloud-based data architectures on Google Cloud Platform (GCP), leveraging services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. Experience designing cloud architectures on Oracle Cloud is a plus. Data Modeling: Develop and maintain conceptual, logical, and physical data models to support various business needs. Big Data Processing: Design and implement solutions for processing large datasets using technologies such as Spark and Hadoop. Data Governance: Establish and enforce data governance policies, including data quality, security, compliance, and metadata management. Data Pipelines: Build and optimize data pipelines for efficient data ingestion, transformation, and loading. Performance Optimization: Monitor and tune data systems to ensure high performance and availability. Collaboration: Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and provide architectural guidance. Innovation: Stay current with the latest technologies and trends in data architecture and cloud computing. Qualifications : GCP Core Services: In-depth knowledge of GCP data services, including BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. Data Modeling: Expertise in data modeling techniques and best practices. Big Data Technologies: Hands-on experience with Spark and Hadoop. Cloud Architecture: Proven ability to design scalable, reliable, and cost-effective cloud architectures. Data Governance: Understanding of data quality, security, compliance, and metadata management. Programming: Proficiency in SQL, Python, and DBT (Data Build Tool). Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent written and verbal communication skills. A Bachelor s degree in Computer Science, Computer Engineering, Data or related or equivalent work experience required. GCP Professional Data Engineer or Cloud Architect certification is a plus. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Tech Permanent Job Description Be part of something bigger. Decode the future. At Electrolux, as a leading global appliance company, we strive every day to shape living for the better for our consumers, our people and our planet. We share ideas and collaborate so that together, we can develop solutions that deliver enjoyable and sustainable living. Come join us as you are. We believe diverse perspectives make us stronger and more innovative. In our global community of people from 100+ countries, we listen to each other, actively contribute, and grow together. All about the role: We are looking for a Engineer to help driving our global MarTech strategy forward, with a particular focus on data engineering and data science to design and scale our customer data infrastructure. You will work closely with cross-functional teams - from engineering and product to data and CX teams - ensuring scalable, future-ready solutions that enhance both consumer and business outcomes. Great innovation happens when complexity is tamed, and possibilities are unleashed. That s what we firmly believe! Join our team at Electrolux, where we lead Digital Transformation efforts. We specialize in developing centralized solutions to enhance inter-system communications, integrate third-party platforms, and establish ourselves as the Master Data within Electrolux. Our focus is on delivering high-performance and scalable solutions that consistently achieve top-quality results on a global scale. Currently operating in Europe and North America, we are expanding our footprint to all regions worldwide. About the CDI Experience Organization: The Consumer Direct Interaction Experience Organization is a Digital Product Organization responsible for delivering tech solutions to our end-users and consumers across both pre-purchase and post-purchase journeys. We are organized in 15+ digital product areas, providing solutions ranging from Contact Center, E-commerce, Marketing, and Identity to AI. You will play a key role in ensuring the right sizing, right skillset, and core competency across these product areas. What you ll do: Design and implement scalable, secure data architectures that support advanced marketing use cases across platforms such as BlueConic (CDP), SAP CDC (Identity & Consent), Iterable (Marketing Automation), Qualtrics (Experience Management), and Dynamic Yield (Personalization). Define and govern data pipelines for collecting, processing, and enriching first party and behavioural data from digital and offline touchpoints. Partner with Data Science teams to productionize machine learning models for audience segmentation, propensity scoring, content recommendations, and predictive analytics. Collaborate with Data Engineering and Cloud teams to build out event-driven and batch data flows using technologies such as Azure Data Factory, Databricks, Delta Lake, Azure Synapse, and Kafka. Lead the integration of MarTech data with enterprise data warehouses and data lakes, ensuring consistency, accessibility, and compliance. Translate business needs into scalable data models and transformation logic that empower marketing, analytics, and CX stakeholders. Establish data governance and quality frameworks, including metadata management, lineage tracking, and privacy compliance (GDPR, CCPA). Serve as a subject matter expert in both MarTech data architecture and advanced analytics capabilities. Who are you: Design and implement scalable, secure data architectures that support advanced marketing use cases across platforms such as BlueConic (CDP), SAP CDC (Identity & Consent), Iterable (Marketing Automation), Qualtrics (Experience Management), and Dynamic Yield (Personalization). Define and govern data pipelines for collecting, processing, and enriching first-party and behavioral data from digital and offline touchpoints. Design and implement scalable, secure data architectures that support advanced marketing use cases across platforms such as BlueConic (CDP), SAP CDC (Identity & Consent), Iterable (Marketing Automation), Qualtrics (Experience Management), and Dynamic Yield (Personalization). Define and govern data pipelines for collecting, processing, and enriching first-party and behavioral data from digital and offline touchpoints. Partner with Data Science teams to productionize machine learning models for audience segmentation, propensity scoring, content recommendations, and predictive analytics. Collaborate with Data Engineering and Cloud teams to build out event-driven and batch data flows using technologies such as Azure Data Factory, Databricks, Delta Lake, Azure Synapse, and Kafka. Lead the integration of MarTech data with enterprise data warehouses and data lakes, ensuring consistency, accessibility, and compliance. Translate business needs into scalable data models and transformation logic that empower marketing, analytics, and CX stakeholders. Establish data governance and quality frameworks, including metadata management, lineage tracking, and privacy compliance (GDPR, CCPA). Serve as a subject matter expert in both MarTech data architecture and advanced analytics capabilities. Where youll be This is a full-time position, based in Bangalore, India. Benefits highlights Flexible work hours/hybrid work environment Discounts on our award-winning Electrolux products and services Family-friendly benefits Extensive learning opportunities and flexible career path.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

2 - 10 Lacs

Lucknow

Work from Office

Naukri logo

Salary : 2.4 LPA - 10 LPA 2 Vacancy 1-5+ Years IGTAPPS Python OCR Developer | Senior OCR Developer Jobs in Lucknow, Uttar Pradesh Apply For Job Title We offer Custom Software Development & Enterprise Mobile Apps Development utilizing technology Python OCR Developer Python Developer Design, refine and build advanced & intelligent OCR engine that can work across multiple applications. Develop methodologies for the extraction, cleaning, organization, and discoverability of information from a variety of media types in the collection of information. Design and develop the OCR service with Textract, Google tesseract / Google Vision to extract information. Collaborate with a team to reduce the error rate, failure rates and thresholds Apply knowledge of best practices in RPA, OCR data capture Leverage established tools, assets, or techniques in support of project and product delivery. Demonstrate sound decision-making and make recommendations on a regular basis; communicate these decisions clearly to colleagues. Assist in execution of an approach to solving complex client problems. Build OCR templates and design OCR batch processes Collaborate with cross-functional teams to define, design, and deliver new features. Unit-test code for robustness, including edge cases, usability, and general reliability. Work on bug fixing and improving OCR engine performance and implement new technologies. Respond promptly to client requests or inquiries. Should be able to work directly with Process Engineers and BOT developers to build an integrated automated solution. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Integrating user-facing elements using server-side logic. Assessing and prioritizing client feature requests. Integrating data storage solutions. Coordinating with front-end developers. Reprogramming existing databases to improve functionality. Developing digital tools to monitor online traffic. Skills and qualifications we are looking for: Bachelors degree in science and technology. Total 3+ years of software development experience including experience in Java or Python. Understanding and hands on experience in Scanning Technology Demonstrate knowledge OPTICAL CHARACTER RECOGNITION (OCR) software Knowledge in extracting data via Optical Character Recognition (OCR) and transforming metadata Strong conceptual, communications, and technical skills Work experience in Java/Python Development 2+ years of relevant OCR development experience. Work exlierience-3+ Years Work from Office job .

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru, Belgaum

Work from Office

Naukri logo

Position Overview We are seeking a highly skilled and experienced Data Architect with expertise in cloud-based solutions. The ideal candidate will design, implement, and optimize our data architecture to meet the organizations current and future needs. This role requires a strong background in data modeling, transformation, and governance, along with hands-on experience with modern cloud platforms and tools such as Snowflake, Spark, Data Lakes, and Data Warehouses. The successful candidate will also establish and enforce standards and guidelines across data platforms to ensure consistency, scalability, and best practices. Exceptional communication skills are essential to collaborate across cross-functional teams and stakeholders. Key Responsibilities Design and Implementation: Architect and implement scalable, secure, and high-performance cloud data platforms, integrating data lakes, data warehouses, and databases. Develop comprehensive data models to support analytics, reporting, and operational needs. Data Integration and Transformation: Lead the design and execution of ETL/ELT pipelines using tools like, Talend / Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark to process and transform data efficiently. Integrate diverse data sources into cohesive and reusable datasets for business intelligence and machine learning purposes. Standards and Guidelines: Establish, document, and enforce standards and guidelines for data architecture, Data modeling, transformation, and governance across all data platforms. Ensure consistency and best practices in data storage, integration, and security throughout the organization. Data Governance: Establish and enforce data governance standards, ensuring data quality, security, and compliance with regulatory requirements. Implement processes and tools to manage metadata, lineage, and data access controls. Cloud Expertise: Utilize Snowflake for advanced analytics and data storage needs, ensuring optimized performance and cost efficiency. Leverage modern cloud platforms to manage data lakes and ensure seamless integration with other services. Collaboration and Communication: Partner with business stakeholders, data engineers, and analysts to gather requirements and translate them into technical designs. Clearly communicate architectural decisions, trade-offs, and progress to both technical and non-technical audiences. Continuous Improvement: Stay updated on emerging trends in cloud and data technologies, recommending innovations to enhance the organization s data capabilities. Optimize existing architectures to improve scalability, performance, and maintainability. Technical Skills: Strong expertise in data modeling (conceptual, logical, physical) and data architecture design principles. Proficiency in Talend / Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark,

Posted 2 weeks ago

Apply

9.0 - 14.0 years

35 - 40 Lacs

Pune

Work from Office

Naukri logo

We are looking forward to hire Microsoft Power Apps Professionals in the following areas : We are seeking a highly skilled Power Apps and SharePoint Technical Lead to spearhead the design and implementation of enterprise-grade business solutions using Microsoft Power Platform and SharePoint. The ideal candidate will have deep expertise in low-code/no-code development and Microsoft 365 ecosystem, with a strong ability to lead development teams and drive digital transformation initiatives. Key Responsibilities: Lead end-to-end solution architecture, design, and development using Power Apps (Canvas & Model-Driven), Power Automate, SharePoint Online, and Microsoft 365. Collaborate with business analysts and stakeholders to gather requirements and translate them into technical solutions. Design and build complex workflows, integrations, and automation between Power Platform, SharePoint, Teams, Outlook, and external systems. Manage SharePoint Online site architecture, lists/libraries, content types, permissions, and metadata design. Drive implementation of governance, security, and best practices for Power Platform and SharePoint. Guide and mentor developers, conduct code reviews, and oversee solution deployments. Define and manage environments, solutions, and ALM (Application Lifecycle Management) processes. Ensure quality, performance, scalability, and security of developed solutions. Stay updated with Microsoft roadmap and evaluate new features for business relevance. Required Skills & Qualifications: 9+ years of experience in IT with at least 3+ years in a technical leadership role. Expertise in Power Platform (Power Apps, Power Automate, Power BI) and SharePoint Online. Strong understanding of SharePoint Framework (SPFx), PnP, REST APIs, JSON, and modern UI development. Experience integrating Power Platform with SharePoint, Microsoft Teams, Outlook, and third-party services. Familiarity with governance, compliance, and security within Microsoft 365. Knowledge of Azure Logic Apps, Azure Functions, and PowerShell scripting is a plus. Excellent leadership, communication, and stakeholder management skills. Microsoft certifications in Power Platform or SharePoint are preferred. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Intro Marketing Language Join us for an exciting opportunity to lead and innovate in data engineering, enhancing your career in a dynamic environment. Job Summary As a Data Engineering VP at JPMorgan Chase within the strategic PNA Data platform, you will be responsible for leading the technical delivery. Your role will involve designing and implementing a cloud-native data platform to support Finance Planning and Analysis across more than 50 markets. You will also collaborate with cross-functional teams to build a robust data platform and manage the development infrastructure of the platform. Job Responsibilities Lead the design and implementation of a cloud-native PNA Data platform. Develop and maintain robust ETL processes for data integration. Collaborate with various teams to support data platform capabilities. Manage platform development infrastructure and support users. Design strategies for effective data lake utilization. Leverage ETL tools and DevOps practices for platform engineering. Ensure high-quality software delivery and platform stability. Required Qualifications, Capabilities, and Skills Formal training or certification in software engineering concepts with 5+ years of applied experience. Strong hands-on experience in Python, SQL, and database development. Proficiency in cloud platforms (AWS ) and containerization technologies like Docker and Kubernetes. Solid understanding of data structures, caching, multithreading, and asynchronous communication. Strong collaboration skills and experience in leading teams. Preferred Qualifications, Capabilities, and Skills Experience with Databricks and data orchestrator tools like Airflow. Familiarity with data governance and metadata management. Exposure to No-SQL databases like MongoDB. Experience with messaging technologies like Kafka, Kinesis.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Location : Bengaluru (Hybrid) / Remote Job Type : Full-time Experience Required : 5+ Year Notice Period : immediate -30 days Role Overview : As a Collibra Expert, you will be responsible for implementing, maintaining, and optimizing the Collibra Data Governance Platform to ensure data quality, governance, and lineage across the organization. You will partner with cross-functional teams to develop data management strategies and integrate Collibra solutions with Google Cloud Platform (GCP) to create a robust, scalable, and efficient data governance framework for the retail domain. Key Responsibilities : - Data Governance Management : Design, implement, and manage the Collibra Data Governance Platform for data cataloging, data quality, and data lineage within the retail domain. - Collibra Expertise : Utilize Collibra for metadata management, data quality monitoring, policy enforcement, and data stewardship across various business units. - Data Cataloging : Lead the implementation and continuous improvement of data cataloging processes to enable a centralized, user-friendly view of the organization's data assets. - Data Quality Management : Collaborate with business and technical teams to ensure that data is high-quality, accessible, and actionable. Define data quality rules and KPIs to monitor data accuracy, completeness, consistency, and timeliness. - Data Lineage Implementation : Build and maintain comprehensive data lineage models to visualize the flow of data from source to consumption, ensuring compliance with data governance standards. - GCP Integration : Architect and implement seamless integrations between Collibra and the Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, and Cloud Storage, ensuring data governance policies are enforced in the cloud environment. - Collaboration & Stakeholder Management : Collaborate with Data Engineers, Analysts, Business Intelligence teams, and leadership to define and implement data governance best practices and standards. - Training & Support : Provide ongoing training and support to business users and technical teams on data governance practices, Collibra platform usage, and GCP-based solutions. - Compliance & Security : Ensure data governance initiatives comply with internal policies, industry standards, and regulations (e.g., GDPR, CCPA). Key Requirements : - Proven Expertise in Collibra : Hands-on experience implementing and managing Collibra Data Governance Platform (cataloging, lineage, data quality). - Google Cloud Platform (GCP) Proficiency : Strong experience with GCP tools (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.) and integrating them with Collibra for seamless data governance. - Data Quality and Lineage Expertise : In-depth knowledge of data quality frameworks, metadata management, and data lineage implementation. - Retail Industry Experience : Prior experience in data governance within the retail or eCommerce domain is a plus. - Technical Skills : Strong understanding of cloud data architecture and best practices for managing data at scale in the cloud (preferably in GCP). - Problem-Solving and Analytical Skills : Ability to analyze complex data governance issues and find practical solutions to ensure high-quality data management across the organization. - Excellent Communication Skills : Ability to communicate effectively with both technical and non-technical stakeholders to advocate for data governance best practices. - Certifications : Relevant certifications in Collibra, Google Cloud, or Data Governance are highly desirable. Education & Experience : - Bachelor's degree (B. Tech/BE) mandatory, masters optional - 5+ years of experience in Data Governance, with at least 3 years of specialized experience in Collibra and GCP. - Experience working with data teams in a retail environment is a plus.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Satara

Work from Office

Naukri logo

NUTRIS CROP SOLUTIONS INDIA PVT.LTD is looking for DATA ANALYST to join our dynamic team and embark on a rewarding career journey. Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Goodwin Financial Holdings (P) Ltd. is looking for Data Analyst to join our dynamic team and embark on a rewarding career journey. Managing master data, including creation, updates, and deletion. Managing users and user roles. Provide quality assurance of imported data, working with quality assurance analysts if necessary. Commissioning and decommissioning of data sets. Processing confidential data and information according to guidelines. Helping develop reports and analysis. Managing and designing the reporting environment, including data sources, security, and metadata. Supporting the data warehouse in identifying and revising reporting requirements. Supporting initiatives for data integrity and normalization. Assessing tests and implementing new or upgraded software and assisting with strategic decisions on new systems. Generating reports from single or multiple systems. Troubleshooting the reporting database environment and reports. Evaluating changes and updates to source production systems. Training end-users on new reports and dashboards. Providing technical expertise in data storage structures, data mining, and data cleansing.

Posted 3 weeks ago

Apply

Exploring Metadata Jobs in India

Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bengaluru
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi/NCR

These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.

Average Salary Range

The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum

Salaries may vary based on the company, location, and specific job responsibilities.

Career Path

In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect

As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.

Related Skills

In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance

Having a combination of these skills can make job seekers more attractive to potential employers.

Interview Questions

  • What is metadata? (basic)
  • How do you ensure data quality in metadata management? (medium)
  • Can you explain the difference between structured and unstructured metadata? (medium)
  • What tools or software have you used for metadata management? (basic)
  • Describe a challenging metadata project you worked on and how you overcame obstacles. (advanced)
  • How do you stay updated with the latest trends in metadata management? (basic)
  • Explain the importance of metadata in data governance. (medium)
  • Have you ever had to resolve conflicts between different metadata standards? How did you handle it? (advanced)
  • What is the role of metadata in data integration? (medium)
  • How do you ensure metadata security and compliance with regulations? (medium)
  • What are the benefits of using metadata in data analytics? (basic)
  • Can you discuss a successful metadata strategy you implemented in a previous role? (advanced)
  • Explain the concept of metadata harvesting. (medium)
  • How do you handle metadata versioning and updates? (medium)
  • Have you worked with ontologies and taxonomies in metadata management? (advanced)
  • How do you collaborate with other teams, such as data scientists or developers, in metadata projects? (medium)
  • What are the common challenges faced in metadata management, and how do you address them? (advanced)
  • How do you measure the effectiveness of metadata initiatives in an organization? (medium)
  • Can you give an example of how metadata enhances data search and retrieval processes? (medium)
  • What role does metadata play in data lineage and traceability? (medium)
  • Explain the difference between technical metadata and business metadata. (basic)
  • How do you handle metadata migration when transitioning to a new system or platform? (advanced)
  • Describe a time when you had to prioritize metadata tasks based on business needs. (medium)
  • What are the best practices for documenting metadata to ensure consistency and accuracy? (medium)
  • How do you handle metadata conflicts or inconsistencies in a large dataset? (advanced)

Conclusion

As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies