Home
Jobs

758 Metadata Jobs - Page 24

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 6 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

Roche fosters diversity, equity and inclusion, representing the communities we serve. When dealing with healthcare on a global scale, diversity is an essential ingredient to success. We believe that inclusion is key to understanding people s varied healthcare needs. Together, we embrace individuality and share a passion for exceptional care. Join Roche, where every voice matters. The Position Senior Full Stack Developer - Optimized Study Design and Protocol Generation Throughout our 125-year history, Roche has grown into one of the world s largest biotech companies and a global supplier of transformative innovative solutions across major disease areas. We are looking for an IT specialist to join one of our teams in Roche Polska within the Roche Informatics division. In Roche Informatics we focus on delivering technology that evolves the practice of medicine and helps patients live longer, better lives. Poland plays the role of Technology Co-creation and Acceleration Hub building capabilities driving digital innovation. We are a diverse team of open and friendly people, enthusiastic about technological novelties and optimal IT solutions. We share knowledge, and experience appreciate different points of view. Overview: We seek a highly skilled Fullstack Developer to join one of our product teams. This permanent role requires a strong development background, a core technical mindset, and a solid understanding of technical architecture. Your Team: You will be part of a dynamic team dedicated to realizing the vision of the RD Excellence Initiative, which aims to transform and streamline the study design and protocol generation process across Roche s RD portfolio. The team is responsible for building cutting-edge tools and solutions, leveraging AI, data insights, and interoperable systems to address inefficiencies and enhance decision-making. Collaborating with cross-functional experts from various business units, the team develops and implements scalable, innovative technologies to optimise patient trial success (PTS), reduce amendments, and accelerate study timelines. This collaborative environment values knowledge-sharing and continuous learning, ensuring seamless integration of new team members and creating an ecosystem that drives innovation in protocol development. Your key responsibilities: Develop and maintain full-stack applications, focusing on Angular for user-friendly, accessible, and responsive interfaces. Implement efficient workflows for metadata validation, enabling users to validate various metadata types using Angular, TypeScript, and Material Design. Build and manage backend systems with Python and Django, including data aggregation, processing, and generating summary documents. Integrate with Snowflake databases for efficient metadata storage and retrieval. Optimize tool performance and ensure secure authentication using OAuth mechanisms. Your qualifications and experience: Proficiency in full-stack development with Angular, TypeScript, Python, and Django. Experience in building and maintaining CI/CD pipelines for reliable and consistent application deployment. Expertise in handling spreadsheet data using Python libraries like Pandas and OpenPyXL. Strong understanding of document generation using Python libraries (e.g., Docx). Advanced English proficiency (C1 level) and excellent problem-solving skills. Nice to have (considered an asset): Experience with Progressive Web Apps (PWAs) and server-side rendering (SSR) for enhanced performance and engagement. Knowledge of internationalization (i18n) for multi-language application support. Familiarity with UI/UX principles to improve user experience and collaboration with design teams. Background in metadata validation tools and integrating computational simulations. Proven ability to mentor team members and independently design technical solutions. Who we are At Roche, more than 100,000 people across 100 countries are pushing back the frontiers of healthcare. Working together, we ve become one of the world s leading research-focused healthcare groups. Our success is built on innovation, curiosity and diversity. Roche is an Equal Opportunity Employer. "

Posted 3 months ago

Apply

2 - 5 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Software Engineer - Java The Fixed Income Applications development team is focused on building and supporting a reference data system. The team s responsibilities span request/response-based metadata distribution for various financial products (bonds, futures, options, FX spots/forwards, deposits, swaps, commodities, swaptions, cdx, cds, equities, etc), dealing with batch and on-demand security creation and updates, building infrastructure for keeping the metadata current and accurate, and providing multiple means of dissemination to downstream systems (such as analytics, risk, and trader systems). While not a low-latency system, it is perceived as a high availability cluster capable of serving both existing securities and securities created upon request based on external metadata. Team members interact directly with operations teams and other technology teams, so solid communication skills are essential. The team owns the entire software lifecycle, from requirements and design, through implementation, to production releases and support. Release cycles are tight, so in addition to strong development skills, you must have demonstrated the ability to adapt to changing conditions and learn quickly. There are no business analysts on the team, so we expect developers to have sufficient business and product knowledge to understand the requirements on their own. That being said, this is not a particularly Quantitative role - there is a separate Analytics team that undertakes valuation and related work. We focus more on building up and supporting the technical infrastructure. Required skills/experience: 4+ years of professional experience with Java 3+ years of SQL database development skills Solid grasp of Multithreading, algorithms, and data structures Familiarity with event streaming platforms like Kafka, RabbitMQ, etc Results-oriented, can deliver quality code with quick turnaround Self-starter and critical thinker, takes ownership of own projects and makes improvement suggestions for the entire infrastructure Preferred skills/experience: Fixed income product knowledge would be a plus Spring/Spring Boot experience Experience with vendor feeds (Bloomberg SAPI/BPIPE, Markit) Distributed caching (e.g., Hazelcast, REDIS, Memcached, Ignite, Ehcache, etc.) Python experience for unit testing and scripts

Posted 3 months ago

Apply

3 - 5 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Company Overview At Motorola Solutions, were guided by a shared purpose - helping people be their best in the moments that matter - and we live up to our purpose every day by solving for safer. Because people can only be their best when they not only feel safe, but are safe. Were solving for safer by building the best possible technologies across every part of our safety and security ecosystem. Thats mission -critical communications devices and networks, AI-powered video security & access control and the ability to unite voice, video and data in a single command center view. Were solving for safer by connecting public safety agencies and enterprises, enabling the collaboration thats critical to connect those in need with those who can help. The work we do here matters. Department Overview Our IT organization isn t just here to support our business. We re here to reinvent it - by changing the way our customers, partners and employees interact with our company. To do that, we re looking for people who bring great ideas and who make our partners ideas better. Intellectually curious advisors (not order takers) who focus on outcomes to creatively solve business problems. People who not only embrace change, but who accelerate it. Job Description As a dynamic technology enterprise that operates on a global scale, Motorola Solutions and its products present an attractive target for malicious actors. This role allows you to use your cybersecurity and software engineering skills to protect the people who protect us. Our customers are first responders. Fire, police, paramedics, 911 call takers, and 911 dispatchers. And when we or our loved ones place that 911 call, we become the customer of our customers. We want that call to be answered and the communications between the dispatcher and the first responder to be available. If you are passionate about securing mission-critical products and services and thrive in a global, collaborative, cross-functional team, we have an exciting opportunity for you that implements secure software development practices, vulnerability management, and other cutting-edge cybersecurity projects. In an ever-evolving threat landscape, the Product Security team is at the forefront of defending against cyber threats, helping the business ensure the confidentiality, integrity, and availability of our products. As a Vulnerability Management Software Developer, you will design, develop, and implement the tooling to integrate data from multiple sources to augment our vulnerability management platform. This provides product development teams with actionable insights to further safeguard data and systems from potential threats. MSI provides a work environment that encompasses workplace flexibility, continued professional growth through paid training and certifications, conferences and seminars, and education assistance. Our culture encourages the honing of current skills and the building of new capabilities. We prize flexibility, continuous improvement, and collaboration, both within the team and with industry peers. Scope of Responsibilities/Expectations Strong team player with the ability to work with a geographically dispersed team Experience bringing order to the chaos of ingesting data from a wide variety of sources Implement, document, and maintain software solutions which may pull data from different REST APIs, perform calculations or data extraction, and store values to databases Design and implement web applications with a great User Experience Develop and manage ongoing process improvements focusing on automation, accuracy, and timeliness of information Work with multiple data sources including vulnerability and other enterprise data for contextual enrichment to drive actionable outcomes Identify, root cause, and remediate data quality issues Familiarity with parsing, extracting, and reshaping data, primarily in JSON format Desired Background/Knowledge/Skills Strong background in software development and modern programming languages - preferably Python/ JavaScript/Typescript and SQL Extensive experience in working with API technologies Basic knowledge of widely-used application development technologies (eg: ReactJS, FastAPI) Knowledge of common application vulnerabilities (e.g. OWASP Top 10), attack techniques and remediation tactics/strategies Working knowledge of CI/CD workflows/pipelines and DevOps practices Basic knowledge of cloud services deployment, containers, etc. Skill in delivering and speaking to technical concepts to a wide variety of audiences Knowledge of cybersecurity and secure coding principles and best practices Skill in using code analysis tools like SAST, DAST, SCA tools Ability to research and learn new topics and become functional with them quickly Aversion to repeated manual processes and a strong desire to automate them Attention to detail for data organization and consistency Interest in modeling data and relationships (semantic/property graphs) Curiosity for new and emerging technologies and how we can use them Knowledge of container-based workflows and deploying applications to Kubernetes Advanced skills in building processes supporting data transformation, data structures, metadata dependency and workload management Familiarity with building REST or GraphQL APIs and data objects Familiarity with Elasticsearch, Elasticsearch Query Language, and Kibana a plus Basic Requirements Bachelor s degree in a related field or equivalent work experience 3-5 years in software development Travel Requirements None Relocation Provided None Position Type Experienced Motorola Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion or belief, sex, sexual orientation, gender identity, national origin, disability, veteran status or any other legally-protected characteristic. We re committed to providing an inclusive and accessible recruiting experience for candidates with disabilities, or other physical or mental health conditions. To request an accommodation, please email ohr@motorolasolutions.com .

Posted 3 months ago

Apply

2 - 5 years

1 - 4 Lacs

Bengaluru

Work from Office

Naukri logo

Description Associate Reporting Analyst Bangalore, India The Opportunity: Anthology delivers education and technology solutions so that students can reach their full potential and learning institutions thrive. Our mission is to empower educators and institutions with meaningful innovation that s simple and intelligent, inspiring student success and institutional growth. The Power of Together is built on having a diverse and inclusive workforce. We are committed to making diversity, inclusion, and belonging a foundational part of our hiring practices and who we are as a company. For more information about Anthology and our career opportunities, please visit www.anthology.com. Primary responsibilities will include: Work ing with different databases and lead ing the effort to design, migrate and programming effort for these platforms Creat ing information solutions covering data security, data privacy, metadata management, multi-tenancy and mixed workload management within contemporary insights-based tools, technologies, frameworks, platforms, and deployment models Provid ing technical leadership to project team(s) to perform design to deployment related activities, provide guidance, participate in reviews, prevent, and resolve technical issues Reviewing escalations from business stakeholders and assist ing them to resolve Providing best practice recommendations Assisting customers with solv ing their challenges related to d ata migration Support ing customers to reach the desired goal with respect to m igration Proactively identifying challenges and communicating to business stakeholders with suggestions to overcome Ensuring the delivered data matches with the data provided Ability to travel frequently and work extra hours as needed Always striving for consistent, quality results The Candidate: Required skills/qualifications: Great customer service and client engagement skills Excellent oral and written communication skills Familiarity with education related technologies At least 2 years of relevant work experience Experience with SQL scripting Solid knowledge of SQL Server 2019 with the development of complex stored procedures, views, and other SQL objects A ble to analyze the output of SQL statement Critical thinking and problem - solving skills Fluency in written and spoken English This job description is not designed to contain a comprehensive listing of activities, duties, or responsibilities that are required. Nothing in this job description restricts managements right to assign or reassign duties and responsibilities at any time. Anthology is an equal employment opportunity/affirmative action employer and considers qualified applicants for employment without regard to race, gender, age, color, religion, national origin, marital status, disability, sexual orientation, gender identity/expression, protected military/veteran status, or any other legally protected factor.

Posted 3 months ago

Apply

3 - 6 years

18 - 23 Lacs

Pune

Work from Office

Naukri logo

Roche fosters diversity, equity and inclusion, representing the communities we serve. When dealing with healthcare on a global scale, diversity is an essential ingredient to success. We believe that inclusion is key to understanding people s varied healthcare needs. Together, we embrace individuality and share a passion for exceptional care. Join Roche, where every voice matters. The Position Senior Full Stack Developer - Optimized Study Design and Protocol Generation Throughout our 125-year history, Roche has grown into one of the world s largest biotech companies and a global supplier of transformative innovative solutions across major disease areas. We are looking for an IT specialist to join one of our teams in Roche Polska within the Roche Informatics division. In Roche Informatics we focus on delivering technology that evolves the practice of medicine and helps patients live longer, better lives. Poland plays the role of Technology Co-creation and Acceleration Hub building capabilities driving digital innovation. We are a diverse team of open and friendly people, enthusiastic about technological novelties and optimal IT solutions. We share knowledge, and experience appreciate different points of view. Overview: We seek a highly skilled Fullstack Developer to join one of our product teams. This permanent role requires a strong development background, a core technical mindset, and a solid understanding of technical architecture. Your Team: You will be part of a dynamic team dedicated to realizing the vision of the RD Excellence Initiative, which aims to transform and streamline the study design and protocol generation process across Roche s RD portfolio. The team is responsible for building cutting-edge tools and solutions, leveraging AI, data insights, and interoperable systems to address inefficiencies and enhance decision-making. Collaborating with cross-functional experts from various business units, the team develops and implements scalable, innovative technologies to optimise patient trial success (PTS), reduce amendments, and accelerate study timelines. This collaborative environment values knowledge-sharing and continuous learning, ensuring seamless integration of new team members and creating an ecosystem that drives innovation in protocol development. Your key responsibilities: Develop and maintain full-stack applications, focusing on Angular for user-friendly, accessible, and responsive interfaces. Implement efficient workflows for metadata validation, enabling users to validate various metadata types using Angular, TypeScript, and Material Design. Build and manage backend systems with Python and Django, including data aggregation, processing, and generating summary documents. Integrate with Snowflake databases for efficient metadata storage and retrieval. Optimize tool performance and ensure secure authentication using OAuth mechanisms. Your qualifications and experience: Proficiency in full-stack development with Angular, TypeScript, Python, and Django. Experience in building and maintaining CI/CD pipelines for reliable and consistent application deployment. Expertise in handling spreadsheet data using Python libraries like Pandas and OpenPyXL. Strong understanding of document generation using Python libraries (e.g., Docx). Advanced English proficiency (C1 level) and excellent problem-solving skills. Nice to have (considered an asset): Experience with Progressive Web Apps (PWAs) and server-side rendering (SSR) for enhanced performance and engagement. Knowledge of internationalization (i18n) for multi-language application support. Familiarity with UI/UX principles to improve user experience and collaboration with design teams. Background in metadata validation tools and integrating computational simulations. Proven ability to mentor team members and independently design technical solutions. Who we are At Roche, more than 100,000 people across 100 countries are pushing back the frontiers of healthcare. Working together, we ve become one of the world s leading research-focused healthcare groups. Our success is built on innovation, curiosity and diversity. Roche is an Equal Opportunity Employer. "

Posted 3 months ago

Apply

2 - 5 years

7 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Primary Duties Responsibilities Support and Maintain Oracle EPM modules (PBCS, EPBCS, FCCS, PCMCS, ARCS). Create metadata, forms, reports, business rules, calculation scripts, and Groovy scripts. Manage user provisioning, security, and approval process flows. Manage data flows between EPM Cloud and other systems using data integration tools, ensuring data accuracy and consistency. Configure EPM Cloud integrations including data exchange , data mappings and pipe-lines. Create and support Groovy and/or JavaScript customization personalization within EPM Cloud. Serve as the liaison between Accounting and IT departments to ensure the proper operation and maintenance of the financial systems in use. Support and troubleshoot installation of Oracle Smart View and EPM extensions. This role serves as a technical point of contact for system maintenance and configuration, ensuring data integrity, system testing, reporting, and process improvements. Lead testing and verification efforts for quarterly production releases, executing unit test plans and verifying business user acceptance testing. Education Experience Bachelors Degree Computer Science or other related field Required or equivalent related experience may be considered in lieu of degree. 5+ years EPM Cloud/Hyperion Financial applications support and data integrations development experience SmartView scripting and design experience preferred. Oracle and SAP experience a plus. Experience in administering system, user testing and user training related to version upgrades and new releases. Skills Ability to blend accounting knowledge and IT logic to recommend and implement improvements to processes and financial reporting. Demonstrated problem solving and work prioritization skills. Ability to keep up to date with technology and apply to business strategic plan. Ability to achieve results independently or working with others. Excellent interpersonal and communication skills; ability to communicate effectively with end-users, management and serve as a liaison between the accounting and IT staff. Ability to handle multiple priorities involving internal customer requests and demands. Ability to excel in a cross-organizational, cross cultural, global team environment. Ability to handle special assignments promptly and professionally. Set a high standard of ethics, professionalism, leadership and competency. Working Conditions To be flexible on timings to be able to support various time zones, primarily US time zones The work mode of Finisar India is Hybrid i.e. 3 days at office. Culture Commitment Ensure adherence to company s values (ICARE) in all aspects of your position at Coherent Corp.: I ntegrity - Create an Environment of Trust C ollaboration - Innovate Through the Sharing of Ideas A ccountability - Own the Process and the Outcome R espect - Recognize the Value in Everyone E nthusiasm - Find a Sense of Purpose in Work Coherent Corp. is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law. Finisar India (Subsidiary of Coherent Corp) is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to gender identity, sexual orientation, race, color, religion, national origin, disability, or any other characteristic protected by law.

Posted 3 months ago

Apply

5 - 12 years

7 - 11 Lacs

Chennai

Work from Office

Naukri logo

Title: Data Modeler Mode of working Full Time Work Location - Chennai, India. Work Experience 8-12 years Job Description Key Skills Required for the Data Modeler Role: Data Modelling Expertise Ability to analyse and translate business needs into long-term data models. Metadata Management Strong knowledge of metadata management and related tools. Machine Learning Experience 5-8+ years of experience in machine learning in production. Statistical Analysis Knowledge of mathematical foundations and statistical methods. Database Systems Evaluating and optimizing existing data systems. Data Flow Design Creating conceptual data models and data flows. Coding Best Practices Developing best practices for data coding to ensure consistency. System Optimization Updating and troubleshooting data systems for efficiency. Collaboration Skills Working with cross-functional teams (Product Owners, Data Scientists, Engineers, Analysts, Developers, Architects). Technical Documentation Preparing training materials, SOPs, and knowledge base articles. Communication Presentation Strong interpersonal, communication, and presentation skills. Multi-Stakeholder Engagement Ability to work with multiple stakeholders in a multicultural environment. Data Modelling Certification Desirable but not mandatory. ",

Posted 3 months ago

Apply

3 - 8 years

3 - 7 Lacs

Noida

Work from Office

Naukri logo

We are seeking a skilled Azure Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, working with Azure cloud services, and designing and implementing scalable data solutions. You will play a crucial role in developing, optimizing, and maintaining data pipelines and architectures, ensuring data quality and availability across various platforms. Key Responsibilities: Design, develop, and maintain data pipelines and ETL processes using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. Build and optimize data storage solutions using Azure Data Lake, Azure SQL Database, and Azure Cosmos DB. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. Implement data quality checks, data governance, and security best practices across data platforms. Monitor, troubleshoot, and optimize data workflows for performance and scalability. Develop and maintain data models, data cataloging, and metadata management. Automate data integration and transformation processes using Azure DevOps and CI/CD pipelines. Stay up-to-date with emerging Azure technologies and data engineering trends. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 3+ years of experience in data engineering with a focus on Azure cloud services. Proficiency in Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. Strong experience with SQL, Python, or other scripting languages. Familiarity with data modeling, ETL design, and big data tools such as Hadoop or Spark. Experience with data warehousing concepts, data lakes, and data pipelines. Understanding of data governance, data quality, and security best practices. Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment. Preferred Skills: Azure certification (e.g., Microsoft Certified: Azure Data Engineer Associate) is a plus. Experience with Azure Logic Apps, Azure Functions, and API Management. Knowledge of Power BI, Tableau, or other data visualization tools

Posted 3 months ago

Apply

8 - 12 years

12 - 13 Lacs

Noida

Work from Office

Naukri logo

We re looking for a Digital Marketing Manager to join our Product team in Noida. Working at Taazaa involves engaging with cutting-edge technology and innovative software solutions in a collaborative environment. We emphasize on continuous professional growth, offering workshops and training. Our employees often interact with clients to tailor solutions to business needs, working on diverse projects across industries. We promote work-life balance with flexible hours and remote options, fostering a supportive and inclusive culture. Competitive salaries, health benefits, and various perks further enhance the work experience. Looking ahead, we aim to expand our technological capabilities and market reach, investing in advanced technologies and expanding our service offerings. We plan to deepen our expertise in AI and machine learning, enhance our cloud services, and continue fostering a culture of innovation and excellence. Taazaa is committed to staying at the forefront of technology trends, ensuring it delivers impactful and transformative solutions for its clients. We are looking for a results-driven Digital Marketing Manager to develop, implement, track, and optimize our digital marketing campaigns across multiple channels. The ideal candidate should have a strong grasp of current marketing tools, trends, and best practices to lead integrated digital marketing strategies that drive brand awareness, lead generation, and customer engagement. What you ll do Develop and Execute Strategies: Create and manage digital marketing campaigns across SEO, SEM, email marketing, social media, and paid advertising. SEO SEM: Lead and mentor the SEO and SEM teams to optimize website content, structure, and metadata for improved search engine rankings and organic traffic growth. Oversee and strategize PPC campaigns on Google Ads and other platforms. Social Media Marketing: Develop and oversee social media strategies to increase engagement, brand awareness, and conversions. Content Marketing: Work with the content team to create and distribute engaging content, including blogs, videos, and email newsletters. Analytics Reporting: Monitor and analyze digital performance metrics, providing insights and recommendations for optimization. Lead Generation Conversion: Optimize digital channels for lead generation and implement CRO (Conversion Rate Optimization) strategies. Email Marketing: Design and execute email marketing campaigns for nurturing leads and engaging customers. Marketing Automation: Leverage marketing automation tools to streamline and scale digital campaigns. Website Management: Collaborate with web developers and designers to enhance user experience and site performance. Stay Updated: Keep up with digital marketing trends, algorithm updates, and emerging technologies to maintain a competitive edge. Your qualifications Bachelor s degree in Marketing, Business, Communications, or a related field. 8+ years of experience in digital marketing, with proven success in managing campaigns. Hands-on experience with SEO/SEM, Google Ads, and social media advertising. Proficiency in analytics tools (Google Analytics, SEMrush, HubSpot, etc.). Strong knowledge of email marketing, marketing automation, and CRM systems. Excellent written and verbal communication skills. Data-driven mindset with strong analytical skills. Preferred Qualifications Experience in B2B marketing, SaaS, or technology-related industries. Certifications in Google Ads, HubSpot, or other relevant platforms. Familiarity with A/B testing and conversion rate optimization techniques Behavioral: Here are five essential behavioral skills a Product Manager should possess: Adaptability : Digital marketing is constantly evolving with new tools, platforms, and trends. A successful manager must be adaptable to change, able to pivot strategies when necessary, and open to learning and experimenting with emerging technologies. Leadership and Team Management : A manager should be able to inspire and motivate their team, delegate tasks efficiently, and maintain a positive, collaborative work environment. Strong leadership skills ensure that the team is aligned with organizational goals and performs to the best of its ability. Analytical Thinking : Digital marketing requires a keen understanding of data and metrics to make informed decisions. A manager should possess strong analytical skills to interpret campaign results, understand customer behavior, and identify opportunities for improvement. Creativity and Problem-Solving : Creativity is essential for designing compelling campaigns, ads, and content that engage audiences. A digital marketing manager should also be a problem-solver, capable of thinking outside the box to overcome challenges and drive innovative solutions. Effective Communication : Clear and effective communication is crucial for coordinating with stakeholders, clients, and team members. A manager must be able to convey complex ideas in a simple way, listen actively, and ensure that there s consistent messaging across all channels. What you ll get in return Joining Taazaa Tech means thriving in a dynamic, innovative environment with competitive compensation and performance-based incentives. Youll have ample opportunities for professional growth through workshops and certifications, while enjoying a flexible work-life balance with remote options. Our collaborative culture fosters creativity and exposes you to diverse projects across various industries. We offer clear career advancement pathways, comprehensive health benefits, and perks like team-building activities. Who we are Taazaa Tech is a kaleidoscope of innovation, where every idea is a brushstroke on the canvas of tomorrow. Its a symphony of talent, where creativity dances with technology to orchestrate solutions beyond imagination. In this vibrant ecosystem, challenges are sparks igniting the flames of innovation, propelling us towards new horizons. Welcome to Taazaa, where we sculpt the future with passion, purpose, and boundless creativity

Posted 3 months ago

Apply

7 - 8 years

9 - 10 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Work with Technology and Business teams to gather data requirements and develop a plan for implementation Define and Develop Data Quality requirements and controls for Financial Crime & Risk Management systems (Oracle - Transaction Monitoring and Customer Risk Rating solutions) Analyze Source to Target Mappings documents, data dictionaries for root cause analysis and further business understanding Engage relevant stakeholders and lead Triage meetings to help with resolving DQ issues identified Perform end-to-end Root cause analysis across Data Issues identified and work with respective stakeholders as a part of the Exception Management Process to remediate DQ issues Define test scripts and cases for SIT, UAT and perform testing Assist with streamlining the DQ exception management process and process improvements Develop detailed documentation on the end-to-end data processes that the team performs Job requirements (Minimum): University Degree, preferably in technology/ engineering Minimum of 5 years of experience working in a financial institution preferably a Global bank Minimum of 7 years of experience in data (data lifecycle, data governance, data quality, Metadata, Data issue resolution and other data concepts) Must be extremely comfortable working in different time zones (US & SG) Advanced SQL & Python programming working experience (Preferably in Oracle based tools) Minimum of 3 years of experience in System testing, Data Issue - Exception Management and resolution Understanding of the Compliance domain and concepts i.e., Anti-Money laundering (AML), Know your Customer (KYC), Customer Risk Rating etc. is a must Experience with Informatica ETL Tool is an added advantage Project Management Experience is an added advantage Skillset: Critical Thinking and Problem solving Good Communication & Articulation skills Adaptability Collaboration Interpersonal skills Decision making

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Master Data Management (MDM) Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Experienced Master Data Management (MDM) Specialist required with a strong background in implementing and managing MDM solutions, using the Semarchy platform is an advantage. The ideal candidate will have hands-on experience in configuring, deploying, and maintaining MDM systems, with a focus on data governance, data quality, and data integration. This role is key to ensuring the consistency, accuracy, and integrity of our organization's critical business data.Key Responsibilities: MDM Solution Implementation:Lead and assist in the design, implementation, and maintenance of **Semarchy MDM** solutions, ensuring alignment with business needs and objectives. Data Governance:Implement and enforce data governance frameworks, ensuring that master data is standardized, accurate, and in compliance with internal policies and industry regulations. Data Modeling:Design and maintain logical and physical data models for master data entities, ensuring that data structures are optimized for usability, consistency, and performance within Semarchy. Data Quality Management:Monitor and improve data quality through the development and enforcement of data quality rules, validation processes, and metrics within the MDM platform. Integration with Source Systems:Collaborate with other teams to integrate **Semarchy** MDM with various source systems (ERP, CRM, legacy systems, etc.) to ensure seamless data flow and consistency across platforms. Metadata Management:Manage and maintain metadata to provide visibility into the lifecycle of master data and ensure its alignment with business processes and reporting. User Training and Support:Provide ongoing support to business users on MDM best practices, data management tools, and data governance processes. Conduct training sessions to promote best practices in using the MDM system. Continuous Improvement:Identify opportunities to enhance MDM processes and optimize the use of Semarchy MDM. Stay up-to-date with industry trends and advancements in MDM technologies. Collaboration with Stakeholders:Work closely with data stewards, business analysts, IT teams, and business users to understand master data requirements, document them, and ensure they are translated into system capabilities. Troubleshooting and Issue Resolution:Address and resolve data-related issues, including data discrepancies, integration errors, and data quality issues. Qualifications 15 years full time education

Posted 3 months ago

Apply

4 - 8 years

30 - 34 Lacs

Mumbai

Work from Office

Naukri logo

As a Quant Modelling Associate in our Risk Management and Compliance team, youll play a pivotal role in maintaining JPMorgan Chases strength and resilience. Youll anticipate emerging risks and use your expertise to tackle challenges affecting our company, customers, and communities. Youll be part of the Model Risk Governance and Review (MRGR) team, responsible for independent model review and governance activities to manage Model Risk. MRGR Trading focuses on valuation and risk-management models used within the Corporate & Investment Bank, particularly on Derivatives Instruments, involving complex and advanced modeling techniques. Job responsibilities Model Review Evaluate conceptual soundness of model specifications, reasonableness of assumptions, reliability of inputs, completeness of testing, correctness of implementation, and suitability and comprehensiveness of performance metrics and risk measures Perform independent testing of models by replicating or building benchmark models Design and implement experiments to measure the potential impact of model limitations, parameter estimation errors, and deviations from model assumptions; compare model outputs with empirical evidence or outputs from model benchmarks Document the model review findings and communicate them to stakeholders Model Governance Serve as the first point of contact for model governance related inquiries for the coverage area, and help identify and escalate issues to ensure that their resolutions are sound and timely Provide guidance on the appropriate usage of models to model developers, users, and other stakeholders in the firm Stay abreast of the ongoing performance testing outcomes for models used in the coverage area, and communicate those outcomes to stakeholders Maintain the model inventory and model metadata for the coverage area Keep up with the latest developments in coverage area in terms of products, markets, models, risk management practices, and industry standards Required qualifications, capabilities and skills PhD or Master s degree in a quantitative discipline such as Math, Physics, Engineering, Computer Science, Economics or Finance Excellence in probability theory, stochastic processes, statistical/economic modeling, partial differential equations, and numerical analysis. Understanding of options and derivative pricing theory and risks Proficient in Python, R, Matlab, C++, or other programming languages Risk and control mindset ability to ask incisive questions, assess materiality of model issues, and escalate issues appropriately Strong communication skills with the ability to interface with front office traders, and other functional areas in the firm on model-related issues; and produce documents for internal and external (regulatory) consumption Strong analytical and problem-solving abilities

Posted 3 months ago

Apply

3 - 6 years

22 - 27 Lacs

Pune

Work from Office

Naukri logo

Assist in the technical planning & requirements gathering phases including estimate, develop, test, manage projects, architect and deliver Support and maintain Informatica EDC installation that currently scans and profiles all data in enterprise (approximately 600 apps) Leverage industry standard practices to develop policies, guidelines, tools, metrics, and standards for managing metadata Plan, lead, and perform research to help address issues related to establishing metadata within client Implement policies, guidelines, tools, metrics, and standards for managing metadata Actively participate in related technical and programmatic meetings and working groups Identify and mitigate metadata risk Communicate and improve the metadata enterprise architecture Work with peer to onboard up to 50 apps/month Work with data stewards to identify business terms Serve as a technical lead and mentor Provide technical support or leadership in the development and continual improvement of service Develop and maintain effective working relationships with team members Demonstrate the ability to adapt and work with team members of various experience level All About You 3+ years experience in data integration, preferably on Informatica platform 3+ years experience on Informatica Enterprise Data Catalog platform EDC optimization and performance calibration preferred Skilled problem solvers with the desire and proven ability to create innovative solutions Flexible and adaptable attitude, disciplined to manage multiple responsibilities and adjust to varied environments Phenomenal communicators who can explain and present concepts to technical and non-technical audiences alike, including high level decision makers Strong knowledge on designing, developing and deploying application using Informatica developer tool 3+ years working in a data governance discipline including business glossary use, data catalog and data dictionary curation Experience Shell scripting and Informatica command lines such as PMCMD,INFACMD Experience in reference data collection, maintenance for reuse

Posted 3 months ago

Apply

9 - 11 years

11 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Job Job Title Collibra developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data Management Meta Data->CollibraCollibra Developer Preferred Skills: Technology->Data Management Meta Data->Collibra Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Job Title DNA_DANSKE_Collibra Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Collibra Developer Preferred Skills: Technology->Data Management Meta Data->Collibra Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 3 months ago

Apply

7 - 10 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Location-chennai, coimbatore, bangalore, pun Key skills Azure Data Factory (primary) , Azure Data bricks Spark (PySpark, SQL Experience - 7 to 10 Years Must-have skills Cloud certified in one of these categories Azure Data Engineer Azure Data Factory , Azure Data bricks Spark (PySpark or scala), SQL, DATA Ingestion, Curation Semantic Modelling/ Optimization of data model to work within Rahona Experience in Azure ingestion from on-prem source, e. g. mainframe, SQL server, Oracle. Experience in Sqoop / Hadoop Microsoft Excel (for metadata files with requirements for ingestion) Any other certificate in Azure/AWS/GCP and data engineering hands-on experience in cloud Strong Programming skills with at least one of Python, Scala or Java Strong SQL skills ( T-SQL or PL-SQL) Data files movement via mailbox Source-code versioning/promotion tools, e. g. Git/Jenkins Orchestration tools, e. g. Autosys, Oozie Source-code versioning with Git. Nice-to-have skills Experience working with mainframe files Experience in Agile environment, JIRA/Confluence tools. Adf (Azure Data Factory), Azure Data Brick

Posted 3 months ago

Apply

6 - 8 years

12 - 17 Lacs

Noida

Work from Office

Naukri logo

AI/ML Engineer: 6-8 years exp Strong Hands-on experience in Python with Flask and API Understanding of machine learning concepts and algorithms, including supervised and unsupervised learning. Proficiency in using Azure AI services such as Azure Open AI, NER , Metadata Extraction , Document parsing Hands-on experience to set up the Azure VPC, private endpoints , App services and Azure functions Able to handle the Azure AD connection with Roles and Identity Management Knowledge of NLP techniques for text analysis, sentiment analysis, and language understanding, Security and Compliance Mandatory Competencies Python - Python Python - Rest API Data Science - Machine Learning (ML) At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.

Posted 3 months ago

Apply

7 - 10 years

14 - 19 Lacs

Chennai, Mumbai

Work from Office

Naukri logo

The Data Engineering team within the AI, Data, and Analytics (AIDA) organization is the backbone of our data-driven sales and marketing operations. We provide the essential foundation for transformative insights and data innovation. By focusing on integration, curation, quality, and data expertise across diverse sources, we power world-class solutions that advance Pfizer s mission. Join us in shaping a data-driven organization that makes a meaningful global impact. Role Summary We are seeking a technically adept and experienced Data Solutions Engineering Manager with a passion for developing data products and innovative solutions to create competitive advantages for Pfizer s commercial business units. This role requires a strong technical background to ensure effective collaboration with engineering and developer team members. As a Data Solutions Engineer in our data lake/data warehousing team, you will play a crucial role in building data pipelines and processes that support data transformation, workload management, data structures, dependencies, and metadata management. This role will need to be able to work closely with stakeholders to understand their needs and work alongside them to ensure data being ingested meets the business users needs and will well modeled and organized to promote scalable usage and good data hygiene . Work with complex and advanced data environments, employ the right architecture to handle data, and support various analytics use cases including business reporting, production data pipeline, machine learning, optimization models, statistical models, and simulations. The Data Solutions Engineering Manager will ensure data quality and integrity by validating and cleansing data, identifying, and resolving anomalies, implementing data quality checks, and conducting system integration testing (SIT) and user acceptance testing (UAT). The ideal candidate is a passionate and results-oriented product lead with a proven track record of delivering data-driven solutions for the pharmaceutical industry. Role Responsibilities Project solutioning, including scoping, and estimation. Data sourcing, investigation, and profiling. Prototyping and design thinking. Developing data pipelines & complex data workflows. Actively contribute to project documentation and playbook, including but not limited to physical models, conceptual models, data dictionaries and data cataloging. Accountable for engineering development of both internal and external facing data solutions by conforming to EDSE and Digital technology standards. Partner with internal / external partners to design, build and deliver best in class data products globally to improve the quality of our customer analytics and insights and the growth of commercial in its role in helping patients. Demonstrate outstanding collaboration and operational excellence. Drive best practices and world-class product capabilities. Qualifications Bachelor s degree in a technical area such as computer science, engineering or management information science. 4+ years of combined data warehouse/data lake experience as hands on data engineer. 2+ years in developing data product and data features in servicing analytics and AI use cases Recent Healthcare Life Sciences (pharma preferred) and/or commercial/marketing data experience is highly preferred. Domain knowledge in the pharmaceutical industry preferred. Good knowledge of data governance and data cataloging best practices. Technical Skillset 4+ years of hands-on experience in working with SQL, Python, object-oriented scripting languages (e. g. Java, C++, etc. . ) in building data pipelines and processes. Proficiency in SQL programming, including the ability to create and debug stored procedures, functions, and views. 4+ years of hands-on experience delivering data lake/data warehousing projects. Experience in working with cloud native SQL and NoSQL database platforms. Snowflake experience is desirable. Experience in AWS services EC2, EMR, RDS, Spark is preferred. Solid understanding of Scrum/Agile is preferred and w orking knowledge of CI/CD, GitHub MLflow. Familiarity with data privacy standards, governance principles, data protection, pharma industry practices/GDPR compliance is preferred. Great communication skills. Great business influencing and stakeholder management skills. Information & Business Tech #LI-PFE

Posted 3 months ago

Apply

8 - 13 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Experience with processing large workloads and complex code on Spark clusters. Experience setting up monitoring for Spark clusters and driving optimization based on insights and findings. Understanding designing and implementing scalable data warehouse solutions to support analytical and reporting needs. Strong analytic skills related to working with unstructured datasets. Understanding of building processes supporting data transformation, data structures, metadata, dependency, and workload management. Knowledge of message queuing, stream processing, and highly scalable big data data stores. Knowledge of Python and Jupyter Notebooks. Knowledge of big data tools like Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools like Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services (EC2, EMR, RDS, and Redshift). Willingness to work from an office at least 2 times per week. Nice to have: Knowledge of stream-processing systems (Storm, Spark-Streaming). Responsibilities: Optimize Spark clusters for cost, efficiency, and performance by implementing robust monitoring systems to identify bottlenecks using data and metrics. Provide actionable recommendations for continuous improvement. Optimize the infrastructure required for extracting, transforming, and loading data from various data sources using SQL and AWS big data technologies. Work with data and analytics experts to strive for greater cost efficiencies in the data systems.

Posted 3 months ago

Apply

8 - 9 years

10 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc An AWS Data Modeler is responsible for designing and implementing data models that effectively organize and manage data within Amazon Web Services (AWS) environments. This role involves collaborating with data architects, analysts, and business stakeholders to translate business requirements into scalable and efficient data structures. Data Modeling: Develop conceptual, logical, and physical data models to support various business applications, ensuring alignment with organizational data standards. AWS Integration: Design and implement data models optimized for AWS services, including RDS, Redshift, DynamoDB, and S3, to ensure seamless data integration and retrieval. ETL Processes: Collaborate with ETL developers to design workflows that ensure accurate and efficient data extraction, transformation, and loading, maintaining data quality and consistency. Metadata Management: Administer and maintain metadata repositories to ensure data accuracy, consistency, and accessibility. Data Quality Assurance: Implement data validation and cleansing techniques to maintain high data quality standards. Documentation: Create and maintain comprehensive documentation of data models, data flow diagrams, and related processes to facilitate understanding and maintenance. About You To be considered for this role it is envisaged you will possess the following attributes: Educational Background: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Experience: 8+ years of experience in data modeling, with a focus on designing data structures for finance, commercial, supply chains, procurement, or customer analytics. AWS Proficiency: Extensive experience with AWS services, including RDS, Redshift, DynamoDB, and S3, with relevant AWS certifications being a plus. SQL Expertise: Proficiency in SQL for querying databases, creating tables, and managing data relationships. Data Modeling Tools: Experience with data modeling tools such as ERWin or ER/Studio. ETL Knowledge: Understanding of ETL processes and experience with data transformation tools and technologies. Analytical Skills: Strong analytical and problem-solving skills, with the ability to translate complex business requirements into technical specifications. Communication: Excellent communication skills to effectively collaborate with technical teams and non-technical stakeholders.

Posted 3 months ago

Apply

6 - 9 years

10 - 12 Lacs

Pune

Work from Office

Naukri logo

Support client needs by delivering Tagetik consolidation or Planning modules. Support client needs by delivering integrating Tagetik with multiple source systems. Merge, Customize and Deploy Tagetik as per client business requirements Experience working with clients throughout various parts of implementation lifecycle Proactive with Solution oriented mindset, ready to learn new technologies for Client requirements. Skills and attributes for success Deliver large/medium Tagetik programmes, demonstrate expert core consulting skils. Perform an appropriate role in business development in terms of presales and practice development. for example, presales, internal engagement and / or knowledge management Demonstrate man management and an ability to conduct and lead consultancy assignments Should be open to adopt new technologies. To qualify for the role, you must have 6-9 years of relevant experience in implementation planning and Consolidation modules . Should have 2-4 years in Tagetik products. Good understanding of Tagetik functionality and setup. Hands on experience in AIH ,predictive analytics. Know FST definition, MDM calculations relate solutioning with all modules. Have exposure of Smart now modules. Should have good understanding of ETL with all mapping and calculations Must have worked on AIH as lead consultant and analytical work space experience. Develop and maintain solid knowledge on consolidations in Tagetik Should have worked in designing the workflow and metadata design in Tagetik Independently provide system solutions to Tagetik issues and enhancements; act as an escalation point to FAST support/Operation team for complex issues Lead the team in effective handling of Financial Planning / Consolidation month-end close Act as a techno-functional Tagetik solution architect. Perform functional and technical testing of system projects; bug fixes and enhancements Should have written Tagetik codes/Rules Participate in prioritizing system issues for development team; Participate in regular calls with development team to keep track of progress Independently identify areas of improvement and makes recommendations Ideally, you ll also have Strong understanding of data close process. Proficient in building and extending metadata functionalities Maintaining end-to-end accountability and expectations for customer satisfaction and overall delivery excellence Prioritize deliveries in conjunction with implementation team Adopt a proactive, logical and organized approach to problem resolution

Posted 3 months ago

Apply

2 - 4 years

2 - 6 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role Description: Backend Developer Position Overview: We are seeking a highly skilled Backend Developer to join the Cloud Data Hub (CDH) Team, working closely with teams in Munich and BMW TechWorks India. The ideal candidate is a backend development expert with proficiency in Python, AWS, Kafka, Terraform, and Git, and a passion for building scalable and efficient systems. This role will involve designing, developing, and maintaining backend solutions for the CDH platform, contributing to BMWs transformation into a fully data-driven organization. About the project The Cloud Data Hub (CDH) is a cloud-based, centralized data lake developed by BMW, serving as the organizations central data landing zone. Designed to democratize data usage across all departments, the CDH consolidates data into a single source of truth, enabling providing and consuming entities to leverage data effectively and efficiently. It plays a pivotal role in BMWs transformation into a truly data-driven organization, supporting data acquisition, integration, processing, and analysis across its value chain. Key Responsibilities Design, develop, and maintain backend systems for the CDH platform, ensuring robust, scalable, and efficient solutions. Build and enhance serverless architectures and REST APIs using Python and AWS services. Implement and manage Kafka data streaming pipelines for real-time data processing and metadata orchestration. Develop and deploy infrastructure using Terraform for infrastructure-as-code automation on AWS. Utilize Git for version control and collaborate with the team on code reviews and CI/CD pipelines. Apply Test-Driven Development (TDD) principles to ensure code reliability, maintainability, and high-quality deliverables. Ensure the backend systems comply with BMW s security standards, performance metrics, and scalability requirements. Proactively identify, debug, and resolve performance bottlenecks and system issues. Contribute to technical documentation and knowledge sharing to ensure project continuity and team alignment. Qualifications Expert-level proficiency in backend development with Python. Strong experience with AWS cloud services, including Lambda, S3, DynamoDB, API Gateway, and other serverless offerings. Hands-on expertise with Kafka for building and managing data streaming solutions. Advanced skills in Terraform for infrastructure automation and management. Proficient in writing optimized SQL queries and working with relational databases. In-depth knowledge of Git for version control and experience with CI/CD pipelines. Experience in building distributed systems and handling large-scale, real-time data processing workloads. Strong understanding of system design, scalability, and security best practices. Excellent debugging and problem-solving skills, with a detail-oriented mindset. Good communication and interpersonal skills to collaborate effectively with cross-functional teams. Preferred Skills Experience working with Docker and containerized environments. Familiarity with agile frameworks and participation in Scrum ceremonies. Knowledge of monitoring and observability tools like CloudWatch, Prometheus, or Grafana. Certification in AWS Solutions Architecture or related AWS certifications. Why Join Us This role provides an exciting opportunity to work on cutting-edge cloud-native technologies, contributing directly to BMW s Cloud Data Hub, a cornerstone of its data-driven transformation. As a Backend Developer, you will collaborate with talented teams across geographies to build solutions that drive real-world impact at a global scale.

Posted 3 months ago

Apply

1 - 4 years

2 - 6 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role Description: Backend Developer Position Overview: We are seeking a highly skilled Backend Developer to join the Cloud Data Hub (CDH) Team , working closely with teams in Munich and BMW TechWorks India . The ideal candidate is a backend development expert with proficiency in Python , AWS , Kafka , Terraform , and Git , and a passion for building scalable and efficient systems. This role will involve designing, developing, and maintaining backend solutions for the CDH platform, contributing to BMWs transformation into a fully data-driven organization. About the project The Cloud Data Hub (CDH) is a cloud-based, centralized data lake developed by BMW, serving as the organizations central data landing zone. Designed to democratize data usage across all departments, the CDH consolidates data into a single source of truth, enabling providing and consuming entities to leverage data effectively and efficiently. It plays a pivotal role in BMWs transformation into a truly data-driven organization, supporting data acquisition, integration, processing, and analysis across its value chain. Key Responsibilities Design, develop, and maintain backend systems for the CDH platform, ensuring robust, scalable, and efficient solutions. Build and enhance serverless architectures and REST APIs using Python and AWS services. Implement and manage Kafka data streaming pipelines for real-time data processing and metadata orchestration. Develop and deploy infrastructure using Terraform for infrastructure-as-code automation on AWS. Utilize Git for version control and collaborate with the team on code reviews and CI/CD pipelines. Apply Test-Driven Development (TDD) principles to ensure code reliability, maintainability, and high-quality deliverables. Ensure the backend systems comply with BMW s security standards , performance metrics , and scalability requirements . Proactively identify, debug, and resolve performance bottlenecks and system issues. Contribute to technical documentation and knowledge sharing to ensure project continuity and team alignment. Qualifications Expert-level proficiency in backend development with Python . Strong experience with AWS cloud services , including Lambda, S3, DynamoDB, API Gateway, and other serverless offerings. Hands-on expertise with Kafka for building and managing data streaming solutions. Advanced skills in Terraform for infrastructure automation and management. Proficient in writing optimized SQL queries and working with relational databases. In-depth knowledge of Git for version control and experience with CI/CD pipelines. Experience in building distributed systems and handling large-scale, real-time data processing workloads. Strong understanding of system design, scalability, and security best practices. Excellent debugging and problem-solving skills, with a detail-oriented mindset. Good communication and interpersonal skills to collaborate effectively with cross-functional teams. Preferred Skills Experience working with Docker and containerized environments. Familiarity with agile frameworks and participation in Scrum ceremonies. Knowledge of monitoring and observability tools like CloudWatch, Prometheus, or Grafana. Certification in AWS Solutions Architecture or related AWS certifications. Why Join Us This role provides an exciting opportunity to work on cutting-edge cloud-native technologies , contributing directly to BMW s Cloud Data Hub , a cornerstone of its data-driven transformation. As a Backend Developer , you will collaborate with talented teams across geographies to build solutions that drive real-world impact at a global scale.

Posted 3 months ago

Apply

6 - 10 years

13 - 18 Lacs

Mumbai

Work from Office

Naukri logo

Gracenote is the top provider of entertainment information, creating industry-leading databases of TV, movie, and music metadata for entertainment guides, applications and in-car entertainment. Our technology serves billions of requests daily to hundreds of millions of devices around the world. Our customers include innovators like Apple, Twitter, Google, Spotify, M-GO and Hulu, top consumer electronics and cable companies, and leading automotive manufacturers such as Ford and Toyota, throughout the US and the world. Simply put, data provides you with an opportunity to impact the evolution of the entire entertainment industry. Job Purpose Develop and enhance our flagship Video, Audio, Automotive and Sports metadata software solutions. Design applications with a Platform-first mentality where scale, consistency and reliability are at the core of every decision. Responsibilities Design, develop, and maintain scalable and robust Big Data pipelines and systems. Architect and implement solutions for managing & processing large-scale datasets with fast refresh cycles, ensuring high performance, scalability & accuracy. Collaborate with cross-functional teams, including data scientists, engineers, and product managers, to define and translate business requirements into technical solutions. Write clean, maintainable, and efficient code following best practices and coding standards. Conduct design and code reviews to ensure high-quality deliverables and adherence to best practices. Troubleshoot and resolve complex issues in systems, ensuring reliability, availability, SLA compliance, observability, and minimal downtime. Participate in the full software development lifecycle, including planning, development, review, testing, and deployment. Stay up-to-date with emerging technologies and industry trends to continuously improve skills and knowledge. Mentor and guide junior engineers, fostering a culture of learning and collaboration within the team. Qualifications Bachelor s degree in Computer Science, Engineering, or a related field. 6 to 10 years of professional experience in Big Data engineering, with hands-on expertise in processing large-scale datasets. Advanced programming skills in Python, Java, or Scala, with a focus on data processing and stream analytics . Experience of working with distributed data systems such as Spark or Flink. Deep understanding of distributed storage systems (HDFS, S3, or ADLS) and modern file formats like Parquet, ORC & Arrow. Strong expertise in Lakehouse architectures and technologies like Delta Lake, Iceberg and data orchestration tools like Airflow, Dagster. Knowledge of database systems, including NoSQL stores (Cassandra, MongoDB), relational databases (PostgreSQL, MySQL) and SQL. Working proficiency with Agile development methodologies and CI/CD practices. Strong problem-solving skills and the ability to work independently as well as in a team environment. Excellent communication and interpersonal skills. Preferred Skillsets Experience with cloud platforms (e. g. , AWS, Azure, Google Cloud). Familiarity with containerization technologies (e. g. , Docker, Kubernetes). Knowledge of CI/CD tools and practices. Experience with test-driven development (TDD) and automated testing frameworks. Familiarity with data visualization tools is good to have. By connecting clients to audiences, we fuel the media industry with the most accurate understanding of what people listen to and watch. To discover what audiences love, we measure across all channels and platforms from podcasts to streaming TV to social media. And when companies and advertisers are truly connected to their audiences, they can see the most important opportunities and accelerate growth. Do you want to move the industry forward with Nielsen? Our people are the driving force. Your thoughts, ideas, and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and act. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. You ll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work!

Posted 3 months ago

Apply

0 - 2 years

2 - 4 Lacs

Mumbai

Work from Office

Naukri logo

Gracenote is the top provider of entertainment information, creating industry-leading databases of TV, movie, and music metadata for entertainment guides, applications and in-car entertainment. We are the leading supplier of TV and movie entertainment data. We supply data to entertainment platforms and devices. We are presently looking for an Enrichment Editor ROLE The Editor will be expected to research, gather and process TV programs and movies from studios, distributors, websites and related sources in a timely and accurate manner for entering into our internal database. Communication with broadcasters, content providers and other internal teams will be expected in order to secure all necessary editorial requirements. RESPONSIBILITIES : 1. Curate and rewrite synopses and create a content database for movies, shows, and episodes 2. Achieve and maintain high standards of quality and productivity for client satisfaction 3. Take up ad-hoc projects along with daily tasks, as and when required 4. Adapt to the in-house style guide rules and replicate them in the synopses 5. Keep abreast of the latest changes in the style guide 6. Proofread and edit content to ensure grammatical accuracy 7. Stay up-to-date with best practices in writing and grammar usage 8. Create new metadata records and/or enhance existing metadata for movies, TV programs and episodes 9. Investigate, confirm and document questionable program content through research from reliable sources 10. Work accurately in a fast-paced environment with stringent deadlines 11. Maintain accurate database information, ensuring all output conforms to strict broadcast quality standards, editorial policies and client service level agreements 12 Self-manage Key Performance Indicators on a daily basis 13 Identify cross-skilling opportunities to support other projects. SKILLS & EXPERIENCE: 1. Passionate about TV and movies and the latest programs. An interest in sports is also essential 2. Ability to write concise, snappy and what we call snackable content 3. Bachelor s Degree or equivalent work experience in Communications or Journalism 4. Excellent command over the English language, experience working with an editorial style guide and impeccable written and verbal skills 6. Exceptional command over English grammar, punctuation, and syntax 7. Experience using content management systems 8. Strong research, rephrasing and rewriting skills 9. Problem-solving mindset and willingness to take initiative while mitigating risks 10. Must understand and quickly adapt to new processes and training 11. Ability to work independently with excellent project and time management skills 12. Strong communication skills and ability to adjust to rapidly shifting deadlines while remaining detail-oriented 13. Ready to work in a 24/7 operation, including evening, night and weekend shifts 14. The role is hybrid, so they will be working partially from home which is in the same city as the Nielsen office they are employed with and partially from a Nielsen office / site. ABOUT THE TEAM Gracenote, a Nielsen company, provides music, video, and sports content along with technologies to the worlds hottest entertainment products and brands, which is also a global standard for music and video recognition which is supported by the largest source of entertainment data. Gracenote features descriptions of more than 200 million tracks, TV listings for 85+ countries, and statistics from 4, 500 sports leagues and competitions. By connecting clients to audiences, we fuel the media industry with the most accurate understanding of what people listen to and watch. To discover what audiences love, we measure across all channels and platforms from podcasts to streaming TV to social media. And when companies and advertisers are truly connected to their audiences, they can see the most important opportunities and accelerate growth. Do you want to move the industry forward with Nielsen? Our people are the driving force. Your thoughts, ideas, and expertise can propel us forward. Whether you have fresh thinking around maximizing a new technology or you see a gap in the market, we are here to listen and act. Our team is made strong by a diversity of thoughts, experiences, skills, and backgrounds. You ll enjoy working with smart, fun, curious colleagues, who are passionate about their work. Come be part of a team that motivates you to do your best work! Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen. com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen. com address. If youre unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.

Posted 3 months ago

Apply

Exploring Metadata Jobs in India

Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bengaluru
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi/NCR

These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.

Average Salary Range

The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum

Salaries may vary based on the company, location, and specific job responsibilities.

Career Path

In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect

As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.

Related Skills

In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance

Having a combination of these skills can make job seekers more attractive to potential employers.

Interview Questions

  • What is metadata? (basic)
  • How do you ensure data quality in metadata management? (medium)
  • Can you explain the difference between structured and unstructured metadata? (medium)
  • What tools or software have you used for metadata management? (basic)
  • Describe a challenging metadata project you worked on and how you overcame obstacles. (advanced)
  • How do you stay updated with the latest trends in metadata management? (basic)
  • Explain the importance of metadata in data governance. (medium)
  • Have you ever had to resolve conflicts between different metadata standards? How did you handle it? (advanced)
  • What is the role of metadata in data integration? (medium)
  • How do you ensure metadata security and compliance with regulations? (medium)
  • What are the benefits of using metadata in data analytics? (basic)
  • Can you discuss a successful metadata strategy you implemented in a previous role? (advanced)
  • Explain the concept of metadata harvesting. (medium)
  • How do you handle metadata versioning and updates? (medium)
  • Have you worked with ontologies and taxonomies in metadata management? (advanced)
  • How do you collaborate with other teams, such as data scientists or developers, in metadata projects? (medium)
  • What are the common challenges faced in metadata management, and how do you address them? (advanced)
  • How do you measure the effectiveness of metadata initiatives in an organization? (medium)
  • Can you give an example of how metadata enhances data search and retrieval processes? (medium)
  • What role does metadata play in data lineage and traceability? (medium)
  • Explain the difference between technical metadata and business metadata. (basic)
  • How do you handle metadata migration when transitioning to a new system or platform? (advanced)
  • Describe a time when you had to prioritize metadata tasks based on business needs. (medium)
  • What are the best practices for documenting metadata to ensure consistency and accuracy? (medium)
  • How do you handle metadata conflicts or inconsistencies in a large dataset? (advanced)

Conclusion

As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies