Home
Jobs

619 Metadata Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 4 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary: We are seeking a Performance Engineer with expertise in SQL/PLSQL query optimization, performance tuning, and the development of efficient PL/SQL stored procedures and functions to support data processing and calculations using best practices within the BI environment. This role focuses on enhancing report performance, optimizing SQL queries, and improving ETL processes for efficient data processing and reporting. The ideal candidate should have at least 8 years of experience in PL/SQL-based ETL development, database performance tuning and 2-4 years of BI Publisher, OAS RPD optimization. Familiarity with SQL Server is a plus. Key Responsibilities: Performance Optimization BI Reporting: Identify and resolve slow-performing reports, dashboards, and queries in OAS/OBIEE. Optimize BI Publisher reports, ensuring efficient layouts, pagination, and caching strategies for fast rendering. Enhance query efficiency by tuning OAS/OBIEE RPD metadata layers (Physical, Logical, and Presentation). Proficiency in SQL/PLSQL query optimization techniques, performance tuning, Develop and maintain efficient PL/SQL stored procedures and functions that support data processing and calculations using best practices within the BI environment. Monitor system performance metrics, including query response times, cache hit rates, and resource utilization. PL/SQL-Based ETL Development Optimization: Develop and maintain efficient PL/SQL stored procedures for ETL processes. Optimize ETL workflows, ensuring high-performance data transformations and aggregations. Troubleshoot data loading and transformation issues, ensuring seamless data integration. Database RPD Management: Manage the Oracle BI Repository (RPD), optimizing data models for reporting efficiency. Utilize query hints, materialized views, partitioning, and indexing to improve database performance. Deep understanding of query execution plans, indexing, joins, partitioning, and caching. Optimize performance by using query hints, materialized views, and caching strategies. Implement best practices in data modeling, Star Schema design, and caching strategies. Collaborate with DBAs, BI developers, and report analysts to improve overall performance. Qualifications Experience: Strong expertise in PL/SQL-based ETL development and query optimization. Experience with OAS/OBIEE RPD metadata management and performance tuning. Ability to create BI Publisher reports with optimized layouts and data structures to ensure fast rendering times and analyze complex data sets and identify performance bottlenecks (Nice to have) Familiarity with SQL Server query optimization. (Nice to have) Deep understanding of query execution plans, indexing, joins, and partitioning. Experience with cloud-based BI environments. (Nice to have)

Posted 2 months ago

Apply

6 - 8 years

10 - 17 Lacs

Pune

Work from Office

Naukri logo

Experience: 6 to 7 yrs Create and Manage/edit Metadata, hierarchies and ownership data, Data Forms along with rules attached to the forms.Create and manage/edit FR and Smart view reports. Establish SV connection and create adhoc analysis sheets.

Posted 3 months ago

Apply

4 - 6 years

6 - 8 Lacs

Gurgaon

Work from Office

Naukri logo

BI Specialist, OBIEE Developer BeamstacksMG Road, Gurgaon. The Business Intelligence (BI) Specialist is responsible for the design, development, implementation, management and support of missioncritical enterprise BI reporting and Extract, Transform, Load (ETL) processes and environments. Exposure to one or more implementations using OBIEE Development and Administration. Must have 6 Years Development experience in PL/SQL. Experience in developing OBIEE Repository at three layers (Physical, Business model and Presentation Layers), Interactive Dashboards and drill down capabilities using global and Filters and Security Setups. Must have 3 year of experience in Data Modeling, ETL Development (Preferably OWB), Etl and BI Tools installation and configuration & Oracle APEX. Experience in developing OBIEE Analytics Interactive Dashboards with Drill-down capabilities using global and local Filters, OBIEE Security setup (users/ group,access/ query privileges), configuring OBIEE Analytics Metadata objects (Subject Area, Table, Column), Presentation Services/ Web Catalog objects (Dashboards,Pages, Folders, Reports). Hands on development experience on OBIEE (version 11g or higher), Data Modelling. Experience in installing and configuring Oracle OBIEE in multiple life cycle environments. Experience creating system architecture design documentation. Experience presenting system architectures to management and technical stakeholders. Technical and Functional Understanding of Oracle OBIEE Technologies. Good knowledge of OBIEE Admin, best practices, DWBI implementation challenges. Understanding and knowledge of Data warehouse. Must have OBIEE Certification on version 11g or higher. Experience with ETL tools. Experience on HP Vertica. Domain knowledge on Supply Chain, Retail, Manufacturing. Developing architectural solutions utilizing OBIEE. Work with project management to provide effort estimates and timelines. Interact with Business and IT team members to move the project forward on a daily basis. Lead the development of OBIEE dashboard and reports. Work with Internal stakeholder and development teams during project lifecycle

Posted 3 months ago

Apply

1 - 6 years

3 - 8 Lacs

Kozhikode

Work from Office

Naukri logo

We are looking for an experienced SEO Analyst who will improve the company s website visibility and rankings on search engines. This role involves developing and executing effective strategies, identifying opportunities for improvement, and staying on top of industry trends. The ideal candidate is highly organized, detail-oriented, and has excellent communication skills. In this role, the ideal candidate must continuously research and monitor performance metrics and analytics to measure campaign success. They must interpret the data appropriately to make informed decisions. Roles and Responsibilities Develop, implement, and monitor effective SEO strategies to increase website visibility. Conduct keyword research, content optimization, link building, and other initiatives. Create detailed reports on search engine performance metrics and analytics. Stay up to date with the latest trends and changes in the SEO industry and identify new opportunities for improvement. Work closely with web developers and content writers to ensure the website is optimized for search engine algorithms. Ensure that existing content is up-to-date and help create new content that adheres to SEO guidelines. Optimize SEO tags and metadata. Work with PPC campaigns to increase website visibility and engagement and reduce costs associated with advertising. Delegate tasks and oversee the execution of tasks. Act as the point of contact for clients on all matters related to SEO. Provide guidance and insights on the best strategies to improve website ranking. Requirements The role of an SEO Analyst requires a mix of technical and creative skills. The ideal candidate will have the following: Bachelor s degree in computer science, web development, or a related field. Experience with website optimization, link building, and content creation. Knowledge of SEO tools such as Google Analytics, Search Console, and SEMrush. Knowledge of HTML, CSS, and other web technologies. Excellent written communication skills. Analytical skills with the ability to interpret analytics data and make strategic decisions based on the results. Ability to handle multiple tasks and shift priorities. Strong problem-solving skills with the ability to come up with innovative solutions for improving website ranking. Passion for SEO and digital marketing. Ability to identify opportunities for improvement and lead the team in the right direction. Benefits: Competitive salary and benefits package. Opportunities for professional growth and skill development. Work in a dynamic and innovative product-focused environment. Collaborative team culture with opportunities to make a significant impact. Flexible work hours.

Posted 3 months ago

Apply

2 - 5 years

4 - 7 Lacs

Gurgaon

Work from Office

Naukri logo

Required Skill Set : Hands on experience with ERWIN Data Modeler and Working on the Enterprise Data Mart. Understanding Metadata Management principles ( Data Dictionary activities ). Hands on experience with Creating the ETL Mapping. Should understand impacts of model changes on databases and to validate that a model supports straight-forward requirements. Hands on experience with 3NF Data Modeling and NORMALIZED schema. Should be able to analyze business requirements to develop data models with minimal oversight. Hands on experience implementing Data Warehouse presentation layer including design and definition of logical tables, physical sources, columns, aggregation rules. Hands on experience in writing SQL queries, views, and optimization of SQL queries. Comprehensive Knowledge of Bill Inmon s Data Modeling Concepts (Logical and Physical) including 3NF, Supertypes, Subtypes, Abstraction, Associative entities / tables. Requires strong decision-making, organizational, planning and problem-solving skills. Nice to have : Knowledge on other Data Modeling tools like Power Designer , Visio , ER studio. Comprehensive Knowledge of Ralph Kimball s Data Modeling Concepts including Transaction dimensions, conformed dimensions and facts, slowly changing dimension, multi-Star/ Snowflake Schema Modeling, different types of Fact tables. Implementing a virtual semantic layer (virtual star schema) using views on a multi-tiered normalized Integrated Operational Data Store is a huge plus. 3NF (Relational) Logical and Physical Modeling.

Posted 3 months ago

Apply

2 - 4 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

About Lowe s Lowe s Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About the Team The Lowe s Business Intelligence team delivers valuable insights to our Marketing partners. The team delivers quality analysis and reports that show the deep understanding of each of the campaign performance and of our customer behavior, helping Lowe s win with its most strategic customer groups. Job Summary: The purpose of the SEO analyst will be to support SEO Manager by executing and tracking SEO strategies to improve traffic to Lowe s.com and Lowes stores. The analyst will be responsible to Analyse and identify gaps on website and optimize pages for improved search results and ranking by optimizing content and technical tactics like building links, developing content briefs, meta, copy block, duplicated, page speed, redirect. Roles & Responsibilities: Core Responsibilities: Execute and track data driven Search Engine Optimization (SEO) strategies to improve traffic to Lowe s. Research and analyze performance, competition, and market trends to proactively identify gaps, opportunities and take corrective actions. Monitor KPI s like visits, clicks, rank, conversion, time spent, bounce rate on regular cadence from multiple platforms like Google Search Console, Seo Clarity, Botify and Adobe. Conduct keyword research using internal and external tools to optimize current keyword scope. Conduct site audits to improve user experience and search engine relevancy by optimizing on-page elements like attributes, taxonomy structure, content, linking strategy, metadata descriptions, title, headlines. Years of Experience: 2 - 4 years exp working in core SEO tactics content or technical. Experience with web analytics tools such as Adobe Analytics or Google Analytics. Experience with SEO Tools such as SEO Clarity, Botify, Google Search Console, Screaming Frog, etc Proven track record of delivering qualified search engine traffic. Education Qualification & Certifications (optional) Required Minimum Qualifications : Graduate in Business, Marketing, Communications, Business Management, Marketing, Digital, E-Commerce, Engineering, Data Analytics. Skill Set Required Primary Skills (must have) Proficiency in minimum of 2 SEO tools like Google Search Console, SEO Clarity, Botify. Ability to interpret insights from Data analytics tools. Strategic and analytical thinking, with attention to detail. Strong understanding of current and emerging search engine and leading algorithm technologies and factors influencing brand visibility. Operational excellence, Quality outcome, Platform mastery, Timely deliveries, Proactive approach to problem solving. Proactive in achieving goals and driving measurable outcomes. Detail-oriented mindset with a focus on continuous improvement, staying up to date with industry trends and best practices. Team player with strong interpersonal skills and the ability to collaborate effectively with cross-functional teams. Secondary Skills (desired) 2+ years of experience in Digital/E-commerce is highly desirable. Experience in omni channel retail Prior background in website analytics. Data visualization skills using Excel or Power Bi tools to better present organic performance analyses and reports to stakeholders would be a bonus.

Posted 3 months ago

Apply

6 - 9 years

4 - 7 Lacs

Chennai

Work from Office

Naukri logo

6 - 9 years of relevant experience in Oracle EPM FCCS, HFM Minimum 2 end to end hands-on implementation experience of Oracle EPM FCCS application. Hands-on implementation experience on - Hyperion Financial Management (HFM), Hyperion Financial Close Management (FCM), FDMEE, Smart View & Hyperion Financial Reporting (HFR) Hands-on experience implementing FCCS on Statutory reporting, Management Reporting, Multi-GAAP, IFRS, CbCr Reporting, Allocations, Intercompany eliminations, Currency translations requirements. Expert understanding & experience on functional aspects of EPM wrt Income Statement, Balance Sheet & Cash Flow reporting. FCCS/HFM Rules writing: Read, Write, Amend & understand the impact on the system. FCCS/ HFM Metadata: Read, Write, Amend & understand the impact on the system. Experience building integration between FCCS, ERP & HCM using Data Management or FDMEE Experience in writing Business Rules, Calc Scripts, Calc Manager rules to cater various business functionalities. Understanding & experience of the setup of security, user groups & provisioning

Posted 3 months ago

Apply

2 - 5 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Details: Job Description - Site Reliability/Data Observability Engineer The role of Site Reliability Engineer in the Data Analytics organization is to develop and support frameworks and mechanisms that monitor Data, Data pipelines, Usage, User operations, Data Infrastructure, Compute statistics to proactively ensure data accuracy, reliability, system scalability, health, and performance of our Analytics platforms on GCP/AWS. The role will play a crucial part in designing and implementing robust, scalable solutions while working with the Cloud/Data Engineering team. We are seeking a Site Reliability Engineer over 3-5 years of technical expertise in Data Observability, Data Platforms, Cloud infrastructure and Cloud operations space. At Tyson Foods, the Site Reliability Engineer will have the opportunity to work on cutting-edge technologies and collaborate with talented professionals in a dynamic environment. We offer a culture that values innovation, growth, and work-life balance, along with opportunities for career advancement and professional development. Primary Job Responsibilities: Collaborate with cross-functional teams to solve data and data pipeline issues, optimize system performance, capacity, and cost-efficiency. Implement and manage monitoring, logging, and observability solutions to ensure proactive identification and resolution for data and platforms issues. Leverage or Build mechanisms for root cause analysis and implement corrective actions to prevent recurrence. Develop data collection strategy, metric definition, anomaly detection, alerting, data pipeline monitoring, data quality assessment, visualization creation. Develop automation scripts and leverage open-source or market tools to implement Data Observability framework, streamline operational processes and improve efficiency. Continuous collaboration and improvement initiatives to enhance reliability, scalability, and performance of data platform. This role requires strong technical expertise as well as excellent problem-solving and communication skills. Qualifications: Minimum 3+ years of experience in a Site Reliability Engineering or similar role, with a strong focus on data observability, cloud infrastructure and operations. Experience with data observability tools (e.g. Monte Carlo, Bigeye, Collibra, Acceldata, Open Metadata, Data Hub, Collate, Anomalo etc) or observability tools (e.g., Prometheus, Grafana, Datadog, New Relic). 2+ years and deep expertise in cloud platforms such as GCP, AWS or Azure. Experience or strong knowledge on Data analytics, Data Ingestions (like Fivetran, HANA/SLT), Data processing (like dbt) and Visualization tools (like Power BI) Experience or strong knowledge on Data Quality. Experience with CI/CD pipelines and DevOps practices. 2+ years in scripting and programming skills (e.g., Python, Bash). Excellent communication skills with the ability to collaborate effectively across teams and influence stakeholders. Strong analytical and troubleshooting skills, with the ability to analyze complex issues and drive solutions. Bachelor s degree in computer science, Engineering, or a related field Familiar with Agile Methodology concepts. Good to have: Master s degree in computer science, engineering field. Experience in containerization and orchestration technologies (e.g., Kubernetes, Docker). Proficiency in infrastructure as code (IaC) tools such as Terraform, CloudFormation, or Ansible. Experience in security best practices and compliance requirements in cloud environments. Certification in relevant cloud platforms (GCP, AWS, AZURE or Palantir). Familiarity with ITIL or other service management frameworks. Relocation Assistance Eligible: No Work Shift: Tyson is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will be considered without regard to race, national origin, color, religion, age, genetics, sex, sexual orientation, gender identity, disability or veteran status.

Posted 3 months ago

Apply

8 - 12 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Certain roles at Tyson require background checks. If you are offered a position that requires a background check you will be provided additional documentation to complete once an offer has been extended . Job Details: SUMMARY: The Lead Data Modeler is responsible for the development of high performance, scalable enterprise data models on cloud platform. Should possess strong SQL skills with excellent data modelling skills. Needs to be well versed in the Kimball methodology. Should participate in various activities throughout the systems development lifecycle. Participate in support activities, involve in POCs and present the outcome in acceptable manner. Also responsible for analyzing, architecture, designing, programing and debugging existing/new products to be developed and also mentoring team members. Taking ownership and demonstrate high professional and technical ethics including a consistent focus on emerging and predominant technologies that help the organization. Essential Duties and Responsibilities: 10+ years of work experience in data modeling or engineering Define, design, implement enterprise data models. Build Kimball compliant data models in Analytic layer of data warehouse Build 3rd normal form compliant data models in hub layer of data warehouse Translate tactical/strategic requirements to ensure effective solutions which meet business needs. Demonstrated ability to take ownership of initiatives and comfortable seeking help. Participate and provide consultation on complex initiatives. Comfortable tackling new problems and learning along the way. Review specifications and coach to ensure consistency in approach and use. Research improvements in coding standards, participate in code reviews. Refactor code to improve testability and maintainability when needed. Perform detailed technical design, development and unit testing of custom applications and data flows in the context of projects, releases and production support. Deliver high quality code for features and bug fixes. Demonstrated ability to effectively adapt to changing technology. Mentor and coach the team members Requirements: Technical Skills: Hands on experience in SQL Query optimization, working on RDBMS and Data Warehouse (ER and Dimensional modeling) Experience Modeling data into star schemas using the Kimball methodology Experience Modeling data into 3rd normal form Experience in Agile methodology Experience with CICD frameworks and common DevOps practices Experience working in an onsite-offshore model Awareness about the Metadata, Data Quality and Governance needs of Data and Analytics platforms Soft Skills: Strong Leadership, analytical, engineering, problem solving, presentation and communication skills Ability to work with a broad group of team members with diverse backgrounds Ability to take a decision and articulate back to leadership as well as business teams Ability to guide other team members through a complex problem or deliverable Education: Bachelor s degree in Computer Science, Information Systems or other technical area (Preferably B.E in Computer Science/Information Tech) Nice to have skills: Apache Spark Python Experience with graph database Experience in data identification, ingestion, transformation, and consumption related work Good knowledge of Data visualization Sap Enterprise S/4 HANA Familiarity Programming Language Skills (Python, NodeJs, Unix Scripting) Experience on GCP Cloud Ecosystem (Big Query, DataFlow, Airflow, Cloud Storage and equivalent tools on other Cloud solutions, design/build data services) Experience in all aspects of software engineering, during these deliverables: Define, architect, build, test, and deploy Relocation Assistance Eligible: No Work Shift: Tyson is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will be considered without regard to race, national origin, color, religion, age, genetics, sex, sexual orientation, gender identity, disability or veteran status.

Posted 3 months ago

Apply

6 - 10 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

Responsibilities: Working with architects, RD, Product Managers, and third-party vendors to understand high-level design, cross-interface impact, architectural or non-functional features, and detail the requirements. Defining detailed functional product requirements (Source to target mapping, transformation rules, business logic, and data integration requirements) in alignment with business needs and product requirements Defining non-functional product requirements around performance, serviceability, etc. Interfacing with customers to understand their data requirements and active participation in the implementation of the product in customers environments Facilitating mutual understanding between multiple products and Engineering teams through clear communications Prioritizing and scoping user stories with Product Management and Engineering to ensure on-time delivery of agreed products and features Defining relevant acceptance criteria for user stories based on product usage behavior Conducting internal product demos and assisting stakeholders in demo activity Defining relevant functional test considerations for all the user stories based on business needs. Providing support to Customer Support, Product team, and Technical Presales teams as needed to address product questions and issues Making product features, functions, and design recommendations as needed to accomplish strategic goals YOUR SKILLS AND EXPERIENCE: Experience - 5 to 10 years Good exposure to BSFI - A MUST Knowledge of either of the domain - Anti-Money Laundering (AML) / Fraud Prevention / Capital Markets / Market Suvillience - A MUST Well versed with writing of Business Use cases/Scenarios and functional test cases - A MUST Ability to multitask and prioritize work Ability to understand technical solutions Ability to see the big picture and translate it to detailed needs Fast learner with the ability to scale up to relevant technology Demonstrated ability to analyze, model and communicate business requirements Hands on mapping source to target fields, transformation rules, data management and defining non-functional requirements for a data transformation engagement. Good understanding of schemas, metadata and standard banking interfaces available in the market Familiar with data integration involving data quality and validation. Experience with creating specifications from business requirements and delivering and explaining them to software teams Good communication and presentation skills (English) Strong understanding of SQL Strong analytical skills with an ability to influence decision making A team player that demonstrates a strong work ethic, creativity, assertiveness, and flexibility Should have had played the role of Business Analyst/Data Analyst in a couple of engagements. Experience with Agile Development methodologies, user stories, acceptance criteria, feature prioritization, and defining product specifications Experience with tools like MS Teams, JIRA, AHA, MS Excel, MS Access and MS Visio

Posted 3 months ago

Apply

3 - 7 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Job title: Principal Data Standards Analyst Location: IN / Hyderabad Grade: L2-2 Hiring Manager: Manuel Anido About the job Our Hubs are a crucial part of how we innovate, improving performance across every Sanofi department and providing a springboard for the amazing work we do. Build a career and you can be part of transforming our business while helping to change millions of lives. ReadyAs Principal Data Standards Analyst within our Clinical Information Governance, As a member of a CIG (Clinical Information Governance) team, the Principal Data Standards Analyst acts as a metadata expert as he/she provides in-depth knowledge and guidance on clinical data standards (CDISC) and best practices for metadata management across global and study-specific levels. Understand the CDISC models (CDASH, SDTM, External Data, Controlled Terminology) and all regulatory requirements regarding Data Standards. Actively leads Therapeutic Area Working Groups. The Principal Data Standards Analyst will bridge the gaps between global metadata strategy and its adoption within therapeutic area studies. The Principal Data Standards Analyst represents Sanofi in all internal external networks initiatives (CDISC, TransCelerate, etc.), monitor actively Health Authorities requirements, and promotes Data Standards knowledge and best practices within Sanofi. At the study level, the Principal Data Standards Analyst supports clinical teams in supervising the review of study-specific metadata based on customer needs and provides effective solutions through regular analysis of information reported from a broad variety of sources. Are you ready to shape the future of medicineThe race is on to speed up drug discovery and development to find answers for patients and their families. Your skills could be critical in helping our teams accelerate progress. Join our Clinical Information Governance team as Principal Data Standards Analyst and you ll help shape the future of Clinical Data Standards at Sanofi and across our industry. Main responsibilities: Acts as a Metadata Lead Expert for CDASH, SDTM, External Data and Controlled Terminology. Understands the application of the CDISC models across the life cycle of a trial. Leads Global/Study Request Review Meetings. Leads eCRF review (study level) with the Study Team and ensure study-specific metadata are aligned with the CIG Clinical Data Standards strategy. Review of study requests (Data Collection, External Data, Controlled Terminology, SDTM). Develop CDISC-compliant end-to-end metadata specifications for new study-specific forms. May lead the creation and upload of study-specific controlled terminology (CT) in the Clinical Data Repos system. Support Clinical Data Standard Leaders with the management of global requests (review/meeting/implementation in Sanofi Metadata/metadata QC/standard documentation updates). Participate in Core and TA standard needs definition (SDTM and TA WG). Contribute to Global governance process definition and/or optimization. Monitor study-specific forms developed with Standard potential at the Therapeutic Area level. About you Experience : Relevant professional experience in Pharmaceutical Industry, with strong involvement in the clinical data flow , specialized in Data Standards Management. Strong knowledge of industry data standards and practices (e.g., CDISC/CDASH/SDTM). Having High level CDISC skills and Metadata Governance practices. (Preferred CDISC certification in one of more of the models). Familiar with end-to-end clinical data flows and data structures. Soft and technical skills : Strong English skills (verbal and written), ability to exchange fluently in a global environment. Efficient communication skills and good organization skills. Ability to negotiate and gain acceptance of others. Ability to coordinate/oversee multiple tasks simultaneously. Project team collaboration by interacting with internal or external partners in/outside the Department and with their leaders. Self-motivated results driven with attention to detail and quality. Education : Bachelor s degree or above, preferably in Life Science or related field Languages : Excellent English language knowledge - written and spoken. Why choose us Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention, and wellness programs and at least 14 weeks gender-neutral parental leave. This role is critical to our team s success and provides exposure to industry-wide developments of Data Standards. It is an opportunity to become a leader within a network of subject matter experts collaborating to shape the future of Data Standards for Clinical Research. Pursue progress , discover extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn t happen without people - people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let s be those people. At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, ability or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com !

Posted 3 months ago

Apply

10 - 15 years

20 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Experience needed: 12+ years Type: Full-Time Mode: 100% WFO (Monday to Friday) Shift: General Shift Location: Hyderabad, India Job Summary: We are seeking an experienced Data Modeller Lead to design, develop, and manage data models that support business intelligence, analytics, and data governance initiatives. The ideal candidate will have a strong background in data architecture, data modeling (conceptual, logical, and physical), and database design. They will collaborate with stakeholders across the organization to ensure data consistency, quality, and compliance with industry standards. Skills and Experience: 12+ years of experience in Data Analysis, Business Systems, or similar role Strong leadership qualities, project/task management skills, analytical skills, and work with distributed teams. Work with business stakeholders, SMEs, and technical teams to develop business & technical requirements. Experience in analyzing complex data systems and map them for Azure Data lake. Create data transformation and mapping documents collaborating with the client. Good understanding of DB principles and hands-on experience in advanced SQL to perform complex analysis Good understanding of data movement and optimization of ETL processing of vast datasets with specified SLAs. Expert knowledge of data modeling concepts Experience with data modeling, generate metadata to support relational & non-relational database implementations Experience building logical and physical data models. Documentation of data platforms for easy business consumption. Strong communication skills, both written and verbal, and skilled at adjusting communication style/vocabulary to an audience's role/function and technical familiarity Knowledge and understanding of financial services, preferably related to Equipment financing.

Posted 3 months ago

Apply

5 - 8 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Position- Knowledge and content management Location- Hyderabad Duration -6 months - extension depends on performance and business Budget- 80,865.60 - 120,483.20 per month 5+ years of experience with business consulting, knowledge management, content management, or related in a consulting, corporate or enterprise environment • Experience working in multinational, multicultural environments required • Strong experience with information, knowledge and/or content management, DAM systems, metadata, taxonomies, and databases • Project management skills and experience a plus • Excellent communication and collaboration skills JD- Description: Key Responsibilities Stakeholder management and engagement • Strategic stakeholder management: develop and own relationships, build partnership, communicate clearly and effectively, influence toward positive outcomes aligned with solution principles • Training and support: effectively explain solution requirements, functionalities and processes to assigned stakeholders • Communities of Practice: participate or lead, direct toward clear actions, focus others on impact, value and principles • Support change and adoption management activities Knowledge and content management • Knowledge architecture: understand and maintain knowledge content organization within solutions (e.g. knowledge pages, zones”, menus, taxonomies) • Knowledge capture: request / receive, upload, tag, organize and regularly review con-tent • Knowledge curation: identify and organized content to meet specific business needs (e.g. configure targeted feeds or alerts, build and distribute newsletters) • Data privacy / compliance: commit to learning about various data privacy, legal, eth-ics, risk and compliance requirements around various types of content and how to fa-cilitate stakeholders in abiding by these requirements in relation to KM solutions Continuous improvement • Data and insights: analyze data related to knowledge utilization as well as user feed-back to develop insights and propose actions to improve knowledge relevancy, usage and impact • Reporting and project management: generate regular reports / updates to communi-cate status on key activities and initiatives • Continuous improvement: collaborate with the wider team and business stakeholders to manage continuous improvement activities around the knowledge content or as-pects of the knowledge solution • Value-add: proactively collaborate with the team to identify, design, develop and pilot technical, process, governance or adoption improvements that add value for the organization Interested share cv : busiraju.sindhu@,manpower.co.in

Posted 3 months ago

Apply

2 - 5 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Software Engineer - Java The Fixed Income Applications development team is focused on building and supporting a reference data system. The team s responsibilities span request/response-based metadata distribution for various financial products (bonds, futures, options, FX spots/forwards, deposits, swaps, commodities, swaptions, cdx, cds, equities, etc), dealing with batch and on-demand security creation and updates, building infrastructure for keeping the metadata current and accurate, and providing multiple means of dissemination to downstream systems (such as analytics, risk, and trader systems). While not a low-latency system, it is perceived as a high availability cluster capable of serving both existing securities and securities created upon request based on external metadata. Team members interact directly with operations teams and other technology teams, so solid communication skills are essential. The team owns the entire software lifecycle, from requirements and design, through implementation, to production releases and support. Release cycles are tight, so in addition to strong development skills, you must have demonstrated the ability to adapt to changing conditions and learn quickly. There are no business analysts on the team, so we expect developers to have sufficient business and product knowledge to understand the requirements on their own. That being said, this is not a particularly Quantitative role - there is a separate Analytics team that undertakes valuation and related work. We focus more on building up and supporting the technical infrastructure. Required skills/experience: 4+ years of professional experience with Java 3+ years of SQL database development skills Solid grasp of Multithreading, algorithms, and data structures Familiarity with event streaming platforms like Kafka, RabbitMQ, etc Results-oriented, can deliver quality code with quick turnaround Self-starter and critical thinker, takes ownership of own projects and makes improvement suggestions for the entire infrastructure Preferred skills/experience: Fixed income product knowledge would be a plus Spring/Spring Boot experience Experience with vendor feeds (Bloomberg SAPI/BPIPE, Markit) Distributed caching (e.g., Hazelcast, REDIS, Memcached, Ignite, Ehcache, etc.) Python experience for unit testing and scripts

Posted 3 months ago

Apply

3 - 5 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Company Overview At Motorola Solutions, were guided by a shared purpose - helping people be their best in the moments that matter - and we live up to our purpose every day by solving for safer. Because people can only be their best when they not only feel safe, but are safe. Were solving for safer by building the best possible technologies across every part of our safety and security ecosystem. Thats mission -critical communications devices and networks, AI-powered video security & access control and the ability to unite voice, video and data in a single command center view. Were solving for safer by connecting public safety agencies and enterprises, enabling the collaboration thats critical to connect those in need with those who can help. The work we do here matters. Department Overview Our IT organization isn t just here to support our business. We re here to reinvent it - by changing the way our customers, partners and employees interact with our company. To do that, we re looking for people who bring great ideas and who make our partners ideas better. Intellectually curious advisors (not order takers) who focus on outcomes to creatively solve business problems. People who not only embrace change, but who accelerate it. Job Description As a dynamic technology enterprise that operates on a global scale, Motorola Solutions and its products present an attractive target for malicious actors. This role allows you to use your cybersecurity and software engineering skills to protect the people who protect us. Our customers are first responders. Fire, police, paramedics, 911 call takers, and 911 dispatchers. And when we or our loved ones place that 911 call, we become the customer of our customers. We want that call to be answered and the communications between the dispatcher and the first responder to be available. If you are passionate about securing mission-critical products and services and thrive in a global, collaborative, cross-functional team, we have an exciting opportunity for you that implements secure software development practices, vulnerability management, and other cutting-edge cybersecurity projects. In an ever-evolving threat landscape, the Product Security team is at the forefront of defending against cyber threats, helping the business ensure the confidentiality, integrity, and availability of our products. As a Vulnerability Management Software Developer, you will design, develop, and implement the tooling to integrate data from multiple sources to augment our vulnerability management platform. This provides product development teams with actionable insights to further safeguard data and systems from potential threats. MSI provides a work environment that encompasses workplace flexibility, continued professional growth through paid training and certifications, conferences and seminars, and education assistance. Our culture encourages the honing of current skills and the building of new capabilities. We prize flexibility, continuous improvement, and collaboration, both within the team and with industry peers. Scope of Responsibilities/Expectations Strong team player with the ability to work with a geographically dispersed team Experience bringing order to the chaos of ingesting data from a wide variety of sources Implement, document, and maintain software solutions which may pull data from different REST APIs, perform calculations or data extraction, and store values to databases Design and implement web applications with a great User Experience Develop and manage ongoing process improvements focusing on automation, accuracy, and timeliness of information Work with multiple data sources including vulnerability and other enterprise data for contextual enrichment to drive actionable outcomes Identify, root cause, and remediate data quality issues Familiarity with parsing, extracting, and reshaping data, primarily in JSON format Desired Background/Knowledge/Skills Strong background in software development and modern programming languages - preferably Python/ JavaScript/Typescript and SQL Extensive experience in working with API technologies Basic knowledge of widely-used application development technologies (eg: ReactJS, FastAPI) Knowledge of common application vulnerabilities (e.g. OWASP Top 10), attack techniques and remediation tactics/strategies Working knowledge of CI/CD workflows/pipelines and DevOps practices Basic knowledge of cloud services deployment, containers, etc. Skill in delivering and speaking to technical concepts to a wide variety of audiences Knowledge of cybersecurity and secure coding principles and best practices Skill in using code analysis tools like SAST, DAST, SCA tools Ability to research and learn new topics and become functional with them quickly Aversion to repeated manual processes and a strong desire to automate them Attention to detail for data organization and consistency Interest in modeling data and relationships (semantic/property graphs) Curiosity for new and emerging technologies and how we can use them Knowledge of container-based workflows and deploying applications to Kubernetes Advanced skills in building processes supporting data transformation, data structures, metadata dependency and workload management Familiarity with building REST or GraphQL APIs and data objects Familiarity with Elasticsearch, Elasticsearch Query Language, and Kibana a plus Basic Requirements Bachelor s degree in a related field or equivalent work experience 3-5 years in software development Travel Requirements None Relocation Provided None Position Type Experienced Motorola Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion or belief, sex, sexual orientation, gender identity, national origin, disability, veteran status or any other legally-protected characteristic. We re committed to providing an inclusive and accessible recruiting experience for candidates with disabilities, or other physical or mental health conditions. To request an accommodation, please email ohr@motorolasolutions.com .

Posted 3 months ago

Apply

2 - 5 years

7 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Primary Duties Responsibilities Support and Maintain Oracle EPM modules (PBCS, EPBCS, FCCS, PCMCS, ARCS). Create metadata, forms, reports, business rules, calculation scripts, and Groovy scripts. Manage user provisioning, security, and approval process flows. Manage data flows between EPM Cloud and other systems using data integration tools, ensuring data accuracy and consistency. Configure EPM Cloud integrations including data exchange , data mappings and pipe-lines. Create and support Groovy and/or JavaScript customization personalization within EPM Cloud. Serve as the liaison between Accounting and IT departments to ensure the proper operation and maintenance of the financial systems in use. Support and troubleshoot installation of Oracle Smart View and EPM extensions. This role serves as a technical point of contact for system maintenance and configuration, ensuring data integrity, system testing, reporting, and process improvements. Lead testing and verification efforts for quarterly production releases, executing unit test plans and verifying business user acceptance testing. Education Experience Bachelors Degree Computer Science or other related field Required or equivalent related experience may be considered in lieu of degree. 5+ years EPM Cloud/Hyperion Financial applications support and data integrations development experience SmartView scripting and design experience preferred. Oracle and SAP experience a plus. Experience in administering system, user testing and user training related to version upgrades and new releases. Skills Ability to blend accounting knowledge and IT logic to recommend and implement improvements to processes and financial reporting. Demonstrated problem solving and work prioritization skills. Ability to keep up to date with technology and apply to business strategic plan. Ability to achieve results independently or working with others. Excellent interpersonal and communication skills; ability to communicate effectively with end-users, management and serve as a liaison between the accounting and IT staff. Ability to handle multiple priorities involving internal customer requests and demands. Ability to excel in a cross-organizational, cross cultural, global team environment. Ability to handle special assignments promptly and professionally. Set a high standard of ethics, professionalism, leadership and competency. Working Conditions To be flexible on timings to be able to support various time zones, primarily US time zones The work mode of Finisar India is Hybrid i.e. 3 days at office. Culture Commitment Ensure adherence to company s values (ICARE) in all aspects of your position at Coherent Corp.: I ntegrity - Create an Environment of Trust C ollaboration - Innovate Through the Sharing of Ideas A ccountability - Own the Process and the Outcome R espect - Recognize the Value in Everyone E nthusiasm - Find a Sense of Purpose in Work Coherent Corp. is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law. Finisar India (Subsidiary of Coherent Corp) is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to gender identity, sexual orientation, race, color, religion, national origin, disability, or any other characteristic protected by law.

Posted 3 months ago

Apply

5 - 12 years

7 - 11 Lacs

Chennai

Work from Office

Naukri logo

Title: Data Modeler Mode of working Full Time Work Location - Chennai, India. Work Experience 8-12 years Job Description Key Skills Required for the Data Modeler Role: Data Modelling Expertise Ability to analyse and translate business needs into long-term data models. Metadata Management Strong knowledge of metadata management and related tools. Machine Learning Experience 5-8+ years of experience in machine learning in production. Statistical Analysis Knowledge of mathematical foundations and statistical methods. Database Systems Evaluating and optimizing existing data systems. Data Flow Design Creating conceptual data models and data flows. Coding Best Practices Developing best practices for data coding to ensure consistency. System Optimization Updating and troubleshooting data systems for efficiency. Collaboration Skills Working with cross-functional teams (Product Owners, Data Scientists, Engineers, Analysts, Developers, Architects). Technical Documentation Preparing training materials, SOPs, and knowledge base articles. Communication Presentation Strong interpersonal, communication, and presentation skills. Multi-Stakeholder Engagement Ability to work with multiple stakeholders in a multicultural environment. Data Modelling Certification Desirable but not mandatory. ",

Posted 3 months ago

Apply

8 - 12 years

12 - 13 Lacs

Noida

Work from Office

Naukri logo

We re looking for a Digital Marketing Manager to join our Product team in Noida. Working at Taazaa involves engaging with cutting-edge technology and innovative software solutions in a collaborative environment. We emphasize on continuous professional growth, offering workshops and training. Our employees often interact with clients to tailor solutions to business needs, working on diverse projects across industries. We promote work-life balance with flexible hours and remote options, fostering a supportive and inclusive culture. Competitive salaries, health benefits, and various perks further enhance the work experience. Looking ahead, we aim to expand our technological capabilities and market reach, investing in advanced technologies and expanding our service offerings. We plan to deepen our expertise in AI and machine learning, enhance our cloud services, and continue fostering a culture of innovation and excellence. Taazaa is committed to staying at the forefront of technology trends, ensuring it delivers impactful and transformative solutions for its clients. We are looking for a results-driven Digital Marketing Manager to develop, implement, track, and optimize our digital marketing campaigns across multiple channels. The ideal candidate should have a strong grasp of current marketing tools, trends, and best practices to lead integrated digital marketing strategies that drive brand awareness, lead generation, and customer engagement. What you ll do Develop and Execute Strategies: Create and manage digital marketing campaigns across SEO, SEM, email marketing, social media, and paid advertising. SEO SEM: Lead and mentor the SEO and SEM teams to optimize website content, structure, and metadata for improved search engine rankings and organic traffic growth. Oversee and strategize PPC campaigns on Google Ads and other platforms. Social Media Marketing: Develop and oversee social media strategies to increase engagement, brand awareness, and conversions. Content Marketing: Work with the content team to create and distribute engaging content, including blogs, videos, and email newsletters. Analytics Reporting: Monitor and analyze digital performance metrics, providing insights and recommendations for optimization. Lead Generation Conversion: Optimize digital channels for lead generation and implement CRO (Conversion Rate Optimization) strategies. Email Marketing: Design and execute email marketing campaigns for nurturing leads and engaging customers. Marketing Automation: Leverage marketing automation tools to streamline and scale digital campaigns. Website Management: Collaborate with web developers and designers to enhance user experience and site performance. Stay Updated: Keep up with digital marketing trends, algorithm updates, and emerging technologies to maintain a competitive edge. Your qualifications Bachelor s degree in Marketing, Business, Communications, or a related field. 8+ years of experience in digital marketing, with proven success in managing campaigns. Hands-on experience with SEO/SEM, Google Ads, and social media advertising. Proficiency in analytics tools (Google Analytics, SEMrush, HubSpot, etc.). Strong knowledge of email marketing, marketing automation, and CRM systems. Excellent written and verbal communication skills. Data-driven mindset with strong analytical skills. Preferred Qualifications Experience in B2B marketing, SaaS, or technology-related industries. Certifications in Google Ads, HubSpot, or other relevant platforms. Familiarity with A/B testing and conversion rate optimization techniques Behavioral: Here are five essential behavioral skills a Product Manager should possess: Adaptability : Digital marketing is constantly evolving with new tools, platforms, and trends. A successful manager must be adaptable to change, able to pivot strategies when necessary, and open to learning and experimenting with emerging technologies. Leadership and Team Management : A manager should be able to inspire and motivate their team, delegate tasks efficiently, and maintain a positive, collaborative work environment. Strong leadership skills ensure that the team is aligned with organizational goals and performs to the best of its ability. Analytical Thinking : Digital marketing requires a keen understanding of data and metrics to make informed decisions. A manager should possess strong analytical skills to interpret campaign results, understand customer behavior, and identify opportunities for improvement. Creativity and Problem-Solving : Creativity is essential for designing compelling campaigns, ads, and content that engage audiences. A digital marketing manager should also be a problem-solver, capable of thinking outside the box to overcome challenges and drive innovative solutions. Effective Communication : Clear and effective communication is crucial for coordinating with stakeholders, clients, and team members. A manager must be able to convey complex ideas in a simple way, listen actively, and ensure that there s consistent messaging across all channels. What you ll get in return Joining Taazaa Tech means thriving in a dynamic, innovative environment with competitive compensation and performance-based incentives. Youll have ample opportunities for professional growth through workshops and certifications, while enjoying a flexible work-life balance with remote options. Our collaborative culture fosters creativity and exposes you to diverse projects across various industries. We offer clear career advancement pathways, comprehensive health benefits, and perks like team-building activities. Who we are Taazaa Tech is a kaleidoscope of innovation, where every idea is a brushstroke on the canvas of tomorrow. Its a symphony of talent, where creativity dances with technology to orchestrate solutions beyond imagination. In this vibrant ecosystem, challenges are sparks igniting the flames of innovation, propelling us towards new horizons. Welcome to Taazaa, where we sculpt the future with passion, purpose, and boundless creativity

Posted 3 months ago

Apply

6 - 8 years

12 - 17 Lacs

Noida

Work from Office

Naukri logo

AI/ML Engineer: 6-8 years exp Strong Hands-on experience in Python with Flask and API Understanding of machine learning concepts and algorithms, including supervised and unsupervised learning. Proficiency in using Azure AI services such as Azure Open AI, NER , Metadata Extraction , Document parsing Hands-on experience to set up the Azure VPC, private endpoints , App services and Azure functions Able to handle the Azure AD connection with Roles and Identity Management Knowledge of NLP techniques for text analysis, sentiment analysis, and language understanding, Security and Compliance Mandatory Competencies Python - Python Python - Rest API Data Science - Machine Learning (ML) At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.

Posted 3 months ago

Apply

8 - 13 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Experience with processing large workloads and complex code on Spark clusters. Experience setting up monitoring for Spark clusters and driving optimization based on insights and findings. Understanding designing and implementing scalable data warehouse solutions to support analytical and reporting needs. Strong analytic skills related to working with unstructured datasets. Understanding of building processes supporting data transformation, data structures, metadata, dependency, and workload management. Knowledge of message queuing, stream processing, and highly scalable big data data stores. Knowledge of Python and Jupyter Notebooks. Knowledge of big data tools like Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools like Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services (EC2, EMR, RDS, and Redshift). Willingness to work from an office at least 2 times per week. Nice to have: Knowledge of stream-processing systems (Storm, Spark-Streaming). Responsibilities: Optimize Spark clusters for cost, efficiency, and performance by implementing robust monitoring systems to identify bottlenecks using data and metrics. Provide actionable recommendations for continuous improvement. Optimize the infrastructure required for extracting, transforming, and loading data from various data sources using SQL and AWS big data technologies. Work with data and analytics experts to strive for greater cost efficiencies in the data systems.

Posted 3 months ago

Apply

2 - 4 years

2 - 6 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role Description: Backend Developer Position Overview: We are seeking a highly skilled Backend Developer to join the Cloud Data Hub (CDH) Team, working closely with teams in Munich and BMW TechWorks India. The ideal candidate is a backend development expert with proficiency in Python, AWS, Kafka, Terraform, and Git, and a passion for building scalable and efficient systems. This role will involve designing, developing, and maintaining backend solutions for the CDH platform, contributing to BMWs transformation into a fully data-driven organization. About the project The Cloud Data Hub (CDH) is a cloud-based, centralized data lake developed by BMW, serving as the organizations central data landing zone. Designed to democratize data usage across all departments, the CDH consolidates data into a single source of truth, enabling providing and consuming entities to leverage data effectively and efficiently. It plays a pivotal role in BMWs transformation into a truly data-driven organization, supporting data acquisition, integration, processing, and analysis across its value chain. Key Responsibilities Design, develop, and maintain backend systems for the CDH platform, ensuring robust, scalable, and efficient solutions. Build and enhance serverless architectures and REST APIs using Python and AWS services. Implement and manage Kafka data streaming pipelines for real-time data processing and metadata orchestration. Develop and deploy infrastructure using Terraform for infrastructure-as-code automation on AWS. Utilize Git for version control and collaborate with the team on code reviews and CI/CD pipelines. Apply Test-Driven Development (TDD) principles to ensure code reliability, maintainability, and high-quality deliverables. Ensure the backend systems comply with BMW s security standards, performance metrics, and scalability requirements. Proactively identify, debug, and resolve performance bottlenecks and system issues. Contribute to technical documentation and knowledge sharing to ensure project continuity and team alignment. Qualifications Expert-level proficiency in backend development with Python. Strong experience with AWS cloud services, including Lambda, S3, DynamoDB, API Gateway, and other serverless offerings. Hands-on expertise with Kafka for building and managing data streaming solutions. Advanced skills in Terraform for infrastructure automation and management. Proficient in writing optimized SQL queries and working with relational databases. In-depth knowledge of Git for version control and experience with CI/CD pipelines. Experience in building distributed systems and handling large-scale, real-time data processing workloads. Strong understanding of system design, scalability, and security best practices. Excellent debugging and problem-solving skills, with a detail-oriented mindset. Good communication and interpersonal skills to collaborate effectively with cross-functional teams. Preferred Skills Experience working with Docker and containerized environments. Familiarity with agile frameworks and participation in Scrum ceremonies. Knowledge of monitoring and observability tools like CloudWatch, Prometheus, or Grafana. Certification in AWS Solutions Architecture or related AWS certifications. Why Join Us This role provides an exciting opportunity to work on cutting-edge cloud-native technologies, contributing directly to BMW s Cloud Data Hub, a cornerstone of its data-driven transformation. As a Backend Developer, you will collaborate with talented teams across geographies to build solutions that drive real-world impact at a global scale.

Posted 3 months ago

Apply

1 - 4 years

2 - 6 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role Description: Backend Developer Position Overview: We are seeking a highly skilled Backend Developer to join the Cloud Data Hub (CDH) Team , working closely with teams in Munich and BMW TechWorks India . The ideal candidate is a backend development expert with proficiency in Python , AWS , Kafka , Terraform , and Git , and a passion for building scalable and efficient systems. This role will involve designing, developing, and maintaining backend solutions for the CDH platform, contributing to BMWs transformation into a fully data-driven organization. About the project The Cloud Data Hub (CDH) is a cloud-based, centralized data lake developed by BMW, serving as the organizations central data landing zone. Designed to democratize data usage across all departments, the CDH consolidates data into a single source of truth, enabling providing and consuming entities to leverage data effectively and efficiently. It plays a pivotal role in BMWs transformation into a truly data-driven organization, supporting data acquisition, integration, processing, and analysis across its value chain. Key Responsibilities Design, develop, and maintain backend systems for the CDH platform, ensuring robust, scalable, and efficient solutions. Build and enhance serverless architectures and REST APIs using Python and AWS services. Implement and manage Kafka data streaming pipelines for real-time data processing and metadata orchestration. Develop and deploy infrastructure using Terraform for infrastructure-as-code automation on AWS. Utilize Git for version control and collaborate with the team on code reviews and CI/CD pipelines. Apply Test-Driven Development (TDD) principles to ensure code reliability, maintainability, and high-quality deliverables. Ensure the backend systems comply with BMW s security standards , performance metrics , and scalability requirements . Proactively identify, debug, and resolve performance bottlenecks and system issues. Contribute to technical documentation and knowledge sharing to ensure project continuity and team alignment. Qualifications Expert-level proficiency in backend development with Python . Strong experience with AWS cloud services , including Lambda, S3, DynamoDB, API Gateway, and other serverless offerings. Hands-on expertise with Kafka for building and managing data streaming solutions. Advanced skills in Terraform for infrastructure automation and management. Proficient in writing optimized SQL queries and working with relational databases. In-depth knowledge of Git for version control and experience with CI/CD pipelines. Experience in building distributed systems and handling large-scale, real-time data processing workloads. Strong understanding of system design, scalability, and security best practices. Excellent debugging and problem-solving skills, with a detail-oriented mindset. Good communication and interpersonal skills to collaborate effectively with cross-functional teams. Preferred Skills Experience working with Docker and containerized environments. Familiarity with agile frameworks and participation in Scrum ceremonies. Knowledge of monitoring and observability tools like CloudWatch, Prometheus, or Grafana. Certification in AWS Solutions Architecture or related AWS certifications. Why Join Us This role provides an exciting opportunity to work on cutting-edge cloud-native technologies , contributing directly to BMW s Cloud Data Hub , a cornerstone of its data-driven transformation. As a Backend Developer , you will collaborate with talented teams across geographies to build solutions that drive real-world impact at a global scale.

Posted 3 months ago

Apply

5 - 8 years

4 - 7 Lacs

Pune

Work from Office

Naukri logo

Role: IDAM AD and Birthright Provisioning Engineer Location:EON Pune Who are we looking for? The Cyber security delivery team that owns the managed security services for this client, has an opening for a IDAM AD Provisioning Engineer specialized in Radiant Logic, LDAP Technical Skills: Radiant Logic, LDAP user provisioning tool experience is must Basic configuration in web service for target file provisioning Knowledge and hands-on experience with the following IAM components: Lifecycle Manager, Compliance Manager, Application On-Boarding, Access Request, Automated Provisioning, Password Management, Workflows Prior experience on User/application on-boarding and provisioning/de-provisioning User Provisioning on any IAM tool or even AD Operational experience on Active Directory & Authentication processes Knowledge on Audit support Knowledge on UNIX Familiarity with web services Familiarity with enterprise directories (LDAP, Active Directory & SQL) Process Skills: Directory Services: Manage and support different Directory Service integration with VDS Troubleshooting of synchronization Issue between Directory and VDS Implementations, upgrades, enhancement and conversions involving VDS Support for extending data attribute based on requirement Troubleshooting Multiple identity correlation, and directory storage Directory Health check and monitoring Report Generation for Access and audit Managing Roles, metadata and mapping of LDAP objects and attribute Modify existing RBAC roles ARP updates (Adding entitlements, role & policy creation, workflow creation, workflow testing) Birthright Tool Birthright: ISIM 1.Troubleshooting onboarding issues 2.Operation support for Manual onboarding adhoc request 3.Handling Deprovision Request 4.ISIM rules and policies 5.Data feed file consistency check 6.Define and modifying of workflow, rules,email template, policiesas per the requirement

Posted 3 months ago

Apply

4 - 9 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary: We are looking for an experienced Inform atica Engineer to join our team. In this role, you will be responsible for managing and optimizing the Intelligent Data Management Cloud (IDMC) environment. You will work with multiple modules within IDMC, including Cloud Data Governance & Catalogues (CDGC), Cloud Data Marketplace (CDMP), and Metadata Command Center. The ideal candidate will have hands on experience in creating and configuring scan resources, metadata extraction, and metadata profiling, along with a strong understanding of business terms association, attribute management, data domains, and rules. If you are passionate about data governance and cloud technologies, we encourage you to apply! Key Responsibilities: IDMC Modules Management: Oversee and manage the CDGC, CDMP, and Metadata Command Center modules within the Informatica Intelligent Data Management Cloud (IDMC). Scan Resources Configuration : Create and configure scan resources of various types, including metadata extraction, metadata profiling, business terms association, attribute management, and data domains. Data Governance: Define and manage data rules, ensuring effective data governance, classification, and cataloguing of data assets within the cloud environment. Collaboration and Support: Work closely with stakeholders and teams to ensure effective usage of the IDMC platform, assisting in the setup, configuration, and troubleshooting of any platform related issues. System Optimization: Continuously monitor and improve the configuration of IDMC modules to ensure optimal performance and data governance capabilities. Preferred Skills and Qualification: Hands-on experience with the following IDMC modules (minimum 1-2 years): CDGC (Informatica Cloud Data Governance & Catalog) CDMP (Cloud Data Marketplace) Metadata Command Center Ability to create and configure scan resources of various types. Perform metadata extraction and metadata profiling. Associate business terms, manage attributes, and configure data domains and rules within the platform. Strong problem-solving and analytical skills. Excellent communication skills, both written and verbal, with the ability to convey technical concepts to non-technical stakeholders. Ability to manage multiple tasks and priorities in a dynamic and fast-paced environment. Desired Skills: Prior experience with Informatica Enterprise Data Catalog (EDC) and/or Axon tools. Strong understanding of Informatica software security models and overall IDMC architecture. Roles and Responsibilities Informatica Engineer

Posted 3 months ago

Apply

4 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

As an Oracle EPM consultant, you are responsible for client interaction for requirement gathering, driving workshops, conduct training in various EPM modules, configure profiles, reconcile financial data. Handle application activities including metadata management, security/ user role provisioning, building reports in multiple EPM reporting tools, co-ordinate with onsite / offshore technical team, hands on project documentation preparation including requirement, design, training, UAT test scripts etc., you must work across Oracle EPM Account Reconciliation module. You should have a good experience in adoption of standard out of the box oracle modules and functionalities. You should be able to work as an independent team member with less supervision, capable of applying judgment to plan and execute your tasks. Skill Requirement: Core Application Skills: Strong Techno Functional knowledge in Planning tools including ARCS Strong accounting financial reconciliation knowledge to identify the gaps and provide recommendations to users. In depth knowledge in data loading process. Ability to build complex reports. Soft Skills Strong understanding on basic accounting and financial consolidation process Strong Interpersonal skill including Oral and written communication with internal and external global stake holders. A team player who is accountable and committed for flawless delivery, a collaborative and flexible person who can adapt to dynamic business environments. A strong presentation skill which help to leverage the business meetings and internal stake holder s discussions. Excellent problem solving skills and providing recommendations. Open for learning to deepen and widen knowledge in Oracle EPM and other skills which is required from time to time. Advance knowledge in Microsoft Office tools including Excel, Word and Power point, leverage the SmartView tool to build report and ad hoc analysis. Advance Microsoft office user to develop complex reports and use the tolls in appropriate project phase, added advantage if you know advance macros. Preferred Skills Oracle certification in EPBCS, FCCS and sARCS Certified Financial standards, Project Management Key Word search: FCCS, ARCS, Oracle Data Management, Financial Data Management Enterprise Edition, FDMEE.

Posted 3 months ago

Apply

Exploring Metadata Jobs in India

Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bengaluru
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi/NCR

These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.

Average Salary Range

The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum

Salaries may vary based on the company, location, and specific job responsibilities.

Career Path

In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect

As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.

Related Skills

In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance

Having a combination of these skills can make job seekers more attractive to potential employers.

Interview Questions

  • What is metadata? (basic)
  • How do you ensure data quality in metadata management? (medium)
  • Can you explain the difference between structured and unstructured metadata? (medium)
  • What tools or software have you used for metadata management? (basic)
  • Describe a challenging metadata project you worked on and how you overcame obstacles. (advanced)
  • How do you stay updated with the latest trends in metadata management? (basic)
  • Explain the importance of metadata in data governance. (medium)
  • Have you ever had to resolve conflicts between different metadata standards? How did you handle it? (advanced)
  • What is the role of metadata in data integration? (medium)
  • How do you ensure metadata security and compliance with regulations? (medium)
  • What are the benefits of using metadata in data analytics? (basic)
  • Can you discuss a successful metadata strategy you implemented in a previous role? (advanced)
  • Explain the concept of metadata harvesting. (medium)
  • How do you handle metadata versioning and updates? (medium)
  • Have you worked with ontologies and taxonomies in metadata management? (advanced)
  • How do you collaborate with other teams, such as data scientists or developers, in metadata projects? (medium)
  • What are the common challenges faced in metadata management, and how do you address them? (advanced)
  • How do you measure the effectiveness of metadata initiatives in an organization? (medium)
  • Can you give an example of how metadata enhances data search and retrieval processes? (medium)
  • What role does metadata play in data lineage and traceability? (medium)
  • Explain the difference between technical metadata and business metadata. (basic)
  • How do you handle metadata migration when transitioning to a new system or platform? (advanced)
  • Describe a time when you had to prioritize metadata tasks based on business needs. (medium)
  • What are the best practices for documenting metadata to ensure consistency and accuracy? (medium)
  • How do you handle metadata conflicts or inconsistencies in a large dataset? (advanced)

Conclusion

As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies