Home
Jobs

758 Metadata Jobs - Page 28

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

About us: Gracenote, the content data division of Nielsen, powers innovative entertainment experiences for leading media companies worldwide. Our advanced entertainment metadata and connected IDs enable seamless content navigation and discovery, helping consumers connect to the content they love and explore new favorites. With a global presence, Gracenote delivers both global and localized video, audio, automotive & sports content solutions across the Americas, Asia-Pacific, Europe, the Middle East, and Africa. Some of the biggest names who trust Gracenote are Jio, Amazon, NBC, BMW, Mercedes-Benz, Samsung, Sony etc. We are seeking a highly skilled and experienced Engineering manager to join our innovative team. The ideal candidate will bring deep expertise in designing and implementing scalable, high-performance systems, particularly in the domains of media data processing, web crawling, and metadata curation and serving. This role involves technical leadership, people management, engineering and operational excellence and collaboration across teams to deliver impactful solutions to media companies and other stakeholders. What is in it for you ? We are setting ourselves to modernise the metadata processing pipelines through platformization and build intelligent solutions using data science for Video, Audio, Automotive and Sports. Be part of high impact and visibility projects which are going to be key revenue drivers for Gracenote. Contribute to the next-gen metadata platforms which combine technology, data science and innovation. Key Responsibilities : They are responsible for building high-performing agile teams They set the direction for the team and support people in their career growth. They are responsible for talent segregation and succession planning Provide technical leadership to identify and define the right component/system abstractions/capabilities for current/future product/platform tasks Drive strategy and vision in collaboration with Product Build scalable and reliable technology solutions at a rapid pace One of the key responsibilities of an EM is to understand business goals and contribute to product strategy and take accountability for moving key business metrics They are also responsible for driving execution using Agile methodologies by removing impediments along the way with the big picture in mind Required Skills and Expertise: Strong expertise in one or more programming languages such as Java, Python, Golang Experience with databases like Postgres, MySQL, MongoDB, Cassandra, and caching layers such as Redis or Memcached. Expertise in designing and implementing diverse data delivery solutions, including real-time streaming architectures, efficient batch processing systems, and robust API delivery mechanisms. Proficiency in building RESTful APIs and working with GraphQL to serve metadata efficiently. Knowledge of cloud ecosystems (AWS, Azure, GCP) and cloud-native architectures for scalable deployments. Exposure to NFR concepts like availability, recoverability, performance & scalability. Exposure to different design & architecture patterns. Demonstrated ability to design distributed systems with high availability and fault tolerance. Understanding of software development methodologies (Agile, Scrum, etc. ). Web Crawling: experience with web crawling frameworks such as Apache Nutch, Scrapy, or custom crawlers using Java/JavaScript. Experience with big data frameworks like Apache Spark, Hadoop, or Flink for processing and transforming large-scale datasets. Expertise in designing systems for curated metadata storage, search, and delivery using relational and NoSQL databases. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. 10+ years of software development experience, including 2+ years in people management role Strong understanding of algorithms, data structures, and system design principles. Experience in delivering end-to-end solutions for web crawling, data processing, and metadata curation. Excellent problem-solving skills and the ability to work effectively in a fast-paced environment. Strong communication skills with the ability to convey complex technical concepts to diverse audiences. Preferred Qualifications: Experience in using GenAI tools for code creation, video and voice data processing. Experience with Natural Language Processing (NLP) and Machine Learning (ML) techniques to analyze and extract insights from large-scale datasets Understanding of security best practices for data protection and system integrity.

Posted 3 months ago

Apply

5 - 7 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

- Maintain/develop data APIs efficiency and scalability, implementing best practises and create Automated tests for API performance, result validations with Python based API technologies such as Flask/Fast API with Snowflake as data source Required Candidate profile - Data API development experience with efficiency and scalability focus with Python based libraries like Flask/Fast API - Experience with Snowflake and cost optimizations techniques.

Posted 3 months ago

Apply

5 - 7 years

7 - 10 Lacs

Bengaluru

Hybrid

Naukri logo

- Maintain/develop data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation and loading of data from a wide variety of data sources using Python, SQL, DBT, other data technologies Required Candidate profile - Working experience with Snowflake, Hands-on experience with Snowflake utilities, Streamlit , SnowSQL, Must have worked on Snowflake Cost optimization scenarios - Overall solid programming skills.

Posted 3 months ago

Apply

8 - 13 years

30 - 35 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Description: The Senior Data Engineer is responsible for leading the design, development, and optimization of data infrastructure and systems to meet complex business needs. They play a strategic role in developing and managing large-scale data pipelines, ensuring scalability, reliability, and performance. As a key technical expert, the Senior Data Engineer collaborates with cross-functional teams to deliver innovative data solutions and ensures adherence to best practices in data management and governance. Responsibilities Leads the design and development of advanced data pipelines, ensuring seamless ETL/ELT of data into enterprise storage systems. Oversees data integration from diverse sources, maintaining data consistency, integrity, and compliance with enterprise standards. Transforms raw data into usable formats through cleansing, aggregation, filtering, and enrichment techniques. Optimizes data pipelines and workflows for scalability, performance, and cost efficiency. Establishes robust data validation, quality checks, and error-handling mechanisms to ensure enterprise data integrity. Mentors junior data engineers, promoting technical growth and team development. Implements and enforces best practices in data governance, including metadata management and lineage tracking. Drives the adoption of real-time data streaming and processing workflows to enable advanced analytics and decision-making. Partners with stakeholders to identify and address complex data challenges, delivering scalable and innovative solutions. Stays updated with emerging data technologies, evaluating their potential to enhance enterprise capabilities. Develops and enforces data security protocols to protect sensitive information. Job Requirements Education A bachelor s degree in computer science, data science, software engineering, information systems, or related quantitative field; master s degree preferred Experience 8+ years of data engineering experience, including expertise in data integration, pipeline optimization, and enterprise-scale data solutions. Proven leadership in implementing big data solutions (e.g., Snowflake, Databricks) and distributed data systems (e.g., Apache Spark, Flink). Skills Advanced proficiency in Apache technologies (Kafka, Airflow, Spark) and programming languages (Python, Java, Scala). Expertise in data query tools (SQL, Hive) and database technologies (NoSQL, Hadoop, Teradata). Strong knowledge of cloud platforms (AWS, Azure, GCP) and modern data architectures. Experience with real-time streaming data workflows and AI/ML analytics initiatives. Exceptional analytical and problem-solving skills with debugging expertise in complex systems. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Strong leadership abilities to mentor teams and influence cross-functional initiatives. We welcome talent at all career stages and are dedicated to understanding and supporting additional needs. Were proud to be an equal opportunity employer, committed to creating an inclusive and open environment for everyone.

Posted 3 months ago

Apply

5 - 7 years

5 - 9 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

The Opportunity at Komodo Health Help deliver high-quality customer-facing Life Science products that exceed our customers expectations. You will own key parts of the quality planning, strategy, and product development workflow for our Mavens product team. You will help establish new web and API testing standards, developed automation tooling, and build upon existing product quality processes and practices at Mavens Komodo Health. Delivering quality products for our customers is essential to business growth and company success at Komodo Health. You will be highly collaborative achieving product quality in partnership with Product Managers, Program Managers, Designers, Engineers, and even clinical experts as part of Mavens Komodo Health s talented Engineering team. Looking back on your first 12 months at Komodo Health, you will have Become an expert with the Mavens product portfolio Collaborated with Product Managers, Product Designers, Program Managers, Data Scientists, Data Engineers, and on defining and building new product features Developed and enhanced our product quality testing framework and increased overall test coverage Implemented process improvements to enhance overall product quality Analyzed, debugged, and documented problems/observations, and worked with the Engineering team to identify the root causes and resolutions Adhered to and promoted product quality standards You will accomplish these outcomes through the following responsibilities Create/Execute test cases and test scenarios to validate Salesforce applications functionality and usability using manual and automated test cases. Conduct functional, regression, integration, and system testing Perform exploratory testing to uncover hidden defects and user experience issues. Develop and maintain up-to-date test documentation (creation, control, updating, and prioritization). Collaborate with the team on refining high-quality user stories Work closely with developers and product owners to investigate, troubleshoot, and resolve defects in a timely manner. What you bring to Komodo Health: 5+ years of hands-on testing (functional, behavioral, and E2E testing) experience With 2+ years of test automation experience on Salesforce Platform using the Robot Framework and CumulusCI With 4+ years of test experience on data-centric application and systems; Experienced creating test data for healthcare datasets such as provider, beneficiary/patient, etc; Experience with Agile methodologies Experience with Atlassian Jira/Confluence Strong analytical skills and innovative thinking Experience with Testrail as a test management tool Excellent command of written and spoken English. Experience with Version Control Systems Good understanding of Salesforce Metadata Excellent problem-solving skills and attention to detail

Posted 3 months ago

Apply

4 - 7 years

19 - 21 Lacs

Bengaluru

Work from Office

Naukri logo

At PwC, our people in cybersecurity focus on protecting organisations from cyber threats through advanced technologies and strategies. They work to identify vulnerabilities, develop secure systems, and provide proactive solutions to safeguard sensitive data. As a cybersecurity generalist at PwC, you will focus on providing comprehensive security solutions and experience across various domains, maintaining the protection of client systems and data. You will apply a broad understanding of cybersecurity principles and practices to address diverse security challenges effectively. Why PWC Responsibilities Client facing role to onboard DataLake to DevSecOps platform and create, formalize and stabilize process for DataLake team to use DevSecOps process effectively. Professional Requirements Design and manage ETL workflows using AWS Glue. Develop and maintain data catalogs and metadata management in AWS Glue. Write and optimize Python and PySpark scripts for data transformations. Integrate AWS Glue with other AWS services. Monitor and troubleshoot ETL jobs to ensure data quality and reliability. Design and manage data warehousing solutions using Snowflake. Write and optimize SQL queries for data extraction and transformation in Snowflake. Build and manage data pipelines in a cloud environment. Optimize query performance and manage Snowflake resources efficiently. Develop custom Snaps using the SnapLogic SDK. Integrate onpremises and cloud applications using SnapLogic. Implement realtime data integration and replication solutions. Manage and monitor SnapLogic instances. Design and manage data pipelines using Apache Airflow. Write and maintain Airflow DAGs (Directed Acyclic Graphs) in Python. Use the Airflow UI and CLI for task scheduling and monitoring. Debug and troubleshoot issues in data pipelines. Implement and manage CI/CD pipelines using GitHub Actions, Jenkins and JIRA or FreshService. Integrate security tools like Fortify and Trivy into the CI/CD pipelines to ensure continuous security testing. Collaborate with development and operations teams to incorporate security best practices into the software development lifecycle. Use Jira for project management and tracking securityrelated tasks and issues. Perform code reviews and security assessments to identify and mitigate vulnerabilities. Automate security testing and monitoring processes to ensure compliance with security standards. Monitor and respond to security incidents and vulnerabilities in a timely manner. Develop and maintain documentation for security processes and procedures. Stay uptodate with the latest security trends, tools, and technologies. Manage and secure cloud environments on AWS and Azure. Implement cloud security best practices and ensure compliance with industry standards. Utilize Docker for containerization and manage containerized applications. Write and maintain Bash and Shell scripts for automation tasks. Apply basic Linux administration skills to manage and secure Linuxbased systems. Mandatory Skill Sets DevSecOps Preferred Skill Sets DevSecOps Year of Experience required 4

Posted 3 months ago

Apply

8 - 13 years

40 - 80 Lacs

Bengaluru

Work from Office

Naukri logo

Translate business requirements into queries and reporting Work with on-premises and cloud-hosted DBs Write automation scripts to extract and curate data Prepare data and perform Data Modeling Database systems, metadata management, automation, and machine learning Experience in writing efficient queries and performance optimization Familiarity with machine learning algorithms and solving business problems Perform automated and manual testing on deliverables Proficient in Python, Scala, Bash, and SQL Experience with database systems and metadata management Strong analytical and problem-solving skills Knowledge of performance optimization techniques Familiarity with machine learning algorithms Experience working on non-trivial business problems using ML Excellent communication and teamwork abilities Self-sufficient and able to lead when given the opportunity Technical Skills: Big Data Core Java PySpark Python Scala Docker Jenkins Kubernetes shell scripting

Posted 3 months ago

Apply

8 - 13 years

40 - 45 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

Description: The Senior Data Engineer is responsible for leading the design, development, and optimization of data infrastructure and systems to meet complex business needs. They play a strategic role in developing and managing large-scale data pipelines, ensuring scalability, reliability, and performance. As a key technical expert, the Senior Data Engineer collaborates with cross-functional teams to deliver innovative data solutions and ensures adherence to best practices in data management and governance. Responsibilities Leads the design and development of advanced data pipelines, ensuring seamless ETL/ELT of data into enterprise storage systems. Oversees data integration from diverse sources, maintaining data consistency, integrity, and compliance with enterprise standards. Transforms raw data into usable formats through cleansing, aggregation, filtering, and enrichment techniques. Optimizes data pipelines and workflows for scalability, performance, and cost efficiency. Establishes robust data validation, quality checks, and error-handling mechanisms to ensure enterprise data integrity. Mentors junior data engineers, promoting technical growth and team development. Implements and enforces best practices in data governance, including metadata management and lineage tracking. Drives the adoption of real-time data streaming and processing workflows to enable advanced analytics and decision-making. Partners with stakeholders to identify and address complex data challenges, delivering scalable and innovative solutions. Stays updated with emerging data technologies, evaluating their potential to enhance enterprise capabilities. Develops and enforces data security protocols to protect sensitive information. Job Requirements Education A bachelor s degree in computer science, data science, software engineering, information systems, or related quantitative field; master s degree preferred Experience 8+ years of data engineering experience, including expertise in data integration, pipeline optimization, and enterprise-scale data solutions. Proven leadership in implementing big data solutions (e.g., Snowflake, Databricks) and distributed data systems (e.g., Apache Spark, Flink). Skills Advanced proficiency in Apache technologies (Kafka, Airflow, Spark) and programming languages (Python, Java, Scala). Expertise in data query tools (SQL, Hive) and database technologies (NoSQL, Hadoop, Teradata). Strong knowledge of cloud platforms (AWS, Azure, GCP) and modern data architectures. Experience with real-time streaming data workflows and AI/ML analytics initiatives. Exceptional analytical and problem-solving skills with debugging expertise in complex systems. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Strong leadership abilities to mentor teams and influence cross-functional initiatives.

Posted 3 months ago

Apply

3 - 6 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary: We are seeking a highly skilled Canvas Application Developer with 7+ years of experience in application development and a strong command of the Power Platform. The ideal candidate will be responsible for designing, developing, and customizing Canvas applications while collaborating with business teams to deliver effective solutions. This role demands strong problem-solving skills, the ability to work independently, and flexibility to adapt to a process-oriented environment. Must-Have Skillsets (Mandatory): - Proficiency in Canvas Application development and customization. - Strong experience with the Power Platform and Dataverse. - Ability to independently analyze, design, plan, and develop solutions based on business requirements. - Understanding of system architecture and involvement in technical upgrade activities. - Effective communication skills (verbal and written) to interact with both technical and non-technical teams. - Commitment to ownership and teamwork, with the ability to work independently. Good-to-Have Skillsets (Optional): - Experience in collaborating with business teams to understand both functional and non-functional requirements. - Knowledge of process-oriented environments and the ability to adapt to flexible working hours. - Proven ability to participate effectively across all project phases, from analysis to development. Qualifications Experience: - A minimum of 7 years of experience in Canvas Application development and customization. - Demonstrated ability to work as an individual contributor and take ownership of tasks. - Experience in day-to-day application development activities with a focus on delivering quality outcomes.

Posted 3 months ago

Apply

5 - 8 years

2 - 3 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary The Salesforce Developer will be responsible for designing, developing, and implementing customized solutions on the Salesforce platform to support various business processes. This role involves working closely with project managers, CRM specialists, and other stakeholders to gather requirements, design technical solutions, and lead the development process. The developer will also be responsible for integration with other systems, data migration, release management, and ensuring best practices are followed. This hands-on role requires significant technical expertise in Salesforce.com, related cloud technologies, and experience in leading projects from concept to delivery. Must Have Skillsets (Mandatory) - Salesforce Development: 5+ years of experience with Salesforce.com development, including Apex, Visualforce, Lightning Components, Aura, LWC, and SOQL. - Salesforce Certifications: Salesforce Platform Developer I II, Salesforce Administrator, Sales Cloud, Service Cloud certifications. - Custom Solutions: Experience in designing and developing custom solutions on the Salesforce platform. - Integration: Proficiency in integrating Salesforce with other systems using REST/SOAP APIs and middleware tools like MuleSoft. - Data Migration: Expertise in designing and implementing data migration strategies between legacy systems and Salesforce. - Release Management: Experience with sandbox management, metadata and data migration, version control, and deployment. - Salesforce Service Cloud AppExchange: Proficient in developing and managing Salesforce Service Cloud and leveraging AppExchange products. - Communication Skills: Excellent verbal and written communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders. - Problem-Solving: Strong analytical and problem-solving skills, with a focus on delivering efficient and scalable solutions. Good to Have Skillsets (Optional) - Web Technologies: Knowledge of web technologies like HTML, CSS, JavaScript, jQuery, AJAX, XML, and JSON. - Enterprise Integration Tools: Experience with ETL tools and enterprise integration patterns. - Salesforce Communities: Experience with Salesforce Communities, Partner/Customer Portal implementation. - Agile Methodologies: Familiarity with Agile development practices and Scrum certification. - Advanced Salesforce Certifications: Force.com Developer, Force.com Advanced Developer, and additional Salesforce certifications. - Data Modeling: Understanding of data structures, data modeling, and DB schema. - Consulting Experience: Experience in performing requirement fit-gap analysis, creating business and functional requirement documents, and contributing to proposal development. - Leadership and Mentoring: Experience in leading projects, mentoring junior developers, and providing guidance on best practices. - Salesforce Security Model: Knowledge of Salesforce security model and sharing rules. - Salesforce Integration Patterns: Experience in implementing Salesforce Integration Patterns. Qualifications and Experience - Education: Bachelor s degree in Computer Science, Engineering, or a related field. A Master s degree and Salesforce certifications are preferred. - Experience: 5-8 years of Salesforce development experience, with a strong focus on Apex, Visualforce, Lightning Components, LWC, and integration. Proven experience in Salesforce Service Cloud and customer portal development. - Certifications: Salesforce Certified Platform Developer, Salesforce Administrator, Sales Cloud, and Service Cloud certifications. - Location: Open to on-site roles with a budget range of 13k to 18k AED, depending on experience and qualifications. - Additional Experience: Experience in designing and implementing business technology solutions on the Salesforce platform, with a focus on cloud-based applications.

Posted 3 months ago

Apply

2 - 5 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

How will you make an impact: As a Product Content Specialist on the Digital Content Platforms team, you will play a key role in handling content for new product launches on our websites. Your work will include improving product content and images provided by our internal and external partners and finding opportunities to improve content within our authoring system. We are looking for a dynamic and proactive professional with excellent attention to detail, an understanding of digital product content, experience leading multiple projects simultaneously, a willingness to quickly learn new tools, and strong communication skills. What will you do: Work with internal and external collaborators to collect product specifications and relevant data/metadata to synthesize into highly structured product descriptions Ensure product data is structured appropriately and that formatting is applied to support global websites Confirm content is consistent with corporate style requirements and brand guidelines and in compliance with Legal requirements (domestic and international) Review accuracy of taxonomy categorizations to ensure products contain applicable specifications (attributes) Encourage self-service reporting by collaborating with business groups to identify and prioritize content backfill to improve Search Engine Optimization (SEO) Communicate effectively with business partners, IT and other colleagues for on-time delivery of product content Learn and apply process improvement strategies including PPI Minimum Requirements/Qualifications: Bachelors degree in Technical Writing or a scientific field Basic knowledge of Microsoft Office programs, specifically Excel and Access, to work with and analyze large data sets Knowledge, Skills, Abilities 2 to 5 years of experience developing and maintaining content within content management systems Outstanding attention to detail Working knowledge of scientific terms/abbreviations and HTML formatting is a plus Strong spelling, grammar, and writing skills Self-motivated and able to work both independently and as part of a collaborative team Strong project and time management skills, including the ability to balance multiple tasks and time-critical situations Outstanding verbal and written communication skills and able to clearly convey project progress and timelines Strong problem-solving skills and the flexibility to adapt to changing priorities

Posted 3 months ago

Apply

2 - 6 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning

Posted 3 months ago

Apply

3 - 5 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role :Application Automation Engineer Project Role Description :Apply innovative ideas to drive the automation of Delivery Analytics at the client level. Must have skills :User Interface Development Good to have skills :NA Minimum 3 year(s) of experience is required Educational Qualification :Minimum 15 years of full time education Key Reponsibilities :"A-Should have good knowledge of SAP S4 HANA UI5 SAP ABAP B-Should be able to perform configuration Design, development of SAP UI5 with experience in S/4 C-Version control and GIT usage D-Extending standard Fiori apps E-Configure launchpad with tiles and shell plugin changes F-Should be able to resolve support issues related SAP UI5 and also have experience in Troubleshooting the standard Tiles, workflow issues, Mitigation control issues G-Deployment of UI5 apps to S/4 and SCP/BTP" Technical Experience : "A-Should have experience in FIORI, JS, Web technologies, ODATA / CDS B-Should have good knowledge of ABAP with S/4 Environment experience Proficient if debugging of SAP program, good in enhancement and write reports by using SAP Fiori/UI5 C-Should be familiar with WebIDE and Launchpad configurations D-Good understanding and implementation of oData services, metadata and filters E-Expert in UI5 runtime and Fiori templates F-SAP UI5 binding, events and lifecycle event" Professional Attributes :"a- Should have strong interpersonal and communication skills b- Should be eager to never stop learning c- Should have problem solving skills"

Posted 3 months ago

Apply

7 - 9 years

5 - 9 Lacs

Noida

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Product 360 (PIM) Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : as per accenture standards Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Informatica Product 360 (PIM). Your typical day will involve working with the PIM tool, collaborating with cross-functional teams, and delivering impactful data-driven solutions. Roles & Responsibilities: Design, build, and configure applications using Informatica Product 360 (PIM) to meet business process and application requirements. Collaborate with cross-functional teams to identify and prioritize requirements, ensuring that solutions are aligned with business needs. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Provide technical guidance and support to junior team members, ensuring that best practices are followed and that solutions are delivered on time and within budget. Stay updated with the latest advancements in Informatica Product 360 (PIM) and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience with Informatica Product 360 (PIM). Strong understanding of data modeling, data integration, and data quality concepts. Experience with ETL tools such as Informatica PowerCenter. Experience with SQL and relational databases such as Oracle, SQL Server, and MySQL. Experience with web services and APIs, including SOAP and REST. Experience with Agile development methodologies. Good To Have Skills:Experience with cloud-based data integration platforms such as Informatica Cloud. Experience with master data management (MDM) solutions. Experience with data governance and metadata management tools. Experience with data profiling and data quality tools. Experience with data visualization tools such as Tableau or Power BI. Additional Information: The candidate should have a minimum of 7.5 years of experience in Informatica Product 360 (PIM). The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Noida office. Qualification as per accenture standards

Posted 3 months ago

Apply

5 - 9 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of fulltime education Summary :As an Application Lead for SAP Master Data Governance MDG Tool, you will be responsible for leading the effort to design, build, and configure applications. You will act as the primary point of contact and work closely with cross-functional teams to ensure successful project delivery. Your typical day will involve overseeing the development process, providing technical guidance, and ensuring adherence to project timelines and quality standards. Roles & Responsibilities: End-to-End driving SAP MDG implementations, from requirements gathering to solution design and deployment. Collaborate with data governance teams to establish data ownership, stewardship roles, and data quality metrics. Utilize SAP BTP Data Intelligence to enhance data profiling, quality monitoring, and metadata management within SAP MDG. Provide guidance on integrating SAP MDG with other systems to ensure data consistency and accuracy. Effective application support involves collaboration among these roles and a commitment to resolving issues promptly, minimizing disruptions, and continuously improving the performance and functionality of business applications. It's essential for businesses to have a well-defined support structure in place to ensure the reliability and efficiency of their software applications. Handling the day to day production issues and calls with the business users Closely work with client counter parts and onshore team Professional & Technical Skills: Must To Have Skills:Around 6+ years of experience in SAP MDG Module Expertise in SAP MDG functionalities, including data modeling, workflow configuration, and data quality management. Proficiency in SAP MDG's integration capabilities, including Cloud Connector, IDOCs, and web services. . Good To Have Skills:Configure SAP MDG to define and enforce data governance rules, workflows, and data quality standards. Coordinates the overarching technical implementation (planning, customizing, programming, incident handling, implementation of change requests, roles and rights concept, etc.) and roll out activities Consult the implementation of data models and data domains within SAP MDG for consistent master data management. Collaborate with data stewards and business owners to define data governance policies and processes. Design and implement data validation, enrichment, and de-duplication strategies within SAP MDG. Integrate SAP MDG with other SAP and non-SAP systems to ensure consistent master data across the landscape. Provide user training and support to ensure effective utilization of SAP MDG by data stewards and business users. Strong Knowledge of ITIL processes in Service management, Change management, Incident Management, , Problem Management Additional Information: The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful solutions. This position is based at our Pune office. Qualification Minimum 15 years of fulltime education

Posted 3 months ago

Apply

5 - 9 years

8 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Hyperion Financial Management (HFM) Good to have skills : Energy Fundamentals Minimum 5 year(s) of experience is required Educational Qualification : Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Oracle Hyperion Financial Management (HFM)Good to Have Skills : Energy FundamentalsJob Requirements :Key Responsibilities :1:Expert team member,help other tech developer,Performs analysis, conceptual design,development implementation of modification2:Coordinates with Functions or Bus team to translate bus require to tech document diagram or consolidated design document import formats, locations, mappings, data load rule, batch data loading 3:Resolve tickets Design build Extension, Interfaces, Reports for HFM Design and build HFM Smart View reporting, Workspace task flows Extension Analytics4:Do fit-gap analysis for changes features in new releases design strategy Technical Experience :1:Must have Oracle Hyperion Financial Management2:Good to have knowledge in creating Workspace task flows, Extended Analytics, designing Chart of Accounts hierarchies for HFM and designing changes to Metadata based on requirements 3:Good to have knowledge on HFR and Smart view reporting 4:Good to have knowledge in FCM application and reporting 5:Should have prior experience of working on maintenace or Implement project Professional Attributes :1:Excellent Communication Skill 2:Quick Learner 3:Contributor Educational Qualification:Minimum 15 years of fulltime education as per organization standardsAdditional Info :

Posted 3 months ago

Apply

3 - 4 years

5 - 6 Lacs

Chennai

Work from Office

Naukri logo

SFDC Developer Experience: 5+ years Job Mode: Full-time Work Mode: Remote Responsibilities : Design, code & unit test highly complex systems and/or applications on force.com platform that may integrate with other applications and/or platforms in PropertyGuru considering all factors such as performance, user experience, reusability, scalability, mobile compatibility, testability, data integrity, operational effectiveness, cost and time to market . Perform enhancements to the existing implementation, support upgrades/maintenance & clean-up. Administrate & configure Salesforce organization. Contribute & adhere to technical standards/guidelines & best practices in design, application programming, code review, deployment, environment refreshes, data migration/integration, application monitoring & iterative refinement. Analyze requirements, document and communicate optimal design/development approach to meet requirements with detailed design and technical specifications. Partner with Product Owner(s)/Business Analyst(s), QA and other Salesforce developers/specialists actively in Sprints to roll out deliverables with timelines without delays. Job Description: M inimum 4 years of extensive application design, development, and configuration experience in Salesforce including but not limited to Apex programming, Visualforce, Aura Lightning Components, web services/APIs, SOQL, SOSL, Salesforce DX, JavaScript, jQuery, CSS3, HTML5, and UX design. Salesforce Certified Platform Developer I or equivalent. Excellent overall understanding of the Salesforce core product suite combined with deep expertise and experience in at least two of the major cloud products (Sales, Service, Marketing, Community, Commerce, Einstein, etc.) and its features. Knowledge of Salesforce integration is a plus. Experience with release management, source control, automated builds & deployment concepts, and technologies such as DX scratch org, Ant, SFDC Metadata API, Jenkins, Git (Code Commit), and DevOps in a Salesforce environment. Exposure to Agile methodology and project management tools like JIRA. Salesforce CPQ experience is a plus. Good problem-solving, analytical, and time management skills with a focus on efficiency and attention to detail. Interpersonal skills - exceptional verbal/written communication in English Bachelor/Masters in Computer Engineering (or similar). About Encora

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Salesforce Developer (IC) Important Information Location - Bangalore Experience - 5+ Years Job Mode - Full time Work mode - Hybrid Job Summary Design, code & unit test highly complex systems and/or applications on force.com platform that may integrate with other applications and/or platforms in PropertyGuru considering all factors such as performance, user experience, reusability, scalability, mobile compatibility, testability, data integrity, operational effectiveness, cost and time to market. Perform enhancements to the existing implementation, support upgrades/maintenance & clean-up. Administrate & configure Salesforce organization. Contribute & adhere to technical standards/guidelines & best practices in design, application programming, code review, deployment, environment refreshes, data migration/integration, application monitoring & iterative refinement. Analyze requirements, document and communicate optimal design/development approach to meet requirements with detailed design and technical specifications. Partner with Product Owner(s)/Business Analyst(s), QA and other Salesforce developers/specialists actively in Sprints to roll out deliverables with timelines without delays. *What we would like you to have?* M inimum 4 years of extensive application design, development, and configuration experience in Salesforce including but not limited to Apex programming, Visualforce, Aura Lightning Components, web services/APIs, SOQL, SOSL, Salesforce DX, JavaScript, jQuery, CSS3, HTML5, and UX design. Salesforce Certified Platform Developer I or equivalent. Excellent overall understanding of the Salesforce core product suite combined with deep expertise and experience in at least two of the major cloud products (Sales, Service, Marketing, Community, Commerce, Einstein, etc.) and its features. Knowledge of Salesforce integration is a plus. Experience with release management, source control, automated builds & deployment concepts, and technologies such as DX scratch org, Ant, SFDC Metadata API, Jenkins, Git (Code Commit), and DevOps in a Salesforce environment. Exposure to Agile methodology and project management tools like JIRA. Salesforce CPQ experience is a plus. Good problem-solving, analytical, and time management skills with a focus on efficiency and attention to detail. Interpersonal skills - exceptional verbal/written communication in English Bachelor/Masters in Computer Engineering (or similar). About Encora Encora is a global company that offers Software and Digital Engineering solutions, with more than 9000 Encorians around the world. Our technology practices include Cloud Services, Product Engineering & Development, Data Modernization & Engineering, Digital Experience, DevSecOps, Cybersecurity, Quality Engineering, Generative AI, among others. At Encora Inc. we hire professionals based solely on their skills and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality.

Posted 3 months ago

Apply

8 - 13 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

In this position, you will play the pivotal role of developing software programs, algorithms, and automated processes to identify meaningful insights for operation from large data and metadata. You ll work with our customers, engineering and internal stakeholders to gain operational insight across SaaS, PaaS and IaaS. Other responsibilities include: Increasing ops efficiency by reducing noise and enriching incident data Increase uptime of system by predicting failure and taking proactive action Reliability Engineering, Performance Analytics and Capacity Forecasting Lead and participate in planning meetings and provide leadership on direction of products Communicate vision and goals of existing and new products to both internal stakeholders, the analyst community, and customers Influence engineering organizations to focus on new feature innovation Work closely with the SaaS and Platform teams and be their internal and external champion Define, implement, and analyze success metrics Minimum Qualifications: Bachelors Degree in Computer Science, Engineering, Statistics or a related discipline 4+ years of experience as Data scientist or ML engineer Attention to detail proven ability to manage multiple competing priorities, simultaneously Ability to think both strategically and tactically in a high-energy, fast paced environment Preferred Qualifications: MS in Computer Science or Statistics Experience in data science or related technology Experience in designing and launching Cloud products

Posted 3 months ago

Apply

3 - 7 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and maintain information architecture for the website. Own site navigation, taxonomy, and URL structures to improve usability and SEO. Collaborate with UX, content, SEO, and design teams for a seamless customer journey. Define and manage metadata and tagging strategies for improved searchability. Conduct user research, card sorting, and tree testing to validate IA decisions. Analyze web analytics and user behavior to identify improvement opportunities. Ensure content organization consistency aligned with brand messaging and goals. Partner with web engineering for implementation of IA improvements. Lead content migration and restructuring during platform transitions or updates. Stay updated on best practices in information architecture, UX, and SEO. What You Bring 5+ years of experience in information architecture, UX, content strategy, or web strategy in B2B SaaS or enterprise environments. Strong understanding of website navigation, taxonomy, and user experience principles. Experience in conducting user research and usability testing. Familiarity with web analytics tools (e.g., Google Analytics, Adobe Analytics, Hotjar). Understanding of SEO best practices, including site structure and internal linking. Experience with CMS platforms (e.g., AEM, WordPress) and structured content models. Proven ability to collaborate cross-functionally with various teams. Excellent problem-solving and communication skills, capable of simplifying complex structures. Nice to Have Experience with Adobe Experience Manager (AEM) or other enterprise CMS platforms. Background in conversion rate optimization (CRO) and demand generation. Familiarity with AI-driven search, personalization, and content recommendation systems

Posted 3 months ago

Apply

12 - 16 years

13 - 18 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary: We are seeking anexperienced Data Architect with a minimum of 10 years of professionalexperience, preferably in the Indian BFSI industry. The successful candidatewill be responsible for designing and implementing robust, scalable, and securedata architectures to support the organization"s data-driven initiatives andbusiness objectives, with experience in cloud (AWS, Azure, GCP) and on-premises(Cloudera/ CDP) environments. KeyResponsibilities: Must Have: 1. Analyze andunderstand the organization"s data landscape, including existing data sources,data flows, and data storage solutions in both cloud (AWS, Azure, GCP) andon-premises (Cloudera) environments. 2. Design anddevelop comprehensive data architectures that address the organization"scurrent and future data requirements, ensuring data integrity, security, andcompliance across cloud and on-premises infrastructure. 3. Collaborate withcross-functional teams, including business analysts, application developers,and IT infrastructure teams, to align data architecture with the overall IT andbusiness strategies. 4. Define dataintegration and transformation processes, ensuring efficient and reliable dataflows between various systems and applications, both on-premises and in thecloud (AWS, Azure, GCP). 5. Implement datagovernance and data quality management practices to ensure data consistency,accuracy, and reliability across the organization, irrespective of the datastorage location (cloud or on-premises). 6. Assess andrecommend appropriate data storage technologies, such as relational databases,data warehouses, and big data platforms (e.g., Cloudera), based on theorganization"s requirements, considering both cloud (AWS, Azure, GCP) andon-premises options. Good to Have: 7. Develop andmaintain data models, data dictionaries, and metadata repositories to provide acomprehensive understanding of the organization"s data assets, across cloud andon-premises environments. 8. Evaluate andselect appropriate data management tools and technologies to support the dataarchitecture, ensuring seamless integration and scalability across cloud (AWS,Azure, GCP) and on-premises (Cloudera) environments. 9. Providetechnical leadership and mentorship to the data engineering and analyticsteams, contributing to the overall data management capabilities, with a focuson both cloud and on-premises data solutions. 10. Stay updatedwith industry trends, best practices, and emerging technologies in the datamanagement domain, and recommend innovative solutions to enhance theorganization"s data capabilities, including cloud-based (AWS, Azure, GCP) andon-premises (Cloudera) solutions. Qualificationsand Experience: Minimum 8-10years of experience in data architecture, with a strong focus on the IndianBFSI industry. Proficient indesigning and implementing data architectures using a variety of data storagesolutions, such as relational databases, data warehouses, and big dataplatforms (e.g., Cloudera), in both cloud (AWS, Azure, GCP) and on-premisesenvironments. Extensiveknowledge of data integration and transformation techniques, including ETL(Extract, Transform, Load) and ELT (Extract, Load, Transform) processes, withexperience in both cloud-based (AWS, Azure, GCP) and on-premises (Cloudera)approaches. Familiarity withdata governance frameworks, data quality management, and metadata managementpractices, and their application across cloud (AWS, Azure, GCP) and on-premises(Cloudera) environments. Hands-onexperience with data modeling, database design, and performance optimization,considering the unique requirements of cloud (AWS, Azure, GCP) and on-premises(Cloudera) data architectures. Strongunderstanding of data security, privacy, and compliance regulations in the BFSIindustry, and the ability to apply these principles in both cloud (AWS, Azure,GCP) and on-premises (Cloudera) environments. Proven ability tocollaborate with cross-functional teams and effectively communicate technicalconcepts to both technical and non-technical stakeholders. Experience inleading and mentoring data engineering and analytics teams, with a focus oncloud (AWS, Azure, GCP) and on-premises (Cloudera) data managementcapabilities. Degree inComputer Science, Information Technology, or a related field. Advanced degreesor certifications in data management or architecture, with a focus on cloud(AWS, Azure, GCP) and on-premises (Cloudera) solutions, are preferred. If you have therequired skills and experience, we encourage you to apply for this excitingopportunity to join our dynamic and innovative team.

Posted 3 months ago

Apply

8 - 13 years

40 - 80 Lacs

Hyderabad

Work from Office

Naukri logo

Interested to build the next generation Financial systems that can handle billions of dollars in transactions? Interested to build highly scalable next generation systems that could utilize Amazon Cloud? Massive data volume, complex business rules in a highly distributed service oriented architecture, a world class information collection and delivery challenge. Our challenge is to deliver the software systems which accurately capture, process, and report on the huge volume of financial transactions that are generated each day as millions of customers make purchases, as thousands of Vendors and Partners are paid, as inventory moves in and out of warehouses, as commissions are calculated, and as taxes are collected in hundreds of jurisdictions worldwide. The ideal candidate will draw upon exemplary analytical, critical thinking and problem solving skills, deep software development experience, and a passion for creating maintainable, highly reliable and scalable user facing applications that are accessed by thousands of external Vendors and internal Customers. Successful members of this team collaborate effectively with internal customers, other dependent development teams in Amazon to develop new applications successfully against high operational standards of system availability and reliability. In the space of workflow management tools, engineers in this team solve problems for the first time and have got opportunity to convert them to generic/re-usable components for use in broader engineering community. We look for engineers who are excellent communicators, self-motivated, flexible, hardworking, and who like to have fun. This candidate also plays active role in reviewing the technical designs from the team and in mentoring of other developers in the team. You will have the opportunity to play a key role in building new software products and features from the ground up. Your work will allow you to utilize a melting pot of technologies, programming languages, and systems and require you to keep up with the ever-changing technological landscape. Your responsibilities will include all aspects of Data Engineering, with the freedom and encouragement to explore your own ideas and the reward of seeing your contributions benefit Amazon worldwide. Finance Automation team is looking for a talented software development engineer who can tackle large complex projects. Key job responsibilities Design, build and own components of a high volume data warehouse. Build efficient data models using industry best practices and metadata for ad hoc and pre-built reporting. Interface with business customers, gathering requirements and delivering complete data and reporting solutions owning the design, development, and maintenance of ongoing metrics, reports, analyses, and dashboards to drive key business decisions. Continually improve ongoing reporting and analysis processes, automating and simplifying self-service support capabilities for our customers. Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources. Own the functional and non-functional scaling of software systems in your area. Provide input and recommendations on technical issues to other engineers, business stake holders, and data analysts. Collaborate with data scientists to continue to build and enhance new or existing ML programs. - Experience as a data engineer or related specialty (e. g. , software engineer, business intelligence engineer, data scientist) with a track record of manipulating, processing, and extracting value from large datasets

Posted 3 months ago

Apply

5 - 9 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Multiplatform Front End Development React, Amazon Web Services (AWS), JavaScript, Java Enterprise Edition Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for ensuring the successful delivery of projects and providing technical guidance to the team. Your typical day will involve collaborating with stakeholders, analyzing requirements, designing solutions, and overseeing the development process. You will also be involved in troubleshooting and resolving any issues that arise during the project. Key Responsibilities:1 Understanding requirements and involving in design and implementation 2 Collaborate with other peers, having domain expertise to build the right solution that business needs 3 Self-driven and capable of managing multiple priorities under pressure and ambiguity 4 Ability to work effectively in a fast-paced environment 5 Keen eyes for usability, creating intuitive visually appealing experiences 6 UI will be used by consumers to extract the relevant data from Metadata repository Development work in the search space using Elastic search Technical Experience:1 5 plus years' experience developing with ReactJS 2 Resource should be good at Coding. Please conduct Coding Test.3 Strong Fundamental JavaScript skills ES5 and ES6,CSS skills, 4 Experience with TypeScript or ClojureScript is a plus, Thorough understanding of Reactjs and its core principles 5 React combined with Flux/Redux experience is preferred 6 Thorough understanding of Reactjs and its core principles 7 Experience developing component-driven UIs 8 Experience with data structure libraries 9 Knowledgeable in performance optimization techniques.10 Knowledge of AWS services and deployment is an advantage11 State management with Redux Additional Information: The candidate should have a minimum of 5 years of experience in Multiplatform Front End Development React. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful multiplatform front-end solutions. Ready to work in B shift - 12 PM to 10 PM Coding test is mandatory for every resource Qualifications 15 years full time education

Posted 3 months ago

Apply

5 - 9 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Multiplatform Front End Development React, Amazon Web Services (AWS), Java Enterprise Edition, JavaScript Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Multiplatform Front End Development React Application Lead, you will be responsible for designing, building, and configuring applications using Amazon Web Services (AWS), Java Enterprise Edition, and JavaScript. Your typical day will involve leading the effort to design, build, and configure applications, acting as the primary point of contact, and collaborating with cross-functional teams to deliver high-quality solutions.Key Responsibilities:1 Understanding requirements and involving in design and implementation 2 Collaborate with other peers, having domain expertise to build the right solution that business needs 3 Self-driven and capable of managing multiple priorities under pressure and ambiguity 4 Ability to work effectively in a fast-paced environment 5 Keen eye for usability, creating intuitive visually appealing experiences 6 UI will be used by consumers to extract the relevant data from Metadata repository Development work in the search space using Elastic search Technical Experience:1 7 plus years' experience developing with ReactJS 2 State management with Redux3 Strong Fundamental JavaScript skills ES5 and ES6,CSS skills, 4 Experience with TypeScript or ClojureScript is a plus, Thorough understanding of Reactjs and its core principles 5 React combined with Flux/Redux experience is preferred 6 Thorough understanding of Reactjs and its core principles 7 Experience developing component-driven UIs 8 Experience with data structure libraries 9 Knowledgeable in performance optimization techniques.10 Knowledge of AWS services and deployment is an advantage Additional Information: The candidate should have a minimum of 5 years of experience in Multiplatform Front End Development React. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful multiplatform front-end solutions. Ready to work in B shift - 12 PM to 10 PM Qualification 15 years full time education

Posted 3 months ago

Apply

5 - 9 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Collibra Data Governance Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Bachelor in Computer Science Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models using Collibra Data Governance. Key Responsibilities: Knowledge of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra Good communication. Design of Data Governance Organization including steering committee, data governance office, stewardship layer and other working groups. Setup people and processes including relevant roles, responsibilities and controls, data ownership, workflows and common processes. Technical Experience Experience in Data Governance of wide variety of data types (structured, semi-structured and unstructured data) and wide variety of data sources (HDFS, S3, Kafka, Cassandra, Hive, HBase, Elastic Search) Working experience of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra. Experience in setting up people s roles, responsibilities and controls, data ownership, workflows and common processes. Integrate Collibra with other enterprise tools:Data Quality Tool, Data Catalog Tool, Master Data Management Solutions Develop and configure all Collibra customized workflows Develop API (REST, SOAP) to expose the metadata functionalities to the end-users.Professional Experience Working as SME in data governance, metadata management and data catalog solutions, specifically on Collibra Data Governance. Client interface and consulting skills required. Partner with Data Stewards for requirements, integrations and processes, participate in meetings and working sessions. Partner with Data Management and integration leads to continuously improve Data Management technologies and processes. Qualification Bachelor in Computer Science

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies