Jobs
Interviews

16 Data Dictionaries Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are Kenvue, a company dedicated to the power of everyday care and rooted in a rich heritage and scientific expertise. With iconic brands like NEUTROGENA, AVEENO, TYLENOL, LISTERINE, JOHNSONS, and BAND-AID, you are committed to delivering the best products to customers globally. As a Kenvuer, you are part of a diverse team of 22,000 individuals focused on insights, innovation, and making a positive impact on millions of lives daily. As a Senior Data Modeler at Kenvue Data Platforms, based in Bengaluru, you will collaborate with various teams including Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. Your role involves developing solution architectures, defining data models, and ensuring the acquisition, ingestion processes, and reporting requirements are met efficiently. Key Responsibilities: - Provide expertise in data architecture and modeling to build next-generation product capabilities that drive business growth. - Collaborate with Business Analytics leaders to translate business needs into optimal architecture designs. - Design scalable and reusable data models adhering to FAIR principles for different functional areas. - Work closely with data engineers, solution architects, and stakeholders to optimize data models. - Create and maintain Metadata Rules, Data Dictionaries, and lineage details for data models. Qualifications: - Undergraduate degree in Technology, Computer Science, or related fields; advanced degree preferred. - Strong interpersonal and communication skills to effectively collaborate with various stakeholders. - 3+ years of experience in data architecture & modeling in Consumer/Healthcare Goods companies. - 5+ years of progressive experience in Data & Analytics initiatives. - Hands-on experience in Cloud Architecture (Azure, GCP, AWS) and cloud-based databases. - Expertise in SQL, Erwin / ER Studio, data modeling techniques, and methodologies. - Familiarity with noSQL, graphDB databases, and data catalogs. - Experience in Agile methodology (Scrum/Kanban) within DevSecOps model. - Proven track record of contributing to high-profile projects with changing requirements. Join Kenvue in shaping the future and making a difference in the world of data and analytics. Proud to be an equal opportunity employer, Kenvue values diversity and inclusion in its workforce. Location: Bangalore, India Job Function: Digital Product Development,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Business Analyst specializing in Product & Pricing Master Data, you will play a crucial role in the large-scale Finance Transformation at a FTSE20 company resulting from the acquisition of Refinitiv by the LSEG in 2021. The finance organization has become complex due to operating on two Enterprise Resource Planning (ERP) systems - Oracle EBS (LSEG) and SAP ECC6 (Refinitiv), leading to dual business processes, control environments, and data structures. The organization has chosen Oracle Fusion as the single ERPM platform and is currently working on Global Design with PwC, the implementation partner. Your responsibilities will involve eliciting and analyzing business requirements from product owners and functions, identifying data needs across various product functions to enhance reusability and flexibility. You will be leading the conceptual design of comprehensive data models, integration methodology, and taxonomies to support product & pricing in Oracle Product Hub. Collaboration with multi-functional teams including finance, technology, operations, regulatory, sales, marketing, and product groups will be essential to ensure successful design and implementation of the Product Data model in the new platform. Furthermore, your role will include documenting data flows, entity relationships, and data dictionaries, identifying data integrity issues and proposing solutions, creating and maintaining data models using industry-standard tools and methodologies, organizing design workshops, establishing testing scope and test scripts for Product Hub, validating data models against business requirements and use cases, supporting implementation teams with guidance on data architecture, ensuring data models comply with privacy and security requirements, and ensuring scalability and reusability for the Product data model designs. To be successful in this role, you should have 5+ years of demonstrable experience in Product Modelling in Oracle Product Hub, 10+ years of experience in successful implementation of Product and Pricing Master data in large and complex programs, proven expertise in simplifying and standardizing complex product structures, experience in integrating product & pricing Data with Q2C processes, good stakeholder management skills, an open-minded approach to driving solutions, and a collaborative and positive attitude. Joining LSEG means being part of a diverse and inclusive organization of over 25,000 people across 70 countries. The company values individual perspectives and believes that a diverse workforce is a strength that fosters collaboration, creativity, and new ideas. LSEG offers a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. Please note that this job description is subject to the privacy notice of the London Stock Exchange Group (LSEG) which outlines the use of personal information and data protection rights. If you are a Recruitment Agency Partner, it is your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.,

Posted 1 week ago

Apply

5.0 - 15.0 years

0 Lacs

karnataka

On-site

The role of Talend Developer and Architect at our company involves designing, developing, testing, and deploying integration processes using Talend. Your responsibilities will include collaborating with team members to understand requirements, coding, debugging, and optimizing code for performance. You will also be involved in maintaining documentation for processes and contributing to technological improvement initiatives. As a Talend Developer and Architect, you will design and develop robust data integration solutions using Talend Studio to meet business requirements. You will also be responsible for implementing data governance frameworks and policies, configuring Talend Data Catalog, managing metadata repositories, data quality rules, data dictionaries, and optimizing data pipelines for performance and scalability. To excel in this role, you should have a background in Computer Science, proficiency in Back-End Web Development and Software Development, strong programming skills with an emphasis on Object-Oriented Programming (OOP), and experience with ETL tools, particularly Talend. Excellent analytical and problem-solving skills, along with good communication and teamwork abilities, are essential. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. You will work closely with data stewards, business analysts, data engineers, data scientists, and business stakeholders to understand and fulfill data integration requirements. If you are looking for a challenging opportunity to showcase your skills and contribute to the success of our organization, this role is perfect for you.,

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be responsible for defining and implementing the data governance strategy within the Telecom Domain. This role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. As a Telecom Data Governance Lead, you will define and implement an enterprise-wide data governance framework. You will own the metadata catalog, ensuring consistency across business and technical assets. Additionally, you will develop and manage KPI registries, data dictionaries, and lineage documentation. Collaborating with data stewards and domain owners is essential to establish attribute ownership. Your role will also involve leading efforts around data standardization, quality rules, and classification of sensitive data. Ensuring privacy and compliance by enforcing tagging, masking, and access rules is a key responsibility. Defining access control rules, overseeing governance for data products and federated data domains, supporting audits, and coordinating with various teams are also part of the role. To qualify for this position, you should have a Bachelor's or Master's degree in Computer Science, Telecommunications Engineering, Data Science, or a related technical field. A minimum of 10 years of experience in data governance roles, with at least 3-4 years specifically in the telecommunications industry, is required. Experience integrating governance with modern data stacks such as Data bricks and Snowflake is essential. Proficiency in data governance tools like Alation, Unity Catalog, Azure Purview, and a proven understanding of metadata management, data lineage, and data quality frameworks are necessary. Experience in implementing federated governance models, data stewardship programs, and knowledge of compliance requirements (GDPR, HIPAA, PII, etc.) are important qualifications. Familiarity with data mesh principles, data contract approaches, excellent communication, stakeholder management skills, and a background in telecom, networking, or other data-rich industries are beneficial. Certification in data governance or management frameworks is a plus.,

Posted 1 week ago

Apply

5.0 - 7.0 years

1 - 5 Lacs

Hyderabad, Telangana, India

On-site

Job description The Data & Analytics team is responsible for integrating new data sources, creating data models, developing data dictionaries, and building machine learning models for Wholesale Bank. The primary objective is to design and deliver data products that assist squads at Wholesale Bank in achieving business outcomes and generating valuable business insights. Within this job family, we distinguish between Data Analysts and Data Scientists Requirements We are seeking a highly skilled Data Science and Machine Learning specialist with 2+ years of experience in Advanced Analytics, Statistical and ML model development. In this role, candidates will be responsible for leveraging data-driven insights and machine learning techniques to solve complex business problems, optimize processes, and drive innovation. The ideal candidate will be skilled in working with large datasets Key Responsibilities: Extract and analyze data from company databases to drive the optimization and enhancement of product development and marketing strategies. Analyze large datasets to uncover trends, patterns, and insights that can influence business decisions. Leverage predictive and AI/ML modeling techniques to enhance and optimize customer experience, boost revenue generation, improve ad targeting, and more. Design, implement, and optimize machine learning models for a wide range of applications such as predictive analytics, natural language processing, recommendation systems, and more. Conduct experiments to fine tune machine learning models and evaluate their performance using appropriate metrics. Qualifications: Bachelors, Master's or Ph.D in Computer Science, Data Science, Mathematics, Statistics, or a related field. 2+ years of experience in Analytics, Machine learning, Deep learning. Proficiency in programming languages such as Python, and familiarity with machine learning libraries (e.g., Numpy, Pandas, TensorFlow, Keras, PyTorch, Scikit-learn). Strong experience with data wrangling, cleaning, and transforming raw data into structured, usable formats. Hands on experience in developing, training, and deploying machine learning models for various applications (e.g., predictive analytics, recommendation systems, anomaly detection). In depth understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning) and their appropriate use cases Good to Have: Qualification

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled and detail-oriented Data Governance Analyst looking to join the Data Lakehouse program team of a Leading Insurance and Investments Firm in Pune (Hybrid; thrice a week in-office requirement). Your primary responsibility will be to ensure data integrity, quality, and compliance across the organization, focusing on Data Ownership/Stewardship, Metadata Management, Data Quality, and Reference Data Management. Your key responsibilities will include: Metadata Management: - Review and validate metadata documents and ingestion templates - Analyze and recommend improvements to data dictionaries, business glossaries, access controls, etc. - Ensure metadata accuracy and completeness across all data assets Data Ownership and Stewardship: - Collaborate with Data Owners and Stewards to align data governance standards with business requirements - Facilitate communication between technical teams and business stakeholders Data Quality: - Review and enforce data quality requirements - Develop data quality metrics and monitoring processes - Identify and address data quality issues in collaboration with relevant teams Reference Data Management: - Review and standardize reference data and Lists of Values - Ensure proper maintenance and version control of reference data - Collaborate with business units to define and implement reference data standards Cross-functional Collaboration: - Work closely with various teams like Business Systems Analysts, Data Architects, etc. - Participate in data governance meetings and initiatives - Contribute to the development and implementation of data governance policies and procedures Preferred Qualifications: - Professional certifications in data governance or data management - Experience with data lakehouse architectures and technologies - Familiarity with Agile methodologies and project management practices - Experience with data governance tools and applications Requirements: - Bachelor's degree in Computer Science, Information Systems, or related field - 5+ years of experience in data governance, data management, or similar role - Strong understanding of data governance principles, metadata management, and data quality concepts - Experience with data dictionaries, business glossaries, and data classification methodologies - Familiarity with insurance and investment industry data standards and regulations - Excellent analytical and problem-solving skills - Strong communication and interpersonal skills - Proficiency in data governance tools and technologies - Knowledge of data privacy regulations and best practices,

Posted 1 week ago

Apply

5.0 - 12.0 years

0 Lacs

karnataka

On-site

Are you looking for a new career challenge With LTIMindtree, are you ready to embark on a data-driven career Working for a global leading manufacturing client to provide an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision will surely be a fulfilling experience. This position involves managing Alation Platform Support, including tasks such as managing users roles groups and assigning required privileges, working with Vendors to understand roadmap and influence Alation roadmap to include use cases/feature requests, scheduling MDE Data Sampling QLI, onboarding new projects into the Alation platform, creating and managing Alation domains, sub-domains, documents hub, setting up new connectors or data sources, configuring SSO SAML Authentication for Alation platform and syncing LDAP with security groups roles. Responsibilities also include uploading data dictionaries for bulk curation through GUI, coordinating with Alation vendor for any product bugs, capabilities, feature requests, and bug fixes, preparing and maintaining a tracker for configuration changes and performance tweaks, reviewing Alation API access and granting appropriate access case by case, configuring Lexicon Alli Bots for automatic curation of Business Technical terms, supporting projects from migrating from on-prem to cloud, performing platform upgrades, applying patches and bug fixes, creating support and change management groups, assisting Data stewards for issues/guidance, defining security model, and enforcing security best practices. Additionally, responsibilities involve onboarding Alation platform to Asset Management, working with TQ to qualify the Alation OnPrem Cloud platforms, maintaining security/firewall rules, and performing other related tasks. The ideal candidate should have 5-12 years of experience and be available for immediate to 30 days" notice period. Join us to work in industry-leading implementations for Tier-1 clients, experience accelerated career growth and global exposure, work in a collaborative, inclusive work environment rooted in innovation, gain exposure to best-in-class automation framework, and be part of an innovation-first culture that embraces automation, AI insights, and clean data. If you know someone who fits this role perfectly, tag them and let's connect the right talent with the right opportunity. For more details, you can reach out to Santosh.M@ltimindtree.com. This position is located at PAN India LTIM locations.,

Posted 1 week ago

Apply

2.0 - 7.0 years

7 - 17 Lacs

Hyderabad

Work from Office

About this role: Wells Fargo is seeking for a Data Product Management Consultant In this role, you will: Participate in less complex analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Consult with clients to assess the current state of data quality within area of assigned responsibility Required Qualifications: 2+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Manage data entry, cleansing and updating processes across core system. Identify and resolve data inconsistency or quality issues. Banking business or technology experience Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx & LLM Job Expectations: Assist in implementing the data process. Monitor data flows and perform regular audits to maintain data integrity. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead C onsultant - Business Analyst Participate/contribute to various stages of end-to-end Project life cycle and have through understanding and experience in both agile and waterfall-based project execution. Review, analyze and evaluate business processes and systems, both new and existing Work with stakeholders to elicit requirements and develop solutions. Understand requirements, acceptance criteria, connect the dots for each assigned task Responsibilities Experienced in Agile Scrum, Story writing, Managing requirements and acceptance criteria as well as bug/issue management. Well-versed in project management tools such as JIRA and Confluence. Experienced in development and execution of independent validation and verification testcases and plan , as well as Production based operational acceptance testing. Collaborate with development teams to ensure requirements are well understood and solution being built are in line with requirements. Able to build Use cases and present them to various teams. Understand and gather data requirements for items such as but not limited to new data demands, new regulatory consumption, attribute mapping , defining aggregation rules. Qualifications we seek in you! Minimum Q ualifications / Skills B. E/ B. Tech/Any Graduation Preferred Q ualifications / Skills Experience in the Banking domain with exposure to Consumer Banking Experience using ETL tools such as Datameer / Informatica etc. Experience in application of programming/computational languages - SQL, Python Proficient in Microsoft Word, Advanced Excel (VB Macros/Pivots) and PowerPoint, Microsoft SharePoint Experience in SDLC documentation and experience in writing BRDs, FRDs, performing Stakeholder Analysis / Process mapping, writing procedures, creating operational readiness guides, creating Data Dictionaries, Attribute Mapping, DQ Lifecycles, Cataloging, Documents naming convention, Version Control, etc. Has good understanding of databases and data structures Experience in technology-based process mapping and current-target state analysis using MS Visio, Blue Works etc. Basic Knowledge of BI/Reporting tools such as Tableau, Power BI and various functionalities Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a skilled professional in MDM solutions using Profisee, you will play a crucial role in designing, developing, implementing, and maintaining our master data management systems. Your expertise in data governance, data quality, and data integration will ensure the accuracy, consistency, and completeness of our master data. This position requires not only strong technical skills but also excellent communication abilities to effectively collaborate with cross-functional teams. Responsibilities: Solution Design and Development: Lead the design and development of MDM solutions using Profisee, encompassing data models, workflows, business rules, and user interfaces. Translate business requirements into technical specifications and MDM solutions. Configure and customize the Profisee platform to align with specific business needs. Develop and implement data quality rules and processes within Profisee to uphold data accuracy and consistency. Design and execute data integration processes between Profisee and other enterprise systems (e.g., ERP, CRM, Data Warehouse) using diverse integration techniques (API, ETL, etc.). Implementation and Deployment: Engage in the complete MDM implementation lifecycle, from requirements gathering to design, development, testing, deployment, and support. Create and execute test plans and scripts to validate the functionality and performance of the MDM solution. Troubleshoot and resolve issues pertaining to MDM data, processes, and infrastructure. Deploy and configure Profisee environments (development, test, production). Data Governance and Stewardship: Contribute to formulating and enforcing data governance policies and procedures. Collaborate with data stewards to define data ownership and accountability. Aid in establishing and updating data dictionaries and metadata repositories. Ensure adherence to data privacy regulations and security policies. Maintenance and Support: Monitor the performance and stability of the MDM environment. Offer ongoing support and maintenance for the MDM solution, encompassing bug fixes, enhancements, and upgrades. Develop and maintain documentation for MDM processes, configurations, and procedures. Proactively identify and resolve potential issues related to data quality and MDM performance. Collaboration and Communication: Work closely with business users, IT staff, and other stakeholders to grasp data requirements and implement efficient MDM solutions. Communicate proficiently with both technical and non-technical audiences. Participate in project meetings and provide regular status updates. Mentor and train junior team members on MDM best practices and the Profisee platform.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will have a pivotal role in implementing and embracing the data governance framework at Amgen, which aims to revolutionize the company's data ecosystem and establish Amgen as a pioneer in biopharma innovation. This position will make use of cutting-edge technologies such as Generative AI, Machine Learning, and integrated data. Your expertise in domains, technical knowledge, and business processes will be crucial in providing exceptional support for Amgen's data governance framework. Collaboration with business stakeholders and data analysts will be essential to ensure successful implementation and adoption of the data governance framework. Working closely with the Product Owner and other Business Analysts will be necessary to guarantee operational support and excellence from the team. You will be responsible for the implementation of the data governance and data management framework within a specific domain of expertise, such as Research, Development, or Supply Chain. Operationalizing the Enterprise data governance framework and aligning a broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, master data management, data sharing, communication, and change management will be part of your responsibilities. Collaborating with Enterprise MDM and Reference Data to enforce standards and data reusability will also be key. You will drive cross-functional alignment in your area of expertise to ensure adherence to Data Governance principles and maintain privacy policies and procedures to safeguard sensitive data and ensure compliance. Regular privacy risk assessments and audits will be conducted by you to identify and mitigate potential risks as required. Furthermore, you will be responsible for maintaining documentation on data definitions, data standards, data flows, legacy data structures, common data models, and data harmonization for the assigned domains. Ensuring compliance with data privacy, security, and regulatory policies for the assigned domains, including GDPR, CCPA, and other relevant legislations, will be critical. Together with Technology teams, business functions, and enterprise teams, you will define the specifications shaping the development and implementation of data foundations. Building strong relationships with key business leads and partners to ensure their needs are met will also be part of your role. Your must-have functional skills include technical knowledge of Pharma processes with specialization in a domain, in-depth understanding of data management, data quality, master data management, data stewardship, data protection, and familiarity with data protection laws and regulations. You should have experience in the development life cycle of data products and proficiency in tools like Collibra and Alation. Strong problem-solving skills, excellent communication, and working with data governance frameworks are essential. Experience with data governance councils, Agile software development methodologies, proficiency in data analysis and quality tools, and 3-5 years of experience in data privacy or compliance are good-to-have functional skills. Soft skills required for this role include integrity, adaptability, proactivity, leadership, organization, analytical skills, ability to work effectively with teams, manage multiple priorities, ambition to develop skills and career, build business relationships, understand end-to-end data use and needs, interpersonal skills, initiative, self-motivation, presentation skills, attention to detail, time management, and customer focus. Basic qualifications for this position include any Degree and 9-13 years of experience.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

vadodara, gujarat

On-site

Wipro Limited is a leading technology services and consulting company dedicated to creating innovative solutions that cater to the most complex digital transformation needs of clients. Our extensive portfolio includes consulting, design, engineering, and operations capabilities, empowering clients to achieve their ambitious goals and establish sustainable businesses for the future. With a workforce of over 230,000 employees and business partners spanning 65 countries, we are committed to helping our customers, colleagues, and communities thrive in an ever-evolving world. For more information, please visit our website at www.wipro.com. Responsibilities: - Develop and maintain applications using ABAP Workbench, ensuring the delivery of robust and scalable solutions. - Customize SAP HR schemas and rules to align with specific business processes and compliance requirements. - Apply Object-Oriented Programming (OOP) principles within ABAP to enhance modularity and maintainability. - Design and manage Data Dictionaries, and create Reports and Forms for HR data processing and presentation. - Integrate and query SQL databases for data extraction, transformation, and reporting. Requirements: - Proficiency in ABAP Workbench development. - Experience with SAP Customization including Schema and Rules. - Background in Object-Oriented Programming (OOP) / ABAP / Software Engineering. - Familiarity with Data Dictionaries, Reports, and Forms. - Knowledge of SAP Customization related to Wage Types. - Experience working with SQL databases. Nice to have: - Exposure to SAP Fiori. - Understanding of NoSQL databases. - Knowledge of MS Azure Databricks / Azure Data Factory. - Familiarity with Scaled Agile methodologies. - Experience with GitLab, TDD, DevOps, and CI/CD Pipelines. Equal Opportunity Employer: Wipro IT Services Poland Sp. z o.o. upholds the principles of equal opportunity employment. We are committed to fostering a diverse and inclusive workplace environment. Internal Reporting and Whistleblower Protection: Wipro IT Services Poland Sp. z o.o. complies with Internal Reporting and Whistleblower Protection Regulations. Candidates can submit internal reports via email to ombuds.person@wipro.com, through the Internet at www.wiproombuds.com, or by post/courier to Wipro's registered office. Mandatory Skills: SAP ABAP Experience: 5-8 Years Reinvent your world. Join us at Wipro as we build a modern digital transformation partner with bold ambitions. We are seeking individuals inspired by reinvention, eager to evolve themselves, their careers, and their skills. At Wipro, we embrace change as a fundamental part of our DNA. Be a part of a purpose-driven business that empowers you to craft your own reinvention journey. Realize your ambitions with us. We welcome applications from individuals with disabilities.,

Posted 2 weeks ago

Apply

13.0 - 17.0 years

0 Lacs

punjab

On-site

You will be responsible for leading and supporting enterprise master data programs to deliver Bunge's strategic initiatives covering Digital programs, Data Integration, and S4 Hana implementation for Finance data domain. As the Global Lead - Finance and Local Techno Functional Lead for Core Finance Master Data, you will be accountable for developing business solutions, ensuring Data Quality, and successfully implementing solutions across all geographic regions and Bunge's businesses. Your role will involve driving alignment across multiple business functional areas to define and execute project scope and deliverables. As a techno-functional expert in Master Data Management for Finance data types such as Cost center, GL, Profit center, and Company code, you will collaborate with various Bunge stakeholders globally from Business, IT, and other areas to define and achieve mutually agreed outcomes in the master data domains. You will work closely with Business Subject Matter Experts, Business Data Owners, Business functional area leaders, IT teams, Solution Architects, Bunge Business Services leaders, Delivery Partner teams, Enterprise Architecture (IT), and other IT teams to ensure seamless execution of integrated technology solutions aligned with business needs. Your key functions will include being responsible for end-to-end business requirements, engaging with business to gather requirements, defining project scope and deliverables, leading business UAT, managing scope and deliverables of strategic projects, driving implementation of master data solutions, building relationships with internal and external service providers, guiding project teams, maintaining in-depth understanding of processes, creating and maintaining data policies, leading Continuous Improvement initiatives, and much more. To be successful in this role, you should have a minimum of 13-15 years of professional data management experience, including at least 8-10 years of providing business solutions and working experience in SAP HANA & MDG/MDM. You should have strong leadership skills, experience in managing project teams, and the ability to work in a virtual team across different locations and time zones. Additionally, having knowledge and expertise in technologies such as SAP MDG, S4 HANA, Data Lake, Data Model, and MDM will be beneficial for this role.,

Posted 3 weeks ago

Apply

12.0 - 15.0 years

35 - 40 Lacs

Gurugram, Bengaluru, Mumbai (All Areas)

Work from Office

Role: Data Solution Architect Location: Bangalore, Chennai, Hyderabad, Kochi, Mumbai, Kolkata, Noida, Gurgaon (Any office but he/she need to be local) Experience: 12-15 Years Budget: 35-40 LPA Only Immediate Joiners Role Objectives Design and maintain enterprise data architecture to support the analysis and interpretation of complex datasets from diverse sources, transforming them into actionable business insights. Lead deep technical investigations into complex data challenges to define solution architecture , propose target state models, and identify key benefits. Apply industry-standard templates, methodologies, and best practices for data architecture, integration, and governance. Implement enterprise data management capabilities including data catalogues , business glossaries , metadata management , data lineage , and reference/master data frameworks . Drive adoption of cloud-native platforms such as Azure Data Platform , Informatica , and automation tools for scalable data solutions. Architect automated data processing workflows using tools like Power Automate , Python , macros , and custom scripts to streamline data operations. Identify and implement data architecture optimizations for efficiency, performance, and scalability. Collaborate with cross-functional teams to enable data-driven decision-making through architectural guidance and scalable data solutions. Enforce and promote data management best practices and architectural governance standards. Stakeholder Management Partner with Product Owners and Data Governance Leaders to promote adoption and alignment of data architecture with business objectives. Ensure data solutions are understandable and usable by technical and non-technical stakeholders. Act as a data architecture evangelist , capturing feedback and driving continuous improvement. Governance and Compliance Operate in alignment with organizational policies, data standards, and regulatory requirements . Ensure adherence to enterprise data governance frameworks, policies, and audit controls . Essential Skills Strong experience as a Data Solution Architect in Financial Services (banking, superannuation, insurance, etc.) and expertise in customer data architecture . Proficiency in data architecture , data modeling , data warehouses , data lake design , and advanced analytics platforms . Experience in data transformation logic , data ingestion patterns , and data quality frameworks . Expertise in documenting and aligning business data requirements with technical solutions. Strategic thinker with strong collaboration and leadership capabilities. Excellent communication skills , translating business objectives into scalable data solutions. Project and stakeholder management , ensuring architectural deliverables are on track. Proven ability to manage multiple concurrent data architecture initiatives . Specific Skills We Are Seeking Candidates to Possess and Demonstrate Are: Ability to lead and operate independently , while collaborating with business users, architects, data engineers, testers, and analysts. Strong understanding of business workflows, enterprise systems , and gathering technical and business data requirements . Proficiency in creating data architecture artifacts , including ERDs, interface specs, data dictionaries, transformation logic, and lineage diagrams . Hands-on experience with complex SQL , ETL pipelines , and cloud-based data platforms . Additionally, It Would Be Advantageous If Candidates Can: Develop Power BI semantic models and design visual analytics dashboards for key business metrics. Evaluate and optimize data warehouse architecture and cloud migration strategies . Create data quality frameworks and reconciliation strategies across enterprise data sources. Lead or contribute to data governance initiatives and architectural enhancements.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

22 - 27 Lacs

Pune

Hybrid

Job Title: AVP Data Designer Location: Pune Package up to 27 LPA Key Responsibilities Translate business data needs into scalable data models, schemas, and flows. Lead the design and implementation of logical and physical data models across platforms. Conduct data profiling and quality analysis to ensure data integrity. Collaborate with cross-functional teams to define data requirements and ensure smooth integration with existing systems. Maintain and update metadata, data dictionaries, and design specifications. Support the banks Data & Analytics strategy by enabling use-case driven data solutions. Ensure data solutions comply with governance, risk, and security frameworks. Optimize data structures for performance, scalability, and business insight generation. Must-Have Skills 58 years of experience in data design , data modeling , or data architecture . Proficiency in SQL and working with databases like Oracle, MySQL, SQL Server . Hands-on experience with Kafka , AWS , or other cloud/data streaming platforms. Strong understanding of data profiling , quality checks , and remediation. Excellent communication skills — ability to work with both technical and non-technical teams. Nice-to-Have Bachelor’s degree in Data Science , Computer Science , or related field. Knowledge of data warehousing and ETL concepts . Experience in the financial services or financial crime domain . Familiarity with data governance tools and frameworks . Exposure to tools like Power BI , Tableau , or data catalog platforms. For more details call Kanika on 9953939776 or email resume to kanika@manningconsulting.in

Posted 3 weeks ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Work with data engineers on data quality assessment, data cleansing and data analytics Share and discuss findings with team members practicing SAFe Agile delivery model Work as a Data Engineer for a team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support the R&D functional area. Overall management of the Enterprise Data Lake on AWS environment to ensure that the service delivery is cost effective and business SLAs around uptime, performance and capacity are met. Proactively work on challenging data integration problems by implementing optimal ETL patterns, frameworks for structured and unstructured data. Automate and Optimize data pipeline and framework for easier and cost-effective development process. Advice and support project teams (project managers, architects, business analysts, and developers) on cloud platforms (AWS, Databricks preferred), tools, technology, and methodology related to the design, build scalable, efficient and maintain Data Lake and other Big Data solutions Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Stay up to date with the latest data technologies and trends. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Have 3-5 years of experience in the Pharmaceutical Industry Have 3-5 years of experience in Mulesoft development Hands-on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Hands-on experience with various Python/R packages for EDA, feature engineering, and machine learning model training Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Solid understanding of data governance frameworks, tools, and standard methodologies. Knowledge of data protection and pharmaceutical regulations and compliance requirements (e.g., GxP, GDPR, CCPA) Demonstrated hands-on experience with AWS cloud platform and its technologies like EC2, RDS, S3, Redshift, and IAM roles. Extensive hands-on experience of working on Data Ingestion methods such as Batch, API and Streaming. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA). Demonstrated experience of performing Data Integrations using Mulesoft. Solid understanding of ETL, Data Modeling and Data Warehousing concepts. Ability to work independently with little supervision Ability to effectively present information to collaborators, and respond to questions to their questions Preferred Qualifications: Knowledge for clinical data in the pharmaceutical industry Knowledge of CT.gov and EUCTR.gov portals Knowledge of the Disclose application from Citeline Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Solid understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Proficiency with Data Orchestration tools like Kubernetes, Docker etc. Familiarity with SQL/NOSQL database, Vector Database for Large Language Models. Professional Certifications: SAFe for Teams certification (preferred) Databricks Certification(Preferred) AWS Certified Data Engineer(Preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies