Jobs
Interviews

2451 Data Integration Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

As a member of the risk and compliance team at PwC, your primary focus will be on maintaining regulatory compliance and managing risks for clients. You will provide valuable advice and solutions to help organizations navigate complex regulatory landscapes and enhance their internal controls effectively. In the realm of enterprise risk management, your role will involve identifying and mitigating potential risks that could impact an organization's operations and objectives. You will play a crucial part in developing business strategies to manage and navigate risks in today's rapidly changing business environment. Joining PwC Acceleration Centers (ACs) presents a unique opportunity to actively support various services, including Advisory, Assurance, Tax, and Business Services. Within our innovative hubs, you will engage in challenging projects and deliver distinctive services to enhance client engagements through quality and innovation. Moreover, you will participate in dynamic training programs designed to enhance your technical and professional skills. As a part of the Enterprise Risk Management team, your responsibilities will include designing and implementing data-driven solutions to enhance decision-making processes. In the role of a Senior Associate, you will be tasked with developing interactive dashboards, creating data models, and collaborating with cross-functional teams to drive strategic initiatives and improve organizational performance. Key Responsibilities: - Design and implement data-driven solutions to support decision-making - Develop interactive dashboards for visualizing key insights - Enhance data models to improve performance and usability - Collaborate with cross-functional teams to align on strategic initiatives - Analyze data to derive insights that enhance organizational performance - Utilize various tools and methodologies to solve complex problems - Ensure the accuracy and integrity of data used in analyses - Maintain a focus on continuous improvement in data processes Requirements: - Bachelor's Degree - 3 years of relevant experience - Proficiency in oral and written English Desired Skills: - Proficiency in Power BI development and data visualization - Experience in building and maintaining semantic data models - Familiarity with data integration and ETL processes - Effective collaboration with cross-functional teams - Clear communication of status updates and test results - Proficiency in SQL for data management and transformation - Engagement in Agile methodologies and ceremonies This role presents an exciting opportunity to contribute to risk management and decision-making processes while enhancing organizational performance through data-driven solutions and strategic initiatives.,

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Tableau Architect, you will be responsible for leveraging your advanced expertise in Tableau products including Tableau Desktop, Tableau Prep, and Tableau Server. Your strong background in data architecture, data modeling, and ETL processes will be essential in designing and modeling data for effective visualization. Proficiency in SQL and experience with relational databases such as SQL Server and Oracle will enable you to integrate data from multiple sources into Tableau for insightful analytics. Your ability to optimize dashboard performance for large datasets and familiarity with cloud platforms such as AWS and Azure for data storage and analytics will be crucial in ensuring efficient data processing. Additionally, your experience in training junior analysts and mentoring team members, particularly in Pune, India, will play a key role in skill development and best practice adherence. Collaboration with cross-functional teams, including remote support, will require strong communication skills to effectively troubleshoot technical issues and provide solutions for data visualization and analytics. Your proficiency in creating clear and comprehensive documentation and training materials for internal and external purposes will be vital in maintaining transparency and knowledge sharing. As a Tableau Architect, your responsibilities will include leading and mentoring junior team members in Pune, designing and architecting scalable Tableau dashboards and data solutions, optimizing Tableau environments for efficiency, scalability, and performance, as well as training junior staff in data visualization best practices through hands-on workshops and one-on-one mentorship sessions.,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Senior Talend Developer at Sedin Technologies, you will be responsible for designing, developing, and deploying ETL processes using Talend for data integration. Your primary focus will be on analyzing business requirements and translating them into technical solutions, collaborating with data architects and analysts to build scalable data pipelines, optimizing existing ETL processes, and integrating data from various sources. You will also participate in code reviews, testing, and deployment activities, working closely with cross-functional teams in a remote, night-shift environment. To excel in this role, you should have 8 to 12 years of IT experience with a minimum of 5 years in Talend development. Strong SQL skills, proficiency in working with large datasets, and knowledge of data warehousing concepts are essential. Experience with cloud platforms like AWS, Azure, or GCP, familiarity with other ETL tools and CI/CD processes, and problem-solving abilities in a fast-paced setting will be advantageous. Excellent communication skills and the flexibility to work during night shifts in US time zones are also required. Preferred qualifications include Talend certifications, experience with Talend Big Data or Talend Cloud, and exposure to Agile methodologies. In return, we offer a 100% remote work opportunity, competitive salary, a collaborative and innovation-driven work culture, and continuous learning and growth prospects. Join us at Sedin Technologies to leverage your Talend expertise, contribute to cutting-edge ETL projects, and be part of a dynamic team committed to excellence and professional development.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

kerala

On-site

At EY, you will have the opportunity to shape a career that is as unique as you are. With our global reach, support, inclusive culture, and cutting-edge technology, you have the tools to become the best version of yourself. We value your distinctive voice and perspective, and we rely on it to help EY continuously improve. By joining us, you will not only create an exceptional experience for yourself but also contribute to building a better working world for all. Your primary responsibilities and accountabilities will include: - Demonstrating experience in implementing EPBCS cloud, with a strong background in Application Development processes on PBCS/EPBCS and expertise in consolidation/reconciliation processes. - Possessing experience in various modules such as Finance, Workforce, and Capex. - Having a solid grasp of data management and the ability to integrate with source systems directly. - Hands-on experience in crafting complex business rules and groovy scripting. - Overseeing the successful implementation, integration, and management of Oracle Enterprise Performance Management (EPBCS) solutions. - Leading a team of EPBCS specialists and collaborating with cross-functional teams to deliver comprehensive solutions aligned with business objectives. - Working closely with Finance, IT, and business leaders to ensure the successful deployment and optimization of Oracle EPBCS solutions. - Leading or participating in end-to-end implementations of Oracle EPBCS modules, including requirements gathering, design, configuration, testing, and deployment. - Collaborating with stakeholders to understand business requirements and translating them into solutions following best practices and industry standards. - Engaging with leadership, business unit heads, and key stakeholders to provide strategic guidance and align Oracle EPBCS initiatives with organizational objectives. Your experience should include: - A proven track record of at least 5 years in customer-facing implementation projects, particularly in EPBCS. - Solid knowledge and experience in leading the technical implementation of EPBCS tools such as Oracle Cloud EPBCS, PCMCS, and Narrative Reporting. - Genuine passion for supporting customers in their digital finance transformation journey. Key competencies and skills required for this role are: - Strong leadership and team management skills to motivate and guide team members. - Effective customer handling skills and the ability to lead and mentor team members. - Project management skills to lead Oracle EPBCS implementation projects within defined timelines and budgets. - Strong communication skills to translate requirements into design documents. - Excellent organizational, time management, analytical, and problem-solving skills. You should be a graduate from a reputable educational institution, preferably with a background in finance. Possession of an Oracle certification is considered an added advantage. Additionally, you must have a valid passport for business travel that may involve work at client sites. Join EY in building a better working world where long-term value is created for clients, people, and society, and trust is built in the capital markets. Our diverse teams in over 150 countries, enabled by data and technology, provide assurance and help clients grow, transform, and operate across various sectors including assurance, consulting, law, strategy, tax, and transactions. At EY, we ask better questions to find new answers for the complex issues facing our world today.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

About SKF SKF started its operations in India in 1923. Today, SKF provides industry-leading automotive and industrial engineered solutions through its five technology-centric platforms: bearings and units, seals, mechatronics, lubrication solutions, and services. Over the years, the company has evolved from being a pioneer ball bearing manufacturing company to a knowledge-driven engineering company helping customers achieve sustainable and competitive business excellence. SKF's solutions provide sustainable ways for companies across the automotive and industrial sectors to achieve breakthroughs in friction reduction, energy efficiency, and equipment longevity and reliability. With a strong commitment to research-based innovation, SKF India offers customized value-added solutions that integrate all its five technology platforms. To know more, please visit: www.skf.com/in SKF Purpose Statement Together, we re-imagine rotation for a better tomorrow by creating intelligent and clean solutions for people and the planet. Job Description Position Title: Solution Owner Reports To: Digital Lead PX, Legal & Sustainability Role Type: Individual Contributor Location: Bangalore Purpose of the Role As a Solution Owner, you will be responsible for gathering business requirements from the Sustainability team, focusing on Data Warehousing development, and creating a roadmap to deliver it on time to ensure the final product meets these requirements. Collaboration with the Data Management Team, engineers, architects, and the Sustainability Functional team is essential to track the progress of development and ensure timely delivery of solutions. Additionally, you will provide technical guidance and mentorship to the team while working with stakeholders across the organization to understand and address their needs effectively. Job Responsibilities - Own and drive end-to-end sustainability-focused data analytics solutions, ensuring alignment with business and regulatory requirements. - Engage with stakeholders across business, IT, and sustainability teams to understand needs, define priorities, and drive solution implementation. - Develop and oversee data analytics frameworks using Power BI and other visualization tools to support sustainability reporting and decision-making. - Lead Agile teams, ensuring adherence to Agile best practices such as Scrum, SAFe, or Kanban. - Manage solution backlog, prioritize features, and ensure timely delivery of high-value outcomes. - Ensure data accuracy and integrity, working closely with data engineers and analysts to implement robust data governance and quality frameworks. - Monitor performance metrics and continuously improve solutions based on user feedback and business impact. Skills And Knowledge Requirements - 10+ years of experience in solution ownership, data analytics, and sustainability initiatives. - Experience in working with various data sources and formats, such as relational databases, NoSQL databases, flat files, web services, etc. - Experience in working with various data integration patterns and scenarios, such as batch, real-time, event-driven, streaming, etc. - Knowledge of various data analytics tools and platforms, such as Informatica, Snowflake, DBT, etc. - Experience in working with various cloud services and platforms, such as AWS, Azure, GCP, etc. - Experience in Agile methodologies, backlog management, and iterative delivery. - Experience in IT Project Management. - Excellent communication, collaboration, and problem-solving skills. - Ability to work independently and as part of a team. Nice to Have ITIL and PMP Certifications Education Bachelors or masters degree in computer science, engineering, or related field.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for this position in Noida (Work From Office) should have 2-6 years of experience and possess the following skills and qualifications: You will be part of a team working on analyzing millions of merchants" device data to gain valuable insights. The key requirements for this role include: - Advanced SQL experience. - Basic knowledge of Python. - Working experience with Google Looker or another visualization tool. - Prior experience with automation projects related to data integration, extraction, and visualization. - Advanced Excel knowledge. Additionally, it is crucial to have strong analytics and critical thinking skills. Experience with Looker Studio for visualization is a plus. To succeed in this role, the candidate should have: - A Bachelor's degree in Computer Science, Engineering, or a related field. - At least 2 years of experience as a Product Analyst, preferably in the financial services or technology industry. - Strong analytical and problem-solving skills to understand complex business problems and translate them into technical requirements. - Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders. - Experience in market research and analysis would be advantageous.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

bhubaneswar

On-site

The Informatica Master Data Management (MDM) Expert plays a critical role in the organization by ensuring the integrity, consistency, and accuracy of master data across all business units. This position is essential for driving data governance initiatives and for supporting various data integration and management processes. As an MDM Expert, you will leverage your knowledge of Informatica tools to develop and implement MDM strategies that align with organizational goals. You will collaborate with cross-functional teams, providing expertise in data modeling, quality management, and ETL processes. This role requires a deep understanding of master data concepts as well as the ability to address complex data challenges, ensuring reliable data inputs for analytical and operational needs. In addition, you'll drive improvements in data processes, lead troubleshooting efforts for MDM-related incidents, and train other team members in best practices. Your contributions will not only enhance data quality but will also support strategic decision-making and business outcomes across the organization. Key Responsibilities - Design and implement Informatica MDM solutions according to business requirements. - Lead the development of data governance frameworks and best practices. - Integrate MDM with existing data management and analytics solutions. - Collaborate with IT and business stakeholders to gather requirements. - Perform data profiling and analysis to ensure governance standards are met. - Develop and maintain data quality metrics and KPIs. - Document data management processes, data flows, and MDM-related architecture. - Provide troubleshooting support for MDM incidents and data discrepancies. - Facilitate data model design and validation with stakeholders. - Conduct training sessions for users on MDM tools and procedures. - Stay current with industry trends and best practices in MDM. - Coordinate with ETL teams to ensure smooth data integration. - Manage ongoing MDM projects, ensuring timely delivery and quality. - Support audit and compliance efforts related to data governance. - Enhance and optimize existing MDM processes for efficiency. Required Qualifications - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in data management, with a focus on MDM. - Proven expertise in Informatica MDM and the Informatica toolset. - Strong understanding of data governance principles and practices. - Proficiency in SQL and relational database management. - Experience with data modeling concepts and best practices. - Knowledge of ETL processes and tools, particularly Informatica PowerCenter. - Familiarity with XML and data transformation techniques. - Prior experience with cloud-based data solutions is a plus. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal abilities. - Ability to train and mentor junior team members. - Hands-on experience with data quality tools and methodologies. - Strong organizational skills with the ability to manage multiple projects. - Experience in agile project management methodologies. - Relevant certifications in Informatica or data governance are desirable. Skills: management, agile project management methodologies, data management, data governance, data modeling, cloud-based data solutions, organizational skills, SQL, interpersonal skills, data transformation techniques, MDM, data integration, data quality, Informatica MDM, data, analytical skills, problem-solving skills, communication skills, data profiling, ETL, ETL processes, master data, relational database management, Informatica, data quality metrics, SQL proficiency,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

kolkata, west bengal

On-site

About KPMG in India KPMG entities in India are professional services firm(s) affiliated with KPMG International Limited, established in August 1993. Our professionals leverage the global network of firms, understanding local laws, regulations, markets, and competition. With offices across India in cities like Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, KPMG entities in India offer services to national and international clients across different sectors. We aim to provide rapid, performance-based, industry-focused, and technology-enabled services, reflecting a shared knowledge of global and local industries and our experience in the Indian business environment. As a member of the software development team at KPMG, India, you will work on a variety of projects in a highly collaborative, fast-paced environment. You will be responsible for the full life cycle of the software development process, including developing code, unit testing, and working closely with Technical Architects, Business Analysts, user interaction designers, and other software engineers to create new product offerings and enhance existing ones. Additionally, you will ensure that all development practices adhere to KPMG's best practices policies and procedures, requiring quick adaptation to new technologies when necessary. Responsibilities: Role: Intelligent Data Management Cloud (IDMC) Location: Kolkata and Bangalore Experience: 4 to 6 years Responsibilities: - Data Integration: Oversee the integration of data across platforms and environments using IDMC tools. - Data Quality: Ensure data accuracy, consistency, and reliability through IDMC's data quality solutions. - Data Governance: Implement and manage data governance policies to maintain data integrity and compliance. - AI and Machine Learning: Utilize AI and machine learning capabilities within IDMC for enhanced data management processes. - Collaboration: Work closely with cross-functional teams to understand data needs and provide solutions. - Performance Monitoring: Monitor and optimize data management systems" performance. - Training and Support: Provide training and support to team members on IDMC tools and best practices. - Experience: Demonstrated experience in data management, data integration, and data governance. - Technical Skills: Proficient in Informatica IDMC, cloud platforms (AWS, Azure, Google Cloud), and data management tools. - Analytical Skills: Strong analytical and problem-solving abilities. - Communication: Excellent communication and interpersonal skills. - Certifications: Relevant certifications in data management or Informatica tools are a plus. Qualifications: - Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Equal Opportunity Employer,

Posted 6 days ago

Apply

14.0 - 18.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Technical Architect at Salesforce Professional Services, you will play a crucial role in serving as a strategic advisor and Salesforce product and platform expert to the company's largest and most complex enterprise customers. Your responsibilities will include being a trusted advisor to the client, leading internal strategic initiatives to grow the consulting practice, and guiding customers and colleagues in deploying emerging technologies for increased business value. You will collaborate closely with Delivery Managers, Solution Architects, and clients to architect technology solutions that meet client needs. Additionally, you will lead the technical architecture team for enterprise-level customer engagements and participate in pre-sales activities such as discovery sessions and Proof-Of-Concept development with prospects. To excel in this role, you should have a minimum of 14 years of enterprise architecture or consulting experience, strong application design skills, and expertise in data, integration, and security architecture. Your presentation skills should be top-notch, and you should be able to effectively communicate with diverse audiences. A detail-oriented approach, rapid learning ability, and innovative problem-solving skills are essential for success. Preferred qualifications include a Bachelor's degree in Computer Science, Engineering, or a related quantitative discipline. Certifications such as Application Architect, System Architect, or CTA are optional but beneficial. Experience in Field Service implementation is preferred. As a key member of the Salesforce Professional Services team, you will embody the core values of trust, collaboration, and effective communication. Your leadership skills, ability to build strong relationships, and thought leadership will contribute to the success of the team and the organization as a whole. Join Salesforce today to unleash your potential and be limitless in all areas of your life. Our benefits and resources will support you in finding balance and excelling in your role. Together, we will bring the power of Agentforce to organizations of all sizes and deliver exceptional customer experiences. Shape the future and redefine what's possible for yourself, AI, and the world by applying now.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

tamil nadu

On-site

The Finance Data Steward is responsible for supporting the Finance Data services team governance and operations. You will be involved in measuring and reporting the master data consumption, consolidated reports, volumes (BVI), process performance, and quality metrics (KPIs) for all the relevant finance data objects covered by the team such as cost centers, Project WBS, GL accounts, Internal Business Partners, etc. Your role will require close collaboration with the finance data team, including data stewards, data SMEs, Data maintainers, reporting teams, data curators, business analysts, and other stakeholders. You will be responsible and accountable for gathering the data requirements, setting up the data model design, architecture, documentation, and developing scalable data models for Finance data objects via Power BI dashboards, SQL programming, Power Automate, and other data analysis and processing tools. You will connect and integrate various data sources to create a unified view of Finance and Business partner master data consolidated reports, volumes, process performance, and quality metrics. Design and implement data models and transformations to prepare Finance and Business Partner master data for performance and quality analysis, reporting, consolidation, and visualization. You will build interactive and insightful dashboards using data visualization tools such as Power BI, SQL, Power Automate, Data Bricks, and Azure. Additionally, you will manage and execute technical activities and projects for Finance and Business partner master data analytics use cases, prepare schedules and technical activities plans, seek cost-efficient solutions, write functional and technical specifications, proactively identify new data insights, maintain and improve current dashboards, work closely with business stakeholders, adhere to data governance policies, and develop and implement data quality checks and validation processes. Key Skills And Experience: - Deep understanding of Master data management principles, processes, and tools including data governance, data quality, data cleansing, and data integration. - Programming & Data Visualization Skills: knowledge of Power BI dashboards, data flows, and design (advanced), SQL, DataBricks, Power Automate, HTML, Python, Sharepoint, etc. - Experience with Data repositories such as EDP, Azure. - Excellent written and oral communication in English. - Hands-on experience with data analytics tools. - Problem-solving aptitude and analytic mindset. - Effective networking capabilities and comfortable in a multicultural environment and virtual teams. - Team player. Join Nokia, a company committed to innovation and technology leadership across mobile, fixed, and cloud networks. Your career here will have a positive impact on people's lives and help build capabilities for a more productive, sustainable, and inclusive world. Nokia offers continuous learning opportunities, well-being programs, opportunities to join employee resource groups, mentoring programs, and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer.,

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

As a Senior Consultant Data Analyst/Data Modeler at Capco, a Wipro company, you will be a part of a global technology and management consulting firm that has been recognized for its deep transformation execution and delivery. With over 32 cities across the globe and 100+ clients in banking, financial, and energy sectors, Capco offers you the opportunity to make a significant impact by providing innovative thinking, delivery excellence, and thought leadership to help clients transform their business. You will be responsible for data warehousing migration programs involving cross-geography and multi-functional delivery, ensuring project success delivery by aligning project timelines, and providing support for data analysis, mapping, and profiling. Your role will include data requirement gathering, analysis, and documentation, mapping data attributes from different source systems to target data models, and interpreting use case requirements for the design of target data models/data marts. Additionally, you will profile data attributes to assess data quality, ensure compliance with data architecture principles, and perform data modeling for better data integration within the data warehouse platform. Working closely with squad members, stakeholders, and internal development teams, you will manage different stakeholders to ensure project delivery aligns with the timeline for each milestone. You will analyze user requirements, profile data, and finalize requirements for delivery, transforming data requirements into data models through design and modeling in alignment with data warehousing standards and processes. Your responsibilities will also include creating data mapping templates, profiling data to assess quality, supporting data stores inbound/outbound development activities, and providing guidance to the development team. Moreover, you will participate in key decision-making discussions, perform SIT, support UAT, manage change requests effectively, align with bank processes and standards, and deliver functional documentation to the development team while collating requirements from stakeholders. Your role will involve ensuring alignment with the Data Quality Management Framework, including data management through lineage documentation and data control to ensure data quality securities. Joining Capco will give you the opportunity to work on engaging projects with some of the largest banks globally, transforming the financial services industry. You will be part of a work culture focused on innovation, ongoing learning opportunities, and a non-hierarchical structure that enables you to work with senior partners and clients directly in a diverse, inclusive, meritocratic culture.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for contributing to the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to assist in applications systems analysis and programming activities. You will utilize your knowledge of applications development procedures and concepts, along with basic knowledge of technical areas, to identify and define necessary system enhancements. This includes using script tools, analyzing code, and consulting with users, clients, and other technology groups to recommend programming solutions. Additionally, you will install and support customer exposure systems and apply fundamental knowledge of programming languages for design specifications. As an Intermediate Programmer Analyst, you will analyze applications to identify vulnerabilities and security issues, conduct testing and debugging, and serve as an advisor or coach to new or lower-level analysts. You will be responsible for identifying problems, analyzing information, and making evaluative judgments to recommend and implement solutions. Operating with a limited level of direct supervision, you will exercise independence of judgment and autonomy while acting as a subject matter expert to senior stakeholders and/or other team members. In this role, it is crucial to appropriately assess risk when making business decisions, with a focus on safeguarding Citigroup, its clients, and assets. This includes driving compliance with applicable laws, rules, and regulations, adhering to policies, applying sound ethical judgment, and escalating, managing, and reporting control issues with transparency. Qualifications: - 4-6 years of proven experience in developing and managing Big Data solutions using Apache Spark and Scala is required - Strong programming skills in Scala, Java, or Python - Hands-on experience with technologies like Apache Hive, Apache Kafka, HBase, Couchbase, Sqoop, Flume, etc. - Proficiency in SQL and experience with relational databases (Oracle/PL-SQL) - Experience in working on Kafka, JMS/MQ applications - Familiarity with data warehousing concepts and ETL processes - Knowledge of data modeling, data architecture, and data integration techniques - Experience with Java, Web services, XML, JavaScript, Microservices, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and the Hadoop ecosystem - Experience with developing frameworks and utility services, logging/monitoring, and high-quality software delivery - Experience creating large-scale, multi-tiered, distributed applications with Hadoop and Spark - Profound knowledge of implementing different data storage solutions such as RDBMS, Hive, HBase, Impala, and NoSQL databases Education: - Bachelor's degree or equivalent experience This job description provides a high-level overview of the responsibilities and qualifications for the Applications Development Intermediate Programmer Analyst position. Other job-related duties may be assigned as required.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

At H.E. Services vibrant tech Center in Hyderabad, you will have the opportunity to contribute to technology innovation for Holman Automotive, a leading American fleet management and automotive services company. Our goal is to continue investing in people, processes, and facilities to ensure expansion in a way that allows us to support our customers and develop new tech solutions. Holman has come a long way during its first 100 years in business. The automotive markets Holman serves include fleet management and leasing; vehicle fabrication and up fitting; component manufacturing and productivity solutions; powertrain distribution and logistics services; commercial and personal insurance and risk management; and retail automotive sales as one of the largest privately owned dealership groups in the United States. Join us and be part of a team that's transforming the way Holman operates, creating a more efficient, data-driven, and customer-centric future. Roles & Responsibilities: - Design, develop, and maintain data pipelines using Databricks, Spark, and other Azure cloud technologies. - Optimize data pipelines for performance, scalability, and reliability, ensuring high speed and availability of data warehouse performance. - Develop and maintain ETL processes using Databricks and Azure Data Factory for real-time or trigger-based data replication. - Ensure data quality and integrity throughout the data lifecycle, implementing new data validation methods and analysis tools. - Collaborate with data scientists, analysts, and stakeholders to understand and meet their data needs. - Troubleshoot and resolve data-related issues, providing root cause analysis and recommendations. - Manage a centralized data warehouse in Azure SQL to create a single source of truth for organizational data, ensuring compliance with data governance and security policies. - Document data pipeline specifications, requirements, and enhancements, effectively communicating with the team and management. - Leverage AI/ML capabilities to create innovative data science products. - Champion and maintain testing suites, code reviews, and CI/CD processes. Must Have: - Strong knowledge of Databricks architecture and tools. - Proficient in SQL, Python, and PySpark for querying databases and data processing. - Experience with Azure Data Lake Storage (ADLS), Blob Storage, and Azure SQL. - Deep understanding of distributed computing and Spark for data processing. - Experience with data integration and ETL tools, including Azure Data Factory. Advanced-level knowledge and practice of: - Data warehouse and data lake concepts and architectures. - Optimizing performance of databases and servers. - Managing infrastructure for storage and compute resources. - Writing unit tests and scripts. - Git, GitHub, and CI/CD practices. Good to Have: - Experience with big data technologies, such as Kafka, Hadoop, and Hive. - Familiarity with Azure Databricks Medallion Architecture with DLT and Iceberg. - Experience with semantic layers and reporting tools like Power BI. Relevant Work Experience: - 5+ years of experience as a Data Engineer, ETL Developer, or similar role, with a focus on Databricks and Spark. - Experience working on internal, business-facing teams. - Familiarity with agile development environments. Education and Training: - Bachelor's degree in computer science, Engineering, or a related field, or equivalent work experience.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

delhi

On-site

As an IT Expert in our organization, you will play a pivotal role in leading and managing backend digital operations across all departments. Your expertise in Shopify management, advanced Excel operations, and cross-functional IT integration will be crucial in ensuring efficiency and connectivity between key departments such as Sales, Marketing, Accounts, Inventory, Customer Support, and E-commerce. This strategic and hands-on role is ideal for someone who thrives in a dynamic environment and can effectively align technology with business goals. Your key responsibilities will include managing and optimizing Shopify-based e-commerce operations, overseeing product uploads, app integrations, theme updates, and backend customizations. You will also be responsible for creating automated reports, troubleshooting platform-related issues, maintaining and automating Excel-based tracking systems, developing custom dashboards and reports, and performing data analysis using advanced functions such as Pivot Tables, VLOOKUP, Macros, and Power Query. Furthermore, you will facilitate cross-departmental IT integration by managing tools like CRM, accounting software, HR systems, and inventory management. Your role will also involve overseeing IT infrastructure, troubleshooting software, hardware, and network-related issues, ensuring cybersecurity practices, regular backups, and system maintenance are enforced, identifying and implementing automation opportunities, and designing workflows to optimize departmental productivity. Key Skills & Qualifications: - Strong command over Shopify backend, theme settings, and plugin management. - Expertise in Microsoft Excel (VLOOKUP, Pivot Tables, Macros, Power Query). - Familiarity with automation platforms such as Zapier, Integromat, or Google Workspace tools. - Understanding of web technologies (HTML, CSS, APIs) is an added advantage. - Solid grasp of data integration, workflow mapping, and file management systems. - Strong analytical skills with a problem-solving mindset. - Effective communication and training skills. Educational Requirements: - Bachelors degree in IT, Computer Science, or a related technical field. - Additional certifications in Shopify, Excel, or Automation Tools are preferred. Experience: - Minimum of 3-5 years in IT operations or backend tech roles, ideally in an e-commerce or multi-departmental environment. If you are a tech-savvy problem-solver passionate about creating streamlined systems and enabling business performance through smart IT practices, we would love to hear from you. Join our growing team that values innovation, autonomy, and impact. This is a full-time, permanent position with a day shift schedule. The work location is in person. Apply now to be part of our team!,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Solution Design Specialist focusing on Kinaxis Rapid Response, your primary responsibility will be to collaborate with clients in Pune, Bangalore, Kolkata, and Hyderabad to comprehend their supply chain challenges. By gathering requirements and understanding their specific needs, you will design and provide Kinaxis Rapid Response solutions tailored to address their unique challenges. Analyzing business processes, data flows, and system integrations will be crucial aspects of your role to deliver comprehensive design solutions. You will be configuring the Kinaxis solution, setting up data integrations, and customizing functionalities to align with the requirements of each client. Additionally, you will assess the feasibility of technology for business proposals and ideas, providing technical leadership for the business and Scrum team. Continuous improvement initiatives and innovation are key focuses of this position. You will ensure adherence to best practices among developers, serving as a subject matter expert on Kinaxis Rapid Response and staying updated on its functionalities and best practices. Offering guidance and support to clients and internal teams on technical configuration, data modeling, and system integration options will also be part of your responsibilities. Building strong relationships with clients, understanding their business objectives and challenges, and providing continuous improvement recommendations will be essential. Collaborating closely with clients, conducting training sessions, and workshops to enhance their knowledge and skills in using Kinaxis Rapid Response will also be part of your role. To qualify for this position, you should hold a Bachelor's or Master's degree in supply chain management or a related field. An MBA or other advanced certifications would be advantageous. A strong understanding of supply chain processes, including inventory management, demand planning, procurement, and logistics, is required. Additionally, holding Kinaxis Rapid Response Solution Consultant Level-2 or Level-3 certification is a mandatory requirement.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

ANP is a leading consulting firm currently seeking Professionals in OneStream practice to join their dynamic team. This role is ideal for an experienced Professional aiming to enhance and optimize financial planning, forecasting, and business processes using OneStream. As a candidate, you will be instrumental in OneStream model solutioning and implementations, business planning process optimization, and stakeholder collaboration to provide effective planning solutions. This position offers valuable hands-on experience and professional growth in the enterprise performance management (EPM) and planning ecosystem. Location: PAN India Key Responsibility: - Implementing OneStream Solutions covering Requirements and Design, Development, Testing, Training, and support. - Assisting in pre-sales meetings with potential clients, including supporting client demos and proof-of-concept projects. - Collaborating effectively with internal and client-side resources and communicating efficiently across various audiences. - Demonstrating proficiency in Anaplan, multi-dimensional modeling, Excel, data integration tools, and ETL processes. - Approaching challenges creatively and leveraging technology to address business issues. - Adhering to clients" delivery methodology and project standards to ensure timely completion of project deliverables. - Thriving in a fast-paced, dynamic environment and effectively navigating ambiguity. - Embracing the clients" culture of "All Business is personal" and taking full ownership of tasks with an outcome-driven strategy. Qualifications: Educational Background: Bachelors degree in finance, Accounting, Business, Computer Science, or related field; or Chartered Accountant / MBA Finance - 3+ Years of OneStream experience and a total of 5+ Years of EPM implementations. - Certified OneStream Professional. - Proficiency in OneStream, multi-dimensional modeling, Excel, data integration tools, and ETL processes. - Solid understanding of financial and accounting processes, including experience with financial close, consolidations, financial reporting, FP&A. - Experience in data integration between different systems/sources, with REST API knowledge as an advantage. Preferred Skills: - Strong client-facing skills, organizational, and detail-oriented. - Excellent communication and interpersonal abilities. - Proven capability to thrive in a demanding, fast-paced environment and manage high workloads. - Familiarity with data visualization tools like Oracle. - Experience with data visualization tools such as Tableau or PowerBI.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You should have a strong knowledge of AWS services including S3, AWS DMS (Database Migration Service), and AWS Redshift Serverless. Experience in setting up and managing data pipelines using AWS DMS is required. Proficiency in creating and managing data storage solutions using AWS S3 is a key aspect of this role. You should also be proficient in working with relational databases, particularly PostgreSQL, Microsoft SQL Server, and Oracle. Experience in setting up and managing data warehouses, particularly AWS Redshift Serverless, is important for this position. Your responsibilities will include utilizing analytical and problem-solving skills to analyze and interpret complex data sets. You should have experience in identifying and resolving data integration issues such as inconsistencies or discrepancies. Strong problem-solving skills are needed to troubleshoot and resolve data integration and migration issues effectively. Soft skills are also essential for this role. You should be able to work collaboratively with database administrators and other stakeholders to ensure integration solutions meet business requirements. Strong communication skills are required to document data integration processes, including data source definitions, data flow diagrams, and system interactions. Additionally, you should be able to participate in design reviews and provide input on data integration plans. A willingness to stay updated with the latest data integration tools and technologies and recommend upgrades when necessary is expected. Knowledge of data security and privacy regulations is crucial. Experience in ensuring adherence to data security and privacy standards during data integration processes is required. AWS certifications such as AWS Certified Solutions Architect or AWS Certified Database - Specialty are a plus for this position.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Architect AI Products supporting Novartis" Commercial function, you will play a crucial role in driving the architectural strategy that facilitates the seamless integration of data and AI products across various key areas such as omnichannel engagement, customer analytics, field operations, and real-world insights. Your responsibilities will involve collaborating with commercial business domains, data platforms, and AI product teams to design scalable, interoperable, and compliant solutions that enhance the impact of data and advanced analytics on healthcare professional and patient engagement. You will be tasked with defining and implementing the reference architecture for commercial data and AI products, ensuring alignment with enterprise standards and business priorities. Additionally, you will architect the integration of data products with AI products and downstream tools, promoting modular, scalable design to encourage reuse and interoperability across different markets and data domains within the commercial landscape. Stakeholder alignment will be a key aspect of your role, as you will partner with various teams to guide solution design, delivery, and lifecycle evolution. Your role will also involve supporting the full lifecycle of data and AI, including ingestion, transformation, model training, inference, and monitoring within secure and compliant environments. It will be essential to ensure that the architecture complies with governance, data privacy, and commercial requirements while constantly seeking opportunities for architectural improvements, modern technologies, and integration patterns to enhance personalization, omnichannel engagement, segmentation, targeting, and performance analytics. To excel in this position, you are expected to demonstrate proven leadership in cross-functional architecture efforts, possess a good understanding of security, compliance, and privacy regulations in the commercial pharma sector, and have experience with pharmaceutical commercial ecosystems and data. A strong background in data platforms, pipelines, and governance, as well as knowledge of AI/ML architectures supporting commercial use cases, will be advantageous. Additionally, a bachelor's or master's degree in computer science, engineering, data science, or a related field, along with at least 10 years of experience in enterprise or solution architecture, are desirable qualifications for this role. Novartis is committed to diversity, equal opportunity, and inclusion, striving to build diverse teams that represent the patients and communities served. By joining Novartis, you will become part of a community of passionate individuals collaborating to achieve breakthroughs that positively impact patients" lives. If you are ready to contribute to creating a brighter future through innovation and collaboration, we invite you to explore career opportunities at Novartis.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a leading consulting firm, we are currently seeking Professionals in OneStream practice to join our dynamic team. This role is ideal for an experienced Professional who is eager to make a significant impact by enhancing and optimizing financial planning, forecasting, and business processes through the power of OneStream. You will play a key role in OneStream model solutioning and implementations, optimizing business planning processes, and collaborating with stakeholders to deliver effective planning solutions. This position offers hands-on experience and opportunities for professional growth in the enterprise performance management (EPM) and planning ecosystem. Location: PAN India Responsibilities: - Implement OneStream Solutions covering Requirements and Design, Development, Testing, Training, and support. - Assist in pre-sales meetings with prospective clients, including supporting client demos and proof-of-concept projects. - Collaborate seamlessly with internal and client-side resources and effectively communicate across various audiences. - Demonstrate proficiency in Anaplan, understanding of multi-dimensional modeling, and basic knowledge of Excel, data integration tools, or ETL processes is a plus. - Approach problems creatively and utilize technology to solve business challenges. - Adhere to clients" delivery methodology and project standards, ensuring timely completion of project deliverables. - Thrive in a fast-paced, dynamic environment and navigate ambiguity. - Embrace the culture of "All Business is personal" and take full ownership of tasks by adopting an outcome-driven strategy. Qualifications: - Educational Background: Bachelor's degree in finance, Accounting, Business, Computer Science, or a related field, or Chartered Accountant / MBA Finance. - 3+ Years of OneStream experience and a total of 5+ Years of EPM implementations. - Certified OneStream Professional. - Proficiency in OneStream, understanding of multi-dimensional modeling, and basic knowledge of Excel, data integration tools, or ETL processes is a plus. - Good understanding of financial and accounting processes (account reconciliations, intercompany eliminations, currency translation, allocations, and top-side adjustment), including proficient experience with financial close, consolidations, financial reporting, FP&A. - Experience with data integration between different systems/sources, REST API as an added advantage. Preferred Skills: - Strong client-facing skills, organizational, and detail-oriented. - Excellent communication and interpersonal skills. - Proven ability to work in a demanding, fast-paced environment and manage a high workload. - Familiarity with data visualization tools like Oracle. - Experience with data visualization tools like Tableau or PowerBI.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As a Power BI Developer, you will be responsible for understanding business requirements in the BI context and designing data models to transform raw data into meaningful insights. You will create dashboards and interactive visual reports using Power BI, identifying key performance indicators (KPIs) and consistently monitoring them. Your role will involve analyzing data and presenting it through reports that aid decision-making. Additionally, you will be converting business requirements into technical specifications, creating relationships between data, and developing tabular and multidimensional data models. Chart creation and data documentation explaining algorithms, parameters, models, and relations will also be part of your responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Business Administration, or a related field, along with a minimum of 6 to 8 years of experience in visual reporting development. You must have at least 6 years of Power BI development experience, expertise in SQL Server, and excellent Microsoft Office skills, including advanced Excel skills. Strong analytical, quantitative, problem-solving, and organizational skills are essential, along with attention to detail and the ability to coordinate multiple tasks, set priorities, and meet deadlines. Apply now for the Power BI Developer position in Nagpur/Pune if you are passionate about creating impactful data visualizations and driving insights through analytics. If you are an experienced professional with 6 to 8 years in the field of Business Intelligence, consider applying for the Power BI Lead role in Nagpur/Pune. You will be responsible for understanding business requirements, creating dashboards and visual reports, identifying key KPIs, and analyzing data to aid decision-making. Data cleansing, data quality processes, and developing data models will be key aspects of your responsibilities. Your skills in Analyzes Service, building Tabular & Multidimensional models, and Power BI development experience will be crucial for success in this role. For those with 8+ years of experience in Business Intelligence and a proven track record as a Power BI Architect, we have an exciting opportunity in Nagpur/Pune. As a Power BI Architect, your responsibilities will include collaborating with business stakeholders to understand reporting and analytics requirements, designing end-to-end Power BI solutions, developing data integration pipelines, and creating visually appealing reports and dashboards. Performance optimization and enhancing user experiences will be key focus areas in this role. If you are passionate about innovation, growth, and high-impact careers, and possess the required skills and experience, we invite you to apply for the Power BI Architect position and be part of a dynamic team that thrives on learning and development opportunities. Join us in creating a collaborative work environment that fosters growth and success for all team members.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Marketing Cloud Engineer at SIXT, you will play a crucial role in understanding business requirements and providing high-quality technical solutions. With a focus on building custom reports, email templates, AMP script-based dynamic templates, automation jobs, and comprehensive customer journeys, you will need to utilize your expertise in front-end development using HTML, CSS, JavaScript, and AMPScript. Your responsibilities will also include mastering SQL and having a robust understanding of Marketing Cloud's contact data model. You will work closely with customers and platform data, including SQL, system data views, and Send Log. Additionally, you will be involved in working with Mobile Studio, Datorama, Contact Builder, journey, and email studio. Furthermore, you will configure data import via FTP imports, utilize Marketing Cloud Connector, and integrate with Salesforce. Knowledge of DMP and data integration will be essential in building SFMC solutions with a focus on availability, redundancy, throughput, speed, and security. Building cross-channel communication for customers through Email, SMS, and Push will be a key aspect of your role. You will be responsible for iteratively improving the Marketing Cloud data model and architecture, as well as implementing automated solutions using Marketing Cloud Server-Side JavaScript (SSJS). Experience with handlebar and strong HTML skills will be beneficial in successfully carrying out your duties. To qualify for this position, you should have a minimum of 4 years of experience working with the Marketing Cloud platform. A Bachelor's or Master's degree in Engineering or a related field such as Computer Science or Information Science will be required. SIXT offers a work environment with a legacy of over a century, promoting a healthy work culture, continuous learning, team empowerment, and challenging opportunities to solve real-world problems. As a Marketing Cloud Engineer, you will be part of a department that focuses on cutting-edge technology, developing and operating core systems in-house, and striving for a long-term technical approach. If you are excited about the opportunity to work with state-of-the-art frameworks, architectures, and play a significant role in revolutionizing the world of mobility, then apply now to join the SIXT team!,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

This is a data engineer position where you will be responsible for designing, developing, implementing, and maintaining data flow channels and data processing systems to support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. Your main objective will be to define optimal solutions for data collection, processing, and warehousing, particularly within the banking & finance domain. You must have expertise in Spark Java development for big data processing, Python, and Apache Spark. You will be involved in designing, coding, and testing data systems and integrating them into the internal infrastructure. Your responsibilities will include ensuring high-quality software development with complete documentation, developing and optimizing scalable Spark Java-based data pipelines, designing and implementing distributed computing solutions for risk modeling, pricing, and regulatory compliance, ensuring efficient data storage and retrieval using Big Data, implementing best practices for Spark performance tuning, maintaining high code quality through testing, CI/CD pipelines, and version control, working on batch processing frameworks for Market risk analytics, and promoting unit/functional testing and code inspection processes. You will also collaborate with business stakeholders, Business Analysts, and other data scientists to understand and interpret complex datasets. Qualifications: - 5-8 years of experience in working in data ecosystems - 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting, and other Big data frameworks - 3+ years of experience with relational SQL and NoSQL databases such as Oracle, MongoDB, HBase - Strong proficiency in Python and Spark Java with knowledge of core Spark concepts (RDDs, Dataframes, Spark Streaming, etc.), Scala, and SQL - Data integration, migration, and large-scale ETL experience - Data modeling experience - Experience building and optimizing big data pipelines, architectures, and datasets - Strong analytic skills and experience working with unstructured datasets - Experience with various technologies like Confluent Kafka, Redhat JBPM, CI/CD build pipelines, Git, BitBucket, Jira, external cloud platforms, container technologies, and supporting frameworks - Highly effective interpersonal and communication skills - Experience with software development life cycle Education: - Bachelors/University degree or equivalent experience in computer science, engineering, or a similar domain This is a full-time position in the Data Architecture job family group within the Technology sector.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 - 0 Lacs

karnataka

On-site

The ideal candidate for this role should be an Immediate Joiner with a salary range of 15-20 LPA. Your responsibilities will include the following: - Design and Development: You will be responsible for gathering requirements, designing solutions, and creating high-level design artifacts. - Coding: Deliver high-quality code for assigned modules. - Testing: Lead validation for all types of testing activities. - Implementation: Support activities related to implementation, transition, and warranty. - Data Quality: Establish data-quality metrics and requirements, and define policies and procedures for access to data. - Platform Management: Design and implement MDM solutions, including data models, data integration, data quality, data governance, and data security. - Documentation: Maintain comprehensive documentation for all service processes and incidents. - Customer Collaboration: Work with customers on solution brainstorming and solution design. The ideal candidate should have experience with MDM development, Informatica MDM tools (MDM Hub, Data Director, provisioning tool, and ActiveVOS), data modeling principles, data integration, and SAAS implementation techniques.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

delhi

On-site

You will be joining the Master Data Management (MDM) team as a Scrum Master, where your primary responsibility will be to facilitate agile processes, ensure high-quality solutions delivery, and remove any blockers that may arise during the project lifecycle. Your role will involve working closely with cross-functional teams to manage the MDM program, focusing on data governance, data quality, data integration, and leveraging MDM technologies like Informatica MDM, IDMC, and other relevant tools. As a Scrum Master for MDM, your key responsibilities will include leading and facilitating Sprint Planning, Daily Stand-ups, Sprint Reviews, and Sprint Retrospectives for the MDM team. You will ensure the team is aligned with the goals of the MDM program and working efficiently within the agile framework. Additionally, you will foster a culture of collaboration, accountability, and continuous improvement within the team by adhering to agile principles and best practices. Proactively identifying and removing any blockers or impediments that hinder the team's progress will be crucial. This includes issues related to data integration, MDM tools, or collaboration problems. You will also facilitate collaboration between the MDM team, data architects, data stewards, developers, and business stakeholders to ensure smooth delivery of MDM projects and programs. Tracking the team's progress using burndown charts, velocity metrics, and other relevant KPIs will be part of your responsibilities. Providing regular updates to key stakeholders on the progress of MDM initiatives and sprint outcomes is essential. Conducting regular retrospectives to identify areas for improvement within the team's processes and workflows and encouraging the adoption of best practices for MDM data governance, data quality, and integration will also be your focus. Additionally, you will help onboard new team members, facilitate training on MDM concepts and tools, and foster a knowledge-sharing environment to keep the team up-to-date with the latest MDM and agile practices. Working with the Product Owner and other stakeholders to ensure the team's work aligns with the overall MDM strategy and business objectives is another critical aspect of the role. To qualify for this position, you should have a Bachelor's Degree in Computer Science, Information Technology, or a related field, along with Certified Scrum Master (CSM) or Professional Scrum Master (PSM) certification. Ideally, you should have 5-7 years of experience as a Scrum Master, preferably in an MDM or data-centric environment. Experience with MDM tools such as Informatica MDM, IDMC, Data Governance platforms, and related technologies is required. A strong understanding of agile methodologies, cross-functional teamwork, MDM processes, excellent communication skills, and problem-solving abilities are essential for success in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an MDM Project Manager, you will play a crucial role in planning, coordinating, and overseeing the implementation of Master Data Management (MDM) projects. Your primary responsibilities will include ensuring that projects are completed on time, within budget, and in alignment with all business requirements. You will be responsible for managing data governance, data quality, and stakeholder engagement throughout the project lifecycle. This will involve collaborating closely with cross-functional teams to define data standards, cleanse data, and integrate MDM solutions with existing systems. Your key duties as an MDM Project Manager will encompass various aspects of project management, data governance, stakeholder management, technical expertise, training, and support. In terms of project planning and execution, you will be expected to develop detailed project plans, timelines, and budgets for MDM initiatives. You will define project scope, deliverables, and success metrics, as well as manage project risks and mitigation strategies. Monitoring project progress and making necessary adjustments to ensure timely delivery will also fall under your purview. Regarding data governance and quality, you will establish and enforce data governance policies and procedures to maintain data consistency and accuracy. You will define data quality standards and metrics, lead data cleansing and deduplication activities, and identify/address data quality issues throughout the project lifecycle. Additionally, you will cleanse and standardize master data by applying data quality rules, enriching data with relevant information, and ensuring data integrity. Stakeholder management will be another critical aspect of your role. You will facilitate communication and collaboration with cross-functional teams, including business stakeholders, IT teams, and data stewards. Gathering business requirements and translating them into technical specifications for the MDM solution, managing stakeholder expectations, and addressing concerns throughout the project will be key responsibilities. Your technical expertise will involve understanding MDM concepts, best practices, and available MDM tools. Working with technical teams to design and implement MDM architecture, data integration processes, and data mapping will be essential. You will oversee data migration activities from legacy systems to the MDM platform, participate in the design and implementation of MDM solutions, and manage and maintain the MDM platform, including user access controls, data mapping, and workflow configurations. A strong understanding of SQL queries to access and manipulate data within relational databases is also required. Furthermore, you will be responsible for developing and delivering training programs to end-users on MDM processes and data management practices. Providing ongoing support and troubleshooting for MDM-related issues will be part of your duties. To excel in this role, you should possess strong project management skills, a deep understanding of Master Data Management principles and data governance best practices, excellent communication and stakeholder management abilities, technical proficiency in data integration, data modeling, and MDM tools, business acumen, and the ability to translate business requirements into technical solutions. Strong analytical and problem-solving skills will also be essential for success in this position.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies