Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
10 - 16 Lacs
mumbai, mumbai suburban, mumbai (all areas)
Work from Office
Location: Mumbai (Andheri East) Designation: Data Engineering/Analytics Job Summary: We are looking for a Data Engineer with up to 5 years of experience in ETL processes to lead the migration of MongoDB data to MSSQL. The ideal candidate will design, develop, and optimize data pipelines, ensuring seamless data transformation, integrity, and performance. Key Responsibilities: Design and implement ETL workflows for migrating data from MongoDB to MSSQL. Develop data transformation and cleansing processes to ensure data integrity. Optimize data pipelines for performance, scalability, and reliability. Collaborate with database administrators and business teams to define migration strategies. Work with SQL Server tools (SSIS, T-SQL) for data integration. Monitor and troubleshoot ETL workflows to ensure smooth operations. Maintain documentation for migration processes, data models, and pipeline configurations. Required Skills & Qualifications: Proficiency in MongoDB and MSSQL (schema design, indexing, query optimization). Strong knowledge of ETL tools (SSIS, Talend, Informatica). Experience in data migration strategies and performance tuning. Expertise in SQL Server (stored procedures, triggers, views). Understanding of data governance, security, and compliance. Preferred Qualifications: Experience with cloud-based data migration (AWS RDS). Knowledge of MongoDB aggregation framework. Exposure to CI/CD pipelines for database deployments.
Posted 2 weeks ago
5.0 - 10.0 years
5 - 15 Lacs
hyderabad, chennai, bengaluru
Hybrid
Need Joiners Who can Join in 15 days only Payroll Company-Anlage Infotech Location- Chennai Years of Experience 6+ years Any Project specific Prerequisite skills Reltio MDM Detailed JD architect and lead implementation of enterprise MDM solutions using STIBO Reltio or Semarchy MDM Strong expertise in Operational MDM Customer Product Vendor etc Design and maintain the master data strategy and roadmap aligned with business goals Lead endtoend MDM project lifecycle requirement gathering solution design data modeling integration and deployment Ensure high data quality governance and stewardship processes are built into the solution Partner with business stakeholders to define master data domains rules and workflows Design integrations with upstreamdownstream systems ERP CRM Data Lake etc Collaborate with data governance and data quality teams to enforce MDM policies Deep understanding of data governance data quality and metadata management Provide technical leadership to MDM developers and analysts Ensure security scalability and performance of the MDM solution Conduct architecture reviews and code reviews to ensure best practices Skills Mandatory Skills : Reltio MDM,MDM Conceptual
Posted 2 weeks ago
5.0 - 10.0 years
5 - 15 Lacs
mysuru, bangalore rural, bengaluru
Work from Office
Dear Candidate, We are currently hiring for an SAP MDG Technical Consultant role and found your profile to be a potential match. Please find the details below: Job Description SAP MDG Technical Experience: 5+ years overall with minimum 4 years in SAP MDG Location: Bangalore(WFO) Notice Period: Immediate/Short joiners preferred Job description: MDG Technical Over all 5+ years with minimum 4 years of relevant experience in SAP MDG - Designing & developing new features in the application - Testing and validating developments done by others - Designing & developing prototypes of the solution when required - Assisting to the support on Level 2 & Level 3 - Providing a deep insight of the capability of MDG in the context of Schneider Electric potential use cases - Should have worked on Implementation projects involving SAP MDG Solution for either of the following: Material ,Customer, Vendor, , Financial masters etc. - Experience in MDG Process modeling, Change Request setup and Change request step definition based on requirement - Knowledge on S/4 HANA Migration projects. - Experience in supporting UAT phase and go-live period of MDG Implementation project Technical skills: • MDG (Configuration, Data Modelling, UI Modelling, Process Modelling, ALE/IDOCs, Data replication techniques, MDG error handling experience) • Webservices • Webdynpro . NetWeaver • ABAP, ABAP Reporting • LSMW, Migration Tools • BRF+ • BADI Implementations • FPM • ABAP Object Oriented Programming • SAP Workflow Soft skills: • Good synthesis capabilities • Good communication skills • Ability to translate business requirements into technical specifications Qualifications - Bachelor of Engineering degree. Please share the following details for submission: Full Name Total Experience Relevant Experience Current Company Current Location Preferred Location Current CTC Expected CTC Official Notice Period Last Working Day (if serving) PAN Number Date of Birth Email ID Alternate Contact/Email ID Documents Required (Mandatory): 1. Updated Resume 2. PAN Card (for background verification) 3. Any Government ID Proof (Aadhar/Driving License/Passport) 4. LWD Confirmation Mail (if applicable) 5. Passport Size Photo Kindly treat this requirement as urgent and share the filled details and documents at the earliest. Looking forward to your application.
Posted 2 weeks ago
3.0 - 8.0 years
2 - 4 Lacs
gurugram
Work from Office
The CMDB Process & Technical Analyst is responsible for the governance, accuracy, and optimization of the Configuration Management Database (CMDB) within ServiceNow. This role ensures that CMDB processes are well-defined, implemented, and adhered to, enabling accurate visibility into IT assets, services, and relationships. Working closely with IT operations, infrastructure, and application teams, the CMDB Process & Technical Analyst will combine process management expertise with strong technical skills to maintain data integrity, improve automation, and support IT Service Management (ITSM) outcomes Roles And Responsibilities Own and maintain CMDB processes in alignment with ITIL best practices, ensuring data accuracy, completeness, and compliance. Define and enforce CMDB data governance standards, naming conventions, and relationship models. Configure and manage CMDB in ServiceNow, including CI class structures, discovery sources, reconciliation rules, and data import sets. Partner with ServiceNow development teams to design and implement automated data integrations, discovery schedules, and reconciliation jobs. Monitor CMDB health using dashboards and KPIs; proactively address data quality issues. Collaborate with process owners (Incident, Change, Problem, Asset) to ensure CMDB supports ITSM processes and business needs. Conduct regular audits of CMDB data and reconcile against authoritative data sources. Develop training materials and conduct workshops to promote CMDB process adoption and understanding across IT teams. Support audits and compliance efforts by providing accurate and timely CMDB data. Qualifications Bachelors degree in information technology, Computer Science, or related field. 3+ years of experience managing or supporting a CMDB, preferably in a ServiceNow environment. Strong understanding of ITIL Configuration Management principles and processes (ITIL v3/v4 Foundation or higher preferred). Hands-on experience configuring and maintaining CMDB in ServiceNow, including discovery, integrations, and reconciliation. Proficiency in querying and analyzing CMDB data for reporting and health checks. Familiarity with IT asset management, service mapping, and dependency visualization. Strong problem-solving skills with attention to detail and data quality. Excellent communication skills with the ability to collaborate across technical and non-technical teams
Posted 2 weeks ago
4.0 - 7.0 years
15 - 25 Lacs
gurugram, bengaluru, mumbai (all areas)
Work from Office
Role & responsibilities The candidate must have deep expertise in data management maturity models, data governance frameworks, and regulatory requirements, ensuring businesses can maximize their data assets while complying with both local and international regulations. This is an exciting opportunity to work in a consulting environment, collaborating with industry leaders and driving data-driven business transformation. This role is based in India, with the expectation of traveling to Middle Eastern client locations as required 1. Data Strategy & Advisory Develop and implement enterprise-wide data strategies aligned with business objectives. Assess data maturity levels using industry-standard frameworks and define roadmaps for data-driven transformation. Advise clients on data monetization, data quality, and data lifecycle management 2. Data Governance & Compliance Define and implement data governance frameworks, policies, and best practices. Ensure compliance with local and international data regulations, including GDPR, HIPAA, and region-specific laws. Develop data stewardship programs, ensuring clear roles and responsibilities for data management. 3. Regulatory & Risk Management Provide expertise on data privacy, security, and risk management strategies. Align data strategies with regulatory frameworks such as ISO 27001, NIST, and other industry-specific compliance standards. Advise on data sovereignty and cross-border data transfer policies. 4. Consulting & Pre-Sales Support Conduct client workshops to define data strategy and governance models. Develop thought leadership, whitepapers, and strategic insights to support client engagements. Assist in business development efforts, including proposals and pre-sales discussions. 5. Team Mentorship & Leadership Mentor junior consultants on data governance and strategic advisory. Stay updated on emerging trends in data strategy, regulations, and governance technologies. Represent the company at industry events, conferences, and knowledge-sharing forums. Preferred candidate profile 1. Education & Experience Bachelors or Masters in Data Management, Business Analytics, Information Systems, or a related field. 5 years of experience in data strategy, governance, or regulatory compliance consulting. 2. Technical & Regulatory Expertise Deep understanding of data management maturity models (e.g., DAMA-DMBOK, CMMI for Data Management) ; Should be DAMA Certified Basic Proficiency in data governance tools such as Collibra, Informatica, or Azure Purview. Strong knowledge of local and international data regulations (e.g., GDPR, CCPA, PDPA, UAE’s NDPL, KSA-NDMO , UAE DGE Data Regulations, Dubai Data Law).
Posted 2 weeks ago
13.0 - 19.0 years
40 - 45 Lacs
pune
Work from Office
Role Description Project Management & Change Execution is responsible for delivery of the value and strategic outcomes of their Change BoW that ensures successful transition into the organizations goals, processes and technologies. Adopting values and principles that follow agreed processes and practices to allow teams to continually improve their performance. Your key responsibilities Lead end-to-end AI project implementation for Investment Banking, compliance, risk & Finance domains Drive cross-functional collaboration across Sales, CRM, Risk, and CIO teams to meet strategic milestones. Manage sprint planning, backlog grooming, and stakeholder alignment for features like financial spreading, trade data integration, and UI enhancements Oversee vendor coordination for data integration and ensure timely API/data feed integrations Ensure compliance with data guardrails and governance across internal and external data sources Track KPIs and RAG status across dashboards, flashcards, and client engagement tools Coordinate with Senior business and tech stakeholder on project management meetings and documentation Champion innovation through frugal engineering, scalable AI models, and user-centric design Your skills and experience Proven ability to lead AI-driven initiatives across client insights, risk management, and sales enablement domains Skilled in managing cross-functional teams and sprint pipelines involving data summarization, predictive analytics, and UI/UX enhancements Expertise in integrating external data sources with internal systems to deliver actionable intelligence. Strong stakeholder engagement and cadence management with business sponsors, SMEs, and engineering leads Experience in resource planning and hybrid team structures across geographies Familiarity with regulatory frameworks and data governance for AI platforms in banking environments Ability to articulate ROI-driven use cases and drive adoption through demos, feedback loops, and performance tracking
Posted 2 weeks ago
10.0 - 17.0 years
0 Lacs
bengaluru
Work from Office
Job Description: Complete data ingestion, data pipeline, data lineage , data quality , data wearhouse, data governance and data reconciliation. Essential Skills: Must have Data architect experience and knowledge. Data Architect with over 10 +years of hands-on experience in designing, developing, and managing large-scale data solutions. Proven expertise in building and optimizing ETL pipelines. Strong in data preprocessing, and enhancing data quality.Extracting events, processing large datasets (5 billion+ records) within (a Spark-Hadoop) cluster. Automated data processing tasks for DAAS (Data as a Service) project, streamlining workflow efficiency. Configured file and client setups, ensuring smooth data integration and delivery. Managed end-to-end data processing pipelines, enhancing data quality and reducing processing time.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
hyderabad, pune, bengaluru
Hybrid
Responsibilities Knowledge of more than one technology • Basics of Architecture and Design fundamentals • Knowledge of Testing tools • Knowledge of agile methodologies • Understanding of Project life cycle activities on development and maintenance projects • Understanding of one or more Estimation methodologies, Knowledge of Quality processes • Basics of business domain to understand the business requirements • Analytical abilities, Strong Technical Skills, Good communication skills • Good understanding of the technology and domain • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods • Awareness of latest technologies and trends • Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Technology->Data Management - MDM->Stibo MDM Preferred Skills: Technology->Data Management - MDM->Stibo MDM
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Principal Engineer/Architect at our company, you will play a crucial role in spearheading the development and expansion of our Data & AI practice. This senior leadership position requires a blend of technical proficiency, strategic acumen, and client-oriented leadership. You will be responsible for leading intricate data migration and transformation initiatives, establishing best practices and reusable frameworks, and influencing decision-making processes at both Fortune 500 companies and innovative startups. Your expertise will be utilized across modern cloud technologies such as AWS, Azure, GCP, Databricks, and advanced Data & AI tools. Your primary responsibilities will include designing and architecting scalable, high-performance data solutions on cloud platforms, leading enterprise-scale Data & AI projects from inception to delivery, crafting data architecture blueprints and reusable frameworks, as well as optimizing data pipelines for enhanced efficiency. Additionally, you will be instrumental in building and nurturing a high-performing Data & AI team by recruiting, mentoring, and training skilled professionals, developing standardized processes and best practices, and driving internal product development initiatives to create reusable tools. In your role, you will act as a trusted advisor to clients, offering strategic guidance and technical expertise, collaborating with sales and pre-sales teams to propose solutions for complex data projects, and representing the company at industry forums and technical conferences to showcase thought leadership and innovation. Moreover, you will provide thought leadership in data governance, data integration, and cloud data transformations, stay abreast of emerging trends and technologies in Data & AI, and foster a culture of continuous improvement and innovation within the team. To excel in this position, you should possess strong expertise in at least one cloud platform (AWS, Azure, or GCP), demonstrate the ability to design and implement high-performing data architectures, have experience in data engineering, data governance, and pipeline optimization, and be proficient in tools like Databricks and Apache Spark for large-scale data processing. Your track record should include successful data platform migrations and the development of reusable frameworks for large-scale deployments. Additionally, you should have a background in Enterprise Information Management programs or large cloud data migration initiatives, excellent client communication skills, and hands-on experience in MLOps, observability frameworks, and cloud security best practices. Success in this role will be defined by your ability to design and deliver scalable data solutions, build and lead a high-impact team, drive standardization and reusability through frameworks and tools, establish yourself as a trusted advisor and thought leader in the Data & AI domain, and contribute to revenue growth through collaboration on pre-sales and solutioning activities. Joining our team will offer you competitive compensation, performance bonuses, a 401k plan with employer matching, comprehensive health and wellness benefits, professional development and certification assistance, a flexible work environment with remote/hybrid options, exposure to industry-leading Data & AI projects, leadership and career growth opportunities, and the chance to work with cutting-edge technologies and innovative teams.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Manager, Material Master Data at Kenvue, you will be responsible for overseeing and executing day-to-day operations of Material Master Data Governance globally. Your role will involve designing and managing data workflows, ensuring data quality and compliance with internal and external regulations such as GXP and SOX. Your contributions will be integral in supporting the Kenvue Operations organization and Digital Transformation Office. Your key responsibilities will include developing and implementing Material Master Data Management strategy and governance framework, ensuring compliance with data quality standards and regulatory requirements, leading the standardization of Material Master Data Management processes, and collaborating with various teams to drive alignment and define data and process requirements. You will also be responsible for facilitating continuous improvement, implementing process enhancements, owning documentation, supporting transformation projects, and monitoring data quality metrics. The ideal candidate will possess a Bachelor's degree in business, IT, engineering, or a related field, along with a minimum of 8-10 years of business experience and at least 5 years of data management experience in large ERP platforms focusing on Material Master Data and Bill of Material data. Strong project management, stakeholder management, and communication skills are essential, along with the ability to drive data governance, quality, and compliance initiatives. Experience with SAP S4 Hana, SAP Master Data Governance (MDG), global ERP systems, and working in Consumer-Packaged Goods sector will be advantageous. Join Kenvue in shaping the future and impacting the lives of millions of people every day through your expertise and empathy. Embrace the opportunity to collaborate with diverse stakeholders, drive process improvements, and contribute to the success of our global team. Your role as a Manager, Material Master Data will be pivotal in delivering the best products to our customers and making a difference in the world of everyday care.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
It is exciting to work in a company where you truly BELIEVE in the mission and vision! At Fractal, we are dedicated to bringing passion and customer focus to our business. When you join us, you become part of a rapidly growing team that assists our clients in utilizing AI and behavioural sciences to enhance decision-making processes. As a strategic analytics partner to some of the most esteemed Fortune 500 companies globally, we empower every human decision within the enterprise through analytics, AI, and behavioural science. Fractal stands out as a leading player in the Artificial Intelligence field, with a mission to influence every human decision in the enterprise by harnessing the potential of AI to support Fortune 500 companies. Your responsibilities will include leading the analysis of employee data from various sources to identify significant trends and insights, and overseeing the delivery of the analytics calendar. You will design and maintain user-friendly dashboards and reports to communicate people insights and key performance indicators in an intuitive manner, primarily using Excel and Power BI. Collaboration with Business partnering, cross-functional, and operations teams is crucial for driving data-driven recommendations. Additionally, you will manage ad hoc projects, define metrics, and provide ongoing dashboards and reports to inform talent strategies. Ensuring the integrity, security, and quality of people analytics data through data governance frameworks is also part of your role. Moreover, you will offer leadership and training to team members on analytics best practices to cultivate a data-driven culture within the function. As for the required skills and qualifications, we are looking for candidates with an MBA and a bachelor's degree in engineering, Statistics, Data Science, or a related field. You should have at least 5 years of experience in business analysis, corporate strategy/planning, with hands-on experience in creating dashboards and insights presentations for C-Suite stakeholders. Proficiency in Excel, visualization techniques, and a working understanding of statistical techniques like regression are essential. Strong analytical skills, the ability to interpret complex data sets, and communicate findings clearly to diverse audiences are also key requirements. In terms of competencies, we value problem-solving aptitude with attention to detail, the ability to align business strategy with talent strategy, and the capability to conduct in-depth analysis and develop impactful visualizations. If you thrive in a dynamic environment and enjoy collaborating with enthusiastic individuals, you will find a rewarding career at Fractal! If this opportunity does not align with your current goals, feel free to express your interest in future openings by clicking on "Introduce Yourself" or setting up email alerts for relevant job postings.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
bhubaneswar
On-site
As a PySpark Data Engineer at Viraaj HR Solutions, you will be an integral part of our dynamic team in India. Your primary responsibility will be to design, develop, and maintain robust data pipelines using PySpark. You will play a crucial role in implementing ETL processes for data extraction, transformation, and loading, while ensuring data quality from structured and unstructured sources. Your expertise in optimizing data processing workflows will be essential for enhancing performance and efficiency. Collaboration with data scientists and analysts is key to understanding data requirements and creating data warehousing solutions for end-user access. You will integrate data from various sources into a unified platform and monitor data pipeline performance, troubleshooting any issues that arise. Data validation, cleansing, and documentation of processes and workflows will be part of your routine to maintain data accuracy and ensure team transparency. Your qualifications should include a Bachelor's degree in Computer Science, Engineering, or a related field, along with proven experience as a Data Engineer. Proficiency in PySpark, SQL, and database management systems is required, as well as familiarity with Hadoop and big data technologies. Experience with cloud platforms like AWS, Azure, or Google Cloud is preferred, along with a strong understanding of data modeling, database design concepts, and Python for data manipulation and analysis. As a PySpark Data Engineer, you should possess strong analytical and troubleshooting skills, excellent communication and teamwork abilities, and knowledge of data governance and compliance practices. Your role will involve implementing ETL and data warehousing solutions, working in a fast-paced environment, managing multiple priorities, and utilizing version control systems like Git. A willingness to learn and adapt to new technologies is essential for this position. This is an exciting opportunity to join a forward-thinking company that values expertise and innovation in the field of data engineering. If you are passionate about data and eager to contribute to transformative projects, we encourage you to apply today.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Lead, you will be responsible for developing and implementing data engineering projects, including enterprise data hubs or Big data platforms. Your role will involve defining reference data architecture, leveraging cloud-native data platforms in AWS or Microsoft stack, and staying updated on the latest data trends like data fabric and data mesh. You will play a key role in leading the Center of Excellence (COE) and influencing client revenues through innovative data and analytics solutions. Your responsibilities will include guiding a team of data engineers, overseeing the design and deployment of data solutions, and strategizing new data services and offerings. Collaborating with client teams to understand their business challenges, you will develop tailored data solutions and lead client engagements from project initiation to deployment. Building strong relationships with key clients and stakeholders, you will also create reusable methodologies, pipelines, and models for more efficient data science projects. Your expertise in data architecture solutions, data governance, and data modeling will ensure compliance with regulatory standards and support effective data management processes. You will be proficient in various data integration tools, cloud computing platforms, programming languages, data visualization tools, and big data technologies to process and analyze large volumes of data. In addition to technical skills, you will demonstrate strong people and interpersonal skills by managing a high-performing team, fostering a culture of innovation, and collaborating with cross-functional teams. Candidates for this role should have at least 10+ years of experience in information technology, with a focus on data engineering and architecture, along with a degree in relevant fields like computer science, data science, or engineering. Candidates should also possess experience in managing data projects, creating data and analytics solutions, and have a good understanding of data visualization, reporting tools, and normalizing data as per key KPIs and metrics. Strong problem-solving, communication, and collaboration skills are essential for success in this role.,
Posted 2 weeks ago
12.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Services & Integration Manager, you should possess a Bachelor's Degree in Accounting, Finance, Business, or a recognized accounting qualification. Additionally, having a relevant data modeling certification and Architecture certification (e.g. TOGAF) is preferred. With 12 to 18 years of experience, you should have a strong understanding of the technical delivery of data and relevant infrastructure, preferably in Azure or AWS. Your experience should encompass systems like SAP, iBPM, Oracle, and Informatica. Proficiency in technologies, frameworks, and accelerators such as ERWIN, Sparks, Zachman, and industry data models is essential. You should also have expertise in catalog and metadata management, data ownership, stewardship, and governance, along with experience in operational and analytical settings. In this role, you will be responsible for developing a corporate data model with a focus on utilizing the model to handle data as a reusable asset. You will drive consistency in the model, ownership, definition, and structure from operational build to analytic consumption. Ensuring data connectivity across all layers and supporting the creation of a corporate view of data through engagement with data change projects will be key aspects of your role. Your responsibilities will include engaging with senior stakeholders, understanding business requirements, coordinating execution, and ensuring a comprehensive delivery approach. Your ability to influence collaborators, comprehend business needs, and drive execution will be crucial for success in this role. For further details on this exciting opportunity, please contact 85916 09735 or email priyanshu@mmcindia.biz.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
salem, tamil nadu
On-site
As an intelligent job parser, I have extracted the Job Description based on the information provided. Your role will involve understanding and applying grade-specific skills and competencies required for the position. You will be expected to demonstrate proficiency in these skills to successfully perform your job responsibilities. Adapting to the grade-specific requirements and utilizing the competencies effectively will contribute to your overall performance and success in the role. Your ability to master these skills will directly impact your effectiveness and efficiency in fulfilling job duties.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
The role of a Data Governance Specialist at Hitachi Energy involves being a key enabler in shaping and operationalizing the enterprise-wide data governance framework. Your focus will be on the implementation and evolution of the Data Catalog, Metadata Management, and Data Compliance initiatives to ensure that data assets are trusted, discoverable, and aligned with business value. You will play a critical role in defining and maintaining the roadmap for the Enterprise Data Catalog and Data Supermarket. This includes configuring and executing the deployment of cataloging tools such as metadata management, lineage, and glossary, while ensuring alignment with DAMA-DMBOK principles. Collaboration with Data Owners, Stewards, and Custodians will be essential in defining and enforcing data policies, standards, and RACI mode. Additionally, you will support the Data Governance Council and contribute to the development of governance artifacts like roles, regulations, and KPIs. Partnering with domain experts, you will drive data profiling, cleansing, and validation initiatives to ensure data quality and support remediation efforts across domains. Providing training and support to business users on catalog usage and governance practices will be part of your responsibilities, acting as a liaison between business and IT to ensure data needs are met and governance is embedded in operations. Staying current with industry trends and tool capabilities like Databricks and SAP MDG, you will propose enhancements to governance processes and tooling based on user feedback and analytics. To qualify for this role, you should have a Bachelor's degree in information systems, Data Science, Business Informatics, or a related field, along with 1-3 years of experience in data governance, data management, or analytics roles. Familiarity with DAMA DMBOK2 framework and data governance tools is required, as well as strong communication and collaboration skills to work across business and technical teams. Being proactive, solution-oriented, and eager to learn are important traits for this role, along with autonomy and ambiguity management capacities as competitive advantages. Preference will be given to candidates with CDMP certifications. Joining Hitachi Energy offers a purpose-driven role in a global energy leader committed to sustainability and digital transformation. You can expect mentorship and development opportunities within a diverse and inclusive team, working with cutting-edge technologies and a culture that values integrity, curiosity, and collaboration, in line with Hitachi Energy's Leadership Pillars. Individuals with disabilities requiring accessibility assistance or accommodations in the job application process can request reasonable accommodations through the Hitachi Energy career site to support them during the application process.,
Posted 2 weeks ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
As the Director of Business Applications at our leading ERP software and solutions provider, you will lead the strategy, delivery, and optimization of enterprise systems across various functions including Sales, Marketing, Finance, HR, and Operations. Your primary responsibility will be to ensure system effectiveness, scalability, and user satisfaction, while driving the adoption of AI and automation to enhance decision-making and operational efficiency. Your role will involve partnering closely with business leaders to ensure that application and AI initiatives align with strategic goals through robust governance, vendor management, and continuous improvement efforts. You will be based in Pune (Hybrid) and will play a key role in defining and executing the enterprise applications strategy in alignment with business objectives. Key Responsibilities: - **Strategic Leadership & Governance:** Define and execute the enterprise applications strategy aligned with business goals, establish governance processes for application lifecycle management, AI/automation adoption, and data integrity. - **Application Portfolio Management:** Oversee the implementation, optimization, and support of core systems including NetSuite, D365, OpenAir, Gong, CPQ, and Zone Billing. Lead application rationalization, upgrades, and system integration initiatives. - **AI-Driven Innovation:** Champion the ethical adoption of AI/ML capabilities within enterprise applications, ensuring alignment with data privacy, fairness, and transparency standards. - **Integration & Automation Architecture:** Lead the design and execution of system integrations and process automations, ensuring seamless data flow and high data quality. - **Team Leadership & Vendor Management:** Build and mentor high-performing teams, manage relationships with software vendors and partners. - **Stakeholder Collaboration:** Act as a trusted partner to business leaders, translating their needs into scalable application solutions and ensuring end-user engagement. - **Operational Excellence & Continuous Improvement:** Define and track KPIs for application performance, foster a culture of continuous improvement. Qualifications: - **Education:** Bachelor's degree in information systems, Computer Science, Business, or related field. MBA or relevant master's degree preferred. - **Experience:** Minimum 15 years in enterprise/business applications, with at least 5 years in leadership roles, experience with NetSuite, D365, OpenAir, Gong, CPQ, and Zone Billing. - **Skills:** Hands-on experience with integration and workflow automation platforms, AI and automation adoption, deep understanding of business processes, SaaS architecture, and data governance. - **Leadership:** Proven track record of building and mentoring teams, ability to translate business needs into technical roadmaps and lead change management initiatives. Certifications such as NetSuite Administrator, Microsoft Dynamics 365 Functional Consultant, Workato Automation Pro, PMP, ITIL, or Agile are preferred but not required. Join us at our beautiful Cary, NC headquarters to be part of a team that enables transformation and drives market innovation. Be a part of VitalEdge and contribute to equipping the world to keep running.,
Posted 2 weeks ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
As the Director of Data Governance and Operations at SailPoint, you will play a crucial role in leading data governance and data quality initiatives within the organization. Reporting directly to the VP of IT, you will be instrumental in driving data-driven decision-making across the Go-To-Market (GTM) operations and Finance departments. We are looking for a dynamic individual with a deep passion for data governance, a proven track record in developing and executing successful data quality strategies, and a strong understanding of how data drives AI innovation. Your responsibilities will include developing and championing a comprehensive data strategy aligned with SailPoint's overall business objectives, focusing particularly on GTM and Finance. You will establish and uphold a robust data governance framework encompassing policies, standards, and procedures to ensure data quality, accuracy, and compliance. Collaborating closely with GTM and Finance leadership, you will identify data needs, devise data solutions, and promote the adoption of data-driven insights to enhance performance. In addition, you will take the lead in monitoring and enhancing data quality across key systems and data sources by implementing data cleansing and validation processes. Your role will also involve ensuring data readiness for AI initiatives by collaborating with the Enterprise Applications and Data Engineering teams to prepare and structure data for AI model development and deployment. Leveraging your expertise in Salesforce, you will optimize data management, reporting, and analytics within the platform. Furthermore, you will be responsible for building, mentoring, and managing a high-performing team of data professionals, fostering a culture of collaboration, innovation, and continuous improvement. Effective collaboration with stakeholders across IT, GTM, Finance, and other departments will be essential to ensure alignment on data strategy and priorities. It will also be crucial to stay informed about emerging trends in data management, AI, and analytics, identifying opportunities to leverage new technologies to enhance SailPoint's data capabilities. To be successful in this role, you should have at least 15 years of experience in data management, data governance, or data strategy roles, with a progressive increase in responsibilities. A proven track record in developing and executing successful data strategies for GTM operations and Finance is required, along with strong experience in Master Data Management (MDM) and a deep understanding of data governance principles, practices, tools, and technologies. In-depth knowledge of Salesforce data model, reporting, and analytics capabilities is essential, as well as experience with Salesforce administration. A solid grasp of AI and machine learning concepts and their impact on data readiness for AI is also necessary. Excellent leadership, communication, and interpersonal skills are vital for this role, along with the ability to influence and collaborate effectively with stakeholders at all levels of the organization. Experience in building and managing high-performing teams, strong analytical and problem-solving skills, and Salesforce certifications would be highly desirable. A Bachelor's degree in a relevant field such as Computer Science, Data Science, or Business Analytics is also required. In your first 30 days, you will focus on establishing foundational knowledge, integrating into the team, and contributing to problem statements and data flow diagrams. Over the next 60 days, you will develop expertise in SailPoint's business data, define strategies for data governance and quality, and engage with key stakeholders. Finally, in the following 90-180 days, you will lead the execution of the established roadmap, focusing on data quality, data governance, MDM initiatives, and AI projects. SailPoint is an equal opportunity employer. Alternative methods of applying for employment are available to individuals unable to submit an application through the website due to a disability. For further assistance, please contact hr@sailpoint.com or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
maharashtra
On-site
Are you ready to make it happen at Mondelz International Join our mission to lead the future of snacking and make it with pride. Together with analytics team leaders, you will support our business by providing excellent data models to uncover trends that can drive long-term business results. Your role will involve: - Working closely with the business leadership team to execute the analytics agenda - Identifying and incubating best-in-class external partners for strategic projects - Developing custom models/algorithms to uncover signals, patterns, and trends for long-term business performance - Executing the business analytics program agenda using a methodical approach that communicates the deliverables to stakeholders effectively To excel in this position, you should possess: - Experience in using data analysis to make recommendations to senior leaders - Technical expertise in best-in-class analytics practices - Experience in deploying new analytical approaches in a complex organization - Proficiency in utilizing analytics techniques to create business impacts The Data COE Software Engineering Capability Tech Lead will be part of the Data Engineering and Ingestion team, responsible for defining and implementing software engineering best practices, frameworks, and tools to support scalable data ingestion and engineering processes. Key responsibilities include: - Leading the development of reusable software components, libraries, and frameworks for data ingestion, transformation, and orchestration - Designing and implementing intuitive user interfaces using React.js and modern frontend technologies - Developing backend APIs and services to support data engineering tools and platforms - Defining and enforcing software engineering standards and practices for developing and maintaining data products - Collaborating with data engineers, platform engineers, and other COE leads to build fit-for-purpose engineering tools - Integrating observability and monitoring features into data pipeline tooling - Mentoring and supporting engineering teams in using the frameworks and tools developed Qualifications required: - Bachelor's or master's degree in computer science, engineering, or related discipline - 12+ years of full-stack software engineering experience, with at least 3 years in data engineering, platform, or infrastructure roles - Strong expertise in front-end development with React.js and component-based architecture - Backend development experience in Python with exposure to microservices architecture, FAST APIs, and RESTful APIs - Experience working with data engineering tools such as Apache Airflow, Kafka, Spark, Delta Lake, and DBT - Familiarity with GCP cloud platforms, containerization (Docker, Kubernetes), and DevOps practices - Strong understanding of CI/CD pipelines, testing frameworks, and software observability - Ability to work cross-functionally and influence without direct authority Preferred skills include: - Experience with building internal developer platforms or self-service portals - Familiarity with data catalogue, metadata, and lineage tools (e.g., Collibra) - Understanding of data governance and data mesh concepts - Agile delivery mindset with a focus on automation and reusability In this role, you will play a strategic part in developing the engineering backbone for a next-generation enterprise Data COE. You will work with cutting-edge data and software technologies in a highly collaborative and innovative environment, driving meaningful change and enabling data-driven decision-making across the business. Join us at Mondelz International to be part of our purpose to empower people to snack right, offering a broad range of delicious, high-quality snacks made with sustainable ingredients and packaging. With a rich portfolio of globally recognized brands, we are proud to lead in biscuits, chocolate, and candy globally, and we have a diverse community of makers and bakers across the world who are energized for growth and committed to living our purpose and values. This is a regular job opportunity in the field of Analytics & Modelling.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Quality & Governance Lead at Kenvue based in Bengaluru, India, you will be responsible for ensuring the quality, consistency, and integrity of the organization's data. Your role will involve developing and implementing data governance frameworks, policies, and procedures to maintain data standards across the enterprise. You will collaborate closely with various stakeholders, including IT, business units, and senior management, to promote a culture of data excellence. Your key responsibilities will include driving the data quality function of the Global Supply Chain, which involves data profiling, data cleansing, and data enrichment to support program goals. You will maintain and define data quality and metadata processes, develop data quality rules, and work on enhancing the in-house data quality framework. Additionally, you will deliver data quality solutions and capabilities to data stewards, assist in the implementation of data governance controls, and ensure compliance with relevant data protection regulations. To excel in this role, you should have a minimum of 5-8 years of relevant work experience, advanced skills in data analysis, business requirement gathering, and dashboarding/reporting. Experience with Microsoft Azure, SQL, and SAP ERP Systems is preferred, along with project management experience. A CDMA certification would be a plus. You will collaborate with global and regional data owners, stewards, and subject matter experts to standardize data, align data quality rules with evolving landscapes, and design data quality policies and methodologies. Your ability to identify and resolve data quality issues, work in complex environments, and champion the role of data quality will be crucial to success in this role. If you are results-driven, detail-oriented, and possess strong organizational and communication skills, along with the ability to work autonomously and in a team, we encourage you to apply for this position at Kenvue in Bengaluru, India.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
thane, maharashtra
On-site
About Welspun Welspun World is one of India's fastest growing global conglomerates with businesses in Home Textiles, Flooring Solutions, Advanced Textiles, DI Pipes, Pig Iron, TMT bars, Stainless Steel, Alloy, Line Pipes, Infrastructure & Warehousing. At Welspun, we strongly believe in our purpose to delight customers through innovation and technology, achieve inclusive & sustainable growth to remain eminent in all our businesses. From Homes to Highways, Hi-tech to Heavy metals, We lead tomorrow together to create a smarter & more sustainable world. Job Purpose/ Summary As a Solution Architect at Welspun, you will be responsible for analyzing business requirements and translating them into analytics solutions. You will architect, design, and implement scalable data analytics solutions using Databricks and other Azure cloud services. Additionally, you will lead data engineering initiatives, develop and optimize data models, and ensure efficient deployment of machine learning models. Collaboration with cross-functional teams, mentoring junior team members, and driving data-driven decision-making are also key responsibilities of this role. Responsibilities - Analyze business requirements and translate them into analytics solutions. - Architect, design, and implement scalable data analytics solutions using Databricks and other Azure cloud services. - Lead data engineering initiatives, including data pipelines, data lakes, and data warehousing solutions. - Develop and optimize data models for analytical and operational use cases. - Implement and drive MLOps best practices for efficient deployment, monitoring, and management of machine learning models. - Collaborate with business stakeholders to understand data requirements and translate them into effective analytics solutions. - Enable data visualization and business intelligence using tools such as Power BI. - Ensure data security, compliance, and governance within Azure and associated technologies. - Provide technical leadership and mentorship to the data engineering and analytics teams. - Stay up to date with the latest advancements in data engineering, cloud analytics, and AI/ML technologies. - Drive data modeling efforts and ensure optimal database structures for analytics and reporting. - Collaborate with Data Science teams to integrate AI/ML/GenAI solutions into the data ecosystem. - Ensure data quality, integrity, and reliability throughout the data lifecycle. - Engage with cross-functional teams to understand business needs and communicate complex data insights to non-technical stakeholders. - Design cost-efficient data architectures and optimize cloud costs. - Identify opportunities for process improvements and implement best practices in data analytics. - Stay abreast of industry trends and advancements in data analytics. - Promote a culture of continuous learning and development within the team. Requirements - Bachelor's degree in Business Administration, Information Technology, Data Science, or a related field. Master's degree is a plus. - 10-14 years of experience in Data Engineering, Analytics, Visualization, AI/ML. - Hands-on expertise in Databricks and Microsoft Azure ecosystem. - Strong knowledge of MLOps frameworks and best practices. - Proficiency in Python, SQL, and Spark for data processing and analysis. - Deep understanding of data pipelines, ETL processes, and cloud-based data lake solutions. - Experience in developing and deploying AI/ML models. - Expertise in data governance, security, and compliance within cloud environments. - Experience with Power BI and other visualization tools. - Excellent communication and stakeholder management skills. - Domain experience in Manufacturing is preferred. - Strong analytical and problem-solving skills with attention to detail. Preferred Skills - Experience in the manufacturing industry. - Familiarity with machine learning and advanced analytics techniques. - Familiarity with Python for data analysis & automation is an advantage. Job Title SBAPL_Solution Architect,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As an Assistant Team Lead - Master Data Maintenance Specialist (SAP MDG) at ArcelorMittal, you will play a crucial role in ensuring the accuracy, consistency, and governance of master data across key domains such as Customer, Vendor, Material, Finance, and Organizational data. Your responsibilities will include collaborating with business stakeholders, data stewards, and IT teams to maintain high-quality master data records in SAP MDG, adhering to established data governance policies. Key Responsibilities: - Lead and mentor a cross-functional team, driving team planning, resource allocation, and workload prioritization to meet project and business goals. - Promote accountability and continuous improvement through effective goal-setting and coaching. - Collaborate with senior leadership on workforce planning, organizational design, and succession planning. - Maintain and manage master data records in SAP MDG, executing data creation, change, and deletion processes for key domains. - Validate incoming requests for completeness, accuracy, and compliance with data standards and business rules. - Monitor data quality, resolve data-related issues, and implement continuous improvements in data accuracy and workflows. - Support integration and synchronization of master data between SAP MDG and downstream systems. - Generate and analyze data quality reports, identify anomalies, and participate in testing and deployment activities for SAP MDG upgrades and enhancements. Required Qualifications: - Bachelor's degree in Information Systems, Business Administration, or a related field. - 6 to 8 years of experience working with SAP MDG in a master data maintenance or governance role. - Strong knowledge of SAP master data objects such as Material Master, Vendor, Customer, Finance Master, etc. - Familiarity with SAP MDG workflows, validations, derivations, and data models. - Experience working with data governance policies, standards, and stewardship practices. - High attention to detail, commitment to data accuracy, and strong communication and organizational skills. Preferred Qualifications: - Experience with SAP S/4HANA and integration with SAP MDG. - Exposure to data migration or data cleansing projects. - Understanding of data quality tools and reporting, such as SAP Information Steward and MDG analytics. - Certification in SAP MDG or relevant data management programs. What We Offer: - Opportunity to work in a key role supporting enterprise-wide data quality. - Collaborative and supportive team environment. - Training and development in SAP MDG and enterprise data management practices. - Competitive salary, benefits, and career growth opportunities.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
thane, maharashtra
On-site
The BI Data Engineer role is a crucial position within the Enterprise Data team, seeking an expert Azure data engineer with profound experience in Data engineering, ADF Integration, and database development. This role offers a unique opportunity to contribute to delivering cutting-edge business analytics using advanced BI tools, including cloud-based databases, self-service analytics, and leading visualization tools in line with the company's vision to become a fully digital organization. As a BI Data Engineer, your key responsibilities will include building Enterprise data engineering and Integration solutions utilizing the latest Azure platform components such as Azure Data Factory, Azure SQL Database, Azure Synapse, and Azure Fabric. You will be involved in the development of enterprise ETL and integration routines using ADF, as well as evaluating emerging Data engineering technologies, standards, and capabilities. Collaborating with business stakeholders, product managers, and data scientists to comprehend business objectives and translate them into technical solutions will be a crucial aspect of this role. Additionally, working closely with DevOps, engineering, and operations teams to implement CI/CD pipelines and ensure seamless deployment of data engineering solutions will be part of your responsibilities. In terms of required skills and experience, the ideal candidate should possess technical expertise in the Azure platform, including Azure Data Factory, Azure SQL Database, Azure Synapse, and Azure Fabric. Exposure to Data bricks and lakehouse architecture & technologies is desirable, along with extensive knowledge of data modeling, ETL processes, and data warehouse design principles. Experience in machine learning and AI services in Azure will be beneficial. The successful candidate should have a minimum of 5 years of experience in database development using SQL, 5+ years of integration and data engineering experience, and 5+ years of experience using Azure SQL DB, ADF, and Azure Synapse. Additionally, 2+ years of experience using Power BI, a comprehensive understanding of data modeling, and relevant certifications in data engineering, machine learning, and AI are expected. Key competencies for this role include expertise in data engineering and database development, familiarity with Microsoft Fabric technologies, strong understanding of data governance, compliance, and security frameworks, and the ability to drive innovation in data strategy and cloud solutions. Proficiency in business intelligence workflows, strong database design skills, and experience in cloud-based data integration tools like Azure Data Factory are essential. Experience with Azure DevOps or JIRA, working with finance data, familiarity with agile development techniques, and objectives are advantageous. This full-time permanent position is located at DGS India - Mumbai - Thane Ashar IT Park under the brand Dentsu.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
We are searching for a highly skilled and seasoned Senior ETL & Data Streaming Engineer with over 10 years of experience to take on a crucial role in the design, development, and maintenance of our robust data pipelines. The ideal candidate will possess in-depth expertise in batch ETL processes as well as real-time data streaming technologies, along with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is a must. Your responsibilities will include designing, developing, and implementing highly scalable, fault-tolerant, and performant ETL processes using leading ETL tools to extract, transform, and load data from diverse source systems into our Data Lake and Data Warehouse. You will also be tasked with architecting and constructing batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis to facilitate immediate data ingestion and processing requirements. Furthermore, you will need to leverage and optimize various AWS data services such as AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others to develop and manage data pipelines. Collaboration with data architects, data scientists, and business stakeholders to comprehend data requirements and translate them into efficient data pipeline solutions is a key aspect of the role. It will also be essential for you to ensure data quality, integrity, and security across all data pipelines and storage solutions, as well as monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Additionally, you will be responsible for developing and maintaining comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs, and implementing data governance policies and best practices within the Data Lake and Data Warehouse environments. As a mentor to junior engineers, you will contribute to fostering a culture of technical excellence and continuous improvement. Staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming will also be expected. Required Qualifications: - 10+ years of progressive experience in data engineering, focusing on ETL, ELT, and data pipeline development. - Extensive hands-on experience with commercial or open-source ETL tools (Talend). - Proven experience with real-time data ingestion and processing using platforms such as AWS Glue, Apache Kafka, AWS Kinesis, or similar. - Proficiency with AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, and potentially AWS EMR. - Strong background in traditional data warehousing concepts, dimensional modeling, and DWH design principles. - Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. - Strong understanding of relational databases and NoSQL databases. - Experience with version control systems (e.g., Git). - Excellent analytical and problem-solving skills with attention to detail. - Strong verbal and written communication skills for conveying complex technical concepts to diverse audiences. Preferred Qualifications: - Certifications in AWS Data Analytics or related areas.,
Posted 2 weeks ago
5.0 - 10.0 years
0 - 0 Lacs
karnataka
On-site
The Data Science Manager - AI Governance; Innovation role at Tesco involves leading a highly skilled team of Data Scientists to gain a better understanding of the business and optimize processes using available data. The Data Science team at Tesco focuses on modeling complex business problems and deploying data products across various areas such as physical stores, online platforms, supply chain, marketing, and Clubcard. There is an emphasis on rotation among Data Scientists to gain expertise in different subjects, with opportunities for learning and personal development. The team works on domains including online, pricing, security, fulfillment, distribution, property, IoT, and computer vision. Collaboration with academic institutions enriches the team's expertise, while regular knowledge sharing events foster a culture of continuous learning. The work environment at Tesco promotes work-life balance, team-building activities, and a relaxed yet engaging culture. As a Data Science Manager, your responsibilities include owning the innovation process across business and technical domains, implementing AI governance frameworks to ensure ethical and responsible practices, and ensuring compliance with relevant regulations and standards. You will lead the team in framing and scoping Data Science problems, line manage Data Scientists, support and mentor team members, and define the strategic direction for the overall team. Collaborating with legal and technical teams, conducting risk assessments, and educating employees on AI policies are also part of the role. You will partner with Engineering to design complex software systems, promote data science within Tesco, and engage with the external Data Science community. Skills required for the role include a strong numerical higher degree in a relevant discipline, expertise in Python and Machine Learning, experience in Software Engineering best practices, familiarity with big-data technologies and cloud platforms, and knowledge of AI governance principles. Strong leadership skills, experience in managing high-performing Data Science teams, driving innovation projects, and collaborating with academic institutions are also essential. A background in the retail sector, logistics, or ecommerce is advantageous but not mandatory. The benefits offered at Tesco include 30 days of leave, retirement benefits, health insurance for colleagues and their families, mental health support, financial wellbeing programs, physical wellbeing facilities on campus, and opportunities to become Tesco shareholders through savings plans. The work environment in Tesco Bengaluru aims to create a sustainable competitive advantage by standardizing processes, delivering cost savings, and empowering colleagues to serve customers better. The Technology team at Tesco comprises over 5,000 experts globally, with dedicated teams in Engineering, Product, Programme, Service Desk, Operations, Systems Engineering, Security & Capability, Data Science, and other roles.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |