Jobs
Interviews

4996 Data Governance Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

9 - 10 Lacs

chennai

Work from Office

As a Data Scientist at Demos Project, you will transform data into actionable insights that directly influence project strategies and outcomes. This role involves working with complex datasets, deploying machine learning models, and creating data pipelines that ensure the accuracy and usability of insights. We are seeking immediate hires for this role, as data analytics is a cornerstone of our ongoing projects. Responsibilities: Data Analysis: Extract and clean data from various sources, ensuring integrity and readiness for analysis. Model Development: Build machine learning models to predict trends, assess risks, and optimize operations. Dashboard Creation: Design interactive dashboards to visualize key metrics and trends for stakeholders. Collaboration: Work closely with project managers and research associates to align data insights with strategic goals. Data Governance: Establish protocols to maintain data quality & security. Preferred Qualification: Master s degree in Data Science, Computer Science, or related disciplines. Proficiency in Python, R, SQL, and visualization tools like Tableau or Power BI. Prior experience in predictive analytics and data storytelling. Minimum Qualification: Bachelor s degree from a Tier-1 institution. 3+ years of experience in a data-driven role. Foundational knowledge of ML algorithms and data engineering.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

60 - 100 Lacs

bengaluru

Hybrid

Role: Senior Engineering Manager, Data Platforms Reference Code: HR1175636213662955 Experience: 10-15 years Salary: Confidential (based on experience) Opportunity Type: Hybrid (Bengaluru) Placement Type: Full time Permanent Position (*Note: This is a requirement for one of Uplers Clients) Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. About the Role: Our client runs the largest custom e-commerce large parcel network in the United States, approximately 1.6 million square meters of logistics space. The nature of the network is inherently a highly variable ecosystem that requires flexible, reliable, and resilient systems to operate efficiently. The Data Services & Data Enablement team is looking for smart, passionate and curious people who are excited to help us scale, support, and engineer our database, distributed analytic, and streaming infrastructure. With the broad reach of the technologies we are using you will have the opportunity to grow your network and skills by being exposed to new people and ideas who work on a diverse set of cutting-edge technologies. If you are the type of person who is fascinated by engineering extremely large and diverse data systems and if you are passionate about troubleshooting challenging technical problems in a rapidly innovating cloud environment, you could be a great fit. What You'll Do: Play a key role in developing and driving a multi-year technology strategy for a complex platform. Lead multiple software development teams - architecting solutions at scale to empower the business and owning all aspects of the SDLC: design, build, deliver, and maintain. Directly and indirectly manage several software engineers by providing coaching, guidance, and mentorship to grow the team as well as individuals. Inspire, coach, mentor, and support your team members in their day to day work and their long term professional growth. Attract, onboard, develop and retain diverse top talents, while fostering an inclusive and collaborative team and culture Lead your team and peers by example. As a senior member of the team your methodologies, technical and operational excellence practices, and system designs will help to continuously improve our domain. Identify, propose, and drive initiatives to advance the technical skills, standards, practices, architecture, and documentation of our engineering teams. Facilitate technical debate and decision making with an appreciation for trade-offs. Continuously rethink and push the status quo, even when it challenges your/our established ideas. What youll Need: Results-oriented, collaborative, pragmatic, and continuous improvement mindset. 10+ years of experience in engineering, out of which at least 5-6 years spent in leading highly performant teams. Experience in development of new applications using technologies such as Python, Java or Go. Experience making architectural and design-related decisions for large scale platforms, understanding the tradeoffs between time-to-market vs. flexibility. Significant experience and vocation in managing and enabling peoples growth and performance. Practical experience in hiring and developing engineering teams and culture and leading interdisciplinary teams in a fast-paced agile environment. Capability to communicate and collaborate across the wider organization, influencing decisions with and without direct authority and always with inclusive, adaptable, and persuasive communication. Analytical and decision-making skills that integrate technical and business requirements

Posted 2 weeks ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

hyderabad, pune, bengaluru

Hybrid

Job Summary: We're looking for a motivated and detail-oriented Senior Snowflake Developer with strong SQL querying skills and a willingness to learn and grow with our team. As a Senior Snowflake Developer, you will play a key role in developing and maintaining our Snowflake data platform, working closely with our data engineering and analytics teams. Responsibilities: Write ingestion pipelines that are optimized and performant Manage a team of Junior Software Developers Write efficient and scalable SQL queries to support data analytics and reporting Collaborate with data engineers, architects and analysts to design and implement data pipelines and workflows Troubleshoot and resolve data-related issues and errors \ Conduct code reviews and contribute to the improvement of our Snowflake development standards Stay up-to-date with the latest Snowflake features and best practices Requirements: 4+ years of experience with Snowflake Strong SQL querying skills, including data modeling, data warehousing, and ETL/ELT design Advanced understanding of data engineering principles and practices Familiarity with Informatica Intelligent Cloud Services (IICS) or similar data integration tools is a plus Excellent problem-solving skills, attention to detail, and analytical mindset Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Nice to Have: Experience using Snowflake Streamlit, Cortex Knowledge of data governance, data quality, and data security best practices Familiarity with Agile development methodologies and version control systems like Git Certification in Snowflake or a related data platform is a plus

Posted 2 weeks ago

Apply

4.0 - 6.0 years

20 - 25 Lacs

bengaluru

Work from Office

We are looking for an experienced Atacama Developer who can: Design & implement Atacama MDM workflows, match/merge, survivorship rules Drive data profiling, cleansing, and enrichment Collaborate with stakeholders to deliver trusted master data across the enterprise What we need: 4+ years IT experience Strong knowledge of MDM, Data Governance, Data Quality SQL + Data Integration expertise (ETL, APIs, Cloud) Location: [Bangalore] If youre ready to take on a high-impact data management role and work with a team that values innovation and quality, we’d love to connect! Apply here: [ surajit.k@kaygen.com ]

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

Join us for a role in "CCO Functions" at Barclays, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As an AVP Controls Assurance at Barclays, you'll need to have at least 7 years of experience in Controls Assurance/Testing, along with knowledge of applying Data Analytics techniques. You should also possess knowledge of principal risks such as Data governance, data lineage, data quality, Records Management, People Risk, Supplier risk, Premises, etc. A basic minimum educational qualification of Graduate or equivalent is required for this role. Your responsibilities will include developing detailed test plans, identifying weaknesses in internal controls, and communicating key findings to relevant stakeholders. You will collaborate across the bank to maintain a robust and efficient control environment and provide advice on improvements to enhance the banks" internal controls framework. In addition to the essential requirements, some highly valued skills for this role may include having relevant professional certifications (CA, CIA, CS, MBA), knowledge of process re-engineering methodologies such as LEAN/DMAIC/Value Mapping, experience in the financial services industry, and proficiency in Project Management and Change Management. You may be assessed on key critical skills relevant for success in the role, including risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. As an AVP Controls Assurance, you will play a crucial role in advising and influencing decision-making, contributing to policy development, and ensuring operational effectiveness. You will collaborate closely with other functions/business divisions and lead a team to deliver work that impacts the entire business function. Moreover, you will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive in your daily interactions and decision-making processes. Location: Pune Purpose of the role: To partner with the bank in providing independent assurance on control processes and advising on improvements to enhance the efficiency and effectiveness of the bank's internal controls framework. Key Responsibilities: - Collaboration across the bank to maintain a satisfactory, robust, and efficient control environment. - Development of detailed test plans and procedures to identify weaknesses in internal controls. - Communication of key findings and observations to relevant stakeholders. - Development of a knowledge center containing detailed documentation of control assessments and testing. Assistant Vice President Expectations: - Advise and influence decision-making and contribute to policy development. - Lead a team performing complex tasks and set objectives for employees. - Demonstrate leadership behaviors to create an environment for colleagues to thrive. - Engage in complex analysis of data to solve problems effectively. - Demonstrate the Barclays Values and Mindset in everyday actions. In summary, as an AVP Controls Assurance at Barclays, you will have the opportunity to drive innovation, enhance customer experiences, and play a key role in maintaining a robust control environment across the bank.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You will be responsible for leading the end-to-end implementation of a data cataloging solution within AWS, with a preference for tools like AWS Glue Data Catalog or third-party options such as Apache Atlas, Alation, or Collibra. Your role will involve establishing and managing metadata frameworks for both structured and unstructured data assets in data lake and data warehouse environments. It will also require you to integrate the data catalog with various AWS-based storage solutions like S3, Redshift, Athena, Glue, and EMR. Collaboration with data Governance/BPRG/IT projects teams is essential to define metadata standards, data classifications, and stewardship processes. You will need to develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using tools like Python, Lambda, Pyspark, or custom jobs in Glue/EMR. Working closely with data engineers, data architects, and analysts will be necessary to ensure that metadata is accurate, relevant, and up to date. Implementation of role-based access controls and ensuring compliance with data privacy and regulatory standards will also be part of your responsibilities. Furthermore, you will be expected to create detailed documentation and conduct training/workshops for internal stakeholders on how to effectively use the data catalog. Requirements: - 7+ years of experience in data engineering or metadata management roles - Proven expertise in implementing and managing data catalog solutions within AWS environments - Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation - Hands-on experience with metadata ingestion, data lineage, and classification processes - Proficiency in Python, SQL, and automation scripting for metadata pipelines - Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines) - Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus - Strong communication, problem-solving, and stakeholder management skills Preferred Qualifications: - AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect) - Experience with data catalog tools like Alation, Collibra, or Informatica EDC, or hands-on experience with open-source tools - Exposure to data quality frameworks and stewardship practices - Knowledge of data migration with data catalog and data-mart is a plus This is a full-time position with a day shift schedule located in person.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Lead Data Engineer / Data Integration Lead at NotionMindz Technology LLP, you will be at the forefront of our data initiatives, responsible for designing, developing, and leading the implementation of robust data integration solutions, data warehousing operations, and reporting frameworks. Your expertise in SQL, RDBMS (especially SQL Server), and modern ETL/ELT tools like Azure Data Factory and Databricks is crucial for the success of our projects. Your role will involve translating complex business requirements into tangible data solutions, guiding a team of data developers and testers, and engaging with clients to ensure project success. Your deep understanding of project dynamics, data architecture, and reporting tools will be essential in driving insights and business value for our clients. Your key responsibilities will include: Data Architecture & Development: - Demonstrating expert-level knowledge in RDBMS, particularly SQL Server, and proficiency in complex SQL query writing, object management, and performance optimization. - Applying a strong understanding of Transactional and Dimensional Data Modeling concepts, including Star Schema, Facts, Dimensions, and their relationships. - Designing, building, and optimizing scalable data pipelines using Azure Data Factory and Azure Databricks, orchestrating complex data workflows. - Leveraging Spark-based data transformations within Databricks for large-scale data integrations. Reporting & Data Analysis: - Possessing expert-level knowledge in Microsoft reporting/visualization tools like Power BI and Azure Analysis Services. - Performing in-depth data analysis to troubleshoot missing data, identify data quality issues, and suggest value-added metrics for enhanced reporting insights. Project Leadership & Client Engagement: - Fully comprehending project context and developing a clear vision for project execution. - Acting as the primary interface with clients, managing expectations, and ensuring clear communication. - Providing consultative advice to clients on best practices for data visualization and reporting. Team Leadership & Quality Assurance: - Leading a team of data developers and testers, providing guidance, mentorship, and support. - Performing design and code reviews to ensure adherence to standards and best practices. - Writing and reviewing comprehensive test cases to ensure accuracy and quality. SDLC & DevOps Practices: - Understanding Software Development Life Cycle practices, source control, version management, and using Azure DevOps. - Familiarity with Continuous Integration/Continuous Deployment practices for automated deployment of data pipelines. Required Qualifications & Skills: - 8-12+ years of experience in Data Engineering, Data Warehousing, or Data Integration roles. - Expertise in RDBMS (SQL Server), SQL query writing, object management, and performance optimization. - Strong understanding of ETL concepts and experience with SSIS, Azure Data Factory, Azure Databricks, or Airflow. - In-depth knowledge of Azure Data Factory, Azure Databricks, and Databricks Unity Catalog. - Proficiency in Power BI and/or Azure Analysis Services. - Proven experience in end-to-end ETL or reporting projects. - Strong analytical and leadership skills with excellent communication abilities. Preferred Skills: - Knowledge of Python programming language. - Hands-on experience with Azure DevOps for source control, CI/CD, and project management. Join us at NotionMindz Technology LLP and be a part of our dynamic and innovative team dedicated to delivering cutting-edge data solutions to our clients.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As the Director of Software Engineering at Ethoca, a Mastercard company, you will be an integral part of a global collaboration-based technology services provider focused on connecting card issuers and merchants to combat fraud, enhance customer experience, and prevent disputes. Your role will involve leading a highly agile team in building exciting and innovative products delivered at scale to global markets. Your primary responsibilities will include overseeing and facilitating the growth of a high-performing team of software engineers. You will be involved in hiring, mentoring, coaching, conducting performance evaluations, motivating, retaining staff, and providing status updates to senior management. Additionally, you will provide hands-on technical leadership and mentorship to the engineering team, working closely with product teams to understand requirements, propose solutions, and offer technical thought leadership. You will actively contribute to the agile process and decision-making within the team, driving prioritization decisions and trade-offs in collaboration with product partners. Your role will involve bringing multiple engineers and/or teams together to achieve the overall objectives of proposed solutions. Furthermore, you will engage engineers across the Technology organization to promote standard software patterns and the reuse of common libraries and services. As the Director of Software Engineering, you will ensure adherence to Mastercard's corporate standards, including coding, security, and data governance standards. Your responsibilities will also include fostering a creative atmosphere, challenging norms to inspire innovation, and implementing effective metrics, controls, and reporting to measure the progress, productivity, and quality of the team's output. Additionally, you will research, evaluate, and implement new technologies and tools in partnership with enterprise architects to align with existing products, platforms, emerging business needs, and engineering best practices. To be successful in this role, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, with over 8 years of experience in software solutions delivery. You should have managed one or more teams with 10+ members and possess strong communication skills, including experience preparing and delivering executive-level presentations. Your ability to interact across multiple organizational levels and set direction for large or complex projects will be crucial. Moreover, you should have a good understanding of data governance, regulations impacting data storage and processing solutions, and experience in building highly resilient and scalable distributed systems. Proficiency in various database, messaging, and caching technologies, as well as knowledge of cryptography functions and secure coding practices, will be essential. Experience with CI/CD, automation, virtualization, and containerization tools, as well as familiarity with web security vulnerabilities, will also be beneficial. This hybrid position may require you to work both remotely and in the office, with the specific expectations to be confirmed by your hiring manager. If you are passionate about technology, innovation, and leading high-performing teams to deliver impactful products, this role offers a dynamic and challenging opportunity for you to thrive.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Senior Engineer for Data Management in Private Bank role in Pune, India, involves working with the Data Governance and Architecture team to drive data management initiatives in collaboration with the Divisional Data Office for Private Bank. Responsibilities include assigning data roles, managing data flows documentation, aligning data requirements between consumers and producers, reporting data quality, and coordinating data delivery. The role also entails supporting colleagues in optimizing Deutsche Bank's Data Policy and associated processes, from project planning to senior management reporting. Regulatory compliance and leveraging data for business benefits are key aspects of the role. Key Responsibilities: - Establish and maintain Private Bank's contribution to Deutsche Bank's Enterprise Logical and Physical Data models - Understand and translate requirements from risk, finance, treasury, and regulatory reporting functions into data models - Co-own Private Bank's relevant data models within Deutsche Bank's framework - Support Private Bank experts in delivering relevant data - Optimize requirements management and modeling processes in collaboration with the group Chief Data Office and Private Bank stakeholders - Align tasks with team and Private Bank Data Council priorities Skills And Experience: - In-depth understanding of how data quality impacts processes in the retail banking sector - Hands-on experience with data modeling in the financial industry - Extensive knowledge of data architecture and challenges in data provisioning - Project and stakeholder management skills - Ability to collaborate effectively with diverse teams globally - Fluency in English Benefits: - Best in class leave policy - Gender-neutral parental leaves - Childcare assistance benefit reimbursement - Sponsorship for industry certifications and education - Employee Assistance Program - Comprehensive insurance coverage - Health screening for employees above 35 years Support: - Training and development opportunities - Coaching from team experts - Culture of continuous learning - Flexible benefits tailored to individual needs Deutsche Bank promotes a culture of empowerment, responsibility, commercial thinking, and collaboration. The organization values inclusivity and diversity in the workplace. For more information about Deutsche Bank, visit https://www.db.com/company/company.htm.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

In the Global Assessments & People Analytics & Insights group at Citigroup, you will work on solving business problems using workforce and business data. Your role involves conducting analyses to enable evidence-based decision-making, utilizing statistical models to predict future outcomes, and providing actionable insights. The mission of the team is to equip decision-makers with data-driven insights through analytics to make informed decisions about individuals, teams, and organizations to enhance business performance. As the Lead Data Product Manager, you will bridge the gap between technical data science teams, business stakeholders, and enterprise data consumers. Your primary responsibility will be to drive impactful decision-making through scalable, user-centric data products. This includes owning the lifecycle of internal data tools, regulatory dashboards, and advanced analytics products supporting areas such as workforce planning, talent insights, and organizational health monitoring. Your key responsibilities will include: - Defining and driving the data product vision, strategy, and roadmap aligned with business priorities across HR, compliance, and leadership teams. - Managing a portfolio of internal data and AI products from concept through launch and iteration, focusing on usability, scalability, and adoption. - Collaborating with cross-functional teams to translate business needs into high-impact solutions. - Developing and maintaining product documentation, use cases, and KPIs to measure effectiveness and ensure product success. - Promoting a product mindset within the data function, balancing technical feasibility, business value, and regulatory compliance. - Leading and mentoring a team of product managers and analysts, fostering a culture of innovation, ownership, and customer focus. - Managing stakeholder communications, demos, and training to drive smooth adoption and value realization across the organization. - Staying updated on emerging trends in data, AI, and workforce technology to continuously enhance product offerings. Qualifications required for this role: - 10+ years of experience in product management, data analytics, or technology strategy, with at least 3 years in a leadership role. - Proven track record of launching successful internal or B2B data products, preferably in enterprise or regulated environments. - Strong understanding of data governance, user experience, and AI/ML product development lifecycle. - Exceptional stakeholder management and communication skills, with the ability to influence and align across all levels of the organization. - Experience in leading cross-functional teams, including data scientists, engineers, and business partners. - Familiarity with people data, HR systems (e.g., Workday, SuccessFactors), and regulatory frameworks is a plus. - Bachelor's degree in a quantitative field or business; MBA or equivalent advanced degree preferred. Should you require a reasonable accommodation due to disability to utilize search tools and/or apply for a career opportunity, please review the Accessibility at Citi. For further information, you can view Citis EEO Policy Statement and the Know Your Rights poster.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

ANSR's client, ArcelorMittal, is a renowned global steel and mining company with a presence in over 60 countries and a workforce of 158,000+ dedicated individuals. At ArcelorMittal, innovation and sustainability are at the core of our operations as we strive to lead transformative change in the steel manufacturing industry. As an Analyst-Master Data Maintenance Specialist specializing in SAP MDG within the D&IT Data department, you will play a crucial role in ensuring the accuracy and integrity of master data across various domains such as Customer, Vendor, Material, Finance, and Organizational data. Working closely with business stakeholders, data stewards, and IT teams, you will be responsible for maintaining high-quality master data in alignment with governance standards and policies. Key responsibilities include managing master data records in SAP MDG, executing data creation and modification processes, validating data requests for accuracy and compliance, monitoring data quality, collaborating with data owners for continuous improvements, and supporting data integration with downstream systems. Additionally, you will be involved in generating data quality reports, analyzing anomalies, and participating in testing and deployment activities for SAP MDG upgrades and enhancements. To qualify for this role, you should possess a Bachelor's degree in Information Systems, Business Administration, or a related field, along with 2 to 4 years of experience in SAP MDG. Strong knowledge of SAP master data objects, familiarity with SAP MDG workflows and data models, and experience in data governance practices are essential. Attention to detail, communication skills, and organizational abilities are key attributes for success in this role. Preferred qualifications include experience with SAP S/4HANA, exposure to data migration projects, and certification in SAP MDG or relevant data management programs. In return, we offer you the opportunity to work in a pivotal role supporting enterprise-wide data quality, a collaborative team environment, training in SAP MDG and data management practices, competitive salary, benefits, and prospects for career growth. Join us at ArcelorMittal and be a part of our journey towards building a better world with smarter low-carbon steel.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As an SAP BW Consultant with CGI, you will have the responsibility of designing and developing data models, implementing ETL processes, and optimizing reporting and analytics. Your role will involve ensuring seamless data integration with SAP and non-SAP systems, enhancing performance, and working with SAP BW on HANA/S/4HANA. Additionally, you will be involved in managing data governance, conducting testing, providing support, and creating documentation and training materials for business users. Your main tasks will include designing and developing data models using SAP BW, extracting, transforming, and loading data from SAP and non-SAP sources into SAP BW, developing reports and dashboards, optimizing query performance, and enhancing data loading processes for efficient reporting. You will also work on SAP BW on HANA/S/4HANA, ensuring seamless data integration with various SAP modules, maintaining data governance and security, and conducting testing to validate data accuracy. Additionally, ongoing support, troubleshooting, and enhancements for BW solutions will be part of your responsibilities. You will also be required to create technical documentation, user guides, and conduct training for business users. In this role, having experience with SAP BW on HANA & S/4HANA, SAP BusinessObjects (BO) & SAC, HANA Native Modeling, BW/4HANA Migration, and an Agile way of working will be beneficial. You will have the opportunity to work in a collaborative environment that values ownership, teamwork, respect, and belonging. At CGI, you will be encouraged to turn meaningful insights into action and contribute to the company's success as a CGI Partner. You will have the support of leaders who prioritize your health and well-being and provide opportunities for skills development and career growth. If you are looking to join one of the largest IT and business consulting services firms globally, CGI offers a dynamic and rewarding environment where you can reach your full potential and shape your career while making a meaningful impact.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Solution Architect at Services, a leading provider of cash management, treasury, trade, clearing, depository receipts, and commercial cards services with a global network spanning over 100 countries, you will play a crucial role in the Core Banking Technology group. This group is currently undergoing a multi-year large-scale transformation to enhance Core Accounts Technology by addressing eCommerce-scale growth programs and retiring legacy infrastructure while building a next-generation set of capabilities to meet evolving client, market, and regulatory needs globally. In this role, you will be responsible for re-engineering the interaction of data flows within the Core Accounts DDA platform, Reference Data platforms, Data Warehouse, Data Lake, and local reporting systems. You will drive the data architecture and roadmap to eliminate non-strategic connections, define canonical data models, assess opportunities for simplification, provide technical guidance to Data Engineers, and formulate strategies for rationalizing and migrating reports. Key Responsibilities: - Re-engineer data flows from Core Accounts DDA platform to various systems - Drive data architecture and roadmap to eliminate non-strategic connections - Define canonical data models for key entities and events - Provide technical guidance to Data Engineers for designing data stores - Rationalize and simplify existing database schemas - Collaborate with Product Owners to translate User Stories into technical requirements - Act as a Subject Matter Expert (SME) for senior stakeholders - Monitor and control all phases of the development process - Provide user and operational support on applications to business users Qualifications: - Experience in data modeling, data lineage analysis, and operational reporting - Proven architecture experience in scalable and resilient data platforms - Proficiency in message queuing, stream processing, and big data stores - Strong SQL knowledge and experience with relational databases - Experience with data integration patterns and real-time streaming - Background in Data Management, Data Governance, and Transformation initiatives - Familiarity with industry Core Banking Platforms such as Flexcube, PISMO - 6+ years of relevant industry experience If you are looking for an opportunity to be part of a transformative journey in global banking technology and possess the required qualifications and experience, we encourage you to apply for this role at Services.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking an experienced and innovative leader to oversee our Business Intelligence (BI) team as the Associate Director of BI. In this role, you will play a crucial part in delivering cutting-edge data-driven solutions and fostering growth and excellence within our practice. We are looking for a visionary leader who can effectively blend technical expertise with strategic execution to drive significant impact across the organization, especially in supporting the go-to-market functions of Wolters Kluwers FCC Division. Your responsibilities will include providing strategic leadership and guidance to BI managers and senior professionals, covering BI, data governance & stewardship, and data engineering practices. You will be responsible for overseeing the development and implementation of advanced analytical models, data analysis, reporting solutions, and data pipelines. Your role will also involve creating visualizations with insightful findings to facilitate informed decision-making. As the Associate Director of BI, you will oversee the end-to-end lifecycle of BI programs to ensure they align with the sales center of excellence and go-to-market initiatives. You will drive innovation in data analysis techniques and BI methodologies to enhance sales strategies and market positioning, providing expert advice on functional and technical design decisions. Additionally, you will manage staffing, budgeting, planning, and day-to-day execution of initiatives across BI and data management areas. Collaboration with various functional and technical teams within the FCC division will be a key aspect of your role. You will act as the main liaison for BI programs delivered from our Pune location, ensuring they meet the requirements of the sales center of excellence. In terms of talent development, you will focus on fostering a positive and inclusive workplace culture within the BI team. You will implement talent management strategies to maintain a strong talent pipeline, initiate coaching programs, and leadership development initiatives to enhance the team's capabilities. Working closely with functional leaders, you will set clear goals and expectations for teams and individuals, monitor team performance, and address any challenges promptly and constructively. Qualifications: Education: Bachelor's degree in Computer Science, Engineering, or a related field. Experience: 10+ years of program delivery experience within a technology function or data and analytics practice area, with a proven track record of leadership in technology services delivery. Knowledge of Business Intelligence and Data Analytics operations, especially in supporting sales and marketing functions. Skills: Outstanding communication, interpersonal, and leadership skills, ability to inspire and motivate diverse teams, expertise in talent acquisition and workforce development, strategic problem-solving and decision-making capabilities, proficiency in BI applications like PowerBI, Salesforce CRMA, Tableau, knowledge of modern data warehousing techniques, and experience with data platforms like Data Bricks and cloud service integrations (Azure, AWS).,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The ideal candidate for this role should have a minimum of 5 years of relevant experience with a strong technical expertise in building PySpark data streaming jobs on Azure Databricks. Additionally, a minimum of 4 years of experience in Data Governance and Data Modeling is required. Excellent communication skills are mandatory for this position. Candidates are expected to have detailed project experience in their profiles. The job type for this role is contractual/temporary with a contract length of 12 months. The working schedule is during the day shift. The required experience for this position includes a total of 5 years of work experience, with at least 3 years of experience specifically in PySpark. The work location for this role is in person.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a Talend ETL Developer at Team Geek Solutions, you will be responsible for designing, developing, and maintaining ETL processes using Talend. Your role will involve implementing data integration solutions, collaborating with business stakeholders to understand data requirements, and optimizing SQL queries for data extraction and manipulation. You will be tasked with ensuring data accuracy and quality through data profiling and analysis, as well as monitoring and troubleshooting ETL jobs to ensure smooth data flow. Additionally, you will be required to maintain documentation for ETL processes and data model designs, work with team members to design and enhance data warehouses, and develop data transformation logic to meet business needs. To excel in this role, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field, and have proven experience as an ETL Developer with expertise in Talend. Your strong understanding of ETL frameworks, data integration principles, and proficiency in writing and troubleshooting SQL queries will be essential. Experience in data modeling, database design, and familiarity with data quality assessment methodologies are also required. Your ability to analyze complex data sets, provide actionable insights, and demonstrate strong problem-solving and analytical skills will be crucial. Excellent communication and interpersonal skills are necessary for collaborating effectively in a team-oriented environment. Knowledge of data warehousing concepts, best practices, and experience with Agile development methodologies will be valuable assets. Your willingness to learn new technologies and methodologies, attention to detail, commitment to delivering high-quality solutions, and ability to manage multiple tasks and deadlines effectively are key attributes for success in this role. Experience with performance tuning and optimization of ETL jobs is a plus. If you are passionate about data warehousing, troubleshooting, ETL processes, workflow management, data modeling, SQL, data profiling and analysis, data governance, data integration, and Agile methodology, and possess the required skills and qualifications, we invite you to join our innovative and collaborative team at Team Geek Solutions in Mumbai or Pune.,

Posted 2 weeks ago

Apply

11.0 - 16.0 years

0 Lacs

karnataka

On-site

It is exciting to be part of a company where individuals genuinely believe in the purpose of their work. The commitment to infuse passion and customer-centricity into the business is unwavering. Fractal stands out as a key player in the field of Artificial Intelligence. The core mission of Fractal is to drive every human decision within the enterprise by integrating AI, engineering, and design to support the most esteemed Fortune 500 companies globally. Recognized as one of India's top workplaces by The Great Place to Work Institute, Fractal is at the forefront of innovation in Cloud, Data, and AI technologies, fueling digital transformation across enterprises at an unprecedented pace exceeding 100%. At Fractal, we empower enterprises to leverage the potential of data on the cloud through a spectrum of services such as Architecture consulting, Data Platforms, Business-Tech platforms, Marketplaces, Data governance, MDM, DataOps, and AI/MLOps. Additionally, we employ AI engineering methodologies to enhance each facet of our offerings. As part of the team, your responsibilities will include evaluating the existing technological landscape and formulating a progressive, short-term, and long-term technology strategic vision. You will actively contribute to the creation and dissemination of best practices, technical content, and innovative reference architectures. Collaborating with data engineers and data scientists, you will play a pivotal role in architecting solutions and frameworks. Moreover, your role will involve ensuring the seamless delivery of services, products, and solutions to our clientele. To excel in this role, you should possess a wealth of experience as an Architect with a strong background in Google Cloud Platform and a genuine interest in leveraging cutting-edge technologies to address business challenges. The ideal candidate will have 11 to 16 years of experience in Data Engineering & Cloud Native technologies, particularly Google Cloud Platforms, encompassing big data, analytics, and AI/ML domains. Proficiency in tools such as BigQuery, Cloud Composer, Data Flow, Cloud Storage, AI Platform/Vertex AI, Dataproc, and GCP IaaS is essential. A solid understanding of Data Engineering, Data Management, and Data Governance is required, along with experience in leading End-to-End Data Engineering and/or Analytics projects. Knowledge of programming languages like Python and Java, coupled with a grasp of technology best practices and development lifecycles such as agile, CI/CD, DevOps, and MLOps, is highly valued. Demonstrable expertise in technical architecture leadership, secure platform development, and creation of future-proof global solutions using GCP services is crucial. Excellent communication and influencing skills are imperative, allowing adaptability to diverse audiences. Desirable skills include experience in Container technology like Docker and Kubernetes, DevOps on GCP, and a Professional Cloud Architect Certification from Google Cloud. If you are drawn to dynamic growth opportunities and enjoy collaborating with energetic, high-achieving individuals, your career journey with us promises to be fulfilling and rewarding.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As the GTM Systems Specialist focused on Customer Operations at Narvar, you will be instrumental in managing and optimizing the technology stack that supports our Customer Success and Customer Operations teams. Your responsibilities will include overseeing the administration of Salesforce and its integrations with key tools such as ChurnZero, Gong, and Delighted. By ensuring seamless data flow, scalable processes, and actionable insights, you will contribute to enhancing customer satisfaction, retention, and growth. Collaboration across teams is a key aspect of this role, as you will work closely with Customer Success, Sales, Marketing, and Business Operations teams. Your objective will be to configure systems to meet their requirements while proactively identifying opportunities to enhance processes, data quality, and user experience. Of particular focus will be managing the implementation and ongoing administration of ChurnZero to ensure effective integration with Salesforce for customer lifecycle management and business value delivery. In your day-to-day activities, you will be responsible for the implementation, management, and optimization of customer-facing tools such as ChurnZero, Gong, and Delighted, ensuring their integration with Salesforce aligns with business objectives. Serving as a Salesforce administrator, you will manage its integrations with external tools, configure workflows, field mappings, and data sync processes for seamless system functionality. Additionally, you will lead the ChurnZero implementation, optimize its use to support the Customer Success team, and identify opportunities to enhance system usability and processes for internal users. Maintaining accurate and reliable customer data across all platforms, addressing data gaps, inefficiencies, and inconsistencies, and recommending data quality improvements will be part of your role in data architecture and governance. You will utilize system capabilities to streamline and automate key processes such as customer lifecycle management, health score tracking, and renewal reporting while collaborating with various teams to ensure system configurations align with organizational goals and enhance user experience. The ideal candidate for this role will have at least 3 years of experience as a Salesforce administrator or in a similar role managing system integrations, preferably in a high-growth SaaS environment. Technical expertise required includes Salesforce Admin Certification, proficiency in managing system integrations, and a strong understanding of data architecture best practices. A user-centric mindset, experience with process automation, data governance, proactive problem-solving skills, and excellent communication and collaboration abilities are also essential. At Narvar, we are dedicated to simplifying the everyday lives of consumers and driving customer loyalty through post-purchase experiences. With a focus on innovation and teamwork, we offer a dynamic work environment where professional achievements are celebrated alongside personal milestones. Join our team and be a part of pioneering the post-purchase movement with a company that values creativity, collaboration, and a commitment to excellence.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Solution Architect with 10 to 14 years of experience, you will collaborate with sales, presales, and COE teams to provide technical expertise and support throughout the new business acquisition process. Your role is crucial in understanding customer requirements, presenting solutions, and demonstrating product value. You excel in high-pressure environments, maintaining a positive outlook and recognizing that career growth requires strategic choices. Possessing strong communication skills, both written and verbal, allows you to convey complex technical concepts clearly. Being a team player, customer-focused, self-motivated, and responsible individual, you can work under pressure with a positive attitude. Experience in managing RFPs/ RFIs, client demos, presentations, and converting opportunities into winning bids is essential. Your work ethic, positive attitude, and enthusiasm for new challenges, along with multitasking and prioritizing abilities, are key. You can work independently with minimal supervision, demonstrating a process-oriented and quality-first approach. Your performance as a Solution Architect will be measured by your ability to convert clients" business challenges into winning proposals through excellent technical solutions. In this role, you will: - Develop high-level architecture designs for scalable, secure, and robust solutions. - Select appropriate technologies, frameworks, and platforms for business needs. - Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. - Ensure seamless integration between enterprise applications, APIs, and third-party services. - Design and develop scalable, secure, and performant data architectures on cloud platforms. - Translate business needs into technical solutions by designing secure, scalable, and performant data architectures. - Recommend and implement data models, data services, and data governance practices. - Design and implement data pipelines for efficient data extraction, transformation, and loading processes. Requirements: - 10+ years of experience in data analytics and AI technologies. - Certifications in data engineering, analytics, cloud, or AI are advantageous. - Bachelor's in engineering/technology or an MCA from a reputed college is required. - Prior experience as a solution architect during the presales cycle is beneficial. Location: Hyderabad, Ahmedabad, Indore Experience: 10 to 14 years Joining Time: Maximum 30 days Work Schedule: All Days, Work from Office,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As an experienced SAP professional with a specific focus on data migration projects, you will be responsible for leading end-to-end data migration processes within the SAP environment. Your role will involve configuring and utilizing data migration tools such as SAP LSMW, SAP Migration Cockpit, SAP Data Services (BODS), or third-party ETL tools to ensure successful data migration from legacy systems to SAP systems. You will need to have a strong understanding of SAP data structures and tables across various SAP modules such as MM, SD, FI/CO, HR, PP, etc. Additionally, expertise in data quality management, data governance principles, master data management (MDM), and transactional data migration will be essential for this role. Collaboration with both business and technical teams is key to understanding data migration requirements, source data structures, and target SAP systems (ECC or S/4HANA). You will be expected to develop and execute detailed data migration plans, including extraction, transformation, validation, and loading (ETL) of data. Your proficiency in ABAP for writing custom programs for data extraction, validation, and loading, as well as SQL for data analysis, extraction, and transformation, will be crucial. Strong problem-solving skills will be required to troubleshoot complex data issues during the migration process. Excellent communication and stakeholder management skills are essential for leading cross-functional teams through the data migration process. Any SAP certification in data migration, SAP S/4HANA, or SAP BODS will be considered a plus for this role.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Management Business Analyst at Insight Global's premium client in the financial services space, you will be responsible for bridging the gap between business needs and technical solutions, with a strong emphasis on data governance and data management. Your primary focus will be to ensure effective governance, security, and alignment of the company's data assets with business objectives, particularly emphasizing the capture of data lineage across the technology estate. You will serve as the liaison for internal stakeholders, facilitating the understanding of requirements and may also be involved in data manipulation. You must have at least 5 years of experience in a Business Analyst and/or Data Analyst role, with a specific focus on data governance, data management, or data quality. A strong technical understanding of data systems, databases (e.g., SQL), data modeling, and data integration tools is essential. Proficiency in data analysis tools and techniques (e.g., Python, R, or Excel) is required. Experience in developing and implementing data governance frameworks, policies, or standards is a must. Excellent communication and stakeholder management skills are vital, along with the ability to translate complex technical concepts into simplified business language. You should have experience in creating business requirement documentation (BRD) and a solid understanding of regulatory compliance requirements related to data, such as GDPR, DORA, or industry-specific regulations. A Bachelor's degree in a relevant field like Computer Science, Information Systems, Data Science, Business Administration, or equivalent is necessary. In addition to the must-haves, the following skills and experiences would be considered a plus: - Hands-on experience with data governance tools like Collibra, Informatica, or Solidatus - Familiarity with cloud-based data platforms such as Azure, AWS, or Google Cloud - Knowledge of modern data platforms like Snowflake, Databricks, or Azure Data Lake - Familiarity with data visualization tools for presenting insights, such as Tableau or Power BI - Experience in writing user stories - Experience working in an Agile environment, using tools such as Jira - Experience in financial services or other regulated industries - Understanding of machine learning or advanced analytics concepts - An advanced degree in Data Science, Business Analytics, or related fields - Professional certifications in business analysis (e.g., CBAP, CCBA), data analysis, or data governance (e.g., DAMA CDMP, CISA) would be highly desirable This role offers a 12-month initial contract with the possibility of extensions and is based in Bengaluru.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Security & Privacy Architect within the Data & AI Architecture team at BT, you will play a crucial role in building security patterns and frameworks, analyzing current incidents, and integrating security and data privacy measures into the design of solutions and core frameworks. Your responsibilities will include developing and maintaining security and data privacy patterns and frameworks that can be applied across various platforms, proactively incorporating security and privacy considerations early in the design and development process, collaborating with other architecture chapters to design and implement security and privacy patterns, and staying informed about the latest security threats and trends to eliminate potential risks. You will be expected to provide technical expertise to ensure that security and privacy measures are effectively integrated into technical solutions, ensure that security and privacy standards and practices are adhered to across all projects and solutions, offer sound technical advice on security and privacy-related matters to various teams and stakeholders, continuously identify opportunities to enhance security and privacy standards and practices within the organization, and address complex security and privacy issues that impact multiple domains and provide effective resolutions. The ideal candidate will have proven experience in designing and implementing security architectures for cloud-based data platforms, a strong understanding of security principles including encryption, identity and access management, and data protection, experience with security frameworks and compliance standards such as GDPR, HIPAA, and ISO 27001, and hands-on experience with security tools and technologies. Additionally, experience with Google security tools such as Google Security Command Center, Cloud Armor, and Identity and Access Management (IAM), ability to conduct security assessments and audits to identify and mitigate risks, and experience in integrating data privacy measures into security patterns and frameworks are required. Qualifications and skills include a strong architectural background with experience in Infrastructure, Engineering, or Security, adaptability, willingness to learn and adapt to change, experience in defining, designing, and managing complex technical solutions across multiple domains, proficiency in defining, creating, documenting, and managing solutions, understanding of technologies supporting data security and privacy, and certifications in GCP, AWS, Azure, or another cloud platform are preferred. In this role, you will need to demonstrate exceptional skills in policy analysis, system design, regulatory compliance, information management, data architecture, data asset management, data privacy, data science, data governance, technology strategy, data analysis, data management, data design, enterprise architecture, decision making, growth mindset, standards design, and inclusive leadership. Your ability to lead inclusively and safely, deliver for the customer, demonstrate a growth mindset, and build for the future will be essential in contributing to BT's ongoing transformation and commitment to improving connectivity for millions.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

Whether you're at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you'll have the opportunity to expand your skills and make a difference at one of the world's most global banks. We're fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You'll also have the chance to give back and make a positive impact where we live and work through volunteerism. Citi Finance is responsible for the firm's financial management and related controls. We manage and partner on key Citi initiatives and deliverables, such as our quarterly earnings process and ensuring Citi's compliance with financial rules and regulations. The team comprises chief financial officers who partner with each of our businesses and disciplines including controllers, financial planning and analysis, strategy, investor relations, tax and treasury. The Controller System and Data Operations (CSDO) function within Finance Controllers Team is responsible for partnering with the Global Process Owners in Finance to transform the end-to-end Global Operating Model for all Financial Books and Records processes, platforms and results. This role is within the CSDO Controllers Adjustment Reduction, EUC and Data Initiatives organization responsible for delivering data ownership, system ownership, data quality rules and data lineage maintenance, and adherence to data policies and standards for Controllers-owned data. As a part of our Controllers Systems, Data & Operations (CSDO) function within Data Quality & Reporting Team, we are looking for a high caliber professional to join our team as Controllers Systems, Data & Operations Primary Business Information Central Office (PBIO) Lead. In this role, you're expected to: - Apply comprehensive understanding of concepts and procedures within your own area and basic knowledge of other areas to resolve issues that have an impact beyond your own area. - Ensure the creation and sign-off of program plan and charter, benefits management plan, stakeholder management plan, acceptance plan, central program issue log; also ensure all program documentation is accurate and understood. - Monitor vendor performance and ensure actions are taken if performance warrants, where necessary within Finance applications assigned. - Exercise responsibility for budget, policy formulation, and planning for Finance Controllers-owned applications assigned. - Ensure that the system impact for any related solutions is considered in the change management process prior to implementation. - Participate in implementation management for strategic initiatives, technology portfolio budget management in coordination with Technology Services, project support, as well as communications and policy management across Finance and Risk. - Contribute to the implementation of common technology, data, and data standards, common processes to comply with internal policy procedures and external regulatory demands. - Apply an in-depth understanding of the business impact of technical contributions. - Develop broad-based business reporting solutions to issues that have complex/multiple variables have the potential to cause substantial impact if left unresolved. - Lead reengineering efforts in business's methodology, strategy, and organizational goals. - Provide in-depth and sophisticated analysis with interpretive thinking to define problems and develop innovative solutions. - Proactively communicate meaningful updates & insights with a variety of stakeholders, including executive stakeholders & oversight partners, clearly & precisely to help management understand progress and risks. - Apply knowledge and understanding of the businesses to solve a great variety of problems - by working directly with the senior business leaders. - Develop strategies to reduce costs, manage risk and enhance services. - Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing and reporting control issues with transparency. - Communicate effectively, develop and deliver multi-mode communications that convey a clear understanding of the unique needs of different audiences; able to drive consensus and influence relationships at all levels. - Collaborate effectively by building partnerships and working well with others to meet shared objectives. As a successful candidate, you'd ideally have the following skills and exposure: - 10+ years of experience, Banking or Finance industry preferred. - Understanding of defining and implementing Data Quality programs. - Experience / Certifications in Agile Methodology preferred but not necessary. - Understanding of managing Data Quality on an ongoing basis. - Ability to gain confidence and trust of others through honesty, integrity, and authenticity. - Strong negotiation, influencing, and stakeholder management skills across a variety of stakeholders at different levels. - Organizational savvy; understands systems, management processes, knows where to go for information and how to interpret them.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

This role is for one of our clients in the Technology, Information and Media industry. As an Associate level Data Visualization Analyst based in Mumbai, you will utilize your 3 years of experience to transform complex datasets into clear, actionable narratives. Your passion for crafting impactful dashboards and proficiency in working with cutting-edge cloud data tools will be instrumental in thriving in fast-paced environments. In this role, you will collaborate across functions with business, engineering, and analytics teams to design intuitive, scalable, and aesthetically rich dashboards and reports that drive better decisions across the company. Your responsibilities will include: Visualization Design & Development - Creating and managing interactive dashboards and data visualizations using tools like Power BI, Tableau, or Looker. - Developing custom visuals and reports tailored to stakeholder needs. Cloud-Based Data Access & Transformation - Extracting, processing, and modeling large-scale data from Azure Data Lake and Databricks to ensure performance and accuracy. - Collaborating with data engineers to prepare, clean, and transform datasets for reporting and visualization. Stakeholder Collaboration - Translating business questions into clear analytical visual narratives and performance dashboards. - Acting as a visualization consultant to product, marketing, and operations teams. Data Quality & Governance - Performing data profiling, validation, and cleansing to ensure data integrity. - Maintaining documentation and consistency across reports, visuals, and metric definitions. Continuous Improvement & Innovation - Staying ahead of trends in dashboarding, self-serve analytics, and BI UX best practices. - Optimizing existing dashboards to enhance performance, usability, and storytelling quality. You should bring: Core Skills & Experience - 3 years of professional experience in data visualization, business intelligence, or analytics. - Strong hands-on knowledge of Azure Data Lake, Databricks, and cloud-native data platforms. - Advanced proficiency in one or more visualization tools: Power BI, Tableau, Looker, or similar. - Solid SQL experience for writing complex queries and transforming datasets. - Understanding of data modeling concepts, including star/snowflake schemas and OLAP cubes. Nice-to-Have Skills - Familiarity with Azure Synapse, Data Factory, or Azure SQL Database. - Experience using Python or PySpark for data prep or analytics automation. - Exposure to data governance, role-based access control, or data lineage tools. Soft Skills & Traits - Strong visual design sense and attention to detail. - Ability to explain complex technical topics in simple, business-friendly language. - Proactive mindset, keen to take ownership of dashboards from concept to delivery. - Comfortable working in agile teams and managing multiple projects simultaneously. Preferred Qualifications - Bachelors degree in Computer Science, Statistics, Data Analytics, or a related field. - Certifications in Azure, Databricks, Power BI, or Tableau are a plus.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

delhi

On-site

You will be a valuable addition to our fast-growing data engineering team as a Data Engineer with expertise in Informatica MDM. Depending on your level of experience, you will join us as a Software Engineer or Senior Software Engineer. Your primary responsibilities will include contributing to the design, development, and maintenance of enterprise data management solutions that align with our business objectives. Your key duties will involve building reliable data pipelines, collaborating on master data management, and ensuring data quality, governance, and integration across various systems. You will work closely with data architects, business analysts, and stakeholders to gather data requirements and perform tasks such as data profiling, quality assessments, and master data matching/merging. Additionally, you will implement governance, stewardship, and metadata management practices to optimize the performance of Informatica MDM Hub, IDD, and related components. Writing complex SQL queries and stored procedures will also be part of your responsibilities. As a Senior Software Engineer, you will take on additional responsibilities such as leading design discussions, conducting code reviews, and mentoring junior engineers. You will be expected to architect scalable data integration solutions using Informatica and other relevant tools, as well as drive the adoption of best practices in data modeling, governance, and engineering. Collaboration with cross-functional teams to shape the data strategy will also be a key aspect of your role. **Responsibilities:** - Design, develop, and implement data pipelines using ETL tools like Informatica PowerCenter, IICS, etc., and MDM solutions using Informatica MDM. - Develop and maintain batch and real-time data integration workflows. - Collaborate with data architects, business analysts, and stakeholders to understand data requirements. - Perform data profiling, data quality assessments, and master data matching/merging. - Implement governance, stewardship, and metadata management practices. - Optimize the performance of Informatica MDM Hub, IDD, and associated components. - Write complex SQL queries and stored procedures as needed. **Senior Software Engineer Additional Responsibilities:** - Lead design discussions and code reviews; mentor junior engineers. - Architect scalable data integration solutions using Informatica and complementary tools. - Drive adoption of best practices in data modeling, governance, and engineering. - Work closely with cross-functional teams to shape the data strategy. **Required Qualifications:** - Software Engineer: - Bachelor's degree in Computer Science, Information Systems, or related field. - 2-4 years of experience with Informatica MDM (Customer 360, Business Entity Services, Match/Merge rules). - Strong SQL and data modeling skills. - Familiarity with ETL concepts, REST APIs, and data integration tools. - Understanding of data governance and quality frameworks. - Senior Software Engineer: - Bachelor's or Master's in Computer Science, Data Engineering, or related field. - 4+ years of experience in Informatica MDM, with at least 2 years in a lead role. - Proven track record of designing scalable MDM solutions in large-scale environments. - Strong leadership, communication, and stakeholder management skills. - Hands-on experience with data lakes, cloud platforms (AWS, Azure, or GCP), and big data tools is a plus. **Preferred Skills (Nice to Have):** - Experience with other Informatica products (IDQ, PowerCenter). - Exposure to cloud MDM platforms or cloud data integration tools. - Agile/Scrum development experience. - Knowledge of industry-standard data security and compliance practices.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies