Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
Role Overview: As an IDMC Architect at our company, you will play a crucial role in leading the design, implementation, and optimization of data integration and management solutions using Informatica's Intelligent Data Management Cloud (IDMC). Your responsibilities will be pivotal in driving our enterprise-wide data strategy and ensuring scalability, performance, and governance across cloud and hybrid environments. Key Responsibilities: - Design end-to-end data integration architectures leveraging IDMC capabilities such as Data Integration, Data Quality, Data Governance, and API Management. - Define and implement best practices for IDMC deployment, scalability, and performance tuning across multi-cloud environments. - Collaborate closely with business analysts, data engineers, and enterprise architects to translate business requirements into technical solutions. - Ensure compliance with data governance, privacy, and security standards across all IDMC implementations. - Mentor development teams, review code and configurations, and guide troubleshooting efforts. - Continuously evaluate new IDMC features and recommend enhancements to improve data workflows and reduce latency. Qualifications Required: - Bachelors or Masters degree in Computer Science, Information Systems, or related field. - 8+ years of experience in data architecture, with at least 3 years in IDMC or Informatica Cloud. - Strong expertise in cloud platforms (AWS, Azure, GCP) and hybrid data ecosystems. - Proficiency in REST APIs, SQL, ETL/ELT pipelines, and metadata management. - Experience with Informatica Axon, EDC, and Data Quality tools is a plus. - Excellent communication and documentation skills. Company Details: Our company offers competitive compensation and benefits, opportunities for professional growth and certification, and fosters a collaborative and innovation-driven culture.,
Posted 4 days ago
6.0 - 9.0 years
20 - 27 Lacs
hyderabad, bengaluru
Hybrid
Job Description * Role & Responsibilities: Manage and administer Informatica IDMC environment including CDI, CDQ, CDGC, Marketplace, and Metadata Command Center. Perform user creation, folder creation, DB connection configuration and other day-to-day administration tasks. Monitor and troubleshoot end-to-end data integration workflows including data sources, transformations, operating systems, and app servers. Work with Informatica Support team for ticket resolution. Conduct active monitoring of platform usage and forecast scaling requirements. Mentor peers and application developers, explaining technical issues to non-technical stakeholders. Ensure effective collaboration with infrastructure teams for stable operating environment. Must-Have Skills: 46 years of hands-on Informatica IDMC administration . Modules: CDGC, CDMP, CDQ, CDI, Metadata Command Center. Proficiency with Linux commands / basic shell scripting . Knowledge of MS SQL Server . Experience in Informatica MDM administration . Understanding of AWS Cloud architecture . Nice-to-Have Skills: Prior experience with Informatica Axon, Informatica EDC . Knowledge of Informatica security model and IDMC architecture. AD authentication models, LDAP integration. SAML/SSO configuration at Org/Sub-Org level. Familiarity with IDMC IPU license model .
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a GCP Data Engineer specialized in Data Migration & Transformation, you will be responsible for designing and constructing robust, scalable data pipelines and architectures on Google Cloud Platform (GCP), particularly focusing on BigQuery. Your primary tasks will involve migrating and transforming large-scale data systems and datasets to GCP while emphasizing performance, scalability, and reliability. It will be crucial for you to automate data lineage extraction and ensure data integrity across various systems and platforms. Collaborating closely with architects and stakeholders, you will play a key role in implementing GCP-native and 3rd-party tools for data ingestion, integration, and transformation. Additionally, your role will include the development and optimization of complex SQL queries in BigQuery for data analysis and transformation. You will be expected to operationalize data pipelines using tools such as Apache Airflow (Cloud Composer), DataFlow, and Pub/Sub, enabling machine learning capabilities through well-structured, ML-friendly data pipelines. Participation in Agile processes and contributing to technical design discussions, code reviews, and documentation will be integral parts of your responsibilities. Your background should include at least 5 years of experience in Data Warehousing, Data Engineering, or similar roles, with a minimum of 2 years of hands-on experience working with GCP BigQuery. Proficiency in Python, SQL, Apache Airflow, and various GCP services including BigQuery, DataFlow, Cloud Composer, Pub/Sub, and Cloud Functions is essential. You should possess experience in data pipeline automation, data modeling, and building reusable data products. A solid understanding of data lineage, metadata integration, and data cataloging, preferably using tools like GCP Data Catalog and Informatica EDC, will be beneficial. Demonstrated ability to analyze complex datasets, derive actionable insights, and build/deploy analytics platforms on cloud environments, preferably GCP, is required. Preferred skills for this role include strong analytical and problem-solving capabilities, exposure to machine learning pipeline architecture and model deployment workflows, excellent communication skills, and the ability to collaborate effectively with cross-functional teams. Familiarity with Agile methodologies, DevOps best practices, a self-driven and innovative mindset, and experience in documenting complex data engineering systems and developing test plans will be advantageous for this position.,
Posted 2 weeks ago
8.0 - 13.0 years
15 - 22 Lacs
chennai, bengaluru, mumbai (all areas)
Work from Office
Title :Informatica Metadata Lead Duration: Full Time Experience : 9+ Years Minimum Qualifications: Degree educated with a minimum of 3 years of direct experience, 5+ years overall industry experience Minimum Experience: Minimum 3 years of direct experience in data-related programs, ideally in financial services. Good operational knowledge of metadata management processes. Experience of collating, designing and presenting top quality management information to senior leadership stakeholders Knowledge, Skills, and Attributes: Knowledge and Skills Good knowledge of Data Governance and Metadata Management, frameworks, policies, and procedures. Experience working on strategic programs and by synthesizing business requirements and priorities with existing organisational capabilities. Good knowledge of Metadata Management Technical knowledge of Informatica EDC / CDGC is a must.. Good SQL knowledge and analytical skills Attributes A reliable and trustworthy person, able to anticipate and deal with the varying needs and concerns of numerous stakeholders adapting personal style accordingly. Adaptable and knowledgeable who is able to discuss a variety of data management situations with both IT and business stakeholders. Intesrested, Share your resume to Mahadeen@culminantexes.com
Posted 3 weeks ago
5.0 - 10.0 years
17 - 32 Lacs
bengaluru, delhi / ncr, mumbai (all areas)
Hybrid
Location : PAN India Experience : 4 to 14 years Employment Type : Full Time Key Responsibilities: Design, develop, and maintain metadata management solutions using Informatica EDC. Configure and manage data cataloging, lineage, and profiling features. Collaborate with data stewards, architects, and business users to ensure metadata accuracy and usability. Integrate EDC with various data sources (RDBMS, Big Data, Cloud platforms). Develop custom metadata connectors and automate metadata ingestion. Monitor and optimize EDC performance and scalability. Required Skills: Strong hands-on experience with Informatica EDC and Axon . Proficiency in metadata management , data lineage , and data profiling . Experience with REST APIs , Java/Python scripting , and SQL . Familiarity with data governance frameworks and data quality tools . Knowledge of cloud platforms (AWS, Azure, GCP) is a plus.
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Governance Policy Compliance Manager, you will be responsible for conducting gaps assessment and developing a compliance monitoring and testing framework. Your key role will involve collaborating with internal audit, GRC, risk, and technology teams to align and define controls and metrics for the adoption of policies and standards within the Banking and Financial services industry. To excel in this role, you should have a strong background in policy governance within Data Management & Governance. Your excellent communication and interpersonal skills will be pivotal in identifying and assessing gaps, developing compliance monitoring and testing frameworks, and acting as a focal point for collaboration with internal audit, GRC, risk, and technology teams to ensure alignment and definition of controls and metrics for the adoption of Data Governance Policies and procedures. Key Requirements: - Bachelor's degree in Technology, Computer Science, Finance, or a related field. Master's degree preferred. - Minimum of 12 years of experience in Data management & Governance within compliance and regulatory reporting in the Banking & financial services industry. - Proficiency in English. - Strong understanding of banking processes, systems, and regulatory requirements, with a background in Internal Audit and Technology Governance. - Proficiency in Data Governance tools and technologies, preferably Informatica DEQ, EDC, AXON, etc. - Experience collaborating between internal audit, risk, and technology teams to understand requirements and execute testing framework. - Strong experience in conducting gap assessment, impact analysis, solution design, and implementation within the Data Management framework. - Successful execution of minimum 2-3 large engagements with good project oversight and decision-making skills. - Hands-on experience in implementing Data Strategy roadmap across the Organization. - Proficient in Data Management and Data Governance concepts, particularly within the Banking industry. - Strong analytical and problem-solving abilities. - Self-starter with the ability to take initiatives and maintain strong relationships with stakeholders. In summary, this role requires a highly skilled individual with extensive experience in Data Governance policy compliance, strong technical knowledge, and the ability to effectively communicate with various stakeholders in the Banking and Financial services industry.,
Posted 1 month ago
7.0 - 12.0 years
22 - 30 Lacs
Mumbai
Hybrid
Type of Candidate They Want: Strong experience with data governance tools (especially Informatica) Experience building data policies and quality frameworks Knows privacy laws and regulatory standards Worked with cloud platforms like AWS, Azure, or GCP Can manage cross-functional teams , conduct meetings, and influence business leaders
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |