Home
Jobs

15 Data Catalog Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

8 - 16 Lacs

Gandhinagar, Ahmedabad

Work from Office

Naukri logo

About TELUS Digital TELUS Digital (NYSE and TSX: TIXT) designs, builds, and delivers next-generation digital solutions to enhance the customer experience (CX) for global and disruptive brands. The companys services support the full lifecycle of its clients digital transformation journeys and enable them to more quickly embrace next-generation digital technologies to deliver better business outcomes. TELUS Digitals integrated solutions and capabilities span digital strategy, innovation, consulting and design, digital transformation and IT lifecycle solutions, data annotation, and intelligent automation, and omnichannel CX solutions that include content moderation, trust and safety solutions, and other managed solutions. Fueling all stages of company growth, TELUS Digital partners with brands across high-growth industry verticals, including tech and games, communications and media, eCommerce and fintech, healthcare, and travel and hospitality. Learn more at: telusinternational.com. Position Overview: We have an ambitious Enterprise Data Office and are building a class-leading data team that works to solve complex business challenges and provide insights to improve our business and customer experience. To enhance the team, we are looking for an innovative and enterprising Data Governance Engineer who plays a critical role in shaping and implementing our enterprise- wide data governance and data management roadmap and strategy. The Data Governance Engineer will focus their work on 'data is an asset' thinking across the enterprise, integrating the Data Governance tools implementations of Collibra Data Intelligence Platform, Collibra Data Quality, Google Cloud Data Related Tools, and Informatica MDM/RDM. This person will work with internal partners and developers to brainstorm and evaluate technical solutions, product integration opportunities and demonstrations. This role requires creative thinking, a deep curiosity and understanding of data models and usage, as well as empathy for partner/client challenges and pain points. Essential Responsibilities:

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description: We are seeking a highly skilled and experienced hands-on Test Automation Engineering Manager with a deep e xpertise in Data Quality (DQ) , Data Integration (DIF) , and Data Governance . In this role, you will design and implement automated frameworks that ensure data accuracy, metadata consistency , and compliance throughout the data pipeline , leveraging technologies like Data bricks , AWS , and cloud-native tools . You will have a major focus on Data Cataloguing and Governance , ensuring that data assets are well-documented, auditable, and secure across the enterprise. In this role, you will be responsible for the end-to-end design and development of a test automation framework, working collaboratively with the team. As the delivery owner for test automation, your primary focus will be on building and automating comprehensive validation frameworks for data cataloging , data classification, and metadata tracking, while ensuring alignment with internal governance standards. will also work closely with data engineers, product teams, and data governance leads to enforce data quality and governance policies . Your efforts will play a key role in driving data integrity, consistency, and trust across the organization. The role is highly technical and hands-on , with a strong focus on automation, metadata validation , and ensuring data governance practices are seamlessly integrated into development pipelines. Roles & Responsibilities: Data Quality & Integration Frameworks Design and implement Data Quality (DQ) frameworks that validate schema compliance, transformations, completeness, null checks, duplicates, threshold rules, and referential integrity. Build Data Integration Frameworks (DIF) that validate end-to-end data pipelines across ingestion, processing, storage, and consumption layers. Automate data validations in Databricks/Spark pipelines, integrated with AWS services like S3, Glue, Athena, and Lake Formation. Develop modular, reusable validation components using PySpark, SQL, Python, and orchestration via CI/CD pipelines. Data Cataloging & Governance Integrate automated validations with AWS Glue Data Catalog to ensure metadata consistency, schema versioning, and lineage tracking. Implement checks to verify that data assets are properly cataloged, discoverable, and compliant with internal governance standards. Validate and enforce data classification, tagging, and access controls, ensuring alignment with data governance frameworks (e.g., PII/PHI tagging, role-based access policies). Collaborate with governance teams to automate policy enforcement and compliance checks for audit and regulatory needs. Visualization & UI Testing Automate validation of data visualizations in tools like Tableau, Power BI, Looker , or custom React dashboards. Ensure charts, KPIs, filters, and dynamic views correctly reflect backend data using UI automation (Selenium with Python) and backend validation logic. Conduct API testing (via Postman or Python test suites) to ensure accurate data delivery to visualization layers. Technical Skills and Tools Hands-on experience with data automation tools like Databricks and AWS is essential, as the manager will be instrumental in building and managing data pipelines. Leverage automated testing frameworks and containerization tools to streamline processes and improve efficiency. Experience in UI and API functional validation using tools such as Selenium with Python and Postman, ensuring comprehensive testing coverage. Technical Leadership, Strategy & Team Collaboration Define and drive the overall QA and testing strategy for UI and search-related components with a focus on scalability, reliability, and performance, while establishing alerting and reporting mechanisms for test failures, data anomalies, and governance violations. Contribute to system architecture and design discussions , bringing a strong quality and testability lens early into the development lifecycle. Lead test automation initiatives by implementing best practices and scalable frameworks, embedding test suites into CI/CD pipelines to enable automated, continuous validation of data workflows, catalog changes, and visualization updates Mentor and guide QA engineers , fostering a collaborative, growth-oriented culture focused on continuous learning and technical excellence. Collaborate cross-functionally with product managers, developers, and DevOps to align quality efforts with business goals and release timelines. Conduct code reviews, test plan reviews, and pair-testing sessions to ensure team-level consistency and high-quality standards. Good-to-Have Skills: Experience with data governance tools such as Apache Atlas , Collibra , or Alation Understanding of DataOps methodologies and practices Familiarity with monitoring/observability tools such as Datadog , Prometheus , or CloudWatch Experience building or maintaining test data generators Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Must-Have Skills: Strong hands-on experience with Data Quality (DQ) framework design and automation Expertise in PySpark, Python, and SQL for data validations Solid understanding of ETL/ELT pipeline testing in Databricks or Apache Spark environments Experience validating structured and semi-structured data formats (e.g., Parquet, JSON, Avro) Deep familiarity with AWS data services: S3, Glue, Athena, Lake Formation, Data Catalog Integration of test automation with AWS Glue Data Catalog or similar catalog tools UI automation using Selenium with Python for dashboard and web interface validation API testing using Postman, Python, or custom API test scripts Hands-on testing of BI tools such as Tableau, Power BI, Looker, or custom visualization layers CI/CD test integration with tools like Jenkins, GitHub Actions, or GitLab CI Familiarity with containerized environments (e.g., Docker, AWS ECS/EKS) Knowledge of data classification, access control validation, and PII/PHI tagging Understanding of data governance standards (e.g., GDPR, HIPAA, CCPA) Understanding Data Structures: Knowledge of various data structures and their applications. Ability to analyze data and identify inconsistencies. Proven hands-on experience in test automation and data automation using Databricks and AWS. Strong knowledge of Data Integrity Frameworks (DIF) and Data Quality (DQ) principles. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Strong understanding of data transformation techniques and logic. Education and Professional Certifications Bachelors degree in computer science and engineering preferred, other Engineering field is considered; Masters degree and 6+ years experience Or Bachelors degree and 8+ years Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

13 - 22 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Talend developer DatawarehouseBI Data Warehouse implementation Unit Testing troubleshooting ETLTalend DataStage ETL Data Catalog cloud database snowflakedeveloping Data Marts,Data warehousing Operational DataStoreDWH concepts,PerformanceTuning Query

Posted 1 week ago

Apply

5.0 - 10.0 years

18 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

We are seeking a skilled and experienced Collibra Developer to support and enhance our data governance and metadata management capabilities. The ideal candidate will be responsible for designing, developing, implementing, and maintaining Collibra solutions, integrating with various enterprise data systems, and ensuring alignment with data governance standards and business requirements. Key Responsibilities: Design and configure Collibra Data Intelligence Cloud solutions (Data Catalog, Data Governance, Lineage, Privacy). Develop and maintain workflows using Collibra Workflow Designer (BPMN). Integrate Collibra with enterprise systems (ETL tools, BI tools, data lakes/warehouses) via APIs, JDBC, or other connectors. Define and maintain data domains, data dictionaries, business glossaries, and data stewardship roles. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field. 5-7+ years of experience working with Collibra Data Intelligence Platform. Hands-on experience in Collibra Administration, Workflow Development (BPMN), and DGC configuration. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-

Posted 1 week ago

Apply

3.0 - 8.0 years

0 - 1 Lacs

Thane, Hyderabad, Navi Mumbai

Work from Office

Naukri logo

Role & responsibilities Lead the end-to-end implementation of a data cataloging solution within AWS (preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc.). Establish and manage metadata frameworks for structured and unstructured data assets in the data lake and data warehouse environments. Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark or Glue/EMR customs jobs. Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. Implement role-based access controls and ensure compliance with data privacy and regulatory standards. Create detailed documentation and deliver training/workshops for internal stakeholders on using the data catalog. Preferred candidate profile AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect). Experience with data catalog tools like Alation, Collibra, or Informatica EDC. Or open sources tools hand-on experience. Exposure to data quality frameworks and stewardship practices. Knowledge of data migration with data catalog and data-mart is plus. 4 to 8+ years of experience in data engineering or metadata management roles. Proven expertise in implementing and managing data catalog solutions wiithin AWS environments. Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog and Lake Formation. Hands-on experience with metadata ingestion, data lineage, and classification processes. Proficiency in Python, SQL, and automation scripting for metadata pipelines. Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines). Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus. Strong communication, Problem-solving, and stakeholder management skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Data Governance Analyst Level 3 for US based IT Company based in Hyderabad. Candidates with experience in Data Governance can apply. Job Title : Data Governance Analyst Level 3 Location : Hyderabad Experience : 7+ Years CTC : 18 LPA - 20 LPA Working shift : Day shift Description: We are seeking a seasoned and detail-focused Senior Data Governance Analyst (Level 3) to support and drive enterprise-wide data governance initiatives . This position will play a crucial role in implementing and managing data governance frameworks , ensuring data quality , and supporting regulatory compliance efforts across business units. The ideal candidate will bring deep expertise in data governance best practices , data quality management , metadata , and compliance standards within the financial services industry . As a senior team member, the analyst will collaborate closely with data stewards , business stakeholders , and technical teams to ensure consistent, accurate, and trusted use of enterprise data. Key Responsibilities: Implement and enhance data governance policies, standards, and processes across the organization Partner with business and technical teams to define and manage data ownership, stewardship , and accountability models Maintain and improve metadata and data lineage documentation using tools such as Collibra , Alation , or similar platforms Monitor key data quality metrics , conduct root cause analysis , and lead issue resolution efforts Ensure compliance with regulatory data requirements (e.g., BCBS 239, GDPR, CCPA ) Facilitate and lead data governance meetings , working groups, and stakeholder communications Support the creation and deployment of data literacy initiatives across the enterprise Document governance practices and develop reports for audits and executive leadership Serve as a subject matter expert in data governance and promote data management best practices across departments Required Skills & Qualifications: 5+ years of experience in Data Governance , Data Quality , or Data Management roles Proven experience in developing and managing data governance frameworks in complex organizational environments Strong understanding of data quality principles , data standards , and issue management workflows Experience with metadata management , data cataloging , and lineage tracking Proficiency with governance tools like Collibra , Alation , or similar platforms Solid grasp of data compliance and regulatory standards in the financial services sector Excellent communication, stakeholder engagement, and documentation skills Strong analytical thinking and problem-solving capabilities Preferred Qualifications: Experience in banking or financial services environments Understanding of enterprise data architecture , Master Data Management (MDM) , and BI/reporting systems Knowledge of data privacy regulations such as GDPR , CCPA , etc. Experience working within Agile project methodologies For further assistance contact/whatsapp : 9354909517 or write to hema@gist.org.in

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your role and responsibilities Role Overview : We are looking for an experienced Denodo SME to design, implement, and optimize data virtualization solutions using Denodo as the enterprise semantic and access layer over a Cloudera-based data lakehouse. The ideal candidate will lead the integration of structured and semi-structured data across systems, enabling unified access for analytics, BI, and operational use cases. Key Responsibilities: Design and deploy the Denodo Platform for data virtualization over Cloudera, RDBMS, APIs, and external data sources. Define logical data models , derived views, and metadata mappings across layers (integration, business, presentation). Connect to Cloudera Hive, Impala, Apache Iceberg , Oracle, and other on-prem/cloud sources. Publish REST/SOAP APIs, JDBC/ODBC endpoints for downstream analytics and applications. Tune virtual views, caching strategies, and federation techniques to meet performance SLAs for high-volume data access. Implement Denodo smart query acceleration , usage monitoring, and access governance. Configure role-based access control (RBAC) , row/column-level security, and integrate with enterprise identity providers (LDAP, Kerberos, SSO). Work with data governance teams to align Denodo with enterprise metadata catalogs (e.g., Apache Atlas, Talend). Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Skills Required : 8-12 years in data engineering, with 4+ years of hands-on experience in Denodo Platform . Strong experience integrating RDBMS (Oracle, SQL Server), Cloudera CDP (Hive, Iceberg), and REST/SOAP APIs. Denodo Admin Tool, VQL, Scheduler, Data Catalog SQL, Shell scripting, basic Python (preferred). Deep understanding of query optimization , caching, memory management, and federation principles. Experience implementing data security, masking, and user access control in Denodo.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Hybrid

Naukri logo

Location - Chennai Responsibilities Direct Responsibilities Work closely with WM IT DCIO team to execute the Data Governance Implementation across Data Initiatives e.g., RAMI (Data Retention), Data Privacy by Design, Data Quality, etc. Create and test proof-of-concept / solutions to support the strategic evolution of the software applications. Data Governance SME within Wealth Management operationally working with Data Custodian IT Officer (DCIO), DPO (Data Protection Officer), CDO (Chief Data Officer) teams. Hands-on with the development, testing, configuration, deployment of software systems in the Data Transversal organization Operationalize Data Policies / Frameworks including Business Glossaries, Data Dictionaries, Data Profiling, etc. Technical & Behavioral Competencies Minimum 7+ years of experience as: Data expertise (At least 2 of the following: Data Governance, Data Quality, Data Privacy & Protection, Data Management) Bachelors degree in Engineering (Computer science or Electronic & Communications) Qualications: Hands-on experience in working with Data (Data Profiling, Scorecards/BI) Previously worked in Data Governance and Data Security Financial Services products and Applications knowledge Working knowledge across Excel, SQL, Python, Collibra, PowerBI, Cloud Plus: Collibra Developer or Ranger Certified or similar certification is preferred. Skills required: Knowledge about Data and Compliance / Regulatory environment( global and local Data Regulations) Demonstrates flexibility and willingness to accept assignments and challenges in rapidly changing environment. Understand how data is used (e.g., Analytics, Business Intelligence, etc.) Working knowledge on Data lifecycle, and Data Transformations / Data Lineage At least 2 of the following: Data Quality, Data Architecture, Database Management, Data Privacy & Protection, Security of data Ability to define relevant key performance indicators (KPI) Problem solving and team collaboration Self-motivated and results driven Project management and business analysis Agile thinking Transversal skills: Proficient in design new process, adaptation of Group IT processes to Wealth Management IT Strong communication to elaborate across stakeholders, and support change Minimum 7 years of experience in Data / Tech

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consultant - Lead Solution Architect, Google Cloud Platform Pre-Sales - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise , digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our vibrant team in India to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact India is expanding its Google Cloud Platform (GCP) capabilities and seeking a highly experienced and technically astute Senior Principal Consultant / Lead Solution Architect specializing in Data and Artificial Intelligence. This critical role will be at the forefront of Genpact%27s growth in the GCP ecosystem in India and globally, leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering strong client relationships. You will operate as a trusted advisor, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: Solution Architecture & Design Leadership: Lead the technical pre-sales process for complex data and AI opportunities on Google Cloud Platform / AWS / Azure (any 2) from initial discovery through to proposal and Statement of Work (SOW) development. Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures leveraging a wide array of GCP services / AWS / Azure services. (any 2) Client Engagement & Advisory: Engage deeply with prospective clients%27 senior IT and business stakeholders, including CXOs, to understand their strategic objectives , critical business challenges, and existing data landscape. Position Genpact%27s GCP / AWS / Azure Data & AI offerings as key enablers for their digital transformation journey. Technical Demonstrations & Workshops: Conduct impactful technical presentations, deep-dive workshops, and product demonstrations (including Proof-of-Concepts where required ) tailored to specific client needs, showcasing the advanced capabilities of GCP / AWS / Azure Data & AI services and Genpact%27s differentiated value. Proposal Development & Commercial Support: Take ownership of the technical sections of proposals, RFPs, and SOWs, ensuring accuracy, technical feasibility, clear value articulation, and alignment with Genpact%27s delivery capabilities. Provide robust technical estimation and sizing for proposed solutions. Sales & Delivery Collaboration: Partner closely with Genpact%27s sales teams to drive deal progression, providing technical guidance, competitive intelligence, and effective solution positioning. Collaborate with delivery teams to ensure proposed solutions are executable, scalable, and align with Genpact%27s operational excellence standards. Technology & Market Expertise: Maintain expert-level knowledge of the latest trends, services, and product roadmaps in Google Cloud Platform, AWS / Azure (any two) , Data Engineering, Machine Learning, Artificial Intelligence (including Generative AI), and relevant industry best practices. Thought Leadership & IP Contribution: Contribute to Genpact%27s intellectual property by developing reusable assets, solution accelerators, and participating in internal/external knowledge sharing, including whitepapers, blogs, and industry events. Mentorship & Capability Building: Mentor and guide junior architects and data engineers within the team, fostering a culture of technical excellence and continuous learning in Google Cloud Data & AI. Qualifications we seek in you! Minimum Qualifications progressive experience in technical roles within data analytics, data warehousing, business intelligence, machine learning, and artificial intelligence, with a significant portion in a client-facing pre-sales, solution architecture, or consulting capacity . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. Deep and demonstrable expertise across the Google Cloud Data & AI stack: Core Data Services: BigQuery , Dataflow, Dataproc , Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. AI/ML Services: Vertex AI (including MLOps , Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow , Recommendation AI. BI & Visualization: Looker, Data Studio. Similar to above Google Cloud stack, need to have exposure to one other cloud data stack - either AWS or Azure Proven experience in translating complex business challenges into viable , scalable technical solutions on GCP, articulated with clear business value. Exceptional communication, presentation, and interpersonal skills, with the ability to engage, influence, and build rapport with diverse audiences from technical teams to senior business executives. Strong problem-solving, analytical, and strategic thinking abilities, with a commercial mindset. Experience in leading and contributing to large, complex deal pursuits in a competitive environment. Bachelor%27s degree in Computer Science , Engineering, or a related technical field. Master%27s degree preferred. Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). Ability to travel to client sites within India and potentially internationally as required . Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Noida

Hybrid

Naukri logo

Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time

Posted 3 weeks ago

Apply

4 - 9 years

10 - 20 Lacs

Pune, Mumbai (All Areas)

Hybrid

Naukri logo

Manager- Enterprise Data- 4-8 years- Mumbai (Hybrid) LOCATION - Mumbai Future Employer- This global financial services organization operates as a strategic capability center supporting a leading investment and savings business headquartered in the UK. With a focus on delivering technology, data, operations, and customer service solutions, it plays a critical role in enabling business growth and operational efficiency across global markets. Primary Key Responsibilities: Collaborate with Data Owners, Data Stewards, and business/technology teams to populate the enterprise data catalogue. Work with stakeholders to define and create data quality rules within the chosen data quality tooling. Support the roll-out and adoption of data management tooling and marketplace across various business areas. Assist in the implementation of data management policies, standards, and procedures. Contribute to other data management activities as required to enhance the organization's data capabilities. Utilize SQL to query and analyze datasets for data profiling and validation. Effectively manage time, prioritize tasks, and organize work to ensure deadlines are met. Demonstrate a proactive approach to learning and a passion for making a tangible difference in data management practices. Requirements: Practical experience of working in a data governance/management team at a significant organization for 4+ years. Working experience in one or more data management tools such as Informatica Axon/IDQ/EDC, Collibra, Datactics, or Microsoft Purview. Good knowledge of SQL for querying and analyzing datasets. Effective time management, prioritization, and organizational skills. Demonstrated passion and enthusiasm to learn and contribute to data management initiatives. Bachelor's or Master's degree in IT. Experience working in the financial services / insurance industry (Desirable). Data Management certifications (a plus). What's in it for you? Opportunity to be part of a significant transformation program within a leading financial services organization. Exposure to cutting-edge data management tools and technologies. Collaborative work environment with opportunities to interact with various business and technology stakeholders. Be a key contributor in building and enhancing the organization's data governance and management capabilities. Potential for professional growth and development within the Enterprise Data team and M&G plc. Be part of an inclusive employer that values diversity and fosters a culture where difference is celebrated. Reach us If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at priya.bhatia @crescendogroup.in Disclaimer Crescendo Global is specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Note We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated.

Posted 1 month ago

Apply

10 - 20 years

20 - 30 Lacs

Hyderabad

Remote

Naukri logo

Looking for Fulltime, Consulting or Freelance Experts with Alation Experience Note: Looking for Immediate Joiners only Summarized Purpose: We are seeking an exceptionally skilled and motivated Solution Architect/Data Governance Lead who can play a pivotal role in the realm of data analytics and business intelligence, driving impactful solutions that empower organizations to harness the full potential of data and enabling to make informed decisions that can help to achieve the business objectives and foster a data-driven culture. Essential Functions: Design and architect end-to-end data governance solutions, focusing on the implementation of the Alation tool, to meet the organization's data governance objectives and requirements. Collaborate with business stakeholders, data stewards, and IT teams to understand data governance needs and translate them into technical requirements, leveraging the capabilities of the Alation tool. Develop data governance strategies, policies, and standards that align with industry best practices and regulatory requirements, leveraging the Alation tool's features and functionalities. Implement and configure the Alation tool to support data governance initiatives, including data cataloging, data lineage, data quality monitoring, and metadata management. Define and implement data governance workflows and processes within the Alation tool, ensuring efficient data governance operations across the organization. Collaborate with cross-functional teams to integrate the Alation tool with existing data systems and infrastructure, ensuring seamless data governance processes. Conduct data profiling, data quality assessments, and data lineage analysis using the Alation tool to identify data issues and develop remediation strategies. Provide guidance and support to business stakeholders, data stewards, and data owners on the effective use of the Alation tool for data governance activities. Stay updated on the latest trends, emerging technologies, and best practices in data governance and the Alation tool, and proactively recommend enhancements and improvements. Collaborate with IT teams to ensure the successful implementation, maintenance, and scalability of the Alation tool, including upgrades, patches, and configurations. Knowledge, Skills and Abilities: Bachelor's degree in computer science, information systems, or a related field. A master's degree is preferred. Must have minimum 15 years experience in IT who has min 3 years experience in Alation and min 10+ years experience in SQL, EDW or any Data Engineering/Data Science capabilities. Proven experience as a Data Governance Solution Architect, Data Architect, or similar role, with hands-on experience implementing data governance solutions using the Alation tool (Must Have) Strong expertise in designing and implementing end-to-end data governance frameworks and solutions, leveraging the Alation tool (Must Have) In-depth knowledge of data governance principles, data management best practices, and regulatory requirements (e.g., GDPR, CCPA). Proficiency in configuring and customizing the Alation tool, including data cataloging, data lineage, data quality, and metadata management features. (Must Have) Experience in data profiling, data quality assessment, and data lineage analysis using the Alation tool or similar data governance platforms. (Must Have) Familiarity with data integration, data modeling, and data architecture concepts. Excellent analytical, problem-solving, and decision-making skills with a keen attention to detail. Strong communication and interpersonal skills with the ability to effectively collaborate with cross-functional teams and influence stakeholders. Proven ability to manage multiple projects simultaneously, prioritize tasks, and meet deadlines. Professional certifications in data governance, such as CDMP (Certified Data Management Professional), DAMA-CDMP (Data Management Association Certified Data Management Professional) , or similar certifications, are a plus. (Nice to Have) Good Understanding of Master Data Management, Data Integration and SQL Skills Exposure to dimensional data modeling, Data Vault Modeling, ETL, ELT and data warehousing methodologies. Ability to effectively communicate in writing and orally with a wide range of audiences and maintain interpersonal relationships. Ability to work within time constraints and manage multiple tasks against critical deadlines. Ability to perform problem solving and apply critical thinking, deductive reasoning, and inductive reasoning to identify solutions. Certifications from Alation are desired (Must Have )

Posted 1 month ago

Apply

5 - 8 years

2 - 5 Lacs

Pune, Mumbai, Bengaluru

Work from Office

Naukri logo

Data Governance, data catalog, and data stewardship, configuration and maintenance of catalog tools such as Alation, Collibra, Informatica or similar, Experience with programing languages such as SQL, HTML, Java, Python Required Candidate profile Notice Period: 0-30 days EducationBE, B.Tech, ME, M.Tech Location Bangalore, Pune, Mumbai, Hyderabad, Chennai, Gurgaon, Noida

Posted 2 months ago

Apply

8 - 13 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Data Engineering Team As a Lead Data Engineer for India, you will be accountable for leading the technical aspects of product engineering by being hands on, working on the enhancement, maintenance and support of the product on which your team is working, within your technology area. You will be responsible for your own hands-on coding, provide the design thinking and design solutions, ensuring the quality of your teams output, representing your team in product-level technical forums and ensuring your team provides technical input to and aligns with the overall product road-map. How will you make an impact? You will work with Engineers in other technology areas to define the overall technical direction for the product on alignment with Groups technology roadmap, standards and frameworks, with product owners and business stakeholders to shape the product's delivery roadmap and with support teams to ensure its smooth operation. You will be accountable for the overall technical quality of the work produced by India that is in line with the expectation of the stakeholders, clients and Group. You will also be responsible for line management of your team of Engineers, ensuring that they perform to the expected levels and that their career development is fully supported. Key responsibilities o Produce Quality Code o Code follows team standards, is structured to ensure readability and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o Critical paths through code are covered by appropriate tests o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Data updates are monitored and complete within SLA o Technical designs follow team and group standards and frameworks, is structured to ensure reusability, extensibility and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Estimates are consistently challenging, but realistic o Most tasks are delivered within estimate o Complex or larger tasks are delivered autonomously o Sprint goals are consistently achieved o Demonstrate commitment to continuous improvement of squad activities o The product backlog is consistently well-groomed, with a responsible balance of new features and technical debt mitigation o Other Engineers in the Squad feel supported in their development o Direct reports have meaningful objectives recorded in Quantium's Performance Portal, and understand how those objectives relate to business strategy o Direct reports' career aspirations are understood / documented, with action plans in place to move towards those goals o Direct reports have regular catch-ups to discuss performance, career development and their ongoing happiness / engagement in their role o Any performance issues are identified, documented and agreed, with realistic remedial plans in place o Squad Collaboration o People Management o Produce Quality Technical Design o Operate at high level of productivity Key activities Build technical product/application engineering capability in the team by that is in line with the Groups technical roadmap, standards and frameworks Write polished code, aligned to team standards, including appropriate unit / integration tests Review code and test cases produced by others, to ensure changes satisfy the associated business requirement, follow best practices, and integrate with the existing code-base Provide constructive feedback to other team members on quality of code and test cases Collaborate with other Lead / Senior Engineers to produce high-level designs for larger pieces of work Validate technical designs and estimates produced by other team members Merge reviewed code into release branches, resolving any conflicts that arise, and periodically deploy updates to production and non-production environments Troubleshoot production problems and raise / prioritize bug tickets to resolve any issues Proactively monitor system health and act to report / resolve any issues Provide out of hours support for periodic ETL processes, ensuring SLAs are met Work with business stakeholders and other leads to define and estimate new epics Contribute to backlog refinement sessions, helping to break down each epic into a collection of smaller user stories that will deliver the overall feature Work closely with Product Owners to ensure the product backlog is prioritized to maximize business value and manage technical debt Lead work breakdown sessions to define the technical tasks required to implement each user story Contribute to sprint planning sessions, ensuring the team takes a realistic but challenging amount of work into each sprint and each team member will be productively occupied Contribute to the teams daily stand-up, highlighting any delays or impediments to progress and proposing mitigation for those issues Contribute to sprint review and sprint retro sessions, to maintain a culture of continuous improvement within the team Coach / mentor more junior Engineers to support their continuing development Set and periodically review delivery and development objectives for direct reports Identify each direct reports longer-term career objectives and, as far as possible, factor this into work assignments Hold fortnightly catch-ups with direct reports to review progress against objectives, assess engagement and give them the opportunity to raise concerns about the product or team Work through the annual performance review process for all team members Conduct technical interviews as necessary to recruit new Engineers The superpowers youll be bringing to the team: 8+ years of experience in design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP) or similar cloud platforms. 2. Strong experience with SQL 3. Values delivering high-quality, peer-reviewed, well-tested code 4. Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time 5. Knowledge of DevOps functions and to contribute to CI / CD pipelines 6. Strong knowledge of data warehousing and data modelling and techniques like dimensional modelling etc 7. Strong hands-on experience with BigQuery/Snowflake, Airflow/Argo, Dataflow, Data catalog, VertexAI, Pub/Sub etc or equivalent products in other cloud platforms 8. Solid grip over programming languages like Python or Scala 9. Hands on experience in manipulating SPARK at scale with true in-depth knowledge of SPARK API 10. Experience working with stakeholders and mentoring experience for juniors in the team is good to have 11. Recognized as a go-to person for high-level designs and estimations 12. Experience working with source control tools (GIT preferred) with good understanding of branching / merging strategies 13. Experience in Kubernetes and Azure will be an advantage 14. Understanding of GNU/Linux systems and Bash/scripting 15. Bachelors degree in Computer Science, Information Technology or a related discipline 16. Comfortable working in a fast moving, agile development environment 17. Excellent problem solving / analytical skills 18. Good written / verbal communication skills 19. Commercially aware, with the ability to work with a diverse range of stakeholders 20. Enthusiasm for coaching and mentoring junior engineers 21. Experience in lading teams, including line management responsibilities

Posted 2 months ago

Apply

6 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced IDMC CDGC (Informatica Data Management Cloud - Cloud Data Governance and Catalog) Consultant to design and implement data governance, metadata management, and data catalog solutions. The ideal candidate should have expertise in data governance frameworks, data lineage, metadata management, data cataloging, and Informatica IDMC CDGC. Key Responsibilities: Design and implement Informatica IDMC CDGC (Cloud Data Governance and Catalog) solutions to enable enterprise-wide data governance. Develop and maintain data catalog, metadata repositories, and business glossaries. Configure data lineage, impact analysis, and data classification to improve data visibility and compliance. Define and enforce data governance policies, data ownership, and stewardship models. Work with data quality, data profiling, and compliance frameworks (GDPR, CCPA, HIPAA, etc.). Collaborate with business and IT teams to establish data governance best practices and workflows. Integrate CDGC with various data platforms (Snowflake, AWS, Azure, GCP, Databricks, etc.). Develop custom rules, policies, and workflows for data governance automation. Ensure role-based access control (RBAC) and security best practices for data access. Provide training and support to data stewards, analysts, and business users on IDMC CDGC features. Required Skills & Qualifications: 6+ years of experience in data governance, data cataloging, and metadata management. Hands-on experience with Informatica IDMC CDGC (Cloud Data Governance and Catalog). Strong knowledge of data lineage, data profiling, data quality, and business glossaries. Proficiency in SQL, metadata modeling, and integration of governance tools. Experience with data governance frameworks (DCAM, DAMA-DMBOK, etc.). Strong understanding of data privacy, security, and compliance requirements. Familiarity with ETL tools, cloud data platforms, and API-based integrations. Excellent communication and documentation skills to support data governance initiatives.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies