Jobs
Interviews

4985 Data Governance Jobs - Page 34

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

delhi

On-site

As a Technical Consultant fluent in English and either Japanese or Mandarin at our organization, you will be leveraging your expertise in deploying and configuring SAS technologies to deliver high-quality outputs to our valued customers. Your role will involve understanding and utilizing best practices, supporting sales opportunities, contributing to technical documentation, and maintaining a thorough knowledge of SAS technologies. You will collaborate with others on consulting projects, conduct requirements gathering sessions, and lead knowledge transfer sessions as needed. We are seeking Technical Consultants with skills in Data Integration, Solutioning, or Decisioning. The ideal candidate should possess analytical and problem-solving skills in these areas, ensuring high levels of customer satisfaction throughout the consulting and implementation services. The required qualifications for this position include the ability to speak fluently in English and either Japanese or Mandarin, as well as a Bachelor's degree preferably in Business, Computer Science, or a related field. Depending on your specialization, you will be expected to have specific knowledge and experience in areas such as Data Integration, Solutioning, or Decisioning. For Data Integration, you should be knowledgeable in designing and developing ETL/ELT pipelines, data modeling, SQL, creating interactive dashboards, data quality, data governance, master data management, and integration with various data sources. For Solutioning, you should have knowledge and experience with SAS Solutions, solution implementation, and the ability to map customer requirements to SAS solutions effectively. For Decisioning, you should have experience modeling in SAS or other analytical tools, knowledge of SAS Intelligent Decisioning tool, automation of operational decisions, real-time decision processing, integration with REST APIs and event streams, what-if analysis, real-time monitoring, anomaly detection, and advanced statistical analysis concepts. At our organization, diversity and inclusion are integral to our culture. We value the unique talents that each individual brings to the table and strive to create software that reflects the diversity of our users and customers. We believe that diversity drives innovation and are committed to fostering an inclusive environment where everyone feels welcome and valued. Please note that SAS only sends emails from verified sas.com email addresses and never asks for sensitive personal information or money. If you have any doubts about the authenticity of communication from SAS, please contact Recruitingsupport@sas.com.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As an AWS Data Architect, you will be responsible for analyzing, architecting, designing, and actively developing cloud data warehouses, data lakes, and other cloud-based data solutions within the AWS environment. Your expertise will lie in working with modern data integration frameworks, big data, DevOps, and data programming languages such as Python and SQL. Your key responsibilities will include designing and developing scalable data ingestion frameworks to handle a variety of datasets, utilizing tools like EMR, RedShift, SageMaker, and other Platform as a Service (PaaS) offerings from AWS. You will innovate cloud-native data solutions, focusing on database structures, security, backup, recovery specifications, and functional capabilities. In addition, you will be required to build and maintain data integration utilities, data scheduling, monitoring capabilities, source-to-target mappings, and data lineage trees. It will also be part of your role to implement and manage production management processes related to data ingestion, transformation, coding utilities, storage, reporting, and other data integration points. Leading technical teams to deliver projects and meet both business and IT objectives will also be a crucial aspect of your job. To qualify for this role, you should possess at least 8 years of experience in data engineering or data warehousing, including a minimum of 4 years of hands-on experience in building AWS data solutions. Your background should encompass expertise in data architecture, data modeling, data engineering, and data governance across various types of databases. Familiarity with agile development practices, including DevOps concepts, and experience with AWS SDKs and programmatic access services are essential. Proficiency in a relevant data/development language for cloud platforms, such as R, Python, Java, C#, or Unix, along with strong SQL skills, is required. Cloud certifications would be considered advantageous in this role. This position offers the opportunity to lead technical teams, drive project delivery, and achieve organizational goals through effective data architecture and development within the AWS ecosystem.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

kerala

On-site

At EY, youll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EYs financial services practice offers integrated Consulting services to financial institutions and other capital markets participants. Within EYs Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. The opportunity We're looking for a candidate with strong expertise in the Financial Services domain, hands-on experience with good data visualization development experience. Your Key Responsibilities Work both as a good team player and an individual contributor throughout design, development, and delivery phases, focusing on quality deliverables. Work directly with clients to understand requirements and provide inputs to build optimum solutions. Develop new capabilities for clients in the form of Visualization dashboards in tools like PowerBI, Spotfire, Tableau, etc. Provide support in organization-level initiatives and operational activities. Ensure continual knowledge management and participate in all internal L&D team trainings. Skills And Attributes For Success Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. Strong communication, presentation, and team-building skills with experience in producing high-quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. Exposure to tools like PowerBI, AAS, DWH, SQL. To qualify for the role, you must have BE/BTech/MCA/MBA with 2-6 years of industry experience. Proven good experience in any of the reporting tools - Power BI (Preferred), Tableau, etc. Experience in designing and building dashboard automation processes and organizing analysis findings into logical presentations. Strong basic understanding and hands-on experience in SQL; Relational database experience such as DB2, Oracle, SQL Server, Teradata. Exposure to any ETL tools. Very strong data modeling skills. PowerBI Connecting to data sources, importing data, and transforming data for Business Intelligence. Excellent in analytical thinking for translating data into informative visuals and reports. Able to implement row-level security on data and understand application security layer models in Power BI. Able to connect and configure Gateways and implement roles/permissions. Proficient in making DAX queries in Power BI desktop. Expertise in using advanced level calculations on the dataset. Tableau Understanding the requirement for using data extract files. Decision-making for TWBX and TWB files. Knowledge of joining tables etc. inside Tableau. Understanding of Tableau server configurations. Understanding of publishing dashboards on the server. Knowledge of embedding a published dashboard on an iFrame. Ideally, youll also have Good understanding of Data Management concepts and Data Strategy. Very good experience in data preparation tools like Alteryx. Knowledge about data concepts such as Data Warehouses, Data Marts, data extraction and preparation processes, and Data Modeling. Understanding of the importance of Data governance and Data security. Experience in Banking and Capital Markets domains. What We Look For A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment. An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. What Working At EY Offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies. The work we do with them is as varied as they are. You get to work on inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from engaging colleagues. Opportunities to develop new skills and progress your career. Freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society, and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Microsoft Purview Data Governance Consultant / Sr. Consultant with 5-8 years of experience, you will be responsible for developing and deploying solutions with Microsoft Purview or similar data governance platforms. You must have a deep understanding of data governance principles, including metadata management, data cataloging, lineage tracking, and compliance frameworks. Your proficiency in Microsoft Azure, especially with services like Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Blob Storage, will be crucial for this role. It would be nice to have experience in data integration, ETL pipelines, and data modeling, as well as knowledge of security and compliance standards. Your problem-solving skills and ability to collaborate in cross-functional team environments will be essential. Strong communication and documentation skills are required for effective collaboration with technical and non-technical stakeholders. Your responsibilities will include leading the program as a Program Leader using Agile Methodology, providing guidance on Data Governance, Microsoft Purview, and Azure Data Management. You will be responsible for designing and deploying Microsoft Purview solutions for data governance and compliance, aligning implementations with business and regulatory requirements. Overseeing data integration efforts, ensuring smooth lineage and metadata management, and configuring data classification and sensitivity policies to meet compliance standards will also be part of your role. Collaborating with data teams to define and enforce governance practices, ensuring Purview services meet business needs, security, and compliance, leading data discovery efforts, monitoring and troubleshooting Purview services, and documenting best practices and governance workflows will all fall under your responsibilities. You will also be responsible for training users and mentoring Consultants in documentation and training efforts.,

Posted 2 weeks ago

Apply

5.0 - 12.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior Stibo MDM Specialist with a strong background in Master Data Management, particularly with expertise in Stibo Systems" MDM platform. Your primary responsibility will be to design, implement, and maintain complex Stibo MDM solutions. This involves developing and managing data models, workflows, and business rules to ensure data quality, integrity, and compliance with organizational standards. Collaboration with cross-functional teams is essential to identify and prioritize project requirements. You will also troubleshoot and resolve technical issues related to Stibo MDM, staying updated with the latest product releases and updates from Stibo Systems. Additionally, mentoring junior team members and providing technical guidance is an important aspect of this role. To qualify for this position, you should have 5-12 years of experience in Master Data Management, preferably with Stibo Systems" MDM platform. A strong understanding of data modeling, data governance, and data quality is required. Proficiency in Stibo Systems" MDM platform, including data management, workflow management, and business rule management, is essential. Your analytical and problem-solving skills will be put to the test, along with excellent communication and interpersonal skills. You should be adept at workflow management, data governance, interpersonal skills, data modeling, data management, data quality, analytical skills, business rule management, problem-solving skills, communication skills, and specifically with Stibo Systems MDM and master data.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You should have at least 8 years of experience in data engineering, including a minimum of 5 years in a leadership position. Your primary responsibilities will include expertise in ETL/ELT processes and utilizing ETL tools such as Talend or Informatica. Proficiency in cloud-based data platforms like Snowflake or similar platforms (e.g., Redshift, BigQuery) is essential. Strong SQL skills are required, along with experience in database tuning, data modeling, and schema design. Knowledge of programming languages like Python or Java for data processing is a plus. Additionally, you should have a good understanding of data governance and compliance standards. Excellent communication and project management skills are crucial, as you will be expected to prioritize and manage multiple projects simultaneously. This role is based in Gurgaon, with the possibility of working from the office 3-4 days a week. Meals and transport will be provided. To qualify for this position, you must hold a Bachelor's or Master's Degree in IT or an equivalent field. Excellent verbal and written communication skills are a must for effective collaboration within the team and with stakeholders.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for leading testing efforts for data conversion, migration, and ETL projects, ensuring the quality of tests across various project phases. Your key responsibilities will include leading testing activities across multiple source and target systems, analyzing data mapping and transformation rules, defining test strategies and plans for data migration validation, collaborating with different teams, validating data processes, developing SQL queries for data validation, coordinating with cross-functional teams, tracking defects, and mentoring junior team members. To be successful in this role, you should have proven experience as a Test Lead for data migration, conversion, and ETL testing projects, hands-on experience with ETL tools like Informatica, Talend, or DataStage, strong SQL skills, experience in handling large volumes of data, familiarity with data warehousing concepts, proficiency in test management tools like JIRA, strong analytical and problem-solving abilities, and excellent communication and coordination skills. Nice to have skills include experience with cloud-based data platforms, exposure to automation frameworks for data validation, knowledge of industry-specific data models, and testing certifications like ISTQB. Ideally, you should hold a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The Data Product Engineering Specialist will be responsible for designing, building, and optimizing strategic data assets that enable advanced analytics, reporting, and operational efficiencies. This role sits at the intersection of data designs, product management, and business strategy, ensuring that data assets are structured, governed, and made accessible in a scalable and reusable manner. You will design, build, and maintain scalable data products to support analytics, AI/ML, and operational business needs. Develop high-quality data pipelines and reusable configurable frameworks, ensuring robust, efficient, and scalable data processing. Implement data transformation and enrichment processes to make raw data useful for pricing, risk modeling, claims, and other business functions. Ensure adherence to data modeling standards, reference data alignment, and data product governance frameworks. Work closely with Product Managers, Designers, and Domain SMEs to embed best practices and standards. You will leverage cloud-based technologies and modern data platforms focusing on Azure data services and Databricks. Ensure solutions align with data security, privacy, and governance policies. Engage with platform teams to ensure efficient deployment and performance optimization of data solutions. Develop automated test routines and conduct thorough testing to ensure quality deliverables. Integrate solutions with data management tools such as Purview to automate data quality rules implementation, metadata management, and tagging. Develop clean and precise documentation including low-level designs, release notes, and how-to guides. Support Data product launch and adoption activities. Keep up to date with new skills - Develop technology skills in other areas of Platform. Stay updated on emerging data technologies and continuously enhance data products to improve efficiency and business value. Optimize data processing performance and cost-efficiency by leveraging automation and modern engineering best practices. Identify opportunities for AI/ML integration and automation within data pipelines and business processes. You should have an advanced understanding of building and deploying production-grade solutions in complex projects. Strong experience in data engineering, data modeling, and pipeline development. Proficiency in SQL, Python, Spark, Databricks, and cloud-based data engineering tools (Azure preferred, AWS, GCP). Strong experience in software engineering practices and implementation. Experience with big data processing, streaming architectures (Kafka, Event Hub), and real-time data solutions. Understanding of data governance, metadata management, and data lineage frameworks. Knowledge of CI/CD practices, DevOps, and Infrastructure as Code (IaC) for data solutions. Knowledge of using advanced tools and techniques to identify and address data quality issues. Strong stakeholder management, able to influence and drive alignment across cross-functional teams. Excellent problem-solving and analytical skills with a product mindset. Ability to work in agile environments, contributing to product roadmaps and iterative development cycles.,

Posted 2 weeks ago

Apply

7.0 - 10.0 years

12 - 16 Lacs

noida

Work from Office

">Salesforce Data Migration Lead 7-10 Years Noida salesforce Data Migration Key Responsibilities: Execute end-to-end data migration activities for Salesforce implementations and integrations. Analyse legacy systems, map data structures, and implement transformation logic. Prepare and maintain data migration plans, including timelines, testing, and validation procedures. Perform ETL processes using tools such as Salesforce Data Loader, Workbench, Informatica Cloud, Talend, or MuleSoft. Collaborate with functional consultants and developers to ensure accurate data mapping and transformation. Carry out data cleansing, deduplication, and validation to ensure data quality and integrity. Troubleshoot and resolve migration issues while documenting processes and lessons learned. Ensure compliance with data governance and security standards. Provide post-migration support to address data-related issues and optimize system performance. Required Skills and Experience: 7-10 years of overall IT experience, with at least 2+ years of Salesforce data migration experience . Strong understanding of Salesforce data architecture, objects, relationships, and security model. Proficiency with Salesforce Data Loader and exposure to ETL tools such as Workbench, Informatica Cloud, MuleSoft, or Talend. Knowledge of data modelling, cleansing, and transformation techniques. Strong problem-solving and communication skills. Salesforce certifications such as Data Architecture & Management Designer or Platform App Builder are an advantage.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

noida

Work from Office

">Salesforce Data Migration 5-7 Years Noida salesforce Data Migration Key Responsibilities: Execute end-to-end data migration activities for Salesforce implementations and integrations. Analyse legacy systems, map data structures, and implement transformation logic. Prepare and maintain data migration plans, including timelines, testing, and validation procedures. Perform ETL processes using tools such as Salesforce Data Loader, Workbench, Informatica Cloud, Talend, or MuleSoft. Collaborate with functional consultants and developers to ensure accurate data mapping and transformation. Carry out data cleansing, deduplication, and validation to ensure data quality and integrity. Troubleshoot and resolve migration issues while documenting processes and lessons learned. Ensure compliance with data governance and security standards. Provide post-migration support to address data-related issues and optimize system performance. Required Skills and Experience: 5 7 years of overall IT experience, with at least 2+ years of Salesforce data migration experience . Strong understanding of Salesforce data architecture, objects, relationships, and security model. Proficiency with Salesforce Data Loader and exposure to ETL tools such as Workbench, Informatica Cloud, MuleSoft, or Talend. Knowledge of data modelling, cleansing, and transformation techniques. Strong problem-solving and communication skills. Salesforce certifications such as Data Architecture & Management Designer or Platform App Builder are an advantage.

Posted 2 weeks ago

Apply

10.0 - 18.0 years

7 - 8 Lacs

noida

Work from Office

Senior Technical Team leader - Business Intelligence, Data Governance & Reporting What you will do -- Lead the development and execution of BI strategies, tools, and reporting solutions in alignment with business objectives. Serve as a subject matter expert for BI within the organization, supporting internal initiatives and mentoring team members on best practices. Design, implement, and maintain scalable data models, analytical layers, and interactive dashboards using modern BI tools (primarily Power BI). Continuously optimize BI architecture to ensure scalability, performance, and adaptability to evolving business needs. Apply performance optimization techniques to improve data processing, dashboard responsiveness, and user experience. Ensure high standards of data quality, consistency, and governance across all BI solutions. Collaborate closely with cross-functional teams including data engineers, data scientists, and business stakeholders to define and meet BI requirements. Utilize advanced Power BI features (DAX, Power Query, Power BI Service) to build robust, automated reporting and analytical solutions. Host workshops and office hours to guide business units on Power BI usage, self-service BI strategies, and technical troubleshooting. Stay abreast of emerging BI tools, trends, and methodologies to drive continuous innovation and improvement. Required Skills & Qualifications -- Bachelor s or Master s degree in Computer Science, Data Science, Engineering, Mathematics, or a related field. 10+ years of experience in Business Intelligence, including data warehousing, ETL pipelines, and reporting. Expert-level proficiency in BI tools, particularly Power BI. Certified Power BI Data Analyst Associate (PL300) and Certified Data Management Professional (CDMP)- DAMA. Strong command of DAX, Power Query, and SQL for data modeling, integration, and Python for analysis. Proficient in Agile\Scrum or traditional project management methodologies. Foster a collaborative team culture and encourage continuous learning. Total Experience Expected: 14-18 years

Posted 2 weeks ago

Apply

8.0 - 14.0 years

25 - 30 Lacs

valsad

Work from Office

Position Title Assistant Manager ROW Position Summary To monitor ROW issues and resolve the same along with concerned stakeholders, Key Accountabilities / Responsibilities Overall monitoring of ROW (Right of Way) issues based on severity and its resolution, Support for Develop and meet requirements of National and Multilateral requirements, Maintain Cost efficiency, Budgets planning and asset management, Drive monitoring of ROW performance metrics, reporting of metrics to project mgmt /senior mgmt team and identification of risk mitigation steps and develop initiatives to set industry bench mark, ROW clearances to ensure project delivery on time and budget, Position Demands Frequent Travel to Project Sites Competencies Behavioural Achievement Orientation Behavioural Information Seeking Behavioural Initiative Behavioural Innovative Thinking Functional Financial Functional Operational Functional People Functional Strategic About Us Resonia is Indias leading integrated power transmission developer and solutions provider, focused on addressing complex challenges in the sector by tackling the key constraints of time, space and capital, We believe that electricity access transforms societies and delivers long-lasting social impact Resonia is uniquely positioned to solve the toughest challenges of energy delivery, We Are Guided By Our Core Purpose Of Empowering Humanity By Addressing The Toughest Challenges Of Energy Delivery Our Four Core Values Form The Pillars Of Our Organisation Respect: Everyone counts Social Impact: We work to improve lives Fun: ?Thank God its Monday!? Innovation: A new way today Resonia is a leading global developer of power transmission infrastructure with projects of over 10,000 circuit kms and 15,000 MVA in India and Brazil With an industry-leading portfolio of power conductors, EHV cables and OPGW, Resonia also offers solutions for upgrading, uprating and strengthening existing networks The Company has set new benchmarks in the industry by use of cutting-edge technologies and innovative financing Resonia is also the sponsor of IndiGrid, Indias first power sector Infrastructure Investment Trust (?InvIT?), listed on the BSE and NSE, For more details, visit: sterlitepower

Posted 2 weeks ago

Apply

6.0 - 10.0 years

14 - 18 Lacs

noida

Work from Office

Roles And Responsibilities 1 years of DevOps Engineering and Service Delivery: Proven expertise in CI/CD and containerization with hands-on experience in cloud environments such as AWS, Microsoft Azure, and others, Automation and Cloud Integration: Skilled in designing and delivering automated CI/CD solutions in cloud environments, utilizing tools like Jenkins, Maven, Puppet, Chef, and UrbanCode, Cloud Services Proficiency: In-depth knowledge of cloud platforms including AWS, GCP, Azure, and DigitalOcean, with experience in cloud deployments and toolchains like GitHub, BitBucket, and Jenkins, Testing and Quality Assurance: Hands-on experience in testing cloud and web applications, ensuring robust and secure deployments, CI/CD Pipeline Design: Expertise in designing DevOps pipelines, including the necessary cloud and tool infrastructure to support continuous integration, continuous testing, and continuous delivery in Agile/Scrum teams, Monitoring and Scaling: Ability to develop and suggest monitoring tools that ensure automatic scaling and performance optimization, Scripting and Automation: Proficient in scripting languages such as Shell, Ruby, or Python, with experience in developing scripts to automate tasks and improve efficiency, Database and SQL Knowledge: Strong understanding of databases, including both SQL (Postgres, MySQL) and NoSQL (MongoDB, Cassandra) databases, Interpersonal Skills: Excellent communication and collaboration skills, with the ability to work effectively with diverse and distributed teams in an agile environment Tools Git/ Bitbucket: Version control system and source code management Jenkins/ GitLab/ Bitbucket/ CircleCI/ UrbanCode: Server automation and developing CI/CD pipelines Docker/ Kubernetes/ Openshift: Containerization/ Container orchestration Puppet/ Ansible/ Chef: Configuration management Selenium: Automated Testing ELK/ Prometheus/ Zabbix/ Nagios/ Splunk: Continuous monitoring Nginx/ Apache/ Tomcat/ IIS/ HAProxy: Web and Proxy server Postgres, MongoDB, MySQL, Cassandra: SQL & NOSQL database Gradle/ Maven: Java Build Tool AWS/ Azure/ GCP: Public Cloud Platform DNS, DHCP, Email, Linux: Administrative Tools Python/ Shell/ Ruby: Scripting Language

Posted 2 weeks ago

Apply

7.0 - 10.0 years

15 - 19 Lacs

pune

Work from Office

Key Result Areas and Activities: Design and implement Lakehouse architectures using Databricks, Delta Lake, and Apache Spark, Lead the development of data pipelines, ETL/ELT processes, and data integration strategies, Collaborate with business and technical teams to define data architecture standards, governance, and security models, Optimize performance and cost-efficiency of Databricks clusters and jobs, Provide technical leadership and mentorship to data engineers and developers, Integrate Databricks with cloud platforms (Azure, AWS, or GCP) and enterprise systems, Evaluate and recommend tools and technologies to enhance the data ecosystem, Ensure compliance with data privacy and regulatory requirements, Contribute to proposal and presales activities Work and Technical Experience Must-have Skills: Expertise in data engineering, data architecture, or analytics, Hands-on experience on Databricks and Apache Spark, Hands-on experience on Snowflake Strong proficiency in Python, SQL, and PySpark, Deep understanding of Delta Lake, Lakehouse architecture, and data mesh principles, Deep understanding of Data Governance and Unity Catalog, Experience with cloud platforms (Azure preferred, AWS or GCP acceptable), Good to have Skills: Good understanding of the CI/CD pipeline Working experience with GitHub Experience in providing data engineering solutions while maintaining balance between architecture requirements, required efforts and customer specific needs in other tools Qualification: Bachelors degree in computer science, engineering, or related field Demonstrated continued learning through one or more technical certifications or related methods Over 10+ years of relevant experience in ETL tools Relevant experience in Retail domain Qualities: Proven problem-solving, and troubleshooting abilities, with a high degree of adaptability; well-versed in the latest trends in the data engineering field Ability to handle multiple tasks effectively, maintain a professional attitude, and work well in a team Excellent interpersonal and communication skills, with a customer-focused approach and keen attention to detail Ability to translate technical information into clear, business-friendly language Able to work with teams and clients in different time zones and in rotational shift Research focused mindset Champion efforts to ensure appropriate development best practices and assist with strategy and roadmap development,

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

pune

Work from Office

": "**About Company:**\\[Comprinno](https: / / comprinno.net / ) is a NASSCOM-incubated company. We are headquartered in Bangalore, with offices in Pune, Coimbatore, and the United States. We specialize in cloud transformation, DevOps, and infrastructure automation, enabling organizations to build scalable, secure, and high-performing cloud environments on AWS. Comprinno is among the top AWS partners in the consulting and technology space.Our flagship SaaS platform, [Tevico](https://tevi.com/), offers intelligent cloud governance and observability for AWS workloads. It helps enterprises improve uptime, reduce costs, and ensure compliance by proactively detecting anomalies and triggering automated remediation workflows.As a trusted partner to both startups and enterprises, we combine deep cloud expertise with a customer-first mindset. Our services span cloud migration, modernization, security, GenAI solutions, and managed operations, driving innovation and measurable impact.**About The Role:**We are looking for a visionary and execution-focused Data Analytics & AI Practice Head to establish, grow, and lead a new business vertical at Comprinno. This is a senior leadership role with end-to-end ownership including strategy, team building, solution delivery, partner alignment, and P&L responsibility.The ideal candidate will have strong technical credibility, a business-building mindset, and experience delivering high-impact data and AI solutions for enterprise clients.**Key Responsibilities:*** Define the vision, strategy, and roadmap for Comprinno s Data & AI practice.* Build and lead a high-performing team of data engineers, AI/ML engineers, and solution consultants.* Develop service offerings in data analytics, data engineering, GenAI, and applied AI.* Engage with strategic customers to understand data-driven opportunities and propose impactful solutions.* Own full **P&L responsibility** for the practice including revenue targets, cost management, and profitability.* Establish delivery methodologies, reusable assets, and quality frameworks for scalable engagements.* Collaborate with the sales and marketing teams to create go-to-market strategies and win enterprise deals.* Represent Comprinno in external forums, partner events, and customer executive briefings.* Build partnerships with hyperscalers (AWS, etc.), ISVs, and data/AI ecosystem partners.**Required Qualifications & Skills:*** **10+ years of experience** in data engineering, analytics, or AI/ML, with at least 3+ years in a leadership role.* Proven experience in setting up or scaling a consulting practice or business unit.* Strong understanding of cloud-native data architectures, modern data platforms, and AI/ML services (especially on AWS).* Experience with technologies such as Python, Spark, Snowflake, Redshift, SageMaker, Bedrock, and data lake/lakehouse architectures.* Strategic mindset with experience in customer consulting, delivery management, and business growth.* Excellent communication and leadership skills, with the ability to influence internal and external stakeholders.* Prior experience managing P&L or business KPIs is highly preferred.**Desired Qualifications & Skills:*** AWS specialty certifications in Data Analytics, Machine Learning, or GenAI.* Prior experience working with AWS Partner programs like Data-Driven Everything (D2E), GenAI Innovation Center, or AI/ML CoE initiatives.* Exposure to open-source data and ML ecosystems (e.g., dbt, Airflow, MLflow, LangChain, Hugging Face).* Experience with verticalized solutions in finance, retail, manufacturing, or healthcare.* Strong network within the AI/ML and data engineering community (speaking at conferences, writing blogs, etc.).* Understanding of data governance, compliance (e.g., GDPR), and responsible AI practices.* Ability to evangelize innovation while balancing execution and delivery commitments.* Prior success in building high-margin, IP-led offerings in services or SaaS.* MBA or equivalent business qualification (preferred, not required).**Why Join Comprinno:*** Work at the forefront of cloud innovation with one of the leading AWS Partners in India and globally.* Be part of real transformation projects involving cloud migration, modernization, DevSecOps, and AI/GenAI.* Accelerate your growth through continuous learning, AWS certifications, and mentorship from cloud experts.* Contribute to Tevico, our flagship SaaS platform for cloud governance, used by enterprises globally.* Engage with diverse clients, from digital-native startups to large-scale enterprises, across industries.* Build thought leadership by participating in webinars, writing blogs, and engaging with the cloud community.* Join a collaborative, high-performance culture that values curiosity, ownership, and innovation.* Enjoy flexibility and autonomy while solving meaningful problems with a driven and talented team.", "

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

bengaluru

Work from Office

SUMMARY Job Role: Data Governance with Collibra Professionals Experience: 5+ years Location: PAN INDIA Key Responsibilities: 3 to 5 years of experience in Data Governance, Data Catalog, Business Glossary using Collibra Define Data Stewardship Framework Define Data Standards and Key Performance Indicators (KPIs) for Governance Implement Data Dictionary, Data Catalogue, Metadata solution within selected DG tool Design and configure operating model for consumption Develop integration between Collibra and other systems and execute processes to ingest metadata into the catalog and enable data lineage Develop and publish Catalog metric and usage queries for use in metrics dashboards Develop workflows to configure different types asset management and issue management process Test and perform application upgrades Configuring Collibra as per business requirements Code configuration migration between different environments Outline and apply software development best practices Daily operational support of the application including job monitoring, Production Monitoring, and health management Upgrades and patches User and policy management Skills: Mandatory Skills: Collibra, Dimensional Data Modeling, MDM Conceptual Requirements Requirements: 3+ years of relevant experience in Data Governance with Collibra Strong understanding of Data Governance principles and best practices Experience in developing and implementing Data Governance solutions Proficiency in Collibra and other relevant tools Excellent communication and problem-solving skills

Posted 2 weeks ago

Apply

11.0 - 15.0 years

30 - 45 Lacs

hyderabad

Hybrid

Role Summary We're seeking a strategic and data-savvy Senior Manager to lead our Master Data Management (MDM) and Data Quality initiatives. This role is ideal for an experienced professional who can translate complex data concepts into business value, with a focus on leveraging Informatica's MDM ecosystem while managing a data governance team. You'll provide leadership and direction to ensure our organization's data is accurate, consistent, and trusted, directly impacting our decision-making capabilities and operational efficiency. Key Responsibilities: Provide strategic oversight for the development and implementation of enterprise-wide MDM and data quality strategies, ensuring alignment with business objectives. Lead the creation and execution of a comprehensive MDM and Data Quality roadmap, coordinating efforts across multiple departments and systems. Oversee the operations and governance of Informatica MDM platforms, ensuring they meet organizational needs and integrate effectively with existing enterprise applications. Guide the development of data models, taxonomies, and hierarchies to support enterprise-wide data consistency, delegating technical tasks as appropriate. Establish and manage a framework for creating and maintaining data quality rules and automated workflows, focusing on proactive issue identification and resolution. Direct the implementation of data profiling techniques and statistical analysis processes to drive continuous improvement in data quality. Coordinate cross-functional collaboration between IT teams and business units for data integration projects, ensuring smooth data synchronization across systems. Oversee the design and implementation of data exchange mechanisms, including APIs, between MDM and other enterprise systems. Manage the development of custom Informatica workflows and mappings to address unique business requirements, delegating tasks to team members as needed. Lead talent development initiatives within the team, fostering skills in advanced MDM concepts, Informatica best practices, and emerging data technologies. Provide leadership and direction to a team of data professionals, setting performance goals, conducting evaluations, and promoting a culture of innovation. Serve as the primary liaison between the MDM/Data Quality team and senior stakeholders, translating technical concepts into business value. Present strategic updates, including data quality metrics, KPIs, and risk assessments, to the Enterprise Data Council and executive management. Ensure compliance with data governance policies and regulatory requirements across all MDM and data quality initiatives. Drive change management and adoption of MDM and data quality best practices across the organization through education and stakeholder engagement. Technical Requirements: Bachelor's degree in Computer Science, Data Science, Technology, Mathematics, or related field. 10+ years of experience with Informatica, including design, implementation, and optimization. 5+ years of direct management experience in data operations, data governance or data quality. Deep understanding of data modeling , ETL processes, and data integration techniques. Proficiency in SQL and familiarity with Python and other scripting languages. Strong understanding of data governance frameworks and metadata management. Familiarity with large language models and understanding of prompt engineering for data quality improvement. Preferred Qualifications: Informatica MDM Certified Professional or Practitioner Experience in implementing data quality initiatives in multi-system, complex environments Strong analytical and problem-solving skills with a data-driven approach to decision making Experience with cloud-based MDM solutions and data integration platforms (e.g., AWS, Azure). Knowledge of regulatory requirements (e.g., GDPR, CCPA) and their implications on data management. Key Competencies: Strategic thinking and ability to translate data strategies into actionable plans Excellent communication skills, able to articulate complex technical concepts to both technical and non-technical audiences Strong leadership and team management skills, including hiring, performance measurement and task management Passion for data quality and its impact on business outcomes

Posted 2 weeks ago

Apply

6.0 - 11.0 years

12 - 16 Lacs

bengaluru

Work from Office

Infrastructure & Architecture Design and scale distributed systems for real-time speech processing. Own data pipelines from ingestion to model serving. Ensure security, privacy, and compliance for user speech data. Performance & Reliability Implement robust monitoring, logging, and alerting for 24x7 availability. Optimise system performance and scalability. Team Leadership & Collaboration Mentor junior engineers and work closely with AI/ML teams. Contribute to hiring efforts as we expand. Must-Have Requirements Experience : 6+ years in backend or platform engineering (focus on distributed systems). Cloud & Containers : Proficiency with Docker, Kubernetes, and AWS (or similar cloud platforms). Real-Time Streaming : Familiarity with audio/video streaming in high-availability, auto-scaling environments. Databases : Strong SQL skills (PostgreSQL preferred), including query optimization, schema design; caching layers (Redis, Memcached).

Posted 2 weeks ago

Apply

2.0 - 7.0 years

15 - 19 Lacs

pune

Work from Office

Solution Development Build and deploy AI/ML models into production environments. Develop APIs and services to integrate AI models with business applications. Optimize model performance and ensure scalability and reliability. Collaboration & Integration Work with cross-functional teams to understand business requirements and translate them into technical solutions. Collaborate with data engineers to ensure robust data pipelines and model training workflows. MLOps & Deployment Implement CI/CD pipelines for ML models. Monitor model performance and manage model versioning and retraining. Ensure compliance with data governance and security standards. Innovation & Experimentation Prototype new AI features and evaluate emerging technologies. Contribute to the development of reusable AI components and frameworks. Required Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or related field. 3 6 years of experience in software development, with at least 2 years in AI/ML engineering. Proficiency in Python and ML libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Experience with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes). Strong knowledge on autonomous intelligence platforms like Coginiti AI, Sisense or AthenaIntel Familiarity with RESTful APIs, microservices, and data engineering tools (e.g., Airflow, Spark).

Posted 2 weeks ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

ahmedabad, chennai, bengaluru

Work from Office

Roles and Responsibilities Collaborate with cross-functional teams to design, develop, test, deploy, and maintain Collibra DQ solutions. Ensure seamless integration of Collibra DQ with other systems using APIs. Provide technical guidance on data governance best practices to stakeholders. Troubleshoot issues related to Collibra DQ implementation and provide timely resolutions. Participate in agile development methodologies such as Scrum. Desired Candidate Profile 4-9 years of experience in Collibra Data Quality (DQ) development or similar roles. Strong understanding of SQL queries for data extraction and manipulation. Experience working with API integrations for system connectivity. Bachelor's degree in Any Specialization (BCA or B.Sc). Proficiency in Agilent tools for testing purposes.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

bengaluru

Hybrid

Must haves: Proficiency in MDM tools and technologies such as Informatica MDM or similar platforms In-depth knowledge of data modelling, data governance, data quality, and data integration principles Relevant experience: 8-12 Years

Posted 2 weeks ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

hyderabad

Hybrid

The Forward Engineer is a pivotal role responsible for reimagining critical TR business functions by strategically applying AI capabilities. This role will work closely with senior functional leaders and will lead squads of SMEs and engineers to facilitate end-to-end optimization of the business processes and create the future AI enabled operating model for this function. This will require thinking beyond process automation and look at end-to-end transformation. About the Role Process Analysis & Opportunity Identification: Conduct in-depth analysis of existing business processes to identify pain points, inefficiencies, and strategic opportunities for leveraging AI/ML. AI Solution Design: Design innovative, AI-driven solutions that directly address identified business challenges and define new AI-driven business processes. Rapid Prototyping & Development: Lead the rapid development of proof-of-concepts (POCs) and minimum viable products (MVPs) using programming languages and AI frameworks to quickly validate and demonstrate AI solutions. Implement Scaled Solutions with ROI: Based on successful POCs, design approaches to implement the solution at scale with suitable ROI. Technical Implementation & Integration: Oversee the implementation and integration of AI solutions, ensuring sound data architecture, leveraging cloud platforms (AWS, Azure, GCP AI services), and designing effective API integrations with existing enterprise systems. Change Management & Stakeholder Engagement: Drive the successful adoption and embedding of new AI-powered processes and solutions within the function. Effectively manage stakeholders through executive communication and influence, guiding organizational change and transformation initiatives. Problem Solving & Innovation: Continuously identify complex, ambiguous business problems and apply analytical thinking to develop creative, AI-centric solutions. Challenge existing norms and proactively reimagine existing solutions. Cross-Functional Collaboration & Leadership: Work collaboratively within small, agile teams (3-4 members) and adapting quickly to new challenges and business contexts. About You You are a strong candidate for this role if you have: 10+ years of relevant work experience with 5+ years of proven experience working in a transformation role within a large, complex and multinational organisation. Technical Skills: AI/ML: Deep understanding of machine learning, natural language processing, RAG implementations, multi-agent systems Programming: Python, R, SQL, and familiarity with AI frameworks (TensorFlow, PyTorch, Scikit-learn) Rapid Prototyping: Ability to build POCs and MVPs quickly Data Architecture: Proficiency with data lakes, warehouses, ETL processes, and data governance Cloud Platforms: Proficiency in AWS, Azure, or GCP AI services API Integration: Understanding of system integrations and API architectures Business & Consulting Skills: Management Consulting: Strategy development, process optimization, and business case creation Financial Modeling: ROI analysis, cost-benefit analysis, and business value quantification Change Management: Leading organizational change and transformation initiatives Stakeholder Management: Executive communication and influence management Process Mapping: Business process analysis and redesign Soft Skills: Problem Solving: Analytical thinking and creative solution development Communication: Executive storytelling and technical concept simplification Leadership: Team building and cross-functional collaboration Innovation: Identifying breakthrough opportunities and challenging status quo Adaptability: Thriving in ambiguous and rapidly changing environments

Posted 2 weeks ago

Apply

5.0 - 9.0 years

14 - 19 Lacs

gurugram, coimbatore, bengaluru

Work from Office

Lead data engineers at Thoughtworks develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. They might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On projects, they will be leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. Alongside hands-on coding, they are leading the team to implement the solution. Job responsibilities You will lead and manage data engineering projects from inception to completion, including goal-setting, scope definition and ensuring on-time delivery with cross team collaboration. You will collaborate with stakeholders to understand their strategic objectives and identify opportunities to leverage data and data quality. You will design, develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You will be responsible to create, design and develop intricate data processing pipelines, addressing clients' most challenging problems. You will collaborate with data scientists to design scalable implementations of their models. You write clean and iterative code based on TDD and leverage various continuous delivery practices to deploy, support and operate data pipelines. You will lead and advise clients on how to use different distributed storage and computing technologies from the plethora of options available. You will develop data models by selecting from a variety of modeling techniques and implementing the chosen data model using the appropriate technology stack. You will be responsible for data governance, data security and data privacy to support business and compliance requirements. You will define the strategy for and incorporate data quality into your day-to-day work. Job qualifications Technical Skills You have experience in leading the system design and implementation of technical solutions. Working with data excites you; You have created Big Data architecture, can build and operate data pipelines, and maintain data storage, all within distributed systems. You have a deep understanding of data modeling and experience with modern data engineering tools and platforms. You have experience in writing clean, high-quality code using the preferred programming language. You have built and deployed large-scale data pipelines and data-centric applications using any of the distributed storage platforms and distributed processing platforms in a production setting. You have experience with data visualization techniques and can communicate the insights as per the audience. You have experience with data-driven approaches and can apply data security and privacy strategy to solve business problems. You have experience with different types of databases (i.e.: SQL, NoSQL, data lake, data schemas, etc.). Professional Skills You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy-in and gaining trust along the way. You are resilient in ambiguous situations and can adapt your role to approach challenges from multiple perspectives. You dont shy away from risks or conflicts, instead you take them on and skillfully manage them. You coach, mentor and motivate others and you aspire to influence teammates to take positive action and accountability for their work. You enjoy influencing others and always advocate for technical excellence while being open to change when needed. You are a proven leader with a track record of encouraging teammates in their professional development and relationships. Cultivating strong partnerships comes naturally to you; You understand the importance of relationship building and how it can bring new opportunities to our business.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

12 - 17 Lacs

gurugram, coimbatore, bengaluru

Work from Office

Senior data engineers at Thoughtworks are engineers who build, maintain and test the software architecture and infrastructure for managing data applications. They are involved in developing core capabilities which include technical and functional data platforms. They are the anchor for functional streams of work and are accountable for timely delivery. They work on the latest big data tools, frameworks and offerings (data mesh, etc.), while also being involved in enabling credible and collaborative problem solving to execute on a strategy. Job responsibilities You will develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions You will develop intricate data processing pipelines, addressing clients' most challenging problems You will collaborate with data scientists to design scalable implementations of their models You will write clean, iterative code using TDD and leverage various continuous delivery practices to deploy, support and operate data pipelines You will use different distributed storage and computing technologies from the plethora of options available You will develop data models by selecting from a variety of modeling techniques and implementing the chosen data model using the appropriate technology stack You will collaborate with the team on the areas of data governance, data security and data privacy You will incorporate data quality into your day-to-day work Job qualifications Technical Skills Working with data excites you; You can build and operate data pipelines, and maintain data storage, all within distributed systems You have hands-on experience of data modeling and modern data engineering tools and platforms You have experience in writing clean, high-quality code using the preferred programming language You have built and deployed large-scale data pipelines and data-centric applications using any of the distributed storage platforms and distributed processing platforms in a production setting You have experience with data visualization techniques and can communicate the insights as per the audience You have experience with data-driven approaches and can apply data security and privacy strategy to solve business problems You have experience with different types of databases (i.e.: SQL, NoSQL, data lake, data schemas, etc.) Advanced English level. Professional Skills You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy-in and gaining trust along the way You are resilient in ambiguous situations and can adapt your role to approach challenges from multiple perspectives You dont shy away from risks or conflicts, instead you take them on and skillfully manage them You are eager to coach, mentor and motivate others and you aspire to influence teammates to take positive action and accountability for their work You enjoy influencing others and always advocate for technical excellence while being open to change when needed

Posted 2 weeks ago

Apply

5.0 - 9.0 years

14 - 19 Lacs

gurugram, coimbatore, bengaluru

Work from Office

Lead data engineers at Thoughtworks develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. They might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On projects, they will be leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. Alongside hands-on coding, they are leading the team to implement the solution. Job responsibilities You will lead and manage data engineering projects from inception to completion, including goal-setting, scope definition and ensuring on-time delivery with cross team collaboration. You will collaborate with stakeholders to understand their strategic objectives and identify opportunities to leverage data and data quality. You will design, develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You will be responsible to create, design and develop intricate data processing pipelines, addressing clients' most challenging problems. You will collaborate with data scientists to design scalable implementations of their models. You write clean and iterative code based on TDD and leverage various continuous delivery practices to deploy, support and operate data pipelines. You will lead and advise clients on how to use different distributed storage and computing technologies from the plethora of options available. You will develop data models by selecting from a variety of modeling techniques and implementing the chosen data model using the appropriate technology stack. You will be responsible for data governance, data security and data privacy to support business and compliance requirements. You will define the strategy for and incorporate data quality into your day-to-day work. We value flexibility and collaboration, and therefore work in a hybrid model. Job qualifications Technical Skills You have experience in leading the system design and implementation of technical solutions (e.g., Kafka, GraphQL, RabbitMQ). Working with data excites you; You have created Big Data architecture, can build and operate data pipelines, and maintain data storage, all within distributed systems. You have built and deployed large-scale data pipelines and data-centric applications using any of the distributed storage platforms and distributed processing platforms in a production setting using Snowflake (must have) , Amazon S3, Azure Data Lake, Spark and/or Flink. You have a deep understanding of data modeling and experience with modern data engineering tools and platforms such as Airflow or Databricks. You have experience in writing clean, high-quality code using the preferred programming language. You have experience with data visualization techniques and can communicate the insights as per the audience. You have experience with data-driven approaches and can apply data security and privacy strategy to solve business problems. You have experience with different types of databases (i.e.: SQL, NoSQL, data lake, data schemas, etc.). Professional Skills You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy-in and gaining trust along the way. You are resilient in ambiguous situations and can adapt your role to approach challenges from multiple perspectives. You dont shy away from risks or conflicts, instead you take them on and skillfully manage them. You coach, mentor and motivate others and you aspire to influence teammates to take positive action and accountability for their work. You enjoy influencing others and always advocate for technical excellence while being open to change when needed. You are a proven leader with a track record of encouraging teammates in their professional development and relationships. Cultivating strong partnerships comes naturally to you; You understand the importance of relationship building and how it can bring new opportunities to our business.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies