Jobs
Interviews

4985 Data Governance Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

13 - 18 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the development process. Your role will be pivotal in establishing a robust data framework that supports the organization's data strategy and enhances data accessibility and usability across teams. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Develop and maintain documentation related to data architecture, ensuring clarity and accessibility for all stakeholders. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling techniques and best practices.- Experience with ETL processes and data integration tools.- Familiarity with cloud data platforms and services.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

bengaluru

Work from Office

About You experience, education, skills, and accomplishments Bachelor degree is required. 3-5 years direct experience preferred working in a role required to manage customer and/or product data. Working knowledge of CRM, preferably Salesforce and order management and billing systems like Netsuite or Zuora. Experience with D&B, Zoominfo, and other 3rd party data providers is preferred. Must possess knowledge of data governance concepts Solid business acumen of business transactions and end-to-end sales processes. Experience with quality management practices including lean sigma is helpful Detail oriented and experience in audit and data cleansing. What will you be doing in this role: Daily Data Management Maintain data sets to designated level of quality and standards Review and monitor the quality of both new and reoccurring data sets. Monitor operational dashboards for anomalies and patterns indicating a broader existing or potential issue. Support sales, contract, and order processing teams in processing or correcting data directly affecting customer transactions. Support reporting and planning teams in rectifying data quality problems directly affecting operational reports. Collaborate with key global functions including sales, sales operations, customer support, fulfillment, order management and billing to reduce re-work. Ensure all business activities follow the governance rules and corporate compliance standards. Manage workflow via cases to support root cause analysis and overall data health measurements. Special Project Support Support data cleansing, mapping, or improvement projects initiated anywhere across the organization. Review Salesforce.com, NetSuite, and other data sources for data accuracy. Support testing of new or enhanced data acquisition and maintenance tooling and processes. Collaborate with global teams around the world in local time zones to discuss and troubleshoot issues. Talent Development & Management Be part of a high-performance culture working towards SMART objectives to measure his/her individual performance. Manage and take personal responsibility for ones professional development plan. Act as a strong contributor to an operating culture that makes possible collaboration, open communications, and a focus on talent development. Work with management to ensure clear role definitions, processes, ownership and expectations.

Posted 1 week ago

Apply

3.0 - 7.0 years

7 - 11 Lacs

bengaluru

Work from Office

Job Requirements Data layer architecting in the Data Architecture domain, responsible for the design and creation of standard and benchmark data models using Layered Scalable Architecture principles / Medallion Architecture to deliver the Data Marts across all Businesses of Titan with highest data quality, performance & costs. Design optimization to handle high volumes having several decades of Business Data, External Data and complex structures. Managing and maintaining data dictionary/ glossary/ERD Creation of Conceptual, Logical and Physical Data Models Project management Work Experience Strong Technical knowledge of SQL (Advanced SQL store procedures, performance tuning), RDBMS, Data Warehousing and Modelling preferablyin AWS stack ER Modelling \u2013 Forward & Reverse Engineering should have worked on modelling tools and extensive knowledge on Conceptual, Logical and Physical Models Should have independently managed customers and projects of 4 members or more Minimum 3 projects implementation experience in Retail/Manufacturing/HR/Finance domains Good Knowledge on ETL tools (min 1 tool) preferably SAP Data Services/IBM Data Stage/AWS Glue Good process knowledge of minimum one Domain Experience in Leading Projects in agile framework at least 1 medium to complex project Experience in implementation of SF Customer Data Cloud would be an added advantage Good to have Statistical data modelling / Predictive modelling skills Hands On experience is SalesForce Customer Data Cloud (creation of Customer segmentations for campaigns) would be an added advantage Should have good understanding of Reporting Tools such as Tableau / Power-BI / AWS Quicksight Should have exposure to Data Governance

Posted 1 week ago

Apply

15.0 - 20.0 years

15 - 20 Lacs

bengaluru

Work from Office

About The Role Project Role : IoT Architect Project Role Description : Design end-to-end IoT platform architecture solutions, including data ingestion, data processing, and analytics across different vendor platforms for highly interconnected device workloads at scale. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Roles & Responsibilities:1. Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. 2. Focused on designing, building, and managing the data architecture of industrial systems. 3. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. 4. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. 5. Create scalable and secure data structures, integrate with existing systems, and ensure efficient data flow. Professional & Technical Skills: 1. Must have Skills: Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life Science. 2. Data Modeling and Architecture:Proficiency in data modeling techniques (conceptual, logical, and physical models). 3. Knowledge of database design principles and normalization. 4. Experience with data architecture frameworks and methodologies (e.g., TOGAF). 5. Database Technologies:Relational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. 6. NoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data. 7. Graph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework). 8. Query Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. 9. Data Integration and ETL (Extract, Transform, Load):Proficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi). 10. Experience with data integration tools and techniques to consolidate data from various sources. 11. IoT and Industrial Data Systems:Familiarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA). 12. Experience with IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core. 13. Experience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache Flink 14. Ability to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow. 15. Understanding event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ. 16. Exposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge 17. AI/ML, GenAI:Experience working on data readiness for feeding into AI/ML/GenAI applications 18. Exposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. 19. Cloud Platforms:Experience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). 20. Data Warehousing and BI Tools:Expertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery). 21. Proficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. 22. Data Governance and Security:Understanding data governance principles, data quality management, and metadata management. 23. Knowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. 24. Big Data Technologies:Experience with big data platforms and tools such as Hadoop, Spark, and Apache Kafka. 25. Understanding distributed computing and data processing frameworks. Additional Info1. A minimum of 15-18 years of progressive information technology experience is required. 2. This position is based at Bengaluru location. 3. A 15 years full-time education is required. 4. AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to optimize performance and efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

kolkata

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica MDM Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions that enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Good To Have Skills: Experience with data warehousing concepts and practices.- Strong understanding of data modeling techniques.- Familiarity with SQL and database management systems.- Experience in implementing data governance and data quality frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and implement data platform solutions.- Develop and maintain data pipelines for efficient data processing.- Implement data security and privacy measures to protect sensitive information.- Optimize data storage and retrieval processes for improved performance.- Conduct regular data platform performance monitoring and troubleshooting. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience with data modeling and database design.- Hands-on experience with ETL processes and tools.- Knowledge of data governance and compliance standards. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Ahmedabad office. Educational Qualification:- 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BTP Datasphere Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while keeping abreast of the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BTP Datasphere.- Good To Have Skills: Experience with cloud integration services.- Strong understanding of application development methodologies.- Familiarity with data modeling and data governance principles.- Experience in performance tuning and optimization of applications. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BTP Datasphere.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica MDM Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions that enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Good To Have Skills: Experience with data warehousing concepts and practices.- Strong understanding of data modeling techniques.- Familiarity with SQL and database management systems.- Experience in implementing data governance and data quality frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

hyderabad

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles and practices.- Experience with application design and configuration.- Ability to lead cross-functional teams effectively.- Familiarity with project management methodologies. Additional Information:- The candidate should have minimum 5 years of experience in SAP Master Data Governance MDG Tool.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BTP Datasphere Good to have skills : SAP BW/4HANA Data Modeling & DevelopmentMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationWe are seeking a skilled SAP Datasphere Developer to design, develop, and implement data integration, modeling, and analytics solutions using SAP Datasphere. The ideal candidate will have experience in data modeling, ETL processes, and integrating SAP and non-SAP systems. You will collaborate with business and technical teams to enable data-driven decision-making through scalable and efficient data solutions.Key Responsibilities:Design and develop data models using SAP Datasphere to support business intelligence and analytics needs.Implement ELT processes to extract, transform, and load data from SAP and non-SAP sources.Integrate SAP Datasphere with SAP S/4HANA, and other third-party systems.Develop and optimize views, calculations, and transformations using SAP Dataspheres Graphical and SQL-based Data Modeling tools.Work with SAP Analytics Cloud (SAC) to visualize and analyze data from SAP Datasphere.Ensure data quality, security, and governance by implementing appropriate policies and best practices.Collaborate with business stakeholders to understand reporting and analytics requirements.Monitor, troubleshoot, and optimize performance issues related to SAP Datasphere.Stay up to date with SAP Datasphere updates, best practices, and emerging trends in data management and analytics.Required Skills & Qualifications:5+ years of experience in data modeling, ETL development, or data integration.Hands-on experience with SAP Datasphere.Strong knowledge of SQL, HANA Calculation Views, and Data Modeling concepts.Experience integrating SAP Datasphere with SAP S/4HANA, and non-SAP data sources.Familiarity with SAP Analytics Cloud (SAC) for reporting and visualization.Understanding of data governance, security, and access control within SAP Datasphere.Proficiency in Cloud Data Warehousing concepts and integration techniques.Experience working with SAP BTP services is a plus.Strong problem-solving, analytical, and communication skills.Preferred Qualifications:SAP Datasphere certification is a plus.Experience with SAP BW/4HANA or SAP HANA Native Modeling.Experience with scripting languages such as Python for data transformation. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

mumbai

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with data warehousing solutions.- Strong understanding of data modeling and database design principles.- Familiarity with data governance and data quality frameworks.- Experience in programming languages such as Python or Scala for data processing. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

coimbatore

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 25.0 years

13 - 18 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : basisMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and meets the needs of the organization. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior professionals in best practices for data architecture and analytics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data modeling techniques and best practices.- Experience with cloud data storage solutions and data integration tools.- Familiarity with data governance and compliance standards.- Ability to design and implement data pipelines for analytics and reporting. Additional Information:- The candidate should have minimum 15 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

pune

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to optimize performance and efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

hyderabad

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with data warehousing solutions.- Strong understanding of data modeling and database design principles.- Familiarity with data governance and data quality frameworks.- Experience in programming languages such as Python or Scala for data processing. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

chennai

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with data warehousing solutions.- Strong understanding of data modeling and database design principles.- Familiarity with data governance and data quality frameworks.- Experience in programming languages such as Python or Scala for data processing. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with data warehousing solutions.- Strong understanding of data modeling and database design principles.- Familiarity with data governance and data quality frameworks.- Experience in programming languages such as Python or Scala for data processing. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

35 - 60 Lacs

hyderabad, chennai

Hybrid

Roles and Responsibilities Design and implement data solutions using Data Architecture principles, including Data Models, Data Warehouses, and Data Lakes. Develop cloud-based data pipelines on AWS/GCP platforms to integrate various data sources into a centralized repository. Ensure effective Data Governance through implementation of policies, procedures, and standards for data management. Collaborate with cross-functional teams to identify business requirements and develop technical roadmaps for data engineering projects. Desired Candidate Profile 7-12 years of experience in Solution Architecting with expertise in Data Architecture, Data Modeling, Data Warehousing, Data Integration, Data Lake, Data Governance, Data Engineering, and Data Architecture Principles. Strong understanding of AWS/GCP Cloud Platforms and their applications in building scalable data architectures. Experience working with large datasets from multiple sources; ability to design efficient ETL processes for migration into target systems.

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

gurugram

Work from Office

We are currently looking for a Team Lead- MDM to join us at our facility in Gurugram, Haryana. Technical Competencies: Should have hands on experience on Master data creation and maintenance (Material/Vendor/Pricing/Customer/PIRs/Source List/BOM data etc) Hands on experience on SAP toolsets in Data space including Data extraction programs from SAP, SQVIs, ETL processes, Load programs, LSMW, data quality maintenance and cleansing etc. Knowledge of Request Management tools eg: SNOW, Remedy etc Knowledge of key database concepts, data models, relationships between different types of data An understanding of end-to-end set-up and business impact of master data key fields Knowledge about SAP, S/4 HANA, SAP-MDG, Ariba , SFDC, MW (Informatica etc) or additional ERP platforms, IT tools and technologies is desirable Experience in Data Management Processes (Data Profiling & Cleansing, Workflows, Data Quality, Governance Process, relationships & dependencies with IT teams etc.), or functional knowledge in SAP MM/PP or OTC modules will be an added advantage Should have prior experience of handling a team. Primary Responsibilities/ Role Expectations: As an MDM Team Lead, the role would involve: Getting adept with MDM process and gaining knowledge by initially working on daily business requests for master data objects - creation/update/obsolete/reactivate. Responsible for holding key business stakeholder interactions for feedbacks, business requirements and to maintain data governance and data quality. Testing master data creations/updates across tools and interfaces Getting the team and the allocated region ready for future additional tasks/requirements/projects as and when needed Responsible for maintaining data governance, data quality and for data cleansing activities Mentoring the team members on topics of expertise Strong ownership focus, drive to excel & deliver Flexibility to work in shifts. Other Skills & Experience: Professional experience of 8+ years Good communication skills, stakeholder alignment, experience of interaction with end clients/ international colleagues across geographies Ability to resolve conflicts, share, collaborate and work as a leader for the allocated team.

Posted 1 week ago

Apply

3.0 - 7.0 years

7 - 11 Lacs

gurugram

Work from Office

oriented: The Remote Monitoring & Diagnostics Service Engineering Data Services (ES RM&DS EDS) Team within Siemens Energy is looking for a data minded resource with power plant knowledge to process, validate and maintain Gas and Steam Turbine operating & outage data, Serialized Parts Tracking data. Fleet operating & outage data, statistics derived from that data, and serialized parts tracking data are a foundational data set leveraged by the organization for many reasons such as ensuring Siemens Energy is providing the right products and services to continuously improve the in-market availability of our customers assets. The ES RM&DS EDS organization provides a link between Siemens and its customers by collecting, validating, analyzing, and reporting out unit and fleet operating statistics. The High quality of our Data is an essential enabler to becoming a successful Data Driven Company. The candidate selected for this position will be responsible for data integrity related to operational, outage, and serialized parts tracking data. This individual will be required to analyze, interpret and process customer reported unit data in a timely manner. They will need to understand and use various engineering applications to complete their tasks; and will need to adhere to the IEEE 762 standard for availability reporting and the associated Siemens Energy requirements for collecting / documenting the necessary data. The selected candidate will be expected to learn these aspects of Fleet Data Processing in a relatively short period of time. The candidate will promote strong communication between Siemens Energy colleagues all over the world ensuring database updates and additions are processed, validated, and entered to the highest global data standard. Candidate will also help to support and improve the data stewardship and data governance processes for power plant fleet data. The candidate will be working closely with the teams in Orlando and as well as support the teams in Berlin and Mulheim . This position is based in Gurgaon (India) and may involve working in morning or night shifts. We dont need superheroes, just super minds: Understanding of steam turbine, gas turbine and generator components and operation are strongly desired Ability to interpret/validate power generation, customer-supplied, operating and outage data Familiarity with IEEE Standard 762 and/or Service Bulletins 36803 and 51009 but not required Familiarity with data governance, data quality and process management methods is a plus Job Skills / Proficiency Include: Excellent attention to detail with a strong focus on data accuracy Process orientated with the ability to plan and organize tasks Demonstrates focus on quality and continuous improvement Good time management, communication, interpersonal and organizational skills (e.g. validating data from multiple sources) Follows standard, operational procedures and tools obtained through extensive work experience Effective verbal & written communication in English Must be familiar with Microsoft Office applications (Excel, Word, Power Point, Outlook, etc Strong customer focus, Ability to develop good working relationships (internal & external) Desire to learn, improve, and be part of a very dynamic team environment. Open minded global collaboration skills and the willingness to work across borders and projects Experience using FDM, USI, and SAP (KSP / OPP) This role is based at Site (Gurgaon). Youll also get to visit other locations in India and beyond, so youll need to go where this journey takes you. In return, youll get the chance to work with teams impacting entire cities, countries and the shape of things to come.

Posted 1 week ago

Apply

8.0 - 13.0 years

16 - 20 Lacs

hyderabad

Work from Office

Overview The PGT MDG Solution Architect will be responsible for maintaining the techno-Functional integrity of the PGT MDG (Master Data Governance) solution. This role will report to the PGT MDG Design Lead and will partner with the PGT Data Solutions Design team to ensure that the MDG Platform supports the business requirements in a scalable and effective manner. The objective of this role is to ensure that master data requirements are accurately designed, thoroughly documented, and aligned with the PepsiCo Global Template solution, in close collaboration with the Value Stream/process teams, Global Process Leads (GPLs), project deployment teams, and in support of Cloud and AI solutions. Responsibilities Architect and implement end-to-end SAP MDG solutions integrated with cloud platforms and Design of SAP MDG processes across business units and analyse current-state data best practices and contribute to the definition of future-state solutions. Support the Development and Sector DSO teams in MDG configuration and development of Material/ Business Partner/ Customer/ Vendor/Financial Objects required and the ability to advise on the best practices in SAP MDG solution stack. Design, development, and deployment of SAP MDG S/4 solutions in cloud environments (e.g., SAP BTP, Azure) and able to drive cloud-based data migration and integration strategies. Design of AI-enabled data quality frameworks, including machine learning models for data validation, classification, enrichment, and anomaly detection. Provide best practice and development expertise in replication frameworks with data transfer, SOA, ALE/IDOC, key & value mapping, and CMP. Understand impacts of PGT data requirements for BP to OTC/PTP and MM to MTD with existing PepsiCo global/sector solutions with a particular focus on Customer/Supplier & Material Master data. Participation in PGT Data Governance processes to ensure impacted stakeholders across PepsiCo are aware of new or changes to data standards. Qualifications Bachelors degree in computer science, Information Technology, or related field preferred. 8+ years of experience working in a technology role supporting SAP deployment projects (S/4HANA experience preferred) 6 + years of experience in Master Data Governance. Strong experience in implementation and integration of SAP MDG Technical solutions for various data domains like Customer /Vendor and Material master. SAP Master Data design principles and key concepts of master data governance and MDG hub architecture in a global / multi-national organization. Strong Knowledge in MDG Cloud & AI Methodologies. Strong Knowledge of Key Mapping, Value Mapping, Process Model configurations, replication model configurations, Replication filters, country-specific checks, Partner profiles, workflow configurations. Strong domain knowledge of core SAP data models Material, Business Partner (including Customer and Vendor) and Finance domains. Enterprise Asset Management data domains from Utopia a plus. Stay current on trends and advancements in MDG, cloud data platforms, and AI/ML technologies. Hands on experience in ABAP Development and debugging of code to trouble shoot problems. Ability to understand complex functional and IT requirements and be able to identify and offer multiple solution options to facilitate the best outcome. Excellent oral and written skills to effectively communicate complex ideas and concepts and provide clear documentation to the management and executive leadership, and across multiple geographies. Ability to quickly adapt to changes in timelines and sequences, deal with ambiguity, and succeed against in a high-pressure environment. Trusted and respected as a thought leader who can influence and persuade business and IT leaders. Highly collaborative and supportive of business and of its ideals and strategies. Highly innovative with aptitude for foresight, systems thinking and design thinking. Vendor and technology- neutral more interested in achieving targeted business outcomes Practical in approach to problem solving and decision making.

Posted 1 week ago

Apply

12.0 - 17.0 years

13 - 18 Lacs

pune

Work from Office

Role Description: The given role will help design, improve and effective implementation of DBs Data Management framework across the bank. The role involves delivering value in key areas like data governance, data quality and controls, data lineage and compliance metrics and KPIs by leveraging innovative thinking, data governance technologies and automation of workflows. Your key responsibilities Design standards, improve processes and monitoring approach to strengthen DBs Data governance frameworks and to facilitate their implementation in an efficient way by divisions across the bank. Develop operating model and standards for implementing the banks data governance framework, data lineage, data quality and controls, and data sourcing. Design and implement methodology for data governance gap closure identified during regulatory and internal audits Design and implement methodology to capture additional metadata associated to Authoritative sources to identify strategic vs. non strategic sources utilized for sourcing data Your skills and experience 12+ years proven experience working with organizations data governance teams developing data governance policy, procedures and standards Experience working in developing standards and operating model for critical data identification and end to end data lineage depiction Experience with Data Governance tools (like Collibra) as well as Data Lineage tools (like Solidatus) Strong problem solving and analytical skills

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Power BI developer, you will require the following skills: - Advanced skills and experience with Power BI Report Development (Pro License) and Proficient in Power BI Service Management. - Strong understanding of Data Analysis Expressions (DAX). - Advanced knowledge of ETL processes with Power Query. - Experience in building complex Power BI dashboards and expert in visualization. - Design and create SQL queries, interpret advanced SQL queries, and extract or manipulate data for reporting and analysis purposes. - Advanced proficiency in MS Excel and Excel 365 complex functions. - Knowledge in Power Platform tools such as Power Apps, Power Automate, and Power Pages. Good to have Skills: - Experience in managing a similar reporting and analysis function. - Experience with Alteryx and Azure ML / Python. - Command on Mathematical concepts especially statistics and probability. - Drive, design, develop, deploy, and maintain a BI-driven transformation of business and financial reporting using Power BI and SQL. Your role and responsibilities will include: - Designing, developing, deploying, and maintaining BI interfaces, including data visualizations, dashboards, and reports using Power BI. - Developing and executing database queries and conducting analyses to optimize the performance of Power BI reports. - Collaborating with cross-functional teams to gather reporting requirements and translate them into actionable insights. - Analyzing complex data sets and providing business insights and recommendations to support decision-making processes. - Ensuring data accuracy and integrity by conducting regular reviews, data validation, troubleshooting, and documentation. - Participating in the design and implementation of data governance policies and procedures. - Enhancing efficiency through Lean methodologies, automation, and digital integration to improve processes and foster business growth. - Staying up-to-date with industry trends and advancements in reporting and analytics tools and techniques. Requirements: - Passionate about data analysis, reporting, and have a strong command of power platform tools, along with SQL Server Management Studio skills to handle the backend data. - Proven experience as a Reporting and Analytics Specialist, Business Intelligence Analyst, or similar role. - Excellent analytical and problem-solving skills with the ability to turn complex data into actionable insights. - Strong communication and interpersonal skills, with the ability to effectively communicate technical concepts to non-technical stakeholders. - Attention to detail and strong organizational skills to handle multiple tasks and meet deadlines. - Continuous learning mindset and the ability to adapt to evolving reporting and analytics technologies. At EY, we exist to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

We are seeking a Power BI Lead Developer with over 8 years of experience in developing business intelligence solutions using Microsoft Power BI, alongside at least 5 years of experience with SQL/Database systems. As a Power BI Lead Developer, you will be responsible for designing, developing, and maintaining interactive and visually compelling Power BI dashboards and reports. Additionally, you will develop robust data models, optimize DAX queries, make changes to SQL tables, and fine-tune Power BI performance. Collaboration with stakeholders is a key aspect of this role, as you will work closely with business users, analysts, and data engineers to understand reporting needs and translate them into effective Power BI solutions. Moreover, you will implement security measures such as row-level security and access control mechanisms to ensure data privacy and governance. In addition to your technical skills, we are looking for someone with excellent communication skills to effectively interact with both business stakeholders and technical teams. Preferred qualifications include being Microsoft Certified as a Power BI Data Analyst Associate or holding an equivalent certification. While not mandatory, knowledge of Azure Data Services and experience with Agile methodologies and version control tools like Git would be beneficial for this role. Furthermore, familiarity with AI/ML-powered analytics in Power BI and integrating Power BI with Python/R for advanced analytics would be considered advantageous. This position offers flexibility in terms of work mode, allowing for remote, hybrid, or onsite work as per company policy. At Yavda, we prioritize a happy and productive team environment, offering opportunities for career growth, mentorship from experienced data leaders, and 22 paid annual leave days. If you are ready to contribute your creativity, passion, and expertise to a company that is truly making a difference, we encourage you to apply today by emailing "recruit@yavda.in" with the subject line - read the full post for Lead Power BI Developer. Share with us why you believe you are the perfect fit for Yavda.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies