Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
12 - 22 years
35 - 60 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Kolkata
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Noida
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
8 - 13 years
0 - 0 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Title: Data Architect with Snowflake & Data Modeling Expert Location: Chennai /Remote Experience: 8+ years Employment Type: Free lancing/Consultant Job Summary: We are seeking a highly skilled and experienced Data Architect with strong expertise in data modeling and Snowflake to design, develop, and optimize enterprise data architecture. The ideal candidate will play a critical role in shaping data strategy, building scalable models, and ensuring efficient data integration and governance. Key Responsibilities: Design and implement end-to-end data architecture using Snowflake Develop and maintain conceptual, logical, and physical data models. Define and enforce data architecture standards, best practices, and policies. Collaborate with data engineers, analysts, and business stakeholders to gather requirements and design data solutions. Optimize Snowflake performance including data partitioning, caching, and query tuning. Create and manage data dictionaries, metadata, and lineage documentation. Ensure data quality, consistency, and security across all data platforms. Support data integration from various sources (cloud/on-premises) into Snowflake. Required Skills and Experience: 8+ years of experience in data architecture, data modeling, or similar roles. Hands-on expertise with Snowflake including Snowpipe, Streams, Tasks, and Secure Data Sharing. Strong experience with data modeling tools (e.g., Erwin, ER/Studio, dbt). Proficiency in SQL , ETL/ELT pipelines , and data warehousing concepts . Experience working with structured, semi-structured (JSON, XML), and unstructured data. Solid understanding of data governance, data cataloging, and security frameworks. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with data lakehouse architecture and real-time data processing. Snowflake Certification(s) or relevant cloud certifications. Knowledge of Python or scripting for data automation is a plus. Please share below details: Total Experience- Relevant Experience- Freelancing -(Y/N) How many hours can support us- Expected Cost per month- Share your profile to samaravadi@osidigital.com
Posted 1 month ago
10 - 20 years
17 - 32 Lacs
Bengaluru
Remote
Location - Remote Data Architect Job Description: Responsibilities: Extensive Experience : Over 12 to 15 years in data architecture, data modelling, and database engineering, with expertise in OLAP/OLTP design, data warehouse solutions, and ELT/ETL processes. Proficient in Tools and Technologies : 5+ years of experience in Architect, Design and Develop Microsoft Azure solutions (Data Factory, Synapse, SQL DB, Cosmos etc), Azure Databricks, Snowflake features (Streams, Data sharing), and enterprise modelling tools (Erwin, Power Designer) Experience in financial services, insurance, or banking industries is a plus . Data Architecture & Modelling: Design and implement enterprise-level data architecture solutions across OLTP and OLAP and Snowflake. Design and implement effective data models using Snowflake, including Star Schema, Snowflake Schema, and Data Vault methodologies. Create and maintain logical and physical data models that align with business requirements and Snowflake's best practices. Knowledge of Slowly Changing Dimensions (SCD Type I & II) in data warehouse projects. Data Migration & Transformation: Lead end-to-end data migration projects from legacy systems to cloud-based environments (Microsoft Azure, Snowflake). Design staging environments and data integration frameworks . Work closely with ETL developers to ensure that the data model is seamlessly integrated with data pipelines, facilitating accurate and efficient data flow. Exposure to dbt ETL tooling. Technology & Performance Optimization: Optimize database performance by implementing indexing strategies, partitioning, query tuning, and workload balancing. Utilize cloud-based data platforms (Azure), and data cataloguing solutions. Monitor and optimize vended Snowflake performance, including query optimization, resource management, and cost control Stakeholder Engagement & Leadership: Collaborate with data engineers, business analysts, and data scientists to ensure data models meet reporting and analytical needs Drive technical roadmap initiatives and ensure alignment with organizational goals. Mentor junior architects and engineers in data modelling, database optimization, and governance best practices. Required Skills & Experience: 12+ years of experience in Enterprise-level Data Architecture, Data Modelling, and Database Engineering. Expertise in OLAP & OLTP design, Data Warehouse solutions, ELT/ETL processes. Strong verbal and written communication skills for collaborating with both technical teams and business stakeholders. Proficiency in data modelling concepts and practices such as normalization, denormalization, and dimensional modelling (Star Schema, Snowflake Schema, Data Vault, Medallion Data Lake). Experience with Snowflake-specific features, including clustering, partitioning, and schema design best practices Proficiency in Enterprise Modelling tools - Erwin, PowerDesigner, IBM Infosphere etc Strong experience in Microsoft Azure data pipelines (Data Factory, Synapse, SQL DB, Cosmos DB, Data bricks). Familiarity with Snowflakes native tools and services, including Snowflake Data Sharing, Snowflake Streams & Tasks, and Snowflake Secure Data Sharing. Strong knowledge of SQL performance tuning, query optimization, and indexing strategies. Working knowledge of BIAN, ACORD, ESG risk data integration. Experience in financial services, insurance, or banking industries is a plus. Preferred Certifications: Microsoft Azure Data Architect Certification Snowflake Cloud Database certifications TOGAF or equivalent enterprise architecture certification
Posted 2 months ago
7 - 10 years
40 - 47 Lacs
Pune
Work from Office
Job Title Data Architect Location Pune, Maharashtra, India Experience 7 to 10 Years As a Data Architect at JCI, you will play a pivotal role in designing and implementing robust data solutions that support our analytics and business intelligence initiatives. This role requires extensive experience in data modeling, data warehousing, and familiarity with cloud technologies. How you will do it Design and implement data architecture solutions that meet business requirements and align with overall data strategy. Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and ensure data availability and quality. Create and maintain data models, ensuring they are optimized for performance and scalability. Establish data governance practices to maintain data integrity and security across the organization. Lead the design and implementation of data integration processes, including ETL workflows and data pipelines. Evaluate and recommend new tools and technologies to improve data management capabilities. Provide technical leadership and mentorship to other team members in best practices for data architecture. Stay current with industry trends and advancements in data technologies and methodologies. What we look for Bachelor s degree in Computer Science, Information Technology, or a related field. 7 to 10 years of experience in data architecture or a similar role. Strong proficiency in SQL and experience with data modeling and database design. Experience with cloud data solutions, such as AWS, Azure, or Google Cloud Platform. Familiarity with data warehousing concepts and tools. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Join JCI and leverage your expertise to create impactful data solutions that drive our business forward!
Posted 2 months ago
10 - 15 years
12 - 17 Lacs
Indore, Ahmedabad, Hyderabad
Work from Office
Job Title: Technical Architect / Solution Architect / Data Architect (Data Analytics) ?? Notice Period: Immediate to 15 Days ?? Experience: 9+ Years Job Summary: We are looking for a highly technical and experienced Data Architect / Solution Architect / Technical Architect with expertise in Data Analytics. The candidate should have strong hands-on experience in solutioning, architecture, and cloud technologies to drive data-driven decisions. Key Responsibilities: ? Design, develop, and implement end-to-end data architecture solutions. ? Provide technical leadership in Azure, Databricks, Snowflake, and Microsoft Fabric. ? Architect scalable, secure, and high-performing data solutions. ? Work on data strategy, governance, and optimization. ? Implement and optimize Power BI dashboards and SQL-based analytics. ? Collaborate with cross-functional teams to deliver robust data solutions. Primary Skills Required: ? Data Architecture & Solutioning ? Azure Cloud (Data Services, Storage, Synapse, etc.) ? Databricks & Snowflake (Data Engineering & Warehousing) ? Power BI (Visualization & Reporting) ? Microsoft Fabric (Data & AI Integration) ? SQL (Advanced Querying & Optimization) ?? Looking for immediate to 15-day joiners!
Posted 2 months ago
8 - 13 years
16 - 20 Lacs
Bengaluru
Work from Office
BNI is looking for an experienced Data Architect to lead and support our data ecosystem, ensuring that our data infrastructure is scalable, secure, and meets the diverse needs of our organization. The ideal candidate will have extensive experience in building and managing Data Lakes, Data Warehouses, and Data Visualization platforms while leading a high-performing data team. This role requires a strategic thinker who can develop a unified data strategy that supports multiple functions such as data reporting, data visualization, data analytics, data science, data engineering, machine learning, and artificial intelligence. The candidate must be local to Bangalore, India and have hands-on expertise in modern data architecture frameworks and technologies. Roles and Responsibilities Data Architecture & Strategy Design, implement, and manage scalable Data Lakes and Data Warehouses that support business intelligence, analytics, and AI-driven decision-making. Define and execute a unified data strategy that aligns with organizational goals and supports multiple data use cases. Ensure data governance, security, and compliance best practices are embedded across all data platforms and processes. Establish data architecture frameworks and principles to optimize data flows and integrations. Technology & Platform Management Lead the adoption of best-in-class Data Visualization platforms such as Tableau, Power BI, Looker, or Qlik to support business intelligence needs. Implement and maintain ETL/ELT pipelines, data integration, and real-time data streaming solutions. Work with cloud-based data platforms (AWS, Azure, Snowflake, Databricks, BigQuery, or Redshift) to ensure efficient data storage, processing, and accessibility. Optimize the performance and scalability of database management systems (SQL, NoSQL, Graph DBs, etc.). Team Leadership & Collaboration Manage, mentor, and grow a team of Data Engineers, Database Administrators, and others to drive excellence in data solutions. Foster collaboration between data teams and business stakeholders to translate business needs into technical solutions. Work closely with AI/ML teams to support model training, feature engineering, and operationalization of AI solutions. Security & Compliance Establish robust data security policies to protect sensitive data while ensuring seamless access control for authorized users. Ensure data compliance with local and international regulations (e.g., GDPR, CCPA). Implement data quality frameworks and monitoring systems to ensure accuracy, consistency, and reliability of data. Qualifications Required: Technical Expertise Proven experience in designing and implementing Data Lakes and Data Warehouses. Expertise in modern data visualization tools (Tableau, Power BI, Looker, Qlik, etc.). Hands-on experience with cloud-based data platforms (AWS, Azure, GCP, Snowflake, Databricks, BigQuery, or Redshift). Strong knowledge of ETL/ELT processes, data modeling, and database management (SQL, NoSQL, Graph Databases). Experience with big data technologies (Hadoop, Spark, Kafka, Flink, etc.). Familiarity with AI/ML workflows and MLOps is a plus. Leadership & Collaboration Strong experience in leading and mentoring data teams. Ability to bridge the gap between business needs and technical solutions. Excellent communication and stakeholder management skills. Preferred: Must be based in Bangalore, India. 8+ years of experience in data architecture, data engineering, or related roles. Bachelors Degree or Equivalent Work Experience Why Join BNI Tech Work in a dynamic and innovative technology team that drives impactful data solutions. Opportunity to lead and influence data-driven decision-making in a global organization. Competitive salary and benefits package. Collaborative and growth-oriented work culture.
Posted 2 months ago
3 - 6 years
8 - 14 Lacs
Visakhapatnam
Work from Office
- Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security events/incidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on Azure/AWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.
Posted 2 months ago
3 - 6 years
8 - 14 Lacs
Bareilly
Work from Office
- Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security events/incidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on Azure/AWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.
Posted 2 months ago
9 - 14 years
15 - 22 Lacs
Gurgaon
Work from Office
- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large scale data warehouse / data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.
Posted 2 months ago
3 - 6 years
8 - 14 Lacs
Mumbai
Work from Office
- Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security events/incidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on Azure/AWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.
Posted 2 months ago
3 - 6 years
8 - 14 Lacs
Amritsar
Work from Office
- Ensure security standards are followed for all structured and unstructured data platforms (e.g., Azure/AWS Blob Storage, data lakes, data warehouses, etc.) - Ensure security standards are followed and implemented for all data pipeline, Data Science and BI projects conducted by the team - Identify and drive implementation of database protection tools to detect and prevent unauthorized access to Worley's data platforms. - Design, develop, test, customize and troubleshoot Database security systems and solutions, such as Database Activity Monitoring, Data Obfuscation, Data segregation/segmentation. - Outline access, encryption, and logging requirements in data stores and work with data solutions and data delivery teams to implement. - Build systemic, technical controls for data discovery, classification and tagging sensitive information in structured and unstructured data stores. - Provide security expertise and consulting to data solution and delivery teams. - Work alongside the Worley Security team, to help remediate security events/incidents. - Collaborate with Worley Security team to ensure successful completion of our roadmaps and initiatives. - Integrate security testing and controls into different phases of Data Delivery development lifecycles. - Experience in working in cloud data platforms - Experience in Information Security - Experience in database administration, database management. - Experience in cloud technology built on Azure/AWS and or snowflake - Knowledge of data architecture and database technologies - Experience with data science, machine learning anomaly detection - Experience working with vendors and developing security requirements and recommendations based on evaluation of technology.
Posted 2 months ago
9 - 14 years
15 - 22 Lacs
Coimbatore
Work from Office
- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large scale data warehouse / data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.
Posted 2 months ago
9 - 14 years
15 - 22 Lacs
Vadodara
Work from Office
- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large scale data warehouse / data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.
Posted 2 months ago
10 - 12 years
25 - 30 Lacs
Pune
Work from Office
The Role We are seeking an experienced Data Architect with expertise in Workday Reporting and data automation. The ideal candidate will have 10-12 years of experience, with a strong background in data architecture, reporting, and process automation. Key Responsibilities 1. Workday Reporting Expertise Design and develop complex Workday reports (Advanced, Composite, and Matrix reports). Deliver data-driven insights using Workdays reporting tools. Ensure the integrity and alignment of reporting solutions with organizational goals. 2. Data Architecture Create and implement robust data architecture frameworks. Manage seamless end-to-end data flows and system integrations. Optimize data storage, retrieval, and transformation processes for performance and scalability. 3. Automation and Process Optimization Develop automation strategies for repetitive tasks using tools and scripts. Innovate data automation solutions to minimize manual efforts. Maintain quality, consistency, and timeliness in automated processes. 4. Stakeholder Collaboration Partner with HR, IT, and business teams to understand reporting and data needs. Serve as a subject matter expert in Workday Reporting and data automation. Lead workshops and training sessions to enhance team understanding of reporting tools and processes. 5. Continuous Improvement Identify and implement opportunities to improve reporting and data processes. Stay updated on emerging trends in data architecture and Workday technologies.
Posted 2 months ago
6 - 11 years
40 - 45 Lacs
Hyderabad
Hybrid
Proven experience in designing and implementing enterprise data architectures. Strong understanding of data modelling concepts and methodologies Experience with various data technologies (e.g., relational databases, NoSQL databases, data warehouses, data lake) Established best practice within an Azure platform. Ability to meet and adhere to internal and external deliverables whilst maintaining an exceptionally high standard. Ability to apply design principles, best practices and share knowledge across the teams. Familiarity with Industry models such as BAIN, DMBOK Experience in Architecture Methodologies and notations such as TOGAF and ARCHIMATE
Posted 2 months ago
6 - 10 years
8 - 12 Lacs
Pune
Work from Office
Tata Tele Business Services is looking for Lead - Data Architect to join our dynamic team and embark on a rewarding career journey. Lead - Data Architect is responsible for overseeing and optimizing lead - data architect operations. This role involves strategic planning, team coordination, and execution of tasks to ensure efficiency and productivity. The incumbent will collaborate with stakeholders to align operations with business goals. Duties include monitoring performance, ensuring compliance with policies, and implementing best practices. Additionally, they will manage resources, resolve operational challenges, and contribute to continuous improvement initiatives. Strong analytical skills, leadership abilities, and industry knowledge are essential for success in this role.
Posted 2 months ago
4 - 5 years
9 - 13 Lacs
Mumbai
Work from Office
Responsibilities & Key Deliverables Analyze Functional Requirements and suggest / recommend / advise technical solution and approach Co-Create Technical Design document, Review and Approve tech designs Lead development activities and report periodically the status, risk and issues Lead 3 to 4 technical developers Problem / Issues analyzing and debugging skills. Experience Minimum 4 to 5 years of Service and Sales cloud technical experience (Service Cloud) Industry Preferred Qualifications Btech General Requirements Good Hands in Sales/Service/Experience Cloud (Must) Technically skilled well on application customization, configuration and API s. Performed Code review and Code Optimization on Apex/Aura/LWC. Very good Apex/Aura/LWC skills/Flows Must have either Integration or Data Architect Certification or experience
Posted 2 months ago
8 - 12 years
13 - 20 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 2 months ago
3 - 5 years
5 - 7 Lacs
Bengaluru
Work from Office
Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. The Team: You will be joining the Catalog team within the Business Apps department. Our mission is to build end-to-end solutions on the Celonis platform, including data models and end-user applications, to accelerate time to value for our customers and partners. The Catalog team within Business Apps specializes in three aspects: defining the data ontology of the most common business processes, building prebuilt transformations for such ontologies for major source systems like SAP, Oracle etc, and lastly, collaborating with various teams in both the Product and Go-to-market organizations to drive adoption at scale. The Role: As a Senior Data Architect, you will own and focus on primarily two aspects: On the one hand, refining prebuilt transformations for existing ontologies (for processes like Order to Cash, Procure to Pay, Inventory Management) for SAP, Oracle etc and validating them across our early adopters in our customer base. On the other, defining and extending the existing ontologies with additional processes and extending to additional systems. In addition, you would also be responsible for maintaining the quality of content that we produce, and write documentation on the ontology definitions. This will ensure both internal and external application developers will be able to leverage the data foundation to develop their solutions The work you ll do: Build data models for the defined ontologies and mappings using the object-centric process mining methodologies with performant SQL transformations. Research and design Facilitate cross-functional interactions with product managers, domain experts, engineers, and consultants.. Test and validate the models in development environments and customer environments to gather early feedback Document the data model governing principles and development. The qualifications you need: You have that rare combination a strong technical expertise and business acumen. You ll use this to build a system-agnostic data model for various business processes. 3-5+ years of experience working in the data field as a Data Engineer, Data Analyst or similar. Must-have: Experience working with data from at least one of the following system types: Strong solution designing skills with solid understanding of business processes (supply chain, financial, CRM or IT-related processes) and data beneath the IT systems that run these processes. Experience with databases and data modeling, and hands-on experience with SQL. Ability to work independently and own a part of the team s goals Very good knowledge of spoken and written English Ability to communicate effectively and build a good rapport with team members What Celonis Can Offer You: The unique opportunity to work with industry-leading process mining technology Investment in your personal growth and skill development (clear career paths, internal mobility opportunities, L&D platform, mentorships, and more) Great compensation and benefits packages (equity (restricted stock units), life insurance, time off, generous leave for new parents from day one, and more). For intern and working student benefits, click here . Physical and mental well-being support (subsidized gym membership, access to counseling, virtual events on well-being topics, and more) A global and growing team of Celonauts from diverse backgrounds to learn from and work with An open-minded culture with innovative, autonomous teams Business Resource Groups to help you feel connected, valued and seen (Black@Celonis, Women@Celonis, Parents@Celonis, Pride@Celonis, Resilience@Celonis, and more) A clear set of company values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future About Us: Celonis helps some of the world s largest and most esteemed brands make processes work for people, companies and the planet. With over 5,000 enterprise customer deployments across nearly every industry, the Celonis Process Intelligence Platform uses process mining and AI to give you a living digital twin of your business operation. It s system-agnostic and without bias, and empowers companies to reduce waste, create value and benefit people across the top, bottom, and green lines. Since 2011, the Celonis platform has enabled its customers to identify more than $18 billion in value. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Data Privacy, Equal Opportunity, and Accessibility Information Different makes us better . Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Statements on Data Privacy, Equal Opportunity and Accessibility. Please be aware of common job offer scams, impersonators and frauds. Learn more here . By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process .
Posted 2 months ago
4 - 11 years
9 - 10 Lacs
Chennai
Work from Office
Responsibilities: To exchange with requestor the needs to understand and identify useful informations To ensure information shared to requestor is valid and accurate Query, analyse and manipulate large datasets using google Bigquery and SQL Design and optimise queries to improve performance and reduce processing costs Develop and maintain dashboards and reports in Looker studio (or other BI Tools) Automate data pipelines and workflows for efficient reporting Ensure data security,privacy and compliance with Organizational Policies To implement support programs/scripts/algorithms so raw data are more useful to the enterprise To interpret data, analyzes results using statistical techniques To acquire data from primary or secondary data sources confirmed by the Data Architect Exchange with Requestor the needs to understand and identify useful informations Ensure a mature business requirement as of Analytics Needs (Clear Benefits Identified Users) Specify the business requirement to allow the feasibility study (Data needed and Flow identified) Check the needs and feasibility in accordance with the RD DSIS Data Catalogs definition Ensure information shared to requestor is valid and accurate Ensure data accuracy flexibility Use any programming language tools to support Identify ways to improve data reliability, efficiency quality Identify and propose RD DSIS Data Catalogs update Implement Queries Access right Define and implement an access right policy per Report/Dashboard, in line with the functional specifications and the rules provided by the Data Architect Manage the Report, Dashboard access right test phase Make the relative documentation Ensure Technical Report exists to monitor data access and data usage Implement support programs/scripts/algorithms so raw data are more useful to the enterprise Understand the competitive marketplace, business issues, and data challenges in order to deliver actionable insights, recommendations and business processes Work with Digital Business Partners / Excellence Managers / Digital Process Manager to prioritize business and information needed In collaboration with Data Scientist/Developer, turn the volumes of big data into valuable and actionable insights In collaboration with Data scientist/Developer, build, test and maintain frameworks (as the RD Data Catalogs) In collaboration with Data scientist/Developer, deploy new modeling programs, scripts, languages, packages and solutions to enhance the productivity of the team or grow new businesses Ensure that any data is properly received, transformed, stored, and made accessible to other users (as the Data scientist). Ensure Data View or Table in the Data Lake can be used for reporting or any data mining needs. Interpret data, analyzes results using statistical techniques In charge of providing global dashboards and KPIs to inform stakeholders on value realization Monitor value and propose levers of improvement Interpret data, analyze results using statistical techniques and providing ongoing reports Acquire data from primary or secondary data sources confirmed by the Data Architect Acquire first hand data and/or secondary data collected by the Data Developer In collaboration with Data scientist/Developer use advanced computerised models to extract the data needed In collaboration with Data Developer integrate external or new datasets implemented into existing data pipelines, in line with the Data Architect Design Ensure RD Data Catalogs for the relative COE data scope are up to date, inform the Data Architect. Job: RD IS Engineer Organization: RD DSIS Applications Schedule: Full time Employee Status: Regular Job Type: Permanent contract Job Posting Date: 2025-03-21 Join Us ! Being part of our team, you will join: - one of the largest global innovative companies, with more than 20,000 engineers working in Research Development - a multi-cultural environment that values diversity and international collaboration - more than 100,000 colleagues in 31 countries... which make a lot of opportunity for career growth - a business highly committed to limiting the environmental impact if its activities and ranked by Corporate Knights as the number one company in the automotive sector in terms of sustainable development More information on Valeo: https://www.valeo.com
Posted 2 months ago
10 - 15 years
17 - 32 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Data Architect \ Data Modeler will be expected to leverage their knowledge of data modeling best practices along with cross industry data expertise and data modeling tool expertise. Looking for Early Joiners Demonstrable experience in developing, validating, publishing, maintaining LOGICAL data models with exposure to or experience in developing, validating, publishing, maintaining PHYSICAL data models Demonstrable experience using data modeling tools - e.g., ErWin Evaluate existing data models and physical databases for variances and discrepancies Experience with managing meta data for data models Demonstrable experience in developing, publishing, and maintaining all documentation for data models
Posted 2 months ago
5 - 7 years
13 - 17 Lacs
Kolhapur
Work from Office
PradeepIT is looking for Senior PBI Data Architect to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues
Posted 2 months ago
16 - 20 years
50 - 65 Lacs
Hyderabad
Work from Office
DAZN Group is looking for Data Architect to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2