Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
As a Data Modeler, your primary responsibility will be to design and develop conceptual, logical, and physical data models supporting enterprise data initiatives. You will work with modern storage formats like Parquet and ORC, and build and optimize data models within Databricks Unity Catalog. Collaborating with data engineers, architects, analysts, and stakeholders, you will ensure alignment with ingestion pipelines and business goals. Translating business and reporting requirements into robust data architecture, you will follow best practices in data warehousing and Lakehouse design. Your role will involve maintaining metadata artifacts, enforcing data governance, quality, and security protocols, and continuously improving modeling processes. You should have over 10 years of hands-on experience in data modeling within Big Data environments. Your expertise should include OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficiency in modeling methodologies like Kimball, Inmon, and Data Vault is essential. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Experience in Databricks with Unity Catalog and Delta Lake is required, along with a strong command of SQL and Apache Spark for querying and transformation. Familiarity with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database, is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are necessary for this role, as well as the ability to work in cross-functional agile environments. A Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field is required. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are a plus. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks like GDPR and HIPAA are advantageous.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Cloud Storage Administrator will manage and support cloud-based storage platforms in AWS and/or Azure. This role involves configuring, monitoring, and optimizing object, block, and file storage solutions to ensure high availability, performance, and data protection across our cloud infrastructure. Required Skills Administer and support cloud storage services such as Amazon S3, EBS, EFS, Glacier and Azure Blob, File and Archive Storage. Disaster mitigation design and implementation experience with a focus on architecture for cross-region replication, backup management, RTO and RPO planning and chaos engineering recovery. Demonstrate use of AWS Elastic Disaster Recovery or Azure Site Recovery. Certification and privacy standards associated with PII, data protection and compliance gap expectations. Ability to identify and tag PII, applying encryption and masking techniques and knowledge and experience in compliance certification (SOC2, ISO27001, GDPR, etc.) and demonstrate use of Azure Macie or Azure Purview. Monitoring and cost optimization practices to proactively alert on performance, usage and anomalies. Demonstrate use of AWS CloudWatch or Azure Monitor and AWS Cost Explorer or Azure Cost Management, . Embrace IaC and automation practices for backups, lifecycles, and archival polices. Demonstrate expertise with AWS CloudFormation or Azure DevOps and a history of use with Terraform modules for Cloud Storage. Manage backup and recovery processes using native cloud tools and third-party solutions. Implement storage policies including lifecycle rules, replication, and access controls. Perform capacity planning and forecasting for storage growth and utilization. Collaborate with infrastructure and application teams to meet storage and data access requirements. Ensure storage systems comply with data protection, retention, and security standards. Document configurations, procedures, and best practices for storage management. Respond to incidents and service requests related to storage systems. Participate in change and incident management processes aligned with ITSM standards. Required Experience 3+ years of experience in storage administration with cloud platforms (AWS, Azure, or both). Hands-on experience with cloud-native storage services and understanding of storage protocols. Experience with AWS CloudWatch, Azure Monitor, and the ability to set up proactive alerting on storage performance, usage, and anomalies. Strong troubleshooting and performance tuning skills related to storage. Familiarity with backup and disaster recovery solutions in cloud environments. Understanding of identity and access management as it pertains to storage services. Knowledge of ITSM processes such as incident, change, and problem management. Experienced with storage cost monitoring tools like AWS Cost Explorer or Azure Cost Management Knowledge of IaC tools (Terraform, CloudFormation) for provisioning storage resources, and automation of backup, lifecycle, and archival policies. Producing technical documentation. Exposure to enterprise backup solutions Show more Show less
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Lead Data Engineer specializing in Data Governance with expertise in Azure Purview, your key responsibilities will include designing and implementing data governance policies, standards, and procedures to guarantee data quality, consistency, and security. You will be tasked with identifying, analyzing, and resolving data quality issues across various data sources and platforms. Collaboration with cross-functional teams to enforce data governance structures and ensure adherence to policies and standards will be a crucial aspect of your role. Your role will also involve implementing and maintaining monitoring systems to track data quality, compliance, and security. Proficiency in data modelling, data warehousing, ETL processes, and data quality tools is essential. Familiarity with data governance tools like Azure Purview will be beneficial in executing your duties effectively. Ensuring that data is safeguarded and complies with privacy regulations through the implementation of appropriate access controls and security measures will be a top priority. You will also be responsible for facilitating data stewardship activities and providing guidance to data stewards on best practices in data governance. Leveraging Azure OneLake and Azure Synapse Analytics, you will design and implement scalable data storage and analytics solutions that support big data processing and analysis. Your expertise in these areas will be instrumental in meeting the data processing and analysis requirements of the organization.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Microsoft Fabric Professional at YASH Technologies, you will be responsible for working with cutting-edge technologies to bring about real positive changes in an increasingly virtual world. You will have the opportunity to contribute to business transformation by leveraging your experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure Storage Services, Azure SQL, ETL, Azure Cosmos DB, Event HUB, Azure Data Catalog, Azure Functions, and Azure Purview. With 5-8 years of experience in Microsoft Cloud solutions, you will be involved in creating pipelines, datasets, dataflows, Integration runtimes, and monitoring Pipelines. Your role will also entail extracting, transforming, and loading data from source systems using Azure Databricks, as well as preparing DB Design Documents based on client requirements. Collaborating with the development team, you will create database structures, queries, and triggers while working on SQL scripts and Synapse pipelines for data migration to Azure SQL. Your responsibilities will include data migration pipeline to Azure cloud, database migration from on-prem SQL server to Azure Dev Environment, and implementing data governance in Azure. Additionally, you will work on data migration pipelines for on-prem SQL server data to Azure cloud, along with utilizing Azure data catalog and experience in Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions. To excel in this role, mandatory certifications are required. At YASH Technologies, you will have the opportunity to create a career path tailored to your aspirations within an inclusive team environment. Our Hyperlearning workplace is built on principles of flexible work arrangements, free spirit, emotional positivity, agile self-determination, trust, transparency, open collaboration, support for business goals realization, stable employment, and ethical corporate culture. Join us to embark on a journey of continuous learning, unlearning, and relearning in a dynamic and evolving technology landscape.,
Posted 2 weeks ago
3.0 - 7.0 years
12 - 20 Lacs
Hyderabad, Bengaluru
Hybrid
OCI Data Catalog Oracle Cloud Infrastructure (OCI) Metadata Management Data Lineage Data Classification Data Stewardship Data Governance Cloud Data Integration Oracle Autonomous Database Oracle Object Storage PoC Delivery (Proof of Concept) Cloud Data Management SQL ETL Tools (e.g., Oracle Data Integrator, Informatica, etc.) Data Catalog Tools (e.g., Collibra, Alation, Azure Purview for comparative experience) On-premise to Cloud Integration API Integration (for metadata harvesting)
Posted 2 weeks ago
10.0 - 18.0 years
30 - 45 Lacs
Bengaluru
Remote
Key responsibilities - Lead the architecture and design of data governance solutions using Microsoft Purview. - Collaborate with data engineering, data science, and business teams to define data governance policies and standards. - Implement and manage data classification, lineage, and cataloging processes in Microsoft Purview. - Develop strategies for data quality, data security, and compliance with regulations such as GDPR and CCPA. - Conduct training sessions and workshops to educate teams on data governance best practices and the use of Microsoft Purview tools. - Monitor and optimize the performance of data governance solutions, ensuring they meet the organization's needs. - Provide technical leadership and mentorship to junior team members. Skills and Tools Required: - Extensive experience with Microsoft Azure and Microsoft Purview. - Strong knowledge of data governance principles, frameworks, and best practices. - Proficiency in SQL and data modeling concepts. - Experience with data visualization tools such as Power BI or Tableau. - Familiarity with data privacy regulations and compliance requirements. - Excellent problem-solving and analytical skills. - Strong communication and interpersonal skills to work effectively with stakeholders at all levels. - Knowledge of cloud architecture and data architecture principles. - Experience with ETL tools and data transformation processes is a plus. - Ability to work in a fast-paced, collaborative environment. NOTE - PLEASE DO NOT APPLY IF YOU DON'T HAVE EXPERIENCE IN MS PURVIEW
Posted 3 weeks ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 2 months ago
5.0 - 10.0 years
3 - 12 Lacs
Indore, Madhya Pradesh, India
On-site
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Modern Data Platform Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Job Description: Development, customize and manage integration tools, databases, warehouses and analytical systems with the use of data related instruments/instances Create and run complex queries and automation scripts for operational data processing. Test the reliability and performance of each part of a system and cooperate with the testing team Deploying data models into production environments. This entails providing the model with data stored in a warehouse or coming directly from sources, configuring data attributes, managing computing resources, setting up monitoring tools, etc. Responsible for setting up tools to view data, generate reports, and create visuals Monitoring the overall performance and stability of the system. Adjust and adapt the automated pipeline as data/models/requirements change. Mentor and train colleagues where necessary by helping them learn and improve their skills, as well as innovate and iterate on best practices. Solve complex issues with minimal supervision Make improvement and process recommendations that have an impact on the business Technology Stack Used & Required Knowledge: Mastery of Azure Data Services with minimum of 4 years of such as Azure Data Factory, Azure Synapse, Azure Databricks, Azure SQL Database, Azure Cosmos DB, Azure Blob Storage/Data Lake Storage, and Azure Stream Analytics. 5+ years Experience in using of Python/ PySpark for data engineering. Understanding of data types/ handling of different data models. Excellent understanding of ETL cycle Expertise in designing scalable and efficient data architectures on Azure is a plus Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms is a plus. Good scripting and programming skills. Expertise in implementing data governance frameworks and ensuring compliance using Azure Purview. Familiarity with Azure Data Catalog for managing and discovering data assets. Understanding of Azure Key Vault for secure key management. Azure scripting with PowerShell/Azure CLI Others: Flexibility in using different technologies/ platforms Ability to shift gears quickly and cope with change Analytical & Detail Oriented Confident, proactive and self-motivated Disciplined and Team first approach oriented
Posted 2 months ago
9.0 - 14.0 years
9 - 12 Lacs
Pune, Maharashtra, India
On-site
Role & responsibilities Job Description: As a Senior Azure Purview Expert, you will be responsible for leading the implementation, configuration, and optimization of Azure Purview within our organization. You will collaborate closely with cross-functional teams to establish and enforce data governance policies, ensure data quality and lineage, and maximize the value of our data assets. The ideal candidate will possess deep expertise in Azure Purview, strong analytical skills, and a proven track record of driving successful data governance initiatives. Responsibilities: Lead the design, implementation, and configuration of Azure Purview to establish comprehensive data governance frameworks and practices. Collaborate with stakeholders to define and enforce data governance policies, standards, and best practices. Configure and optimize data discovery, classification, and lineage tracking processes within Azure Purview. Develop and implement data quality management strategies to ensure the accuracy, completeness, and reliability of our data assets. Collaborate with data engineering and architecture teams to integrate Azure Purview with existing data platforms and tools. Stay current with industry trends, emerging technologies, and best practices in data governance, analytics, and cloud computing. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field; Master's degree preferred. 10-12 years of experience in data management, data governance, and analytics, with a focus on cloud-based platforms. Extensive 3 years hands-on experience with Azure Purview, including implementation, configuration, and optimization. Strong understanding of data governance principles, policies, and best practices, with experience implementing data governance frameworks in enterprise environments. Proficiency in data quality management, metadata management, and data lineage tracking. Experience working with diverse data sources and platforms, including structured and unstructured data, relational databases, data lakes, and cloud-based storage solutions. Excellent communication skills, with the ability to effectively collaborate with cross-functional teams and communicate complex technical concepts to non-technical stakeholders. Strong analytical and problem-solving skills, with a proactive and results-oriented approach to addressing technical challenges and driving continuous improvement. Preferred Qualifications Azure Purview certification (e.g., DP-203: Data Engineering on Microsoft Azure) Experience with other Azure services, such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure SQL Database. Familiarity with data governance tools and platforms, such as Collibra, Informatica Axon, Alation, or Erwin Data Intelligence Suite. Experience working in regulated industries, such as healthcare and Lifescience domain
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough