Jobs
Interviews

1 Modeling Methodologies Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

dehradun, uttarakhand

On-site

As a Data Modeler, your primary responsibility will be to design and develop conceptual, logical, and physical data models supporting enterprise data initiatives. You will work with modern storage formats like Parquet and ORC, and build and optimize data models within Databricks Unity Catalog. Collaborating with data engineers, architects, analysts, and stakeholders, you will ensure alignment with ingestion pipelines and business goals. Translating business and reporting requirements into robust data architecture, you will follow best practices in data warehousing and Lakehouse design. Your role will involve maintaining metadata artifacts, enforcing data governance, quality, and security protocols, and continuously improving modeling processes. You should have over 10 years of hands-on experience in data modeling within Big Data environments. Your expertise should include OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficiency in modeling methodologies like Kimball, Inmon, and Data Vault is essential. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Experience in Databricks with Unity Catalog and Delta Lake is required, along with a strong command of SQL and Apache Spark for querying and transformation. Familiarity with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database, is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are necessary for this role, as well as the ability to work in cross-functional agile environments. A Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field is required. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are a plus. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks like GDPR and HIPAA are advantageous.,

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies