Jobs
Interviews

3 Kimball Methodology Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Modeller/Data Modeler, you will play a crucial role in leading data architecture efforts across various enterprise domains such as Sales, Procurement, Finance, Logistics, R&D, and Advanced Planning Systems (SAP/Oracle). Your responsibilities will include designing scalable and reusable data models, constructing data lake foundations, and collaborating with cross-functional teams to deliver robust end-to-end data solutions. You will work closely with business and product teams to understand processes and translate them into technical specifications. Using methodologies such as Medallion Architecture, EDW, or Kimball, you will design logical and physical data models. It will be essential to source the correct grain of data from authentic source systems or existing DWHs and create intermediary data models and physical views for reporting and consumption. In addition, you will be responsible for implementing Data Governance, Data Quality, and Data Observability practices. Developing business process maps, user journey maps, and data flow/integration diagrams will also be part of your tasks. You will design integration workflows utilizing APIs, FTP/SFTP, web services, and other tools to support large-scale implementation programs involving multiple projects. Your technical skills should include a minimum of 5+ years of experience in data-focused projects, strong expertise in Data Modelling encompassing Logical, Physical, Dimensional, and Vault modeling, and familiarity with enterprise data domains such as Sales, Finance, Procurement, Supply Chain, Logistics, and R&D. Proficiency in tools like Erwin or similar data modeling tools, understanding of OLTP and OLAP systems, and knowledge of Kimball methodology, Medallion architecture, and modern Data Lakehouse patterns are essential. Furthermore, you should have knowledge of Bronze, Silver, and Gold layer architecture in cloud platforms and the ability to read existing data dictionaries, table structures, and normalize data tables effectively. Familiarity with cloud data platforms (AWS, Azure, GCP), DevOps/DataOps best practices, Agile methodologies, and end-to-end integration needs and methods is also required. Preferred experience includes a background in Retail, CPG, or Supply Chain domains, as well as experience with data governance frameworks, quality tools, and metadata management platforms. Your skills should encompass a range of technical aspects such as FTP/SFTP, physical data models, DevOps, data observability, cloud platforms, APIs, data lakehouse, vault modeling, dimensional modeling, and more. In summary, as a Data Modeller/Data Modeler, you will be a key player in designing and implementing data solutions that drive business success across various domains and collaborating with diverse teams to achieve strategic objectives seamlessly.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

You will be joining Beyond Key, a Microsoft Gold Partner and a Great Place to Work-certified company that prioritizes the happiness of both team members and clients. Established in 2005, Beyond Key is an international IT consulting and software services firm known for delivering cutting-edge services and products to meet the global needs of their clients across various regions such as the United States, Canada, Europe, Australia, the Middle East, and India. With a team of over 350+ skilled software professionals, Beyond Key creates and designs IT solutions tailored to their clients" requirements. For more information, visit https://www.beyondkey.com/about. As a Snowflake DevOps Engineer within the BI TEC team, your primary responsibility will be to support and enhance a multi-region Snowflake data warehouse infrastructure. This role will involve developing and maintaining robust CI/CD pipelines using tools like GitHub, Git Actions, Python, TeamCity, and SDA. Proficiency in Control-M for batch scheduling and a solid background in data warehousing are crucial for this position. Collaboration with cross-functional technical teams and a proactive delivery approach are essential aspects of this role. While experience in the Broker Dealer domain is advantageous, a proven track record in managing large-scale data warehouse projects will also be highly valued. Key Responsibilities: - Develop and maintain CI/CD pipelines for Snowflake. - Collaborate with different teams to improve deployment and automation processes. - Manage batch scheduling using Control-M. - Ensure quality and security compliance, including conducting Veracode scan reviews. - Contribute to data warehouse design following Kimball methodologies. - Translate technical concepts into easily understandable language for business purposes. - Provide support for production reporting and be available for on-call support when necessary. Required Skills & Experience: - Minimum 5 years of experience in Snowflake CI/CD. - Minimum 5 years of Python development experience. - Proficiency in GitHub, Git Actions, TeamCity, and SDA. - Strong understanding of Data Warehousing and Kimball methodology. - Experience with Control-M for batch processing and job scheduling. - Familiarity with Veracode or similar security scanning tools. - Experience working in large-scale database development teams. - Knowledge of Capital Markets or Broker Dealer domain (preferred). - Oracle PL/SQL experience is a plus. If you are seeking a role where you can contribute to innovative data solutions and work collaboratively with a dynamic team, this opportunity at Beyond Key may be perfect for you. Explore all our job openings and share this opportunity with someone exceptional.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The Lead Data Modeler role involves developing high performance, scalable enterprise data models on a cloud platform. You must possess strong SQL skills along with excellent data modeling expertise, and be well-versed in the Kimball methodology. Your responsibilities include participating in various activities throughout the systems development lifecycle, supporting activities, engaging in POCs, and presenting outcomes effectively. Additionally, you will be responsible for analyzing, architecting, designing, programming, debugging both existing and new products, as well as mentoring team members. It is crucial to take ownership and demonstrate high professional and technical ethics with a consistent focus on emerging technologies beneficial for the organization. You should have over 10 years of work experience in data modeling or engineering. Your duties will involve defining, designing, and implementing enterprise data models, building Kimball-compliant data models in the Analytic layer of the data warehouse, and constructing 3rd normal form-compliant data models in the hub layer of the data warehouse. You must translate tactical/strategic requirements into effective solutions that align with business needs. The role also requires participation in complex initiatives, seeking help when necessary, reviewing specifications, coaching team members, and researching coding standards improvements. Technical skills include hands-on experience in SQL, query optimization, RDBMS, Data Warehouse (ER and Dimensional modeling), modeling data into star schemas using the Kimball methodology, Agile methodology, CICD frameworks, DevOps practices, and working in an onsite-offshore model. Soft skills such as leadership, analytical thinking, problem-solving, communication, and presentation skills are essential. You should be able to work with a diverse team, make decisions, guide team members through complex problems, and effectively communicate with leadership and business teams. A Bachelor's degree in Computer Science, Information Systems, or a related technical area is required, preferably B.E in Computer Science/Information Tech. Nice-to-have skills include experience with Apache Spark Python, graph databases, data identification, ingestion, transformation, and consumption, data visualization, SAP Enterprise S/4 HANA familiarity, programming language skills (Python, NodeJs, Unix Scripting), and experience in GCP Cloud Ecosystem. Experience in software engineering across all deliverables, including defining, architecting, building, testing, and deploying, is preferred. The Lead Data Modeler role does not offer relocation assistance and does not specify a particular work shift.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies