Jobs
Interviews

8 Snowflake Schemas Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

0 Lacs

karnataka

On-site

You will be joining MRI Software as a Data Engineering Leader responsible for designing, building, and managing data integration solutions. Your expertise in Azure Data Factory and Azure Synapse analytics, as well as data warehousing, will be crucial for leading technical implementations, mentoring junior developers, collaborating with global teams, and engaging with customers and stakeholders to ensure seamless and scalable data integration. Your key responsibilities will include leading and mentoring a team of data engineers, designing and implementing Azure Synapse Analytics solutions, optimizing ETL pipelines and Synapse Spark workloads, and ensuring data quality, security, and governance best practices. You will also collaborate with business stakeholders to develop data-driven solutions. To excel in this role, you should have 8-13 years of experience in Data Engineering, BI, or Cloud Analytics, with expertise in Azure Synapse, Data Factory, SQL, and ETL processes. Strong leadership, problem-solving, and stakeholder management skills are essential, and knowledge of Power BI, Python, or Spark would be advantageous. Deep knowledge of Data Modelling techniques, ETL Pipeline development, Azure Resources Cost Management, and data governance practices will also be key to your success. Additionally, you should be proficient in writing complex SQL queries, implementing best security practices for Azure components, and have experience in Master Data and metadata management. Your ability to manage a complex business environment, lead and support team members, and advocate for Agile practices will be highly valued. Experience in change management, data warehouse architecture, dimensional modelling, and data integrity validation will further strengthen your profile. Collaboration with Product Owners and data engineers to translate business requirements into effective dimensional models, strong SQL skills, and the ability to extract, clean, and transform raw data for dimensional modelling are essential aspects of this role. Desired skills include Python, real-time data streaming frameworks, and AI and Machine Learning data pipelines. A degree in Computer Science, Software Engineering, or related field is required for this position. In return, you can look forward to learning leading technical and industry standards, hybrid working arrangements, an annual performance-related bonus, and other benefits that foster an engaging, fun, and inclusive culture at MRI Software. MRI Software is a global Proptech leader dedicated to empowering real estate companies with innovative applications and hosted solutions. With a flexible technology platform and a connected ecosystem, MRI Software caters to the unique needs of real estate businesses worldwide. Operating across multiple regions, MRI Software boasts nearly five decades of expertise, a diverse team of over 4000 professionals, and a commitment to Equal Employment Opportunity.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Database / ETL / Power BI Developer, you will leverage your 4 to 6 years of total IT experience, including a minimum of 4+ years of relevant experience, to excel in this role. Your strong interpersonal skills and ability to manage multiple tasks with enthusiasm will be key assets as you interact with clients to understand their requirements. Staying up to date with the latest best practices and advancements in Power BI is essential for this position. Your analytical and problem-solving mindset will be crucial in tackling the challenges that come your way. In this role, your responsibilities will include: - Designing and deploying database schemas using data modeling techniques. - Demonstrating strong SQL skills and proficiency in SQL Performance tuning, including creating T-SQL objects, scripts, views, and stored procedures. - Developing and maintaining ETL platform/applications (SSIS). - Diagnosing and resolving database access and performance issues through query optimization and indexing. - Understanding business requirements in a BI context and designing data models to transform raw data into actionable insights. - Utilizing your expertise in Power BI (Power BI Service, Power BI Report Server) to provide data modeling recommendations. - Working with direct query and import mode, implementing static & dynamic Row level security, and integrating Power BI reports in external web applications. - Setting up data gateways & data preparation, creating Dashboards & Visual Interactive Reports using Power BI. - Utilizing third-party custom visuals like Zebra BI, demonstrating proficiency in various DAX functions, and handling star and snowflake schemas in DWH. - Creating and maintaining technical documentation. It would be advantageous to have experience in Power Automate, C# and .Net knowledge, familiarity with Tabular Models, and experience in JSON data handling. Your commitment to continuous learning and staying abreast of new technologies will be beneficial in this dynamic role.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at our company, you will be responsible for designing and implementing Azure Synapse Analytics solutions for data processing and reporting. Your role will involve optimizing ETL pipelines, SQL pools, and Synapse Spark workloads to ensure efficient data processing. It will also be crucial for you to uphold data quality, security, and governance best practices while collaborating with business stakeholders to develop data-driven solutions. Additionally, part of your responsibilities will include mentoring a team of data engineers. To excel in this role, you should have 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Your expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes will be essential. Experience with Fabric is strongly desirable, and possessing strong leadership, problem-solving, and stakeholder management skills is crucial. Knowledge of Power BI, Python, or Spark would be a plus. You should also have deep knowledge of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and proficiency in writing complex SQL queries. Furthermore, you are expected to have knowledge and experience in Master Data/metadata management, including Data Governance, Data Quality, Data Catalog, and Data Security. Your ability to manage a complex and rapidly evolving business, actively lead, develop, and support team members will be key. As an Agile practitioner and advocate, you must be highly dynamic in your approach, adapting to constant changes in risks and forecasts. Your role will involve ensuring data integrity within the dimensional model by validating data and identifying inconsistencies. You will also work closely with Product Owners and data engineers to translate business needs into effective dimensional models. This position offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and receive competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. Ideally, you should hold a Bachelors/masters degree in software engineering, Computer Science, or a related area. Our company offers a range of benefits, including hybrid working arrangements, an annual performance-related bonus, Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture. MRI Software is dedicated to delivering innovative applications and hosted solutions that empower real estate companies to elevate their business. With a strong focus on meeting the unique needs of real estate businesses globally, we have grown to include offices across various countries with over 4000 team members supporting our clients. MRI is proud to be an Equal Employment Opportunity employer.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

punjab

On-site

You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

The responsibilities of the role involve designing and implementing Azure Synapse Analytics solutions for data processing and reporting. You will be required to optimize ETL pipelines, SQL pools, and Synapse Spark workloads while ensuring data quality, security, and governance best practices are followed. Collaborating with business stakeholders to develop data-driven solutions and mentoring a team of data engineers are key aspects of this position. To excel in this role, you should possess 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes is essential. Experience with Fabric is strongly desirable. Strong leadership, problem-solving, and stakeholder management skills are required. Additionally, knowledge of Power BI, Python, or Spark is a plus. Deep understanding of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and writing complex SQL queries are important competencies. Familiarity with Best Authorization and security practices for Azure components, Master Data/metadata management, and data governance is crucial. Being able to manage a complex and rapidly evolving business and actively lead, develop, and support team members is vital. An Agile mindset and the ability to adapt to constant changes in risks and forecasts are expected. Thorough knowledge of data warehouse architecture, principles, and best practices is necessary. Expertise in designing star and snowflake schemas, identifying facts and dimensions, and selecting appropriate granularity levels is also required. Ensuring data integrity within the dimensional model by validating data and identifying inconsistencies is part of the role. You will work closely with Product Owners and data engineers to translate business needs into effective dimensional models. Joining MRI Software offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and access competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. The ideal candidate should hold a Bachelor's/Master's degree in software engineering, Computer Science, or a related area. The benefits of this position include hybrid working arrangements, an annual performance-related bonus, 6x Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture at MRI Software. MRI Software is a company that delivers innovative applications and hosted solutions to empower real estate companies to enhance their business. With a flexible technology platform and an open and connected ecosystem, we cater to the unique needs of real estate businesses globally. With offices across various countries and a diverse team, we provide expertise and insight to support our clients effectively. MRI Software is proud to be an Equal Employment Opportunity employer.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Design, develop, test, and deploy ETL workflows and mappings using Informatica PowerCenter . Perform data extraction, transformation, and loading from various sources (RDBMS, flat files, XML, cloud sources). Optimize ETL performance through pushdown optimization, partitioning, and tuning transformations. Collaborate with data architects, analysts, and business users to understand requirements and translate them into ETL solutions. Implement error handling, logging, and data validation processes. Maintain documentation for ETL design, mappings, workflows, and data dictionaries. Troubleshoot data issues, resolve production problems, and support data reconciliation. Integrate with relational databases such as Oracle, SQL Server, PostgreSQL , and cloud databases . Work with job schedulers like Control-M , Tivoli , or Autosys to manage ETL job execution. Qualifications & Skills: Essential: Bachelor's degree in Computer Science, Engineering, Information Systems, or related field. 3+ years of hands-on experience with Informatica PowerCenter development. Proficient in ETL design principles , data cleansing, and transformation logic. Strong SQL skills for querying, data analysis, and performance tuning. Experience with data warehousing concepts , star/snowflake schemas , and dimensional modeling . Desirable: Exposure to Informatica Cloud , IDQ (Data Quality) , or MDM . Experience with Unix/Linux shell scripting and automation. Familiarity with data governance , data lineage , and metadata management . Knowledge of cloud platforms (AWS, Azure, GCP) and cloud data sources (e.g., S3, Redshift, BigQuery). Basic understanding of Agile methodologies and tools like JIRA .

Posted 4 weeks ago

Apply

9.0 - 14.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies