86 Snowflake Schema Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

0 Lacs

india

On-site

Job Summary: We are seeking a skilled Power BI Developer with strong expertise in Oracle E-Business Suite (EBS) to design, develop, and deliver insightful dashboards and reports. The ideal candidate will have a deep understanding of Oracle EBS modules (e.g., Financials, SCM, HCM), data structures, and experience integrating EBS data into Power BI for business reporting and analytics. Key Responsibilities: Develop, publish, and maintain interactive Power BI dashboards and reports. Extract and transform data from Oracle EBS for business intelligence purposes. Collaborate with business stakeholders to gather requirements and translate them into data models and visualizations. Optimize DAX queri...

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

0 Lacs

pune, maharashtra, india

On-site

Job Title - Data Medeler Proficiency in SQL and experience with relational databases (e.g. SQL Server) and their modeling techniques. Minimum 3 years of experience with data modeling to support business process Experience with data modeling tools such as Dbschema, Erwin, etc. Strong understanding of data warehousing principles (including star and snowflake schema) and ETL/ELT processes. Familiarity with data governance and data quality principles. Strong communication and collaboration skills.

Posted 2 months ago

AI Match Score
Apply

6.0 - 11.0 years

25 - 27 Lacs

indore, hyderabad, chennai

Work from Office

Hiring for Discovery Use Case Specialist (Hyderabad/Chennai/Indore – Hybrid) with strong expertise in Snowflake, Power BI, WhereScape RED & 3D, Data Vault 2.0, and data modeling. Translate business needs into scalable data solutions. Required Candidate profile Experienced data professional with 5+ yrs in Snowflake, Power BI, WhereScape RED/3D, and Data Vault 2.0 modeling, skilled in data architecture, SQL, and cross-functional collaboration.

Posted 2 months ago

AI Match Score
Apply

0.0 years

0 Lacs

gurgaon, haryana, india

On-site

Responsibilities ETL Development: Build and maintain ETL/ELT pipelines using Databricks & Apache Spark (PySpark/Scala). Data Warehousing: Design and implement Star Schema, Snowflake Schema, and Data Vault models. Databricks Optimization: Optimize Delta Lake storage, caching, and job execution to improve query performance. Databricks Optimization: Optimize Delta Lake storage, caching, and job execution to improve query performance. Data Ingestion & Processing: Develop scalable ingestion pipelines using Structured Streaming, Auto Loader, and Databricks Workflows. Data Governance & Security: Implement Unity Catalog, role-based access controls (RBAC), and compliance standards. Performance Tuning...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: As a Business Intelligence Developer at Liebherr CMCtec India Private Limited, your primary responsibility will be to collaborate with cross-functional teams to design and implement robust data solutions and architectures. You will be instrumental in implementing and optimizing Star and Snowflake Schema designs in Data Warehousing for effective data modeling. Your role will also involve developing and optimizing complex SQL queries, implementing stored procedures within SQL Server, administering and maintaining Microsoft SQL Server databases, and designing, creating, and managing Views for simplified data access and enhanced data security. Additionally, you will be responsible...

Posted 2 months ago

AI Match Score
Apply

0.0 years

0 Lacs

hyderabad, telangana, india

On-site

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world's leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Cu...

Posted 2 months ago

AI Match Score
Apply

4.0 - 7.0 years

10 - 14 Lacs

gurugram, bengaluru

Hybrid

Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines and data models using Snowflake and Python. Write efficient SQL queries, stored procedures, and scripts for data extraction, transformation, and loading. Develop Python-based applications or scripts for automation, data ingestion, and integration with Snowflake. Implement data quality checks, performance tuning, and cost optimization within the Snowflake environment. Integrate Snowflake with cloud platforms (AWS, Azure, or GCP) and third-party data sources. Collaborate with data engineers, analysts, and business teams to understand data requirements and deliver solutions. Monitor, troubleshoot, and optimize data pipelines ...

Posted 2 months ago

AI Match Score
Apply

10.0 - 14.0 years

35 - 45 Lacs

bengaluru

Remote

Role & responsibilities Implementing data management solutions. Certified Snowflake cloud data warehouse Architect Deep understanding of start and snowflake schema, dimensional modelling. Experience in the design and implementation of data pipelines and ETL processes using Snowflake. Optimize data models for performance and scalability. Collaborate with various technical and business stakeholders to define data requirements. Ensure data quality and governance best practices are followed. Experience with data security and data access controls in Snowflake. Expertise in complex SQL, python scripting, and performance tuning. Expertise in Snowflake concepts like setting up Resource monitors, RBA...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

The OBIEE developer/engineer is in charge of developing or modifying OBIEE reports according to the technical specifications in order to bring the added expected value to the Business. Production or UAT environments supports activities are also part of the position (job monitoring, issues solving, ) Responsibilities Direct Responsibilities - Participate to all the Agile ceremonies of the squad (Dailys, Sprint plannings, backlog refinements, reviews, etc) - Communicate ASAP on the blocking points - Estimate and develop, according to the company standards, the functionalities according to the Jira requirements (Change, bug fixing, ). - Do the unit tests of the code developed in order to delive...

Posted 2 months ago

AI Match Score
Apply

6.0 - 8.0 years

0 Lacs

india

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSQL Professionals in the following areas : Job description: Skill Set: .??SQL .??Snowflake .??SnapLogic ETL Tool JD .?? 6+ years of IT experience in A...

Posted 2 months ago

AI Match Score
Apply

3.0 - 5.0 years

5 - 8 Lacs

noida

Work from Office

Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in devel...

Posted 3 months ago

AI Match Score
Apply

4.0 - 9.0 years

7 - 16 Lacs

mumbai

Work from Office

Job Title: BI Developer Location: Mumbai, India Department: Data and AI Reports To: SVP-Data and AI About NeuIQ NeuIQ is a new-age technology services firm specializing in solving enterprise business transformation and experience challenges through cutting-edge, AI-powered data and technology solutions. At the core of our vision is building a scalable and profitable technology implementation business with data engineering as its foundation. Our expertise lies in implementing enterprise SaaS platforms such as Qualtrics, ServiceNow, Snowflake, and Databricks, enabling organizations to unlock actionable insights and maximize the value of their AI and technology investments. With innovation, int...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

10 - 20 Lacs

bengaluru

Hybrid

Position: Senior Software Developer-Analytics *** JOB DESCRIPTION *** Overview: The Senior Software Developer will work closely with product manager, Implementation Consultants (ICs) and clients to gather requirements to meet the data analysis need of a company or a client. They must have good collaboration skills. The Senior Software Developer will provide direction on analytics aspects to the team on various analytics related activities. Key Tasks & Responsibilities: Experienced in Qlik Sense Architecture design and good knowledge on load script implementation and best practices. Hands on experience in Qlik Sense development, dashboarding, data-modelling and reporting techniques. Experienc...

Posted 3 months ago

AI Match Score
Apply

10.0 - 15.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Introduction SymphonyAI is a global leader in AI-driven enterprise applications, transforming industries with cutting-edge artificial intelligence and machine learning solutions. We empower organizations across retail, CPG, financial services, manufacturing, media, enterprise IT and the public sector by delivering data-driven insights that drive business value. Headquartered in Palo Alto, California, SymphonyAI has a wide range of products and a strong global presence, with operations in North America, Southeast Asia, the Middle East, and India. The company is dedicated to fostering a high-performance culture and maintaining its position as one of the largest and fastest-growing AI portfolio...

Posted 3 months ago

AI Match Score
Apply

8.0 - 10.0 years

0 Lacs

hyderabad, telangana, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to...

Posted 3 months ago

AI Match Score
Apply

4.0 - 7.0 years

5 - 7 Lacs

chennai

Work from Office

Job Title: Developer. Work Location: CHENNAI. Skills Required: SAP Business Objects Data Services Experience Range Required: 6-8 Years. Interested candidates share your CV to, Pravallika.m@systechcorp.in Kindly mention your Current and Expected CTC and your Notice period in the mail along with your CV. Job Description: SAP BODS Developer, SAP Business Objects Data Services Strong SAP BODS development experience Strong Understanding of Dimensional modelling, Star/Snowflake Schemas. Implementation experience of various SCD Types using BODS flows. Works on building Change Data Capture or computation of Delta by various methods. Works on building Change Data Capture/Delta logic using Hash MD5 Al...

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

hyderabad

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

navi mumbai

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

mumbai

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

hyderabad, gachibowli

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

hyderabad, hitech city

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 3 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

mumbai suburban

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer (Snowflake) at our new-age, AI-first Digital & Cloud Engineering Services company, you will have the opportunity to play a crucial role in building and scaling our data infrastructure. Your primary responsibility will be to design, develop, and maintain efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. By collaborating with stakeholders, you will translate data requirements into efficient data models and pipelines to facilitate data-driven decision-making across the organization. Your key responsibilities will include designing, developing, and maintaining robust and scalable data pipelines fo...

Posted 3 months ago

AI Match Score
Apply

8.0 - 10.0 years

0 Lacs

chandigarh, india

On-site

JOB DESCRIPTION Job Summary If you are looking for an opportunity in Technology Solutions and Development, Emerson has this exciting role for you! The Senior ETL Developer - Oracle will be partof team of individuals whose responsibility is to develop ETL programs and to improve the performance of poorly written or poorly performing applications code in Oracle Data Integrator tool. This will include existing code and new code which have not yet been promoted to production. This team delivers technology solutions for strategic business needs, drives adoption of these services and support processes and boosts value by enhancing our customers experience. This role work along a hardworking and de...

Posted 3 months ago

AI Match Score
Apply

5.0 - 6.0 years

8 - 10 Lacs

bengaluru

Work from Office

We seek a professional to develop ETL pipelines with PySpark, Airflow, and Python, work with large datasets, write Oracle SQL queries, manage schemas, optimize performance, and maintain data warehouses, while guiding the team on scalable solutions.

Posted 4 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies