Jobs
Interviews

8 Snowflake Schema Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. We are inviting applications for the role of AM, SQL Developer Responsibilities Writing T-SQL Queries using joins and Sub queries in MS SQL Server. T-SQL development skills to write complex queries involving multiple tables, Joins. Experience in Extracting, Transforming and Loading (ETL) data from MS ACCESS Database to MS SQL Server. Creating & Modifying of tables, fields, constraints in MS SQL Server Creating / modifying stored procedures in MS SQL Server. Scheduling / modifying Jobs to automate in MS SQL Server. Knowledge on creating Views / Triggers / Functions Good / Basic knowledge on handling MS Access Database. Knowledge of SQL Server Analysis Services (SSAS), Integration Services (SSIS) and Reporting Service (SSRS). Qualifications we seek in you! Minimum qualifications B.Tech /MBA/MSc/MCA Design and Develop QlikView Dashboards. Knowledge on designing Data model (Star / Snowflake Schema). KPI designing as per Business Requirements . Preferred qualifications Data fetching from QVD&rsquos & creation of QVD&rsquos. Good knowledge on handling variables in QlikView. Basic knowledge on Section Access Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 6 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibilities: Design, develop, and implement ETL processes using Informatica Power Center to extract, transform, and load data from various sources. Analyze business requirements and translate them into technical solutions. Optimize existing ETL jobs for performance, scalability, and reliability. Perform unit testing, integration testing, and support UAT phases. Ensure data accuracy, integrity, and consistency across all data sources and destinations. Create and maintain technical documentation and data flow diagrams. Collaborate with data architects, DBAs, business analysts, and QA teams. Troubleshoot production issues and provide timely resolutions. Key Skills Required: Proficiency in Informatica Power Center (v9.x or higher) Strong understanding of ETL best practices, data profiling, and performance tuning Experience working with relational databases (Oracle, SQL Server, Teradata, etc.) Solid SQL and PL/SQL skills for data validation and transformation logic Familiarity with data warehousing concepts (star/snowflake schema, SCDs, fact/dimension tables) Experience with workflow scheduling tools like Control-M, Autosys, or TWS is a plus Knowledge of Informatica Administration and version control is an advantage

Posted 2 weeks ago

Apply

8.0 - 13.0 years

35 - 45 Lacs

Noida, Pune, Bengaluru

Hybrid

Role & responsibilities Implementing data management solutions. Certified Snowflake cloud data warehouse Architect Deep understanding of start and snowflake schema, dimensional modelling. Experience in the design and implementation of data pipelines and ETL processes using Snowflake. Optimize data models for performance and scalability. Collaborate with various technical and business stakeholders to define data requirements. Ensure data quality and governance best practices are followed. Experience with data security and data access controls in Snowflake. Expertise in complex SQL, python scripting, and performance tuning. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero-copy clone, time travel, and automating them. Experience in handling semi-structured data (JSON, XML), and columnar PARQUET using the VARIANT attribute in Snowflake. Experience in re-clustering the data in Snowflake with a good understanding of Micro-Partitions. Experience in Migration processes to Snowflake from an on-premises database environment. Experience in designing and building manual or auto-ingestion data pipelines using Snowpipe. SnowSQL experience in developing stored procedures and writing queries to analyze and transform data. Must have skills - Certified Snowflake Architect, Snowflake Architecture, Snow Pipes, SnowSQL, SQL, CI/CD and Python Perks and benefits Competitive compensation package. Opportunity to work with industry leaders. Collaborative and innovative work environment. Professional growth and development opportunities.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a PySpark Data Engineer, you must have a minimum of 2 years of experience in PySpark. Strong programming skills in Python, PySpark, and Scala are preferred. It is essential to have experience in designing and implementing CI/CD, Build Management, and Development strategies. Additionally, familiarity with SQL and SQL Analytical functions is required, along with participation in key business, architectural, and technical decisions. There is an opportunity for training in AWS cloud technology. In the role of a Python Developer, a minimum of 2 years of experience in Python/PySpark is necessary. Strong programming skills in Python, PySpark, and Scala are preferred. Experience in designing and implementing CI/CD, Build Management, and Development strategies is essential. Familiarity with SQL and SQL Analytical functions and participation in key business, architectural, and technical decisions are also required. There is a potential for training in AWS cloud technology. As a Senior Software Engineer at Capgemini, you should have over 3 years of experience in Scala with a strong project track record. Hands-on experience in Scala/Spark development and SQL writing skills on RDBMS (DB2) databases are crucial. Experience in working with different file formats like JSON, Parquet, AVRO, ORC, and XML is preferred. Previous involvement in a HDFS platform development project is necessary. Proficiency in data analysis, data profiling, and data lineage, along with strong oral and written communication skills, is required. Experience in Agile projects is a plus. For the position of Data Modeler, expertise in data structures, algorithms, calculus, linear algebra, machine learning, and modeling is essential. Knowledge of data warehousing concepts such as Star schema, snowflake, or data vault for data mart or data warehousing is required. Proficiency in using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models is necessary. Hands-on knowledge and experience with tools like PL/SQL, PySpark, Hive, Impala, and other scripting tools are preferred. Experience with Software Development Lifecycle using the Agile methodology is essential. Strong communication and stakeholder management skills are crucial for this role. In this role, you will design, develop, and optimize PL/SQL procedures, functions, triggers, and packages. You will also write efficient SQL queries, joins, and subqueries for data retrieval and manipulation. Additionally, you will develop and maintain database objects such as tables, views, indexes, and sequences. Optimizing query performance and troubleshooting database issues to improve efficiency are key responsibilities. Collaboration with application developers, business analysts, and system architects to understand database requirements is essential. Ensuring data integrity, consistency, and security within Oracle databases is also a crucial aspect of the role. Developing ETL processes and scripts for data migration and integration are part of the responsibilities. Documenting database structures, stored procedures, and coding best practices is required. Staying up-to-date with Oracle database technologies, best practices, and industry trends is essential for success in this role.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

The OBIEE developer/engineer is in charge of developing or modifying OBIEE reports according to the technical specifications in order to bring the added expected value to the Business. Production or UAT environments supports activities are also part of the position (job monitoring, issues solving, ) Responsibilities Direct Responsibilities - Participate to all the Agile ceremonies of the squad (Dailys, Sprint plannings, backlog refinements, reviews, etc) - Communicate ASAP on the blocking points - Estimate and develop, according to the company standards, the functionalities according to the Jira requirements (Change, bug fixing, ). - Do the unit tests of the code developed in order to deliver the code for user acceptance test. Contributing Responsibilities - Accountable to deliver the amount of jira tickets assigned to him/her during a sprint - Accountable to do the daily supports on the inbound flows (jobs) and user interface in the perimeter of the Risk & billing or Securitization squad - Accountable to contribute to the Engineer Chapter events and OBIEE Tribe community (Guild with the Engineer Techlead) Technical & Behavioral Competencies Be autonomous on developments (Senior to expert level) and motivated on the daily tasks Be an active member of the squad. Have a proper communication with the Business (English language) and willing to do user supports. Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Attention to detail / rigor Adaptability Creativity & Innovation / Problem solving Ability to deliver / Results driven Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 5 years BI Knowledge OBIEE platform

Posted 1 month ago

Apply

12.0 - 14.0 years

12 - 14 Lacs

Hyderabad, Bengaluru

Hybrid

Bachelors or master’s degree in computer science, Engineering, or a related field. 10+ years of overall experience and 8+ years of relevant in Data bricks, DLT, Py spark and Data modelling concepts-Dimensional Data Modelling (Star Schema, Snowflake Schema) Proficiency in programming languages such as Python, Py spark, Scala, SQL. Proficiency in DLT Proficiency in SQL Proficiency in Data Modelling concepts - Dimensional Data Modelling (Star Schema, Snowflake Schema) Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Noida

Work from Office

Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies