Jobs
Interviews

13 Dimension Tables Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will have the opportunity to work at Capgemini, a company that empowers you to shape your career according to your preferences. You will be part of a collaborative community of colleagues worldwide, where you can reimagine what is achievable and contribute to unlocking the value of technology for leading organizations to build a more sustainable and inclusive world. Your Role: - You should have a very good understanding of current work, tools, and technologies being used. - Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python are required. - Experience with Fact and Dimension tables, SCD is necessary. - Minimum 3 years of experience in GCP Data Engineering is mandatory. - Proficiency in Java/ Python/ Spark on GCP, with programming experience in Python, Java, or PySpark, SQL. - Hands-on experience with GCS (Cloud Storage), Composer (Airflow), and BigQuery. - Ability to work with handling big data efficiently. Your Profile: - Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. - Experience in pipeline development using Dataflow or Dataproc (Apache Beam etc). - Familiarity with other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions, etc. - Possess proven analytical skills and a problem-solving attitude. - Excellent communication skills. What you'll love about working here: - You can shape your career with a range of career paths and internal opportunities within the Capgemini group. - Access to comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - Opportunity to learn on one of the industry's largest digital learning platforms with access to 250,000+ courses and numerous certifications. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini leverages its over 55-year heritage to unlock the value of technology for clients across the entire breadth of their business needs. The company delivers end-to-end services and solutions, combining strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, generative AI, cloud, and data, along with deep industry expertise and a strong partner ecosystem.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

panchkula, haryana

On-site

You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of experience in data integration, warehousing, and analytics. You possess deep technical expertise in ETL tools, strong data modeling knowledge, and the ability to lead complex data engineering projects from design to deployment. - 4+ years of hands-on experience with ETL tools like SSIS, Informatica, DataStage, or Talend. - Proficient in relational databases such as SQL Server and MySQL. - Strong understanding of Data Mart/EDW methodologies. - Experience in designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Knowledge of reporting and analytics tools like Tableau and Power BI. - Scripting and programming proficiency using Python. - Familiarity with cloud platforms such as AWS or Azure. - Ability to lead recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Experience with cross-functional and geographically distributed teams. - Ability to translate complex data problems into actionable insights. - Strong communication and client management skills. - Self-starter with a collaborative attitude and problem-solving mindset. You will be responsible for: - Delivering high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components, ensuring adherence to standards and best practices. - Owning delivery quality and timeliness across one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical problem-solving guidance and support to junior team members. - Driving QA for deliverables and validating progress against project timelines. - Leading issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing across data engineering efforts. Please note the contact details: Email: careers@grazitti.com Address: Grazitti Interactive LLP (SEZ Unit), 2nd Floor, Quark City SEZ, A-40A, Phase VIII Extn., Mohali, SAS Nagar, Punjab, 160059, India,

Posted 3 days ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

noida

Work from Office

Role: UFT Tester Skills: UFT, VB script, STLC Good to have: ALM/ QC, Jenkins Experience: 4-6 years Location: Noida only Key Responsibilities: Design, develop, and maintain automated test scripts using UFT Strong experience with data-driven testing using Excel (e.g., DataTable operations, parameterization) Perform database testing by writing and executing SQL queries for validation Create and maintain object repositories and function libraries in UFT Analyze test results, log defects, and work closely with the development team for resolution Optimize and refactor test scripts for performance and reusability. Contribute to continuous improvement of test processes and automation strategy Familiarity with descriptive programming in UFT Ability to debug and troubleshoot automation script failures Mandatory Skills Hands-on experience with UFT/QTP automation scripting Strong knowledge of VBScript (UFT scripting language) Experience in data-driven testing using Excel/DataTables Understanding of object repository, function libraries, and descriptive programming Familiarity with defect tracking and test management tools (e.g., ALM/QC) Good understanding of software testing life cycle (STLC) and test methodologies Good to Have Skills: Knowledge of version control tools like Git Exposure to CI/CD tools (e.g., Jenkins) Basic knowledge of API testing Ability to build or enhance UFT automation frameworks Familiarity with integration of UFT with ALM

Posted 3 weeks ago

Apply

10.0 - 14.0 years

30 - 37 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

punjab

On-site

You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Hybrid

POWER BI/SQL EXPERT - Relevant - 4 to 6 years Key Responsibilities & Must Skills Required: Power BI Expertise: Strong hands-on experience with Power BI using Import Mode and Direct Query. Skilled in DAX, report/dashboard creation, data modeling, and building dimension and fact tables. Proven ability to optimize and manage large datasets for performance in Power BI. Overall, 7-8 years of experience, with 45 years specifically in Power BI. SQL & Database Skills: Proficient in SQL, with secondary expertise in Oracle SQL. Ability to manipulate data, apply filters, and extract meaningful insights. Communication & Stakeholder Management: Strong communication skills required to coordinate directly with the Product Owner Comfortable working in client-facing roles within the Accounts domain.

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 3 Lacs

Bengaluru

Remote

If you are passionate about Snowflake, data warehousing, and cloud-based analytics, we'd love to hear from you! Apply now to be a part of our growing team. Perks and benefits Intersected candidates can go through the below link to apply directly and can complete the 1st round of technical discussion https://app.hyrgpt.com/candidate-job-details?jobId=67ecc88dda1154001cc8b88f Job Summary: We are looking for a skilled Snowflake Engineer with 3-10 years of experience in designing and implementing cloud-based data warehousing solutions. The ideal candidate will have hands-on expertise in Snowflake architecture, SQL, ETL pipeline development, and performance optimization. This role requires proficiency in handling structured and semi-structured data, data modeling, and query optimization to support business intelligence and analytics initiatives. The ideal candidate will work on a project for one of our key Big4 consulting customer and will have immense learning opportunities Key Responsibilities: Design, develop, and manage high-performance data pipelines for ingestion, transformation, and storage in Snowflake. Optimize Snowflake workloads, ensuring efficient query execution and cost management. Develop and maintain ETL processes using SQL, Python, and orchestration tools. Implement data governance, security, and access control best practices within Snowflake. Work with structured and semi-structured data formats such as JSON, Parquet, Avro, and XML. Design and maintain fact and dimension tables, ensuring efficient data warehousing and reporting. Collaborate with data analysts and business teams to support reporting, analytics, and business intelligence needs. Troubleshoot and resolve data pipeline issues, ensuring high availability and reliability. Monitor and optimize Snowflake storage and compute usage to improve efficiency and performance. Required Skills & Qualifications: 3-10 years of experience in Snowflake, SQL, and data engineering. Strong hands-on expertise in Snowflake development, including data sharing, cloning, and time travel. Proficiency in SQL scripting for query optimization and performance tuning. Experience with ETL tools and frameworks (e.g., DBT, Airflow, Matillion, Talend). Familiarity with cloud platforms (AWS, Azure, or GCP) and integration with Snowflake. Strong understanding of data warehousing concepts, including fact and dimension modeling. Ability to work with semi-structured data formats like JSON, Avro, Parquet, and XML. Knowledge of data security, governance, and access control within Snowflake. Excellent problem-solving and troubleshooting skills. Preferred Qualifications: Experience in Python for data engineering tasks. Familiarity with CI/CD pipelines for Snowflake development and deployment. Exposure to streaming data ingestion and real-time processing. Experience with BI tools such as Tableau, Looker, or Power BI.

Posted 2 months ago

Apply

7.0 - 8.0 years

9 - 10 Lacs

Mumbai

Work from Office

This Role Includes. Leading RPA development team with a mandate for new automation opportunity finding, requirement analysis, and Shaping the solution approach for Business process transformation using RPA. Responsible for leading design, development and deployment of RPA bots for different clients. Supporting different teams for solution life cycle management on-going operational support, Process change activities etc Assist and drive the team by providing oversight and as a mentor.. Requirements. Hands-on experience in working with RE Framework. Hands-on experience in working with Data tables, argument and variables. Hands-on experience in working with selectors. Understanding of PDF automation. Hands-on experience in working and creation of Libraries. Hands-on experience in debugging, breakpoints and watch points. Understanding of Orchestrator and deployment process. Hands-on experience in error and exception handling. Analysis of business requirement and effort estimation.. UiPath Developer Certification. Understanding of Abbyy Integration. Experience in .Net language. Understanding of Machine Learning with Python programming. Hands-on experience in PDF automation. Strong working knowledge of SQL and relational databases. Experience in Citrix automation. Experience in using Regex. Understanding of integration with APIs. Experience in image automation. Experience in document understanding. Understanding of machine learning models and its capabilities in UiPaths. Experience/skills Required. Overall 7 8 years of experience with minimum 4-5 years exp in RPA (preferably using UiPath). (ref:hirist.tech).

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 months ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Genesys Cloud CX Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Genesys Cloud CX.- Good To Have Skills: Experience with cloud-based application development.- Must have Genesys Architect & Developer with Data actions & integration experience.- Must have call Flow build experience with Data Tables, Call routing & Data actions in Genesys Experience.- Must have Genesys IVR development experience with Google dialogue or Genesys Flow engine bots.- Must be familiar with Genesys APl's and data.- Must be familiar with Genesys L3 support cases and escalation(s)- Strong understanding of application lifecycle management.- Familiarity with integration techniques for various business applications.- Experience in troubleshooting and optimizing application performance. Additional Information:- Must have 6+ years of working experience in Genesys with a minimum of 1-2 years in Genesys Cloud.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

9 - 11 years

37 - 40 Lacs

Ahmedabad, Bengaluru, Mumbai (All Areas)

Work from Office

Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 months ago

Apply

5 - 8 years

9 - 13 Lacs

Mumbai

Work from Office

This Role Includes Leading RPA development team with a mandate for new automation opportunity finding, requirement analysis, and Shaping the solution approach for Business process transformation using RPA Responsible for leading design, development and deployment of RPA bots for different clients Supporting different teams for solution life cycle management - on-going operational support, Process change activities etc Assist and drive the team by providing oversight and as a mentor. Requirements Hands-on experience in working with RE Framework Hands-on experience in working with Data tables, argument and variables Hands-on experience in working with selectors Understanding of PDF automation Hands-on experience in working and creation of Libraries Hands-on experience in debugging, breakpoints and watch points Understanding of Orchestrator and deployment process Hands-on experience in error and exception handling Analysis of business requirement and effort estimation. UiPath Developer Certification Understanding of Abbyy Integration Experience in .Net language Understanding of Machine Learning with Python programming Hands-on experience in PDF automation Strong working knowledge of SQL and relational databases Experience in Citrix automation Experience in using Regex Job Description Understanding of integration with APIs Experience in image automation Experience in document understanding Understanding of machine learning models and its capabilities in UiPaths Experience/skills required: Overall 7-8 years of experience with minimum 4-5 years exp in RPA (preferably using UiPath

Posted 4 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies