Jobs
Interviews

89 Star Schema Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior SQL Developer at our company, you will play a crucial role in our BI & analytics team by expanding and optimizing our data and data queries. Your responsibilities will include optimizing data flow and collection for consumption by our BI & Analytics platform. You should be an experienced data querying builder and data wrangler with a passion for optimizing data systems from the ground up. Collaborating with software developers, database architects, data analysts, and data scientists, you will support data and product initiatives and ensure consistent optimal data delivery architecture across ongoing projects. Your self-directed approach will be essential in supporting the data needs of multiple systems and products. If you are excited about enhancing our company's data architecture to support our upcoming products and data initiatives, this role is perfect for you. Your essential functions will involve creating and maintaining optimal SQL queries, Views, Tables, and Stored Procedures. By working closely with various business units such as BI, Product, and Reporting, you will contribute to developing the data warehouse platform vision, strategy, and roadmap. Understanding physical and logical data models and ensuring high-performance access to diverse data sources will be key aspects of your role. Encouraging the adoption of organizational frameworks through documentation, sample code, and developer support will also be part of your responsibilities. Effective communication of progress and effectiveness of developed frameworks to department heads and managers will be essential. To be successful in this role, you should possess a Bachelor's or Master's degree or equivalent combination of education and experience in a relevant field. Proficiency in T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL, and ETL is required. Experience in creating Tables, Views, and Stored Procedures is crucial. Familiarity with BI and Reporting Platforms, industry trends, and knowledge of multiple database platforms like SQL Server and MySQL are necessary. Proficiency in Source Control and Project Management tools such as Azure DevOps, Git, and JIRA is expected. Experience with SonarQube for clean coding T-SQL practices and DevOps best practices will be advantageous. Applicants must have exceptional written and spoken communication skills and strong team-building abilities to contribute to making strategic decisions and advising senior management on technical matters. With at least 5+ years of experience in a data warehousing position, including working as a SQL Developer, and experience in system development lifecycle, you should also have a proven track record in data integration, consolidation, enrichment, and aggregation. Strong analytical skills, attention to detail, organizational skills, and the ability to mentor junior colleagues will be crucial for success in this role. This full-time position requires flexibility to support different time zones between 12 PM IST to 9 PM IST, Monday through Friday. You will work in a Hybrid Mode and spend at least 2 days working from the office in Hyderabad. Occasional evening and weekend work may be expected based on client needs or job-related emergencies. This job description may not cover all responsibilities and duties, which may change with or without notice.,

Posted 18 hours ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,

Posted 3 days ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Senior Power BI Developer About Moss Adams (now Baker Tilly) Moss Adams, which is part of the Baker Tilly family of companies, a US top 6th accounting and tax advisory firm, is a fully integrated professional services firm dedicated to assisting clients with growing, managing, and protecting prosperity. With more than 11,000 professionals across more than 140 countries globally, we work with many of the worlds most innovative companies and leaders. Our strength in the middle market enables us to advise clients at all intervals of development from start-up to rapid growth and expansion, to transition. Moss Adams, which is part of the Baker Tilly family of companies, also serves international customers, companies who are headquartered in the US with an international presence. We leverage these connections in over 140 countries, including India, to serve such international customers. By being a midsized firm with the same depth of experience as the Big 4, Moss Adams, which is part of the Baker Tilly family of companies, is a unique, combination of experienced yet nimble, has a flat structure, is highly employee-oriented, and remains focused on delivering higher value to customers. In June 2021, Moss Adams, which is part of the Baker Tilly family of companies, established an office in India, expanding our existing and future workforce by building a global talent network to increase our capabilities, productivity, and innovation. We recently celebrated the milestone of around 400+ team members! Weve grown from 20 to 400 in just couple of years, and were looking ahead as we continue building a global talent network. At Moss Adams India, which is part of the Baker Tilly family of companies, we believe in the power of possible to empower our clients and people to pursue success however they define it. General Summary As a Senior Power BI Developer, you will design and develop interactive dashboards by transforming business needs into data-driven visual stories. Youll work with clients and internal teams to gather requirements, perform data analysis, and ensure accurate, optimized, and insightful reporting using Power BI. Responsibilities: ¢ ¢ ¢ ¢ ¢ Collaborate with clients and internal teams to understand and translate business needs, processes, and challenges into effective data visualizations and dashboards. Design, develop, and deploy interactive Power BI reports and dashboards that combine data and insights to tell compelling stories visually. Gather and analyze requirements, ensuring alignment with client goals and business objectives. Extract, transform, and load (ETL) data from various sources to create cohesive and accurate datasets for analysis. Perform data analysis to identify trends, insights, and actionable recommendations for clients. Maintain and optimize existing Power BI reports, ensuring data accuracy and performance. Provide training and support to clients and team members on Power BI tools and best practices. ¢ ¢ Collaborate with data engineers and architects to ensure data models are optimized for reporting and analysis. Stay updated with the latest Power BI features and industry trends to enhance the quality of deliverables. Qualifications ¢ ¢ Bachelors degree required. Minimum of 3 years experience in developing Power BI reports and dashboards in a consulting or client facing environment. ¢ ¢ Strong understanding of star schema dimensional modeling, including fact and dimension tables, to design efficient and effective datasets in Power BI. Holds certifications in Power BI or related tools and technologies Strong knowledge of DAX (Data Analysis Expressions) and Power Query. Proficiency in SQL and working with relational databases. Familiarity with C# and/or Python to support data integration, scripting, or custom visualization needs. Knowledge of cloud platforms such as Azure, AWS, or Snowflake. ¢ ¢ Familiarity with key external tools DAX Studio, Tabular Editor, and Power BI Report Builder which provide additional functionality to Power BI to improve productivity and enhance the Power BI experience for clients. Familiarity with Agile project management methodologies. Experience with advanced analytics techniques, including predictive modeling and machine learning. ¢ ¢ Demonstrated ability to interpret and communicate complex data insights in a clear and concise manner. Excellent problem-solving skills and attention to detail. Strong interpersonal and communication skills, with the ability to collaborate effectively with diverse stakeholders. Here, youll be challenged and rewarded for leadership, technical excellence, and inspired perspectives. Thats why we offer opportunities to build your skills and explore your career in a supportive environment. At MossAdams India, which is part of the Baker Tilly family of companies, where you take your career is up to you. Moss Adams, which is part of the Baker Tilly family of companies is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation, gender identity or any other characteristic protected by law

Posted 4 days ago

Apply

5.0 - 7.0 years

18 - 20 Lacs

Pune

Work from Office

Critical Skills to Possess: 5+ years of experience in data engineering or ETL development. 5+ years of hands-on experience with Informatica. Experience in production support , handling tickets, and monitoring ETL systems. Strong SQL skills with experience in querying large datasets. Familiarity with data warehousing concepts and design (e.g., star schema, snowflake schema). Experience with relational databases such as Oracle, SQL Server, or PostgreSQL. Knowledge of cloud platforms such as AWS, Azure, or GCP is a plus. Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Roles and Responsibilities Roles and Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica . Work with data architects and business stakeholders to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based systems. Optimize and troubleshoot existing Informatica workflows for performance and reliability. Monitor ETL workflows and proactively address failures, performance issues, and data anomalies. Respond to and resolve support tickets related to data loads, ETL job failures, and data discrepancies. Provide support for production data pipelines and jobs Ensure data quality and consistency across different systems and pipelines. Implement data validation, error handling, and auditing mechanisms within ETL processes. Collaborate with data analysts, data scientists, and other engineers to ensure a consistent and accurate data platform. Maintain documentation of ETL processes, data flows, and technical designs. Monitor daily data loads and resolve any ETL failures or data quality issues.

Posted 4 days ago

Apply

5.0 - 6.0 years

20 - 25 Lacs

Chennai

Work from Office

Mandatory requirements : A minimum of 5 years of hands-on Snowflake experience Overall experience minimum 6 Years Proven expertise in query and performance optimization Strong background in medallion architecture and star schema design Demonstrated experience building scalable data warehouses (not limited to ingesting data from flat files) Good To Have: SnowPro Core Certification SnowPro Advanced certifications in Data Engineering, Data Analysis, or Architecture are highly desirable

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning, and modeling. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands-on knowledge and experience with tools and techniques for analysis, data manipulation, and presentation (e.g. PL/SQL, PySpark, Hive, Impala, and other scripting tools). Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence). Expertise in conceptual modeling; ability to see the big picture and envision possible solutions. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. We're committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events, yoga challenges, or marathons. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

6.0 - 9.0 years

18 - 27 Lacs

Bangalore Rural, Gurugram, Bengaluru

Work from Office

Data Modelling,Star/Snowflake schema, Normalisation/Denormalization Snowflake,Schema design,Performance tuning, Time Travel, Streams & Tasks,Secure & Materialised Views,SQL & Scripting: Advanced SQL (CTEs, Window Functions), Automation & optimisation

Posted 1 week ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As an Oracle Data Integrator (ODI) Professional at YASH Technologies, you will play a key role in providing prompt and effective support, maintenance, and development on OBIA based Analytics Datawarehouse using ODI as the underlying ETL Tool. Your responsibilities will include implementation, development, and maintenance of the ODI environment, data warehouse design, dimensional modeling, ETL development & support, as well as ETL performance tuning. You will be responsible for solution design, implementation, migration, and support in the Oracle BI Tool stack, particularly ODI and SQL. Your tasks will involve ODI development in an OBIA Environment, enhancements, support, and performance tuning of SQL programs. Additionally, you will be involved in data warehouse design, development, and maintenance using Star Schema (Dimensional Modeling). Your role will also encompass production support of daily running ETL loads, monitoring, troubleshooting failures, bug fixing across environments, and working with various data sources including Oracle, CRM, Cloud, Flat Files, Sharepoint, and other non-Oracle systems. Experience in performance tuning of mappings in ODI and SQL query tuning will be essential. To succeed in this role, you should have 5-7+ years of relevant experience working in OBIA on ODI as the ETL tool in a BIAPPS environment. Strong written and oral communication skills, the ability to work in a demanding user environment, and knowledge of tools like Serena Business Manager and ServiceNow are crucial. A B.Tech / MCA qualification is required, and competencies such as being tech-savvy, effective communication, optimizing work processes, and cultivating innovation are essential. At YASH, you will have the opportunity to create a career path in an inclusive team environment that supports continuous learning and development. Our Hyperlearning workplace is grounded on principles such as flexible work arrangements, agile self-determination, trust, transparency, and stable employment with an ethical corporate culture. Join us at YASH Technologies and be part of a team that fosters positive changes in an ever-evolving virtual world.,

Posted 1 week ago

Apply

4.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Developer Tableau and SQL. Responsibilities You will work on CISO multi-functional Tableau reports that interface with several applications. You will report to a senior developer who has been working on the project for a few years. Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs is required for this role Strong Tableau Desktop Development Skills are required. Skills Must have Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs Expertise in developing SQL Stored Procedures, Functions & Views Experience in developing ETL processes to extract and transform data using SQL (aggregations, filtering, data cleansing) and loading data into database Familiarity working in Microsoft SQL Server Familiarity with data modeling concepts (star schema) Tableau Desktop Development Skills: Expertise in developing complex, interactive dashboards in Tableau incorporating filters, parameters, and actions Experience in building user-friendly dashboards with menus, tooltips, drill-down and drill-through capabilities Ability to create calculated fields, custom aggregations, table calculations and LOD expressions Knowledge of optimizing Tableau dashboards for performance Understanding of user access & user groups creation & management in Tableau Nice to have insurance domain

Posted 1 week ago

Apply

8.0 - 12.0 years

10 - 20 Lacs

Gurugram

Work from Office

Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures . Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion , or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake . Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema , normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake : Architecture design, performance tuning, cost optimization. Strong proficiency in SQL , Python , and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion , or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms : AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks . Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI , or Looker . Certifications (Preferred/Required): Snowflake SnowPro Core Certification Required or Highly Preferred SnowPro Advanced Architect Certification – Preferred Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 1 week ago

Apply

7.0 - 10.0 years

15 - 20 Lacs

Hyderabad, Pune, Chennai

Work from Office

Role: Data Engineer. Work Type: C2H (Contract to Hire). Location: Remote (Overlap 3-5 hours CST). Experience: 7+ Years. Notice Period: Immediate -15 days. Skills Required: Expert in Azure Data Factory Proven experience in Data Modelling for Manufacturing data sources Proficient SQL design 7+ years of experience in Data Engineering roles Proven experience in PBI: Dashboarding, DAX calculations, Star Scheme development and semantic model building Manufacturing knowledge Experience with GE ppa as data source is desirable Ge PPA is - GE plant applications - what the client uses to run their plant machines. ADF will be just fine instead of PPA API dev Knowledge Python skills Please share your updated CV to sravani.n@skilviu.com or contact +91 70754 98530

Posted 1 week ago

Apply

8.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Remote

US Shifts(night shift) 8+ yrs in Data Engineering, expert in ADF, SQL & Power BI (DAX, Star schema, dashboards, semantic models). Strong in data modeling (manufacturing), Python, and API dev. GE PPA exp a plus.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Remote

Client is 5+years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio.The ideal candidate will possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data Ensure data quality in a big data environment Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach Desirable to have Talend / Snowflake Certification Excellent SQL coding skills, excellent communication, and documentation skills Familiar with Agile delivery process. Must be analytical, creative, and self-motivate Work effectively within a global team environment Excellent communication skills.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Pune

Work from Office

: Job TitleSenior Engineer Data SQL Engineer Corporate TitleAVP LocationPune, India Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience 10+ years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a PySpark Data Engineer, you must have a minimum of 2 years of experience in PySpark. Strong programming skills in Python, PySpark, and Scala are preferred. It is essential to have experience in designing and implementing CI/CD, Build Management, and Development strategies. Additionally, familiarity with SQL and SQL Analytical functions is required, along with participation in key business, architectural, and technical decisions. There is an opportunity for training in AWS cloud technology. In the role of a Python Developer, a minimum of 2 years of experience in Python/PySpark is necessary. Strong programming skills in Python, PySpark, and Scala are preferred. Experience in designing and implementing CI/CD, Build Management, and Development strategies is essential. Familiarity with SQL and SQL Analytical functions and participation in key business, architectural, and technical decisions are also required. There is a potential for training in AWS cloud technology. As a Senior Software Engineer at Capgemini, you should have over 3 years of experience in Scala with a strong project track record. Hands-on experience in Scala/Spark development and SQL writing skills on RDBMS (DB2) databases are crucial. Experience in working with different file formats like JSON, Parquet, AVRO, ORC, and XML is preferred. Previous involvement in a HDFS platform development project is necessary. Proficiency in data analysis, data profiling, and data lineage, along with strong oral and written communication skills, is required. Experience in Agile projects is a plus. For the position of Data Modeler, expertise in data structures, algorithms, calculus, linear algebra, machine learning, and modeling is essential. Knowledge of data warehousing concepts such as Star schema, snowflake, or data vault for data mart or data warehousing is required. Proficiency in using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models is necessary. Hands-on knowledge and experience with tools like PL/SQL, PySpark, Hive, Impala, and other scripting tools are preferred. Experience with Software Development Lifecycle using the Agile methodology is essential. Strong communication and stakeholder management skills are crucial for this role. In this role, you will design, develop, and optimize PL/SQL procedures, functions, triggers, and packages. You will also write efficient SQL queries, joins, and subqueries for data retrieval and manipulation. Additionally, you will develop and maintain database objects such as tables, views, indexes, and sequences. Optimizing query performance and troubleshooting database issues to improve efficiency are key responsibilities. Collaboration with application developers, business analysts, and system architects to understand database requirements is essential. Ensuring data integrity, consistency, and security within Oracle databases is also a crucial aspect of the role. Developing ETL processes and scripts for data migration and integration are part of the responsibilities. Documenting database structures, stored procedures, and coding best practices is required. Staying up-to-date with Oracle database technologies, best practices, and industry trends is essential for success in this role.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

27 - 35 Lacs

Chennai

Work from Office

• Experience data modeling for OLTP. OLAP, Star schema, Snowflake. • SAP HANA Studio, SAP XSC/XSA, SAP SLT, SAP Data Services • SAP Business Objects Data Services, SSIS, Power BI, Business Objects • Design and develop data models using SAP XSC /XSA

Posted 4 weeks ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

Must Have: Data modeling techniques - star schema, Data analysis, requirements' gathering experience

Posted 4 weeks ago

Apply

8.0 - 13.0 years

16 - 22 Lacs

Pune

Hybrid

Data Architect / Modeler About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Groups partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VOIS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Location : Pune Working Persona : Hybrid Experience : 9 to 14 years Must-Have Skills: Proficiency in Data Vault 2.0 and dimensional modeling. Strong experience with data profiling and source-to-target mapping. Expertise in working with GCP BigQuery environments. Solid understanding of data governance principles. Strong SQL hands-on. Good-to-Have Skills: Experience with Telecom Domain and systems - CRM, billing, mediation, provisioning. Performance Tuning in BigQuery – Partitioning, clustering, cost optimization. Telecom Analytics Use Cases – Churn prediction, fraud detection, network optimization. Working knowledge of Power Designer/ Erwin (or similar tools) for Data Model creation and maintenance. Role purpose: The Data & Analytics (D&A) team leads the design and delivery of the overall UK Data Strategy on behalf of Vodafone UK. The team is the driving force in maximising incremental value from data through the design & delivery of ‘Valuable’, ‘Accessible’ & ‘Trusted’ data. Now is a really exciting time to join the team as we begin our data transformation journey to implement this agreed data strategy working across all of Vodafone UK. DATA MODELLING & ARCHITECTURE: Working as part of a team you will help to define, develop and maintain the Data Architecture blue prints for implementing TRUSTED DATA. You will be responsible for defining, developing and maintaining data models for the GCP single physical layer, data models, semantic layer, and data products. Working with multiple IT partners and domain architects to understand data in Vodafone IT systems and design, map and model GCP data integration, enabling value through business leading GCP capabilities, fit for the business, focusing on a “Create once, use many times” TRUSTED capability approach. Working with the Data Innovation Lead, you will ensure all Data Products have the required capabilities, and all attributes currently produced in multiple locations across disperate reports are created within the data models/semantic layer as a core entity capability, reusable for all downstream capability. Data Architects, Data Governance & Product Owner teams will work together to document tech debt & historical work arounds which require re-engineering in the Data Warehouse or source systems to enable better, trusted data in the data models to support data products, which will accordingly influence the data improvements roadmap. Core competencies, knowledge and experience : 1. Excellent technical, data and architectural understanding & ability to work with a wide variety of disciplines/platforms - Strong working knowledge of Data Warehousing concepts. An ability to technically understand core IT transactional and CRM platforms and interpret source system generated data into actionable warehouse data through to business data in Data Products 2. Demonstrable knowledge and experience in creating and maintaining data models in data & analytics environments 3. Ability to engage, and work with stakeholders across locations and business teams to shape the delivery of valuable outcomes 4. Experience of working in Agile and creating Value for the business through Cloud based data architecture solutions. Key accountabilities and decision ownership 1. Data Modelling: Defining, developing, maintaining and socialising good documented and accessible data models, information models, and semantic models which are fit for purpose to enable data capabilities for TRUSTED DATA 2. Supporting excellence across D&A by championing the implementation of the Data Strategy TRUSTED blueprints, policies and frameworks 3. Supporting the implementation and ongoing maintenance of roadmaps and opportunities for new data or data improvement. 4. Support Data By Design: Providing input of data inadequacies, modelling constraints and missing IT attributes/capabilities affected TRUSTED DATA VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills!

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Noida

Work from Office

Must be: Bachelors or Masters degree in Computer Science, Information Technology, or a related discipline. 35+ years of experience in SQL Development and Data Engineering . Strong hands-on skills in T-SQL , including complex joins, indexing strategies, and query optimization. Proven experience in Power BI development, including building dashboards, writing DAX expressions, and using Power Query . Should be: At least 1+ year of hands-on experience with one or more components of the Azure Data Platform : Azure Data Factory (ADF) Azure Databricks Azure SQL Database Azure Synapse Analytics Solid understanding of data warehouse architecture , including star and snowflake schemas , and data lake design principles. Familiarity with: Data Lake and Delta Lake concepts Lakehouse architecture Data governance , data lineage , and security controls within Azure

Posted 1 month ago

Apply

3.0 - 6.0 years

8 - 12 Lacs

Noida, Hyderabad, Gurugram

Work from Office

Mandatory Skills- Power BI Desktop & Service (report development, publishing, workspace management) Advanced DAX (complex calculations and measures) Power Query (M language) (data transformation and cleansing) Data Modeling (star schema, snowflake schema, relationships) Row-Level Security (RLS) and role-based access control Integration with multiple data sources (SQL, Azure Data Lake, Excel, APIs) Performance Optimization (query folding, refresh schedules, rendering efficiency) ETL Collaboration (working with data engineering teams and pipelines) Governance & Lifecycle Management (version control, audit trails) Migration of legacy reports to Power BI Requirement Gathering & Stakeholder Collaboration Data Quality & Governance Alignment Detailed JD- Lead the design, development, and implementation of Power BI dashboards and reports to provide actionable insights for business stakeholders. Collaborate with business users, data analysts, and other stakeholders to gather reporting requirements and translate them into comprehensive data visualization solutions. Develop and maintain complex DAX calculations, measures, and Power BI data models to support efficient data analysis and reporting. Ensure data accuracy and consistency by designing robust data models, integrating multiple data sources such as SQL databases, Azure Data Lake, Excel, and other APIs. Optimize Power BI performance, including query efficiency, data refresh schedules, and report rendering, ensuring minimal delays in accessing real-time data. Implement Power BI security roles, row-level security (RLS), and access controls to ensure secure and role-based data access for users. Guide the deployment of Power BI reports and dashboards using Power BI Service, integrating with data pipelines and automating workflows as necessary. Provide leadership and mentorship to junior Power BI developers, offering technical guidance and promoting best practices in report design, data modeling, and performance optimization. Work closely with data engineering teams to integrate Power BI with ETL processes, data warehouses, and cloud environments such as Azure or AWS. Implement Power BI governance, including version control, report lifecycle management, and audit trails, ensuring consistency and compliance across reporting environments. Lead the migration of legacy reporting solutions to Power BI, ensuring a smooth transition with minimal business disruption. Collaborate with data governance teams to ensure that Power BI reports align with data quality standards, business glossaries, and metadata management practices. Utilize Power Query to transform and clean data before loading it into Power BI, ensuring readiness for analysis and visualization. Lead data storytelling efforts by creating compelling visuals, including interactive dashboards, KPIs, and drill-down capabilities that help stakeholders make informed decisions. Stay updated on the latest Power BI features, updates, and best practices, incorporating new functionalities into existing solutions to enhance reporting capabilities. Provide end-user training and support to ensure stakeholders can effectively use Power BI reports and dashboards for self-service analytics. Oversee the integration of Power BI with other tools like Power Automate and Power Apps to create a seamless data and reporting ecosystem. Drop your resume at Aarushi.Shukla@coforge.com

Posted 1 month ago

Apply

6.0 - 8.0 years

4 - 8 Lacs

Noida

Work from Office

We are looking for a skilled Senior Data Warehouse Analyst to join our team at Apptad Technologies Pvt Ltd. The ideal candidate will have 6 to 8 years of experience in data analysis and management, with expertise in working with large datasets. Roles and Responsibility Design, develop, and implement data warehouse solutions to meet business requirements. Analyze complex data sets to identify trends, patterns, and insights. Develop and maintain databases, data models, and ETL processes. Collaborate with cross-functional teams to integrate data from various sources. Ensure data quality, integrity, and security. Optimize database performance and troubleshoot issues. Job Strong knowledge of data warehousing concepts, including star schema design. Experience with relational databases such as Oracle or SQL Server. Proficiency in programming languages like Python or R. Excellent analytical and problem-solving skills. Ability to work independently and collaboratively as part of a team. Strong communication and interpersonal skills. Ref6566417

Posted 1 month ago

Apply

7.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Your key responsibilities Your Role - What Youll Do As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases. Key Responsibilities: Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience Skills Youll Need : Must Have: 8+ years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Educational Qualifications Bachelors degree in Computer Science/Engineering or relevant technology & science Technology certifications from any industry leading cloud providers

Posted 1 month ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 1 month ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies