Jobs
Interviews

148 Star Schema Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

8 - 11 Lacs

navi mumbai

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

hyderabad, gachibowli

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

mumbai

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

hyderabad, hitech city

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

mumbai suburban

Work from Office

- Data Processing: BigQuery, Apache Spark, Hadoop, Dataflow - BI Tools: Tableau, Power BI, Looker - Languages: Python, SQL, Java, Scala - ETL Tools: Apache Nifi, Talend, Informatica, Dataform - Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage) - Data Modeling: Kimball, Star Schema, Snowflake Schema - Version Control: Git, GitLab

Posted 2 weeks ago

Apply

11.0 - 20.0 years

30 - 45 Lacs

pune, chennai, bengaluru

Work from Office

Data Architecture experience in Data Warehouse, Snowflake+ DBT Snowflake + snowflake advanced certification Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Essential job tasks Data Architecture experience in Data Warehouse, Snowflake. Oversees and designs the information architecture for the data warehouse, including all information structures i.e. staging area, data warehouse, data marts, operational data stores, oversees standardization of data definition, Oversees development of Physical and Logical modelling. Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc. Significant experience in working as a Data Architect with depth in data integration and data architecture for Enterprise Data Warehouse implementations (conceptual, logical, physical & dimensional models). Maintain in-depth and current knowledge of the Cloud architecture, Data lake, Data warehouse, BI platform, analytics models platforms and ETL tools Pls share your resume at parul@mounttalent.com

Posted 2 weeks ago

Apply

10.0 - 15.0 years

7 - 11 Lacs

bengaluru

Work from Office

About the Role We are looking for a Data Warehouse Engineer with strong expertise across the Azure Data Platform to design, build, and maintain modern data warehouse and analytics solutions. This role requires hands-on experience with Azure Synapse Analytics, Data Factory, Data Lake, Azure Analysis Services, and Power BI . The ideal candidate will ensure seamless data ingestion, storage, transformation, analysis, and visualization, enabling the business to make data-driven decisions. Key Responsibilities Data Ingestion & Orchestration 10-15years of experience in designing and building scalable ingestion pipelines using Azure Data Factory . Integrate data from multiple sources (SQL Server, relational databases, Azure SQL DB, Cosmos DB, Table Storage). Manage batch and real-time ingestion into Azure Data Lake Storage . Data Storage & Modelling Develop and optimize data warehouse solutions in Azure Synapse Analytics . Implement robust ETL/ELT processes to ensure data quality and consistency. Create data models for analytical and reporting needs. Data Analysis & Security Build semantic data models using Azure Analysis Services for enterprise reporting. Collaborate with BI teams to deliver well-structured datasets for reporting in Power BI . Implement Azure Active Directory for authentication, access control, and security best practices. Visualization & Business Support Support business teams in building insightful Power BI dashboards and reports . Translate business requirements into scalable and optimized BI solutions. Provide data-driven insights in a clear, business-friendly manner. Optimization & Governance Monitor system performance and optimize pipelines for efficiency and cost control. Establish standards for data governance, data quality, and metadata management . Qualifications & Skills Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field. Proven experience as a Data Warehouse Engineer / Data Engineer with strong expertise in: Azure Synapse Analytics Azure Data Factory Azure Data Lake Storage Azure Analysis Services Azure SQL Database / SQL Server Power BI (reporting & dashboarding) Strong proficiency in SQL and data modelling (star schema, snowflake schema, dimensional modelling). Knowledge of Azure Active Directory for authentication & role-based access control. Excellent problem-solving skills and ability to optimize large-scale data solutions. Strong communication skills to collaborate effectively with both technical and business stakeholders.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

11 - 15 Lacs

hyderabad

Work from Office

Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Developer Tableau and SQL. Responsibilities You will work on CISO multi-functional Tableau reports that interface with several applications. You will report to a senior developer who has been working on the project for a few years. Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs is required for this role Strong Tableau Desktop Development Skills are required. Skills Must have Proficiency in writing complex SQL queries involving multiple joins (inner, outer, cross), subqueries, and CTEs Expertise in developing SQL Stored Procedures, Functions & Views Experience in developing ETL processes to extract and transform data using SQL (aggregations, filtering, data cleansing) and loading data into database Familiarity working in Microsoft SQL Server Familiarity with data modeling concepts (star schema) Tableau Desktop Development Skills: Expertise in developing complex, interactive dashboards in Tableau incorporating filters, parameters, and actions Experience in building user-friendly dashboards with menus, tooltips, drill-down and drill-through capabilities Ability to create calculated fields, custom aggregations, table calculations and LOD expressions Knowledge of optimizing Tableau dashboards for performance Understanding of user access & user groups creation & management in Tableau Nice to have insurance domain

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

pune

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

gurugram

Work from Office

Role Description: As a Senior Software Engineer - BI & Visualization - Power BI at Incedo, you will be responsible for designing and developing business intelligence (BI) dashboards and visualizations to support business decision-making. You will work with business analysts and data architects to understand business requirements and translate them into technical solutions. You will be skilled in BI tools such as Tableau or Power BI and have experience in database management systems such as Oracle or SQL Server. You will be responsible for ensuring that BI dashboards Roles & Responsibilities: Designing and developing business intelligence (BI) and visualization solutions using tools like Power BI. Creating and maintaining data pipelines and ETL processes Collaborating with other teams to ensure the consistency and integrity of data Providing guidance and mentorship to junior software engineers Troubleshooting and resolving BI and visualization platform issues Technical Skills Skills Requirements: Proficiency in data visualization tools such as Power BI. Knowledge of database technologies such as SQL Server, Oracle, or MySQL. Understanding of data modeling and data warehouse concepts such as star schema, snowflake schema, or data vault. Familiarity with ETL tools and techniques such as Talend, Informatica, or SSIS. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 10 Lacs

gurugram

Work from Office

Role Description As a Software Engineer - BI & Visualization - Power BI at Incedo, you will be responsible for designing and developing business intelligence (BI) dashboards and visualizations to support business decision-making. You will work with business analysts and data architects to understand business requirements and translate them into technical solutions. You will be skilled in BI tools such as Tableau or Power BI and have experience in database management systems such as Oracle or SQL Server. You will be responsible for ensuring that BI dashboards Roles & Responsibilities: Designing and developing business intelligence (BI) and visualization solutions using tools like Power BI. Creating and maintaining data pipelines and ETL processes Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving BI and visualization platform issues Technical Skills Skills Requirements: Proficiency in data visualization tools such as Power BI. Knowledge of database technologies such as SQL Server, Oracle, or MySQL. Understanding of data modeling and data warehouse concepts such as star schema, snowflake schema, or data vault. Familiarity with ETL tools and techniques such as Talend, Informatica, or SSIS. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

bengaluru

Remote

Role Overview: We are looking for a skilled Snowflake Developer with strong experience in cloud data warehousing, data modeling, and performance optimization. The ideal candidate will have hands-on expertise in Snowflake , SQL , and ETL/ELT pipelines , and will work closely with data engineers, analysts, and business stakeholders to deliver scalable data solutions. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Snowflake . Implement data models, schemas, and transformations for analytics and reporting. Optimize Snowflake performance through clustering, partitioning, and query tuning. Integrate Snowflake with various data sources using ETL/ELT tools (e.g., Informatica, Talend, dbt, Apache Airflow). Ensure data quality, security, and governance across the platform. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Develop and maintain stored procedures, views, and user-defined functions in Snowflake. Monitor and troubleshoot data pipeline issues and performance bottlenecks. Required Skills: 5+ years of experience in data engineering or BI development , with at least 3 years in Snowflake . Strong proficiency in SQL , Snowflake scripting , and data modeling . Experience with cloud platforms (AWS, Azure, GCP) and data integration tools . Familiarity with version control systems (e.g., Git) and CI/CD pipelines . Understanding of data governance , security , and compliance best practices. Excellent problem-solving and communication skills. Preferred Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, or related field. Snowflake certifications (e.g., SnowPro Core or Advanced). Experience with Python , Spark , or Kafka is a plus.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

pune

Work from Office

Mandatory skills: • Min 7+ years of hands-on experience in creating T-SQL functions, stored procedures, views, triggers, complex querying techniques, Query optimization/Performance tuning techniques, indexes, user-defined types, constructing dynamic queries ( all are mandatory) • Min 3+ years of experience in ETL concepts and any one ETL tool • Flexibility to work in any RDBMS database tech like MSFT or Oracle. • Should be well versed in RDBMS constructs and CAP Theory. • Should be able to explain the redo and rollback functionalities in RDBMS • Should be able to explain performance by design constructs for different types of data stores like OLTP and OLAP • Should be aware of Bill Inmon Vs Ralph Kimball school of thoughts in data warehouse constructs and should be able to articulate the pros and cons of the design principles • Should be able to understand and articulate multiple data modelling design principles like Canonical data model, ER model , Start schema, Data lake, etc. Good to Have: • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB and PostgreSQL are highly desirable

Posted 3 weeks ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

coimbatore

Work from Office

Mandatory skills: • Min 7+ years of hands-on experience in creating T-SQL functions, stored procedures, views, triggers, complex querying techniques, Query optimization/Performance tuning techniques, indexes, user-defined types, constructing dynamic queries ( all are mandatory) • Min 3+ years of experience in ETL concepts and any one ETL tool • Flexibility to work in any RDBMS database tech like MSFT or Oracle. • Should be well versed in RDBMS constructs and CAP Theory. • Should be able to explain the redo and rollback functionalities in RDBMS • Should be able to explain performance by design constructs for different types of data stores like OLTP and OLAP • Should be aware of Bill Inmon Vs Ralph Kimball school of thoughts in data warehouse constructs and should be able to articulate the pros and cons of the design principles • Should be able to understand and articulate multiple data modelling design principles like Canonical data model, ER model , Start schema, Data lake, etc. Good to Have: • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB and PostgreSQL are highly desirable

Posted 3 weeks ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

bengaluru

Work from Office

Mandatory skills: • Min 7+ years of hands-on experience in creating T-SQL functions, stored procedures, views, triggers, complex querying techniques, Query optimization/Performance tuning techniques, indexes, user-defined types, constructing dynamic queries ( all are mandatory) • Min 3+ years of experience in ETL concepts and any one ETL tool • Flexibility to work in any RDBMS database tech like MSFT or Oracle. • Should be well versed in RDBMS constructs and CAP Theory. • Should be able to explain the redo and rollback functionalities in RDBMS • Should be able to explain performance by design constructs for different types of data stores like OLTP and OLAP • Should be aware of Bill Inmon Vs Ralph Kimball school of thoughts in data warehouse constructs and should be able to articulate the pros and cons of the design principles • Should be able to understand and articulate multiple data modelling design principles like Canonical data model, ER model , Start schema, Data lake, etc. Good to Have: • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB and PostgreSQL are highly desirable

Posted 3 weeks ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

chennai, ambattur

Work from Office

We are looking for " DEVELOPER / SENIOR DEVELOPER - BI Requirements: We are seeking an experienced and highly skilled Microsoft data professional with a strong track record in the design, development, implementation, and maintenance of Data Warehouses and Data Marts , including robust ETL processes . The candidate should have hands-on expertise in Azure Data Factory , Power BI , and Microsoft Cube , along with a deep understanding of data modeling and performance optimization Responsibilities: Lead end-to-end implementation of ETL processes using Azure Data Factory . Design and optimize complex ETL workflows, including performance tuning and error handling Collaborate with business stakeholders to gather requirements and translate them into technical designs , test plans , and project documentation . Develop and maintain Power BI reports and Microsoft Cubes for data visualization and analytics. Apply strong expertise in data modeling , including Star Schema and Snowflake Schema designs. Required Skills: Proficient in Azure Data Factory , Power BI , and Microsoft Cube development. Hands-on experience in database modeling and data warehouse architecture . Strong understanding of ETL best practices , data integration, and transformation logic. Nice-to-Have Skills: Experience with Automic for job scheduling and automation. Familiarity with Microsoft Fabric is a plus. Other: Ability to work in fast changing environment Ability to technological watching/cutting edge Proven ability to work in a team environment Strong organizational and communication skills Language Skills - Good knowledge of spoken and written in English

Posted 3 weeks ago

Apply

10.0 - 14.0 years

25 - 37 Lacs

pune, bengaluru, mumbai (all areas)

Hybrid

Role & responsibilities Job Description Summary The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, specifications, troubleshooting and issue resolution. The ability to communicate to both technical and non-technical audiences is key. Business Title Associate Technical Architect Years of Experience 10+ Years Must have skills 1. Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) 2. Data Warehouse (one or more of Big Query, SnowFlake, etc.) 3. ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) 4. Experience in Cloud platforms - GCP 5. Python, PySpark, Project & resource management 6. SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. UNIX shell scripting, SnowFlake, Reshift, Familiar with NoSQL such as MongoDB, etc 2. ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud platforms - AWS / Azure 4. Client-facing skills Job Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of clients business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using GCP • Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. • Working knowledge of ETL as well as database skills • Working knowledge of data modelling, data structures, databases, and ETL processes • Strong understand of relational and non-relational databases and when to use them • Leadership and communication skills to collaborate with local leadership as well as our global teams • Translating technical requirements into ETL/ SQL application code • Document project architecture, explain detailed design to team and create low level to high level design • Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package • Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions • Perform mid to complex level tasks independently • Support Client, Data Scientists and Analytical Consultants working on marketing solution • Work with cross functional internal team and external clients • Strong project Management and organization skills. Ability to lead 1 2 projects of team size 2 3 team members. • Code management systems which includes Code review, deployment, cod • Work closely with the QA / Testing team to help identify/implement defect reduction initiatives • Work closely with the Architecture team to make sure Architecture standards and principles are followed during development • Performing Proof of Concepts on new platforms/ validate proposed solutions • Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed • Must understand software development methodologies including waterfall and agile • Distribute and manage SQL development Work across the team • The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Education Qulification Bachelors or Master Degree in Computer Science Certification (Must): Snowflake Associate / Core or Min. Basic level certification in AZURE Shift timing GMT (UK Shift) - 2 PM to 11 PM

Posted 3 weeks ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

pune

Work from Office

We are seeking a highly skilled and experienced Tableau Developer with 7+ years of hands-on experience in designing, developing, and deploying advanced BI solutions. The candidate will be responsible for transforming complex business requirements into insightful and interactive Tableau dashboards, ensuring data accuracy, performance optimization, and best practices in visualization. The role also requires mentoring junior team members, collaborating with cross-functional stakeholders, and contributing to BI solution architecture. Key Responsibilities Design, develop, and deploy interactive Tableau dashboards, reports, and storyboards to support business decision-making. Work closely with business stakeholders to gather requirements and translate them into effective BI solutions. Implement advanced Tableau features including LOD expressions, parameters, sets, and table calculations. Develop optimized SQL queries and custom SQL for Tableau extracts and live connections. Ensure dashboard performance optimization (efficient queries, extract tuning, reducing load times). Implement and manage Row-Level Security (RLS) and role-based access controls in Tableau Server/Online. Perform Tableau Server administration tasks: publishing workbooks, scheduling extracts, managing users, monitoring performance. Collaborate with data engineers and ETL teams to ensure data availability, accuracy, and integrity. Integrate Tableau with external systems via APIs, TabPy, R, and extensions when required. Conduct peer reviews, mentoring, and knowledge sharing with junior developers. Contribute to BI strategy and solution architecture, ensuring scalability and reusability of dashboards. Document technical solutions, processes, and standards for BI delivery. Required Skills & Qualifications 7+ years of experience as a Tableau Developer, with proven expertise in dashboard development and performance tuning. Strong proficiency in SQL (complex queries, joins, CTEs, window functions, performance optimization). Experience in data modeling (star schema, snowflake schema, fact-dimension design). Expertise in Tableau Desktop and Tableau Server/Online (installation, configuration, administration). Strong understanding of LOD expressions, table calculations, and advanced charting techniques. Experience with data sources such as Hadoop/Impala. Knowledge of automation and scripting (Python, Tableau API, REST API, TabCmd). Strong analytical skills with ability to interpret business needs and translate into BI solutions. Excellent communication, problem-solving, and stakeholder management skills. Experience working in Agile/Scrum environments. Good to Have Exposure to other BI tools (Power BI, Qlik, Looker). Knowledge of cloud data platforms (AWS, Azure, GCP). Experience with DevOps CI/CD pipelines for BI deployments. Familiarity with data governance and security best practices.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

22 - 30 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Contribute to Data Modeling accelerators • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Use the Data Modelling tool to create appropriate data models. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers to strategize ingestion logic and consumption patterns. Job Requirement Expertise and Qualifications What do we expect? • 6+ years of experience in Data space. • Decent SQL skills. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry

Posted 3 weeks ago

Apply

10.0 - 14.0 years

30 - 37 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .

Posted 3 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

chennai, tamil nadu

On-site

As a fresher with basic knowledge of using Microsoft SQL Server (version 2008 or later), you will be responsible for creating and maintaining complex T-SQL queries, views, and stored procedures. You should also have the ability to monitor performance and enhance it by optimizing the code and creating indexes. Proficiency in Microsoft Access and Microsoft Excel is required for this role. Additionally, you should possess knowledge of descriptive statistical modeling methodologies such as classification, regression, and association activities to support statistical analysis in various healthcare data. Your strong written, verbal, and customer service skills will be essential in this position. You should be proficient in compiling data, creating reports, and presenting information using tools like query, MS Excel, SSRS, Tableau, PowerBI, etc. Furthermore, familiarity with various data forms including star and snowflake schemas is preferred. Your role will involve translating business needs into practical applications and working within a fast-paced environment. Being able to work effectively in a team environment and being flexible in taking on various projects is crucial. Previous experience in a similar fast-paced environment supporting multiple concurrent projects will be beneficial.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be responsible for performing complex analysis, design, programming, test plans, testing, debugging, modification, and support for QlikView and Qlik Sense. Additionally, you will provide advanced technical troubleshooting for Qlik Sense and QlikView. Your role will involve applying advanced administration, management, and performance tuning techniques. You will be expected to research solutions to various business and technical problems, evaluate alternatives, scope appropriately, and present recommendations with justification to management. Moreover, you will be involved in data visualization mockup, report design, and data modeling. To qualify for this position, you should have a minimum of 2 years of experience in implementing end-to-end business intelligence using QlikView and Qlik Sense. You should also possess experience in designing and developing Qlik Sense and QlikView dashboards, reports, and solutions, as well as configuring server components such as QMC tasks, setting up the server, and server administration. Proficiency in data modeling using dimensional modeling, Star schema, and snowflake schema is required. Having real-time hands-on experience with at least 2 projects in QlikView/Qlik Sense with Nprinting project implementation is essential. Your responsibilities will also include developing reports, dashboards, scorecards, unit testing, integration testing the developed components, and gathering business requirements, eliciting technical requirements, and preparing report specifications.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior SQL Developer at our company, you will play a crucial role in our BI & analytics team by expanding and optimizing our data and data queries. Your responsibilities will include optimizing data flow and collection for consumption by our BI & Analytics platform. You should be an experienced data querying builder and data wrangler with a passion for optimizing data systems from the ground up. Collaborating with software developers, database architects, data analysts, and data scientists, you will support data and product initiatives and ensure consistent optimal data delivery architecture across ongoing projects. Your self-directed approach will be essential in supporting the data needs of multiple systems and products. If you are excited about enhancing our company's data architecture to support our upcoming products and data initiatives, this role is perfect for you. Your essential functions will involve creating and maintaining optimal SQL queries, Views, Tables, and Stored Procedures. By working closely with various business units such as BI, Product, and Reporting, you will contribute to developing the data warehouse platform vision, strategy, and roadmap. Understanding physical and logical data models and ensuring high-performance access to diverse data sources will be key aspects of your role. Encouraging the adoption of organizational frameworks through documentation, sample code, and developer support will also be part of your responsibilities. Effective communication of progress and effectiveness of developed frameworks to department heads and managers will be essential. To be successful in this role, you should possess a Bachelor's or Master's degree or equivalent combination of education and experience in a relevant field. Proficiency in T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL, and ETL is required. Experience in creating Tables, Views, and Stored Procedures is crucial. Familiarity with BI and Reporting Platforms, industry trends, and knowledge of multiple database platforms like SQL Server and MySQL are necessary. Proficiency in Source Control and Project Management tools such as Azure DevOps, Git, and JIRA is expected. Experience with SonarQube for clean coding T-SQL practices and DevOps best practices will be advantageous. Applicants must have exceptional written and spoken communication skills and strong team-building abilities to contribute to making strategic decisions and advising senior management on technical matters. With at least 5+ years of experience in a data warehousing position, including working as a SQL Developer, and experience in system development lifecycle, you should also have a proven track record in data integration, consolidation, enrichment, and aggregation. Strong analytical skills, attention to detail, organizational skills, and the ability to mentor junior colleagues will be crucial for success in this role. This full-time position requires flexibility to support different time zones between 12 PM IST to 9 PM IST, Monday through Friday. You will work in a Hybrid Mode and spend at least 2 days working from the office in Hyderabad. Occasional evening and weekend work may be expected based on client needs or job-related emergencies. This job description may not cover all responsibilities and duties, which may change with or without notice.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies