Home
Jobs

104 Dimensional Modeling Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

12 - 22 Lacs

Pune

Hybrid

Naukri logo

About _VOIS _VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Groups partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS India In 2009, _VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VOIS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Primary Skills: SQL (Data Analysis and Development) Location: Pune Working Persona: Hybrid Experience: 5 to 12 years Core competencies, knowledge and experience: Essential: - Strong SQL experience - Advanced level of SQL - Excellent data interpretation skills - Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques - Working knowledge of large information technology development projects using methodologies and standards - Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. - Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. - Ability to work under pressure to tight deadlines. - Hands on experience working with large datasets. - Able to manage different stakeholders. Good to Have / Alternate Skills: - Strong coding experience in Python. Experience: - In-depth working experience in ETL. C2 General - Fixing problems in cooperation with internal and external partners (e.g. Service owner, Tech. Support Team, IT-Ops) - Designing and implementing the changes to the existing different components of data flow. - Develop & maintain end to end data flow. - Maintaining the data quality, data consistency issues and essential business critical processes - Conducting preventative maintenance of the systems - Drive system optimization and simplification - Responsible for performance of data flow and optimisation of the data preparation in conjunction with the other technical team _VOIS Equal Opportunity Employer Commitment _VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills!

Posted 1 month ago

Apply

8 - 13 years

25 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

Over all 8+ years of solid experience in data projects. Excellent Design, develop, and maintain robust ETL/ELT pipelines for data ingestion, transformation, and storage. Proficient in SQL and must worked on complex joins, Subqueries, functions, procedure Able to perform SQL tunning and query optimization without support. Design, develop, and maintain ETL pipelines using Databricks, PySpark to extract, transform, and load data from various sources. Must have good working experience on Delta tables, deduplication, merging with terabyte of data set Optimize and fine-tune existing ETL workflows for performance and scalability. Excellent knowledge in dimensional modelling and Data Warehouse Must have experience on working with large data set Experience working with batch and real-time data processing (Good to have). Implemented data validation, quality checks , and ensure adherence to security and compliance standards. Ability to develop reliable, secure, compliant data processing systems. Work closely with cross-functional teams to support data analytics, reporting, and business intelligence initiatives. One should be self-driven and work independently without support.

Posted 1 month ago

Apply

12 - 16 years

30 - 40 Lacs

Kolkata, Pune, Bengaluru

Hybrid

Naukri logo

Data Architect- (Snowflake AWS Bigdata & DW) Primary Skill Architecting Solutioning both in Snowflake AWS Big Data -Experience in the Snowflake Big Data and Data warehousing space architecting Big data solutions on any cloud platform AWS Preferred. Technical Skills Required- Experience of Architecting Solutioning both in Snowflake AWS Big Data and DW GDPR AWS Snowflake Understanding on Snowflake Cortex for Generative AI Implementation Experience on framework which can help Reduce the Estimation and cost in Repeating work ie Ingestion or Extraction of data Open understand newly introduced feature around Snowflake Look for new approach and solution based on customer standard Understanding Secure coding standard secure data sharing Data clean room Etc Experience on Snowflake and Spark preferred PySpark other Big Data Projects Experience in working on Cloud Implementations AWS ecosystem S3 EC2 Good knowledge of Hadoop ecosystem tools Hive HBase Pig Knowledge of Airflow NiFi Kafka Panda Lamda Snowflake preferable Responsibilities- A Technical Architect defines and owns the technical architecture of systems to deliver business objectives while ensuring quality standards Responsible for highlevel requirement gathering consulting design development and definition of the technical architecture as per business requirements Enable the creation of designs and frameworks for models and validate their adherence with the overall architecture Will be responsible for technically leading the software projects through all stages of the life cycle including responsibility for requirements capture design development and acceptance testing Lead and direct the technical team to do POC to take the critical architectural decision Work with client architects team to complete the architecture Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-

Posted 1 month ago

Apply

2 - 4 years

7 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Responsibilities You will be responsible for defining, delivering high quality scalable BI solutions to meet business objectives u Be a key point of expertise and consultation Data Warehouse Dimensional modelling Ensuring development aligns to agreed standards and is of a high quality Evaluating and adopting new techniques to advance the Azure Data Platform Provide expert guidance on BI skills and the latest Microsoft BI tools and technologies Maintain accurate and complete solution design documents Collaborate with BI Specialists, Engineers, DBAs, BI Developers, Analysts and third party BI suppliers for successful development of BI reporting and analysis solutions Write clean, effective, and reusable code with notes for others throughout To work with multiple stakeholders and Knight Frank teams on wider projects To be able to advise other parties on best practice and on Data, BI tools and concepts Experience Building Enterprise BI solutions Have strong experience with Microsoft SQL or T-SQL for querying data u Solid experience with Data Modelling, Data Warehouse design and Data Lake concepts and practices u Have a good knowledge of Microsoft Reporting tools (Power BI and MS SSRS) u Proactive to remain up to date with latest Microsoft technologies and techniques u Demonstrated experience working effectively as part of a team, showing strong collaboration and resourcefulness Desirable to have but not essential u Agile delivery methodology of database development u Experience in the Property sector u Working with Azure Synapse or Data Bricks u Power BI Data Platform tools to have in-depth knowledge of the following u SQL Azure DB or DW u MS SQL Server u SSAS cube development Particular Aptitudes/Skills Required u Highly organised, systematic and adaptable with an excellent attention to detail, with the ability to recognise the relative importance of software issues and to prioritise work effectively u Ability to communicate clearly and deal with others at all levels in a polite, professional, friendly and helpful manner, both face to face, by email and on the telephone and maintain a good working relationships at all times u Demonstrated an ability to work on multiple projects simultaneously u Ability to work with all members of the team, in a professional and yet dynamic and creative environment, fostering an open, friendly and constructive working relationship with all members of the Knight Frank team u Ability to work within a high pressure environment, balance priorities and remain calm under pressure u The successful candidate will be flexible, self-motivated, organised and pro-active with excellent computing and administration skills and the ability to adapt to a wide range of tasks. They will also have a hands on attitude and possess the necessary skills, manner and experience to provide an effective support service to the department/office u Desire to learn new technologies and continuously develop new skills and expertise u Ability to work out of hours to deliver application upgrades in agreed maintenance windows

Posted 1 month ago

Apply

5 - 10 years

16 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

ETL tool: ODI 11g, SSIS, Informatic Data Modeling: Dimensional modeling Database: MS SQL Server 2005/2008, Oracle 8i/9i/10g/11g Languages: SQL, PL/SQL Tools: TOAD, SQL Developer, SQL Plus, interested send cv recruiter4@archancehrs.com

Posted 2 months ago

Apply

6 - 10 years

20 - 32 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required: Proficient in Data Modelling 6-8 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only

Posted 2 months ago

Apply

7 - 11 years

15 - 19 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Role Description: We are seeking a Data Solutions Architect to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support enterprise data lakes, data warehouses, and real-time analytics. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 6-8 + years of experience in Computer Science, IT or related field OR Master’s degree with 8-10 + years of experience in Computer Science, IT or related field OR Bachelor’s degree with 10-12 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 2 months ago

Apply

9 - 14 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : Job TitleSenior GCP Data & BI SME Corporate TitleAVP LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in multiple areas predominantly in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly experienced GCP Data & BI Subject Matter Expert (SME) to join our growing team. In this senior role, you will be a trusted advisor, providing technical expertise and strategic direction across all things data and BI on GCP. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Technical Expertise In-depth knowledge of GCP data services (BigQuery, Cloud Storage, Dataflow, etc.). Design and optimize complex data pipelines for efficient data ingestion, transformation, and analysis. Partner with product management group and other business stakeholders to gather requirements, translate them into technical specifications, and design effective BI solutions (Tableau, Looker) Design and develop complex data models, leveraging expertise in relational and dimensional modeling techniques. Advocate for best practices in data governance, security, and compliance on GCP. Collaboration & Mentorship Collaborate with data engineers,analysts, and business stakeholders to understand data requirements and drive data-driven decision-making. Mentor and guide junior team members on GCP technologies and BI best practices. Foster a culture of innovation and continuous improvement within the data and BI domain. Staying Current Track emerging trends and innovations in GCP, BI tools, and data analytics methodologies. Proactively research and recommend new technologies and solutions to enhance our data, BI capabilities. Your skills and experience 9+ years of experience in data warehousing, data management, and business intelligence. Proven expertise in Google Cloud Platform (GCP) and its data services (BigQuery, Cloud Storage, Dataflow, etc.). Strong understanding of data governance, security, and compliance principles on GCP. Experience designing and implementing complex data pipelines. In-depth knowledge of relational and dimensional modeling techniques for BI. Experience with T SQL or PL SQL or Ansi SQL Experience with leading BI tools and platforms (Tableau, Looker). Excellent communication, collaboration, and problem-solving skills. Ability to translate technical concepts into clear, actionable insights for business stakeholders. Strong leadership presence and ability to influence and inspire others. Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus Knowledge of cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

8 - 10 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you will be a Senior Manager – BI & Visualization to lead and drive enterprise-wide business intelligence (BI) and data visualization initiatives. This role will be responsible for strategic planning, governance, and execution of BI and analytics solutions, ensuring that business leaders have access to actionable insights through advanced reporting and visualization platforms. The ideal candidate will have a deep understanding of BI tools, data visualization best practices, self-service BI enablement, and enterprise analytics strategy, working closely with business executives, data teams, and technology leaders to foster a data-driven culture. Develop and execute a strategic BI & visualization roadmap, aligning with business goals, analytics objectives, and digital transformation strategies. Lead and mentor BI, analytics, and visualization teams, fostering a culture of innovation, collaboration, and continuous learning. Own the end-to-end BI lifecycle, including data modeling, dashboard development, analytics governance, and self-service BI adoption. Oversee the implementation of modern BI solutions, leveraging tools like Power BI, Tableau, Looker, Qlik Sense, or similar to deliver high-impact visual insights. Define and enforce data visualization best practices, ensuring dashboards are intuitive, user-friendly, and business-focused. Drive self-service BI enablement, empowering business users to explore, analyze, and act on data independently while maintaining data security and governance. Collaborate with business leaders, data scientists, and engineering teams to identify and prioritize high-value analytics use cases. Optimize BI infrastructure and reporting architecture, ensuring scalability, performance, and cost efficiency. Establish BI governance frameworks, defining data access controls, security policies, KPI standardization, and metadata management. Champion the use of AI/ML-powered BI solutions, enabling predictive analytics, anomaly detection, and natural language-driven insights. Monitor BI performance metrics, ensuring reporting solutions meet business SLAs, operational efficiency, and data accuracy. Stay ahead of emerging trends in BI, data visualization, and analytics automation, ensuring the company remains competitive in its data strategy. What we expect of you Master’s degree and 8 to 10 years of experience in Computer Science, IT or related field OR Bachelor’s degree and 10 to 14 years of experience in Computer Science, IT or related field OR Diploma and 14 to 18 years of experience in Computer Science, IT or related field. Certifications on PowerBI / Any other visualization tools Basic Qualifications: 10-14 + years of experience in BI, analytics, and data visualization, with at least 5 years in a leadership role. Expertise in BI tools, including Power BI, Tableau, Looker, Qlik Sense similar enterprise BI platforms. Strong proficiency in data visualization principles, storytelling with data, and dashboard usability best practices. Experience in leading large-scale BI transformation initiatives, driving self-service analytics adoption across an enterprise. Strong knowledge of data modeling, dimensional modeling (star/snowflake schema), and data warehousing concepts. Hands-on experience with SQL, DAX, Power Query (M), or other analytics scripting languages. Strong background in BI governance, data security, compliance, and metadata management. Ability to influence senior leadership, communicate insights effectively, and drive business impact through BI. Excellent problem-solving skills, with a track record of driving efficiency, automation, and data-driven decision-making. Preferred Qualifications: Experience in Biotechnology or pharma industry is a big plus Experience with Data Mesh, Data Fabric, or Federated Data Governance models. Experience with AI/ML-driven BI solutions, predictive analytics, and NLP-based BI capabilities. Knowledge infrastructure & deploayment automation for visualization platforms. Experience integrating BI with ERP, CRM, and operational systems (SAP, Salesforce, Oracle, Workday, etc.). Familiarity with Agile methodologies and Scaled Agile Framework (SAFe) for BI project delivery. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 2 months ago

Apply

16 - 22 years

40 - 55 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.

Posted 2 months ago

Apply

7 - 12 years

20 - 30 Lacs

Pune, Bengaluru, Hyderabad

Hybrid

Naukri logo

Job Description (Senior) Data Modeler The data modeler designs, implements, and documents data architectures and data models for solutions, which include the use of application, relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, operations, machine learning, data science, and other business interests. The successful candidate will: 1. Be responsible for the development of the conceptual, logical, and physical data models, oversight of the implementation of RDBMS, operational data stores (ODS), application databases, data marts, and data lakes on target platforms (SQL/NoSQL/cloud/on-prem). 2. Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (application, relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, operations and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modelling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualisations. Hands-on modelling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills Strong communication skill, good oral English 3+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience on MongoDB would be an advantage. Good knowledge of metadata management, data modelling, and related tools (Erwin or Visual Paradigm or others) required. Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-centre contexts is desired. Experience in team management, communication, and presentation.

Posted 2 months ago

Apply

5 - 10 years

20 - 25 Lacs

Pune, Hyderabad

Hybrid

Naukri logo

Job Title: Databricks Data Modeler Location : Pune / Hyderabad Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.

Posted 2 months ago

Apply

4 - 9 years

5 - 15 Lacs

Pune

Hybrid

Naukri logo

PBI Model Developer Job Description About Cloudaeon: Cloudaeon is a global technology consulting and services company. We support companies in managing cloud infrastructure and solutions with the help of big data, DevOps and analytics. We offer first-class solutions and services that use big data and always exceed customer expectations. Our deep vertical knowledge, combined with expertise in several enterprise-class big data platforms, helps develop targeted solutions to meet our customers' business needs. Our global team consists of experienced professionals with experience in various tech stacks. Every member of our team is very active and committed to helping our customers achieve their goals. Job Role PBI Model Developer Experience - 4+ years Job description We are looking for an experienced Power BI Model Developer to join our team and support semantic model development in Power BI. The ideal candidate should have strong expertise in Power BI data modeling, DAX, and SQL to design and optimize efficient, scalable, and business-ready data models. Responsibilities Develop and optimize Power BI semantic models to support business reporting and analytics. Design and implement star and snowflake schemas for efficient data modeling. Write and optimize DAX calculations to enhance analytical capabilities. Develop SQL queries and stored procedures to extract, transform, and load (ETL) data efficiently. Ensure data integrity, governance, and performance tuning within Power BI datasets. Collaborate with business stakeholders to gather requirements and translate them into effective data models. Implement security roles and row-level security (RLS) in Power BI models. Work closely with data engineers to ensure smooth data integration from various sources.. Requirement 4+ years of experience in Power BI data modeling and development. Strong expertise in DAX functions and advanced calculations. Proficiency in SQL (writing complex queries, performance tuning, stored procedures). Experience with data warehousing concepts, ETL processes, and dimensional modeling. Familiarity with Azure Synapse, Data Factory, or other cloud data solutions (preferred). Understanding of best practices in Power BI governance, security, and performance tuning. Excellent communication skills to work with both technical and non-technical stakeholders. Experience working in an onshore environment with direct stakeholder interactions. Knowledge of Power BI Premium, Composite Models, and Direct Query. Experience with Python or Power Automate for advanced reporting automation. Customer centric, passionate about delivering great digital products and services. Demonstrating true engineering craftsmanship mindset. Passionate about continuous improvement, collaboration, and great teams. Strong problem-solving skills coupled with good communication skills. Open minded, inquisitive, life-long learner.

Posted 2 months ago

Apply

3 - 5 years

12 - 16 Lacs

Noida

Work from Office

Naukri logo

Please apply using below link. This will require uploading your CV: https://crowe.wd12.myworkdayjobs.com/External_Careers/job/Noida-Uttar-Pradesh-India/Senior-Data-Engineer-1_R-47179 "Do not apply on Naukri" Your Journey at Crowe Starts Here: At Crowe, you can build a meaningful and rewarding career. With real flexibility to balance work with life moments, you’re trusted to deliver results and make an impact. We embrace you for who you are, care for your well-being, and nurture your career. Everyone has equitable access to opportunities for career growth and leadership. Over our 80-year history, delivering excellent service through innovation has been a core part of our DNA across our audit, tax, and consulting groups. That’s why we continuously invest in innovative ideas, such as AI-enabled insights and technology-powered solutions, to enhance our services. Join us at Crowe and embark on a career where you can help shape the future of our industry. Job Description: Job Description: Experience in developing, optimizing, and maintaining T-SQL stored procedures, views, and functions within SQL. Develop, optimize, and maintain ETL workflows using SSIS and other tools to integrate data from various sources, including on-prem SQL Server, Excel, cloud services, and other BI tools. Leverage Power BI to build dashboards, reports, and semantic models, applying best practices in DAX, Power Query, and SSAS Tabular modeling. Intermediate or advanced proficiency and knowledge of databases (on-prem or cloud), querying data using T-SQL, or other comparable languages. Collaborate with other developers, senior leadership, and business SMEs to discover platform solution requirements, while performing data discovery and independently executing complex data analysis across multiple platforms (e.g., SQL, Fabric, Power BI, and Excel). Work with Microsoft Fabric to build and manage modern data infrastructure platform in a medallion architecture (Bronze, Silver, Gold), leveraging PySpark and other Notebook-based approaches. Demonstrate platform capabilities and promote application of tools as appropriate to address business requirements. Collaborate with peers on best practices for security, governance, and performance optimization. Understand and apply intermediate to advance Azure DevOps methodologies to drive an exemplary customer experience; apply elementary project management techniques to ensure communicable progress. Document processes, data flows, and architecture to ensure system transparency and maintainability. Effectively communicate across the organization in a manner easily understood by individuals with diverse technical and business backgrounds. Hands-on experience with Azure Data Services, such as Azure SQL Database and Azure Data Factory, while maintaining expertise in on-premises SQL Server deployments. Substantial experience with data modeling, ETL processes, as well as advanced understanding of dimensional modelling. Experience with Postman, SSAS Tabular, DAX Studio, ALM Toolkit, and/or Tabular Editor (2 or 3) a plus. Familiarity with Power BI semantic modelling, machine learning and advanced analytics a plus Ability to work independently and manage multiple priorities in a fast-paced environment. A good understanding of the agile development cycle and familiarity with Smartsheet or Azure DevOps preferred. Stay updated and up to speed with regards to new features, techniques, findings, and best practices emergent for core platforms. Qualifications: Bachelor’s degree in computer science, Data Analytics, Data/Information Science, Information Systems, Mathematics (or other related fields). 3+ years of experience with SQL and data warehousing concepts. 3+ years of experience with Microsoft Power BI, including DAX, Power Query, and M language. 3+ years of experience in supporting Business Intelligence, data analysis, and reporting. 2+ years of experience with Python (or other programming languages). 2+ years of experience managing projects from inception to execution. 1+ years of experience working with Delta Lake or Apache Spark (Fabric or Databricks). We expect the candidate to uphold Crowe’s values of Care, Trust, Courage, and Stewardship. These values define who we are. We expect all of our people to act ethically and with integrity at all times. Our Benefits: At Crowe, we know that great people are what makes a great firm. We value our people and offer employees a comprehensive benefits package. Learn more about what working at Crowe can mean for you! How You Can Grow: We will nurture your talent in an inclusive culture that values diversity. You will have the chance to meet on a consistent basis with your Career Coach that will guide you in your career goals and aspirations. Learn more about where talent can prosper! More about Crowe: Crowe Horwath IT Services Private Ltd. is a wholly owned subsidiary of Crowe LLP (U.S.A.), a public accounting, consulting and technology firm with offices around the world. Crowe LLP is an independent member firm of Crowe Global, one of the largest global accounting networks in the world. The network consists of more than 200 independent accounting and advisory firms in more than 130 countries around the world. Crowe does not accept unsolicited candidates, referrals or resumes from any staffing agency, recruiting service, sourcing entity or any other third-party paid service at any time. Any referrals, resumes or candidates submitted to Crowe, or any employee or owner of Crowe without a pre-existing agreement signed by both parties covering the submission will be considered the property of Crowe, and free of charge.

Posted 2 months ago

Apply

8 - 12 years

20 - 32 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Strong understanding of SQL and dimensional data modeling concepts. Experience in Payer Domain Data Models Proficiency in data modeling tools

Posted 2 months ago

Apply

12 - 15 years

25 - 35 Lacs

Chennai, Pune, Bengaluru

Hybrid

Naukri logo

Good to have hands-on experience on data modelling tools such as Erwin etc. 3. Should have good knowledge on provider domain 4. Should have strong experience on Snowflake and must have executed development and migration projects involving snowflake.

Posted 2 months ago

Apply

5 - 10 years

6 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Title - Data & Analytics Consultant - OAC/FAW + ODI Experience - 5 to 10 years Role - Permanent Responsibilities: An Individual Contributor who has worked with ERP systems as sources with sound knowledge of Dimensional Modeling, Data Warehousing, implementation & Extensions of Oracle Business Intelligence Applications / Fusion Data Intelligence (Fusion Analytics Warehouse) Experience in designing and development of data pipelines from variety of source systems into Data warehouse or lakehouse using ODI, Informatica Power Center or any other ETL/ELT technologies. Possess hands on experience to the Semantic modeling / metadata (RPD) modeling very well, developing, customizing, maintaining and support Complex Analysis, Data Visualizations and BI Publisher Reports in Oracle Analytics Cloud or Oracle Analytics Server as per requirement of the business users.

Posted 2 months ago

Apply

4 - 8 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: An Individual Contributor who has worked with ERP systems as sources with sound knowledge of Dimensional Modeling, Data Warehousing, implementation & Extensions of Oracle Business Intelligence Applications / Fusion Data Intelligence (Fusion Analytics Warehouse) Experience in designing and development of data pipelines from variety of source systems into Data warehouse or lakehouse using ODI, Informatica Power Center or any other ETL/ELT technologies. Possess hands on experience to the Semantic modeling / metadata (RPD) modeling very well, developing, customizing, maintaining and support Complex Analysis, Data Visualizations and BI Publisher Reports in Oracle Analytics Cloud or Oracle Analytics Server as per requirement of the business users.

Posted 2 months ago

Apply

9 - 14 years

15 - 22 Lacs

Gurgaon

Work from Office

Naukri logo

- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large scale data warehouse / data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.

Posted 2 months ago

Apply

3 - 8 years

9 - 13 Lacs

Hubli

Work from Office

Naukri logo

- Design and develop data integration processes using appropriate data loading and data movement techniques - Be able to use different tools and or programming to stitch together data pipelines that may include streaming, IoT or data sourced from traditional data sources like database, file stores or from SOAP and REST data interfaces - Be able to document design based on pilot and demonstrate how it delivers to functionality and manages exceptions - Work with data models to use them for data staging and for data storage purposes with knowledge on how to load dimensional and 3rd normal form data objects - Be able to create database procedures and tasks on the snowflake database environment. - Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications - Prepare technical literature documentation for accuracy and completeness - Experience in working in cloud Data Platform - Proficient in scripting in Java, Windows and PowerShell - Proficient in at least one programming language like Python, Scala - Expert in using Informatica IICS Data integration application Integration - Proficient in working with Database's like snowflake, Oracle, SQL Server - Expert on writing and implementing SQL Query language

Posted 2 months ago

Apply

9 - 14 years

15 - 22 Lacs

Coimbatore

Work from Office

Naukri logo

- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large scale data warehouse / data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.

Posted 2 months ago

Apply

9 - 14 years

15 - 22 Lacs

Vadodara

Work from Office

Naukri logo

- Experience in architecting with AWS or Azure Cloud Data Platform - Successfully implemented large scale data warehouse / data lake solutions in snowflake or AWS Redshift - Be proficient in Data modelling and data architecture design experienced in reviewing 3rd Normal Form and Dimensional models. - Experience in implementing Master data management, process design and implementation - Experience in implementing Data quality solutions including processes - Experience in IOT Design using AWS or Azure Cloud platforms - Experience designing and implementing machine learning solutions as part of high-volume data ingestion and transformation - Experience working with structured and unstructured data including geo-spatial data - Experience in technologies like python, SQL, no SQL, KAFKA, Elastic Search - Hands on experience using snowflake, informatica, azure logic apps, azure functions, azure storage, azure data lake and azure search.

Posted 2 months ago

Apply

3 - 8 years

9 - 13 Lacs

Surat

Work from Office

Naukri logo

- Design and develop data integration processes using appropriate data loading and data movement techniques - Be able to use different tools and or programming to stitch together data pipelines that may include streaming, IoT or data sourced from traditional data sources like database, file stores or from SOAP and REST data interfaces - Be able to document design based on pilot and demonstrate how it delivers to functionality and manages exceptions - Work with data models to use them for data staging and for data storage purposes with knowledge on how to load dimensional and 3rd normal form data objects - Be able to create database procedures and tasks on the snowflake database environment. - Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications - Prepare technical literature documentation for accuracy and completeness - Experience in working in cloud Data Platform - Proficient in scripting in Java, Windows and PowerShell - Proficient in at least one programming language like Python, Scala - Expert in using Informatica IICS Data integration application Integration - Proficient in working with Database's like snowflake, Oracle, SQL Server - Expert on writing and implementing SQL Query language

Posted 2 months ago

Apply

0 - 2 years

2 - 4 Lacs

Udaipur

Work from Office

Naukri logo

Python Developer Name:Python DeveloperRole:Python DeveloperIndustry:IT/ SoftwareLocation:Udaipur(Rajasthan)Job Type:Full TimeExperience:Freshers - 2yearsSkills:Python, Database libraries,data architecture, and dimension modeling,ETL/ELT frameworks Salary:Best in the industryEducation:BTech ( CS/ IT/ EC) Description: Execution of data architecture and data management projects for both new and established data sources. Innovate and contribute to the development of client’s data platforms using Python.Familiarity with transitioning existing data sets and databases to new technology stack is helpful.Manage the end-to-end process for data ingestion and publishing.Perform data loads and data quality analysis to identify potential errors within the data platform.Work closely with operation teams to understand data flow, architecture, and gather functional requirements. Experience in a data production environment, with a focus on adeptly managing vast volumes of intricate data.Hands-on experience in SQL programming, data architecture, and dimension modeling.Expertise in Python programming, showcasing deep knowledge of libraries such as Beautiful Soup, Selenium, Requests, Pandas, data structures, and algorithms.Proficiency in crafting efficient, reusable, and modular code.In-depth knowledge of the RDBMS with the ability to design and optimize complex SQL queries.Relational database experience with MySql, PostGres, Oracle or Snowflake is preferred.Expertise in mapping, standardizing, and normalizing data.Knowledge of ETL/ELT frameworks and writing pipelines for loading millions of records is helpful.Use of version control systems like Git, effectively managing code repositories.Strong analytical skills for addressing complex technical challenges, including proficiency in debugging and performance optimization techniques. Showcase a thorough understanding of the software development lifecycle, from requirements analysis to testing and deployment.

Posted 2 months ago

Apply

6 - 11 years

25 - 35 Lacs

Pune, Delhi NCR, India

Hybrid

Naukri logo

Exp - 6 to 10 Yrs Role - Data Modeller Position - Permanent Job Locations - DELHI/NCR (Remote), PUNE (Remote), CHENNAI (Hybrid), HYDERABAD (Hybrid), BANGALORE (Hybrid) Experience & Skills: 6+ years of experience in strong Data Modeling and Data Warehousing skills Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP)

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies