Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
25 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Pune, Gurugram, Jaipur
Work from Office
Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.
Posted 1 month ago
7.0 - 12.0 years
20 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.
Posted 1 month ago
7.0 - 9.0 years
9 - 12 Lacs
Nagpur
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
9 - 12 Lacs
Kolkata
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Jaipur
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Visakhapatnam
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Nashik
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Pune
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Ludhiana
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Lucknow
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
4.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2182_JOB Date Opened 13/04/2024 Industry Technology Job Type Work Experience 4-6 years Job Title OBIEE Developer City Bangalore Province Karnataka Country India Postal Code 560029 Number of Positions 5 LocationsBangalore, Mumbai, Pune, Hyderabad, Chennai, Kolkata, Delhi 3-8 years of IT experience as in development and implementation of Business Intelligence and Data warehousing solutions using OBIEE/OAC/OAS Knowledge of Analysis Design, Development, Customization, Implementation & Maintenance of OBIEE/OAS/OAC Must have experience of working in OAC including security design and implementation in OAC. Must have good knowledge & experience in RPD development and dimensional modelling, including but not limited to handling multiple fact table dimensional Modelling, facts of different grain level, MUDE Environment, Hierarchies, Fragmentation etc. Must have good knowledge in OAC DV Knowledge in OAC/Oracle DV/FAW/OBIA & BI Publisher is a plus. Sound knowledge in writing SQL and debugging queries. Strong knowledge in front-end in developing OBIEE/OAS/OAC reports and dashboards using different views. Good knowledge of performance tuning of reports (Indexing, caching, aggregation, SQL modification, hints etc.) Excellent communication skills, organized and effective in delivering high-quality solutions using OBIEE/OAS/OAC check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
10.0 - 15.0 years
30 - 40 Lacs
Noida, Gurugram
Work from Office
We're hiring for Snowflake Data Architect - With Leading IT Services firm for Noida & Gurgaon. Job Summary: We are seeking a Snowflake Data Architect to design, implement, and optimize scalable data solutions using Databricks and the Azure ecosystem. The ideal candidate will have deep expertise in big data architecture, data engineering, and cloud technologies , enabling them to create robust, high-performance data pipelines and analytics solutions. Key Responsibilities: Design and develop scalable, secure, and high-performance data architectures using Snowflake, Databricks, Delta Lake, and Apache Spark . Architect ETL/ELT data pipelines to process structured and unstructured data efficiently. Implement data governance, security, and compliance frameworks across cloud-based data platforms. Optimize Spark jobs for performance, cost, and reliability. Collaborate with data engineers, analysts, and business teams to understand requirements and design appropriate solutions. Develop data lakehouse architectures leveraging Databricks and ADLS Implement machine learning and AI workflows using Databricks ML and integration with ML frameworks. Define and enforce best practices for data modeling, metadata management, and data quality . Monitor and troubleshoot Databricks clusters, job failures, and performance bottlenecks . Stay updated with the latest Databricks features, Apache Spark advancements, and cloud innovations . Required Qualifications: 10+ years of experience in data architecture, data engineering, or big data platforms . Hands-on experience with Snowflake is mandatory and experience on Databricks (including Delta Lake, Unity Catalog, DBSQL) is great-to-have, as an addition. Will work in Individual Contributor role with expertise in Apache Spark for large-scale data processing. Proficiency in Python, Scala, or SQL for data transformations. Experience with Azure and their data services (e.g., Azure Data Factory, Azure Synapse, Azure, Azure SQL Server ). Knowledge of data lakehouse architectures, data warehousing and ETL processes . Strong understanding of data security, IAM, and compliance best practices . Experience with CI/CD pipelines, Infrastructure as Code (Terraform, ARM templates, CloudFormation) . Familiarity with MLflow, Feature Store, and MLOps concepts is a plus. Strong interpersonal and communication skills If interested, please share your profile at harjeet@beanhr.com
Posted 1 month ago
6.0 - 9.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to translate requirements into scalable data solutions. Ensure data quality, governance, and security across all big query data assets. Automate workflows using orchestration tools. Mentor junior resource and lead script reviews, documentation, and knowledge sharing. Qualifications 6+ years of experience in data analytics, with 3+ years on GCP and BigQuery. Strong proficiency in SQL , with experience in writing complex queries and optimizing performance. Hands-on experience with ETL/ELT tools and frameworks. Deep understanding of data warehousing , dimensional modeling , and data lake architectures . Good Exposure with data governance , lineage , and metadata management . GCP data engineer certification is a plus. Experience with BI tools (e.g., Looker, Power BI). Good communication and team lead skills.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Key Responsibilities: Translate business needs to technical specifications Design, build and deploy BI solutions (e.g. reporting tools) Develop and execute database queries and conduct analyses Create visualizations and reports for requested projects Conduct unit testing and troubleshooting Collaborate with teams to integrate systems Agile delivery process Maintain and support data platforms Evaluate and improve existing BI systems Proven abilities to take initiative and be innovative Required Knowledge & Skills: BS/MS in Computer Science of Information Systems along with work experience in related field 6+ years of experience in building visualizations, reports, and dashboards. (Microsoft Power BI) Knowledge of SQL queries, SQL Server Reporting Services (SSRS), SQL Server Integration Services (SSIS), PostgreSQL and Snowflake. Background in data warehouse design (e.g. dimensional modelling) and data mining Capable to implement row level security on data Experience in production support activities will be added advantage Good to hire a candidate with Certification Experience on Python and Tableau will be added advantage.
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 1 month ago
6.0 - 11.0 years
30 - 40 Lacs
Bengaluru
Work from Office
Role & responsibilities JD Bachelors Degree preferred, or equivalent combination of education, training, and experience. 5+ years of professional experience with SQL, ETL, data modeling, and at least one programming language (e.g., Python, C++, C#, Scala, etc.) 3+ years of professional experience with Enterprise Domains Like HR, Finance, Supply Chain 6+ years of professional experience with more than one SQL and relational databases including expertise in Presto, Spark, and MySQL Professional experience designing and implementing real-time pipelines (Apache Kafka, or similar technologies) 5+ years of professional experience in custom ETL design, implementation, and maintenance 3+ years of professional experience with Data Modeling including expertise in Data Warehouse design and Dimensional Modeling 5+ years of professional experience working with cloud or on-premises Big Data/MPP analytics platform (Teradata, AWS Redshift, Google BigQuery, Azure Synapse Analytics, or similar) Experience with data quality and validation (using Apache Airflow) Experience with anomaly/outlier detection Experience with Data Science workflow (Jupyter Notebooks, Bento, or similar tools) Experience with Airflow or similar workflow management systems Experience querying massive datasets using Spark, Presto, Hive, or similar Experience building systems integrations, tooling interfaces, implementing integrations for ERP systems (Oracle, SAP, Salesforce, etc.) Experience in data visualizations using Power BI and Tableau. Proficiency in Python programming language and Python libraries, with a focus on data engineering and data science applications. Professional fluency in English required Preferred candidate profile
Posted 2 months ago
5.0 - 10.0 years
4 - 8 Lacs
Chennai, Guindy, Chenai
Work from Office
Data Modeller Chennai - Guindy, India Information Technology 17074 Overview A Data Modeller is responsible for designing, implementing, and managing data models that support the strategic and operational needs of an organization. This role involves translating business requirements into data structures, ensuring consistency, accuracy, and efficiency in data storage and retrieval processes. Responsibilities Develop and maintain conceptual, logical, and physical data models. Collaborate with business analysts, data architects, and stakeholders to gather data requirements. Translate business needs into efficient database designs. Optimize and refine existing data models to support analytics and reporting. Ensure data models support data governance, quality, and security standards. Work closely with database developers and administrators on implementation. Document data models, metadata, and data flows. Requirements Bachelors or Masters degree in Computer Science, Information Systems, Data Science, or related field. Data Modeling Tools: ER/Studio, ERwin, SQL Developer Data Modeler, or similar. Database Technologies: Proficiency in SQL and familiarity with databases like Oracle, SQL Server, MySQL, PostgreSQL. Data Warehousing: Experience with dimensional modeling, star and snowflake schemas. ETL Processes: Knowledge of Extract, Transform, Load processes and tools. Cloud Platforms: Familiarity with cloud data services (e.g., AWS Redshift, Azure Synapse, Google BigQuery). Metadata Management & Data Governance: Understanding of data cataloging and governance principles. Strong analytical and problem-solving skills. Excellent communication skills to work with business stakeholders and technical teams. Ability to document models clearly and explain complex data relationships. 5+ years in data modeling, data architecture, or related roles. Experience working in Agile or DevOps environments is often preferred. Understanding of normalization/denormalization. Experience with business intelligence and reporting tools. Familiarity with master data management (MDM) principles.
Posted 2 months ago
5.0 - 9.0 years
14 - 19 Lacs
Chennai
Work from Office
Project description We are seeking a highly skilled Senior Power BI Developer with strong expertise in Power BI, SQL Server, and data modeling to join our Business Intelligence team. In this role, you will lead the design and development of interactive dashboards, robust data models, and data pipelines that empower business stakeholders to make informed decisions. You will work collaboratively with cross-functional teams and drive the standardization and optimization of our BI architecture. Responsibilities Power BI Dashboard Development (UI Dashboards) Design, develop, and maintain visually compelling, interactive Power BI dashboards aligned with business needs. Collaborate with business stakeholders to gather requirements, develop mockups, and refine dashboard UX. Implement advanced Power BI features like bookmarks, drill-throughs, dynamic tooltips, and DAX calculations. Conduct regular UX/UI audits and performance tuning on reports. Data Modeling in SQL Server & Dataverse Build and manage scalable, efficient data models in Power BI, Dataverse, and SQL Server. Apply best practices in dimensional modeling (star/snowflake schema) to support analytical use cases. Ensure data consistency, accuracy, and alignment across multiple sources and business areas. Perform optimization of models and queries for performance and load times. Power BI Dataflows & ETL Pipelines Develop and maintain reusable Power BI Dataflows for centralized data transformations. Create ETL processes using Power Query, integrating data from diverse sources including SQL Server, Excel, APIs, and Dataverse. Automate data refresh schedules and monitor dependencies across datasets and reports. Ensure efficient data pipeline architecture for reuse, scalability, and maintenance. Skills Must have Experience6+ years in Business Intelligence or Data Analytics with a strong focus on Power BI and SQL Server. Technical Skills: Expert-level Power BI development, including DAX, custom visuals, and report optimization. Strong knowledge of SQL (T-SQL) and relational database design. Experience with Dataverse and Power Platform integration. Proficiency in Power Query, Dataflows, and ETL development. ModelingProven experience in dimensional modeling, star/snowflake schema, and performance tuning. Data IntegrationSkilled in connecting and transforming data from various sources, including APIs, Excel, and cloud data services. CollaborationAbility to work with stakeholders to define KPIs, business logic, and dashboard UX. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior
Posted 2 months ago
5.0 - 10.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together The ETL Developer is responsible for the design, development and maintenance of various ETL processes. This includes the design and development of processes for various types of data, potentially large datasets and disparate data sources that require transformation and cleansing to become a usable data set. This candidate should also be able to find creative solutions to complex and diverse business requirements. The developer should have a solid working knowledge of any programing languages, data analysis, design, ETL tool sets. The ideal candidate must possess solid background on Data Engineering development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with business and technical experts in the team. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 6+ years of development, administration and migration experience in Azure Databricks and Snowflake 6+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 months ago
4.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Subject matter expert (SME) in one or more Healthcare domains Analyzes and documents client's business requirements, processes and communicates these requirements by constructing conceptual data and process models, including data dictionaries and functional design documents Collaborates with data teams, departments, other IT groups, and technology vendors to define the data needs and facilitate analytic solutions Provides input into developing and modifying data warehouse and analytic systems to meet client needs and develops business specifications to support these modifications Ability to communicate complex technical and functional concepts verbally and in writing Ability to lead socialization and consensus building efforts for innovative data and analytic solutions Identifies opportunities for reuse of data across the enterprise; profiles and validates data sources Creates test scenarios and develops test plans to be used in testing the data products to verify that client requirements are incorporated into the system design. Assists in analyzing testing results throughout the project Participates in architecture and technical reviews to verify 'intent of change' is carried out through the entire project Performs root cause analysis and application resolution Assignment of work and development of less experienced team members Ensure proper documentation and on-time delivery of all functional artifacts and deliverables Document and communicate architectural vision, technical strategies, and trade-offs to gain broad buy-in Reduce inefficiencies through rationalization and standards adherence Responsible for identifying, updating, and curating the data standardization and data quality rules Responsible for leading and providing direction for data management, data profiling and source to target mapping Responsible for optimizing and troubleshooting and data engineering processes Work independently, but effectively partner with a broader team, to design and develop enterprise data solutions Ability to creatively take on new challenges and work outside comfort zone Comfortable in communicating alternative ideas with clinicians related to information options and solutions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering, or another related field) Revenue Cycle Management Domain Experience 7+ years in a healthcare data warehouse setting and experience in profiling and analyzing disparate healthcare datasets (Financial, Clinical Quality, Value Based Care, population health, Revenue cycle analytics, Health system operations, etc.) and ability to convert this data into insights 7+ years working with healthcare datasets and ability to convert the business requirements into functional designs that are scalable and maintainable 7 + years of experience with Oracle/SQL server databases including T-SQL, PL/SQL, Indexing, partitioning, performance tuning 7+ years of experience in creating Source to Target Mappings and ETL designs (using SSIS / Informatica / DataStage) for integration of new/modified data streams into the data warehouse/data marts 5+ years of experience in designing and implementing data models to support analytics with solid knowledge of dimensional modeling concepts Experience with Epic Clarity and/or Caboodle data models Healthcare Domain Knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 2 months ago
2.0 - 5.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so. Qualifications - External Required Qualifications: Graduate degree or equivalent experience 8+ years of development, administration and migration experience in Azure Databricks and Snowflake 8+ years of experience with data design/pattern- Data warehousing, Dimensional Modeling and Lakehouse Medallion Architecture 5+ years of experience working with Azure data factory 5+ years of experience in setting up, maintenance and usage of Azure services as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure SQL Database etc. 5+ years of experience working with Python and Pyspark 3+ years of experience with Kafka Excellent communication skills to effectively convey technical concepts to both technical and non-technical stakeholders.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough