Hi,
Erwin ER Diagrams
Data Modeling & Architecture:
- Design and implement conceptual, logical, and physical data models, including Entity-Relationship
(ER) models, Star Schema, Snowflake Schema, Data Vault Modeling, and Dimensional Modeling.
- Lead the design of normalized and denormalized structures to meet business requirements and ensure optimal performance of the Data Warehouse and Data Marts.
- Collaborate with business and technical teams to map business requirements to data models, ensuring that
Master Data Management (MDM)
processes and Golden Record concepts are well-defined. - Build and maintain a comprehensive
Business Glossary
and Data Dictionary to standardize definitions and ensure consistency across the organization.
Data Lineage & Mapping:
- Ensure that data lineage is accurately defined, visualized, and documented across the Data Warehouse environment.
- Oversee the data mapping process to track the flow of data from source to destination, ensuring consistency, integrity, and transparency of data throughout its lifecycle.
Data Governance & Quality:
- Implement Data Governance processes to manage data access, quality, security, and compliance.
- Define and enforce Data Quality standards and practices, including Data Cleansing, to ensure data integrity and accuracy within the data warehouse environment.
- Work with stakeholders to establish governance frameworks for Data Lineage, ensuring data traceability, and transparency across the platform.
- Work with data architects and IT leadership to establish guidelines for data access, data security, and lifecycle management.
Real-Time Data Ingestion & Change Data Capture (CDC):
- Design and implement real-time data ingestion pipelines using Kafka, AWS Kinesis, or Snowpipe to enable streaming data integration into the data warehouse.
- Implement Change Data Capture (CDC) mechanisms to efficiently capture and propagate data changes from operational systems using tools such as Fivetran or AWS, Lambda.
- Ensure low-latency processing, incremental updates, and data availability for real-time analytics and reporting.
Quality Assurance & Continuous Improvement:
- Ensure high standards for data quality through rigorous testing, data validation, and performance optimization.
- Continuously evaluate and improve data modeling processes, tools, and methodologies.
Automation & Process Improvement:
- Work with Data Engineers and development teams to improve data platform automation and enhance the data modeling lifecycle.
- Continuously monitor, test, and optimize data models and pipelines to ensure scalability, flexibility, and performance of the Data Warehouse.
Documentation & Reporting:
- Maintain clear and up-to-date documentation for data models, data lineage, data mappings, and architectural decisions.
- Create and present technical diagrams, such as Entity-Relationship Diagrams (ERDs), to stakeholders and ensure alignment with business objectives.
Platform Design & Deployment:
- Develop data architecture for the analytics platform on Snowflake and integrate with other AWS tools for robust data management.
- Work closely with data engineers to automate data pipeline deployments and updates using Fivetran, DBT, and cloud-based solutions.
Stakeholder Collaboration:
- Partner with Product Manager, and other technical teams to define requirements and deliver optimal data architecture solutions.
- Conduct regular meetings to communicate technical decisions and ensure alignment with business goals and strategy.
- Contribute to proposal creation and RFP submissions, ensuring technical feasibility and best practices.
Documentation & Reporting:
- Document all design decisions and data models, adhering to existing guidelines and ensuring clear communication across teams.
- Create presentations and visual data architecture diagrams for internal and external stakeholders.
Perform other duties that support the overall objective of the position.
Education Required:
- Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.
- Masters degree or certifications in Data Architecture, Cloud Technologies, or related areas is a plus.
- Or, any combination of education and experience which would provide the required qualifications for the position.
Experience Required:
- 6+ to 10 years in Data Modeling and Data 6+ to 10 years of hands-on experience in data modeling, data architecture, or information architecture with a focus on large-scale data warehouses.
- 6+ years of experience with dimensional models and relational database management systems (SQL Server, Oracle, DB2, etc.).
- 5+ years of experience with cloud technologies, especially AWS services and tools.
- Experience with ETL tools and automation (e.g.,
Fivetran, DBT
). - Experience with data governance, data quality frameworks, and metadata management.
Preferred:
- Experience in healthcare data modeling and data warehousing.
- Expertise in AWS environments.
- Hands-on experience with data integration and cloud automation tools.
- Familiarity with business intelligence tools (e.g., Sigma Computing).
- Understanding of healthcare-specific data governance, regulatory frameworks, and security compliance (e.g., HIPAA).
Knowledge, Skills & Abilities:
- Knowledge of: Solid understanding of Data Vault, Star Schema, Snowflake Schema, and dimensional modeling. Proficient in SQL and experience with cloud-based data warehouse solutions such as Snowflake. Familiarity with AWS cloud services, Sigma Computing, Atlan, and Erwin ER diagrams.
- Skill in: Excellent communication skills to engage with both technical and non-technical stakeholders. Strong analytical and problem-solving skills to design scalable and efficient data models.
- Ability to: Ability to take ownership of deliverables, manage multiple tasks, and work effectively within an Agile methodology. Proven leadership ability to coach and mentor junior team members.