Job
Description
Project description This project is part of a strategic initiative to migrate legacy on-premises data systems to a modern, scalable cloud-based data platform using Snowflake on Azure. The goal is to enhance data accessibility, performance, and governance while enabling advanced analytics and automation.The resource will join an ongoing effort to accelerate the migration and modernization process, contributing to data pipeline development, orchestration, and optimization using tools like Airflow, DBT, and Docker, with strong emphasis on Python scripting and database engineering. Responsibilities Data Pipeline DevelopmentBuild and optimize ETL/ELT pipelines using Python, Airflow, and DBT to ingest, transform, and load data into Snowflake. Cloud Migration SupportAssist in migrating existing on-prem data workflows and databases to Snowflake on Azure, ensuring minimal disruption and high data fidelity. Data Modeling & TransformationDesign and implement robust data models and transformation logic using DBT and SQL, aligned with business requirements. ContainerizationUse Docker to containerize data services and workflows for consistent deployment across environments. Database EngineeringApply strong SQL and Snowflake expertise to optimize queries, manage schemas, and ensure data quality and performance. Version Control & CollaborationUse Git/GitHub for source control, code reviews, and CI/CD integration. Monitoring & TroubleshootingMonitor pipeline performance, troubleshoot issues, and implement logging and alerting mechanisms. Security & GovernanceEnsure data security, compliance, and governance standards are met throughout the migration process. Skills Must have Python, Airflow / DBT, Docker, strong database skills (Snowflake), git/GitHub, Azure Scope resource will help expedite work already in progress as part of cloud migration of our on-prem data systems to Cloud (snowflake on Azure). Nice to have Certifications