Job
Description
Project Role :Data Engineer
Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills :Data Engineering, Snowflake Data Warehouse
Good to have skills :Machine Learning
Minimum 7.5 year(s) of experience is requiredEducational Qualification:15 years full time education
Summary Lead the architecture, strategy, and delivery of enterprise-scale data platforms on Snowflake. Own technical direction for data modeling, ingestion, performance, security, governance, and developer experience. Mentor engineers and collaborate with stakeholders to deliver robust, scalable, and secure data solutions for analytics, AI/ML, and operational use cases. Roles and responsibilities:Architecture & Strategy
Define target data platform architecture (multi-environment, multi-account, integration patterns). Establish technical standards for ELT, dbt/Snowpark, data contracts, and lifecycle management. Lead make/buy/standardize decisions for connectors, CDC/integration tools, orchestration, catalog, and observability. Design and Implement CI/CD pipelines for data engineering workflows Design and Implement Streaming data architectures and Micro-service based architectures Design and implement reliable ingestion and transformation pipelines (batch/streaming) using Snowflake features (Snowpipe, Streams & Tasks, external stages) and orchestration tools (Airflow/ADF). Drive data modeling best practices (dimensional, data vault, wide tables) with versioning, testing, and documentation. Enable CI/CD, environment promotion, automated data quality checks (dbt tests, Great Expectations), and blue-green deployments. Performance, Reliability & Cost Lead workload management, query optimization, clustering, and caching strategies. Implement resource monitors, autosuspend policies, and cost observability dashboards. Own SLA/SLOs for key data products, incident runbooks, lineage-aware impact analysis, and post-incident reviews. Security, Compliance & Sharing Define and enforce RBAC/ABAC, row/column-level security, masking policies, and secrets management. Partner with Security/GRC on auditing, data retention, PII/PHI protection, and regulatory compliance (GDPR/CCPA). Operationalize secure data sharing and cleanroom patterns for internal/external collaboration. Leadership & Influence Serve as principal technical authority for Snowflake; mentor engineers and elevate team capability. Collaborate with Product, Analytics, and ML teams to deliver scalable, reusable data products. Author architecture decision records (ADRs), roadmaps, and executive-level technical narratives. Technical experience & Professional attributes:12+ years in data engineering/architecture; 6+ years designing and operating Snowflake at scale (multi-TB warehouses, 1000s of daily jobs). Expert SQL and deep knowledge of Snowflake internals:virtual warehouses, micropartitions, clustering, Query Profile, caching, Time Travel, Failsafe. Hands-on with Snowflake workloads & features:Snowpipe, Streams & Tasks, stages/external tables, materialized views, Secure Data Sharing. Experience with at least one transformation framework (dbt strongly preferred) and programming language (Python, Snowpark, UDFs). Experience with orchestration (Airflow/ADF/Argo), CI/CD (Jenkins/GitHub Actions/GitLab/Azure DevOps), and IaC (Terraform). Fluency with cloud platforms (AWS/Azure/GCP), object storage (S3/ADLS/GCS), networking (VPC/VNet, PrivateLink/Private Endpoints). Solid grounding in data governance (catalog/lineage tools), privacy & security controls, and observability (logs/metrics/traces for pipelines). Experience in Fivetran and streaming tools Kafka, Pulsar or other equivalent. Performance tuning at scale:warehouse sizing, task concurrency, pruning/clustering, cost optimization. Excellent communication; able to present architecture tradeoffs to engineers and executives. Preferred Skills SnowPro Advanced:Architect or Advanced:Data Engineer certification. Data quality frameworks ,data contracts, and contract-testing patterns.
Additional Information:You will be working with a Trusted Tax Technology Leader, committed to delivering reliable and innovative solutions Qualification 15 years full time education