Posted:6 hours ago|
Platform:
On-site
Full Time
Key Responsibilities Lead and architect end-to-end data migrations from on-premise and legacy systems to Snowflake, ensuring optimal performance, scalability, and cost-efficiency. Design and develop reusable data ingestion and transformation frameworks using Python. Build and optimize real-time ingestion pipelines using Kafka, Snowpipe, and the COPY command. Utilize SnowConvert to migrate and optimize legacy ETL and SQL logic for Snowflake. Design and implement high-performance Snowflake data models, including materialized views, clustering keys, and result caching strategies. Monitor resource usage and implement auto-suspend/auto-resume, query profiling, and cost-control measures to manage compute and storage effectively. Drive cost governance initiatives, providing insights into credit usage and optimizing workload distribution. Integrate Snowflake with AWS services such as S3, Lambda, Glue, and Step Functions to ensure a robust data ecosystem. Mentor junior engineers, enforce best practices in development and code quality, and champion agile data engineering practices. ________________________________________ Required Skills And Experience 10+ years of experience in data engineering with a focus on enterprise ETL and cloud data platforms. 4+ years of hands-on experience in Snowflake development and architecture. Expertise In Advanced Snowflake Features Such As Snowpark, Streams & Tasks, Secure Data Sharing, Data Masking, and Time Travel. Proven ability to architect enterprise-grade Snowflake solutions optimized for performance, governance, and scalability. Proficient in Python for building orchestration tools, automation, and reusable data pipelines. Solid knowledge of AWS services, including S3, IAM, Lambda, Glue, and Step Functions. Hands-on experience with SnowConvert or similar tools for legacy code conversion. Familiarity with real-time data streaming technologies such as Kafka, Kinesis, or other event-based systems. Strong SQL skills with proven experience in query tuning, profiling, and performance optimization. Deep understanding of legacy ETL tools, with preferable experience in Ab Initio. Exposure to CI/CD pipelines, version control systems (e.g., Git), and automated deployment practices. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. Experience in migrating on-premises or mainframe data warehouses to Snowflake. Familiarity with BI/analytics tools such as Tableau, Power BI, or Looker. Knowledge of data security and compliance best practices, including data masking, RBAC, and OAuth integration. Snowflake certifications (Developer, Architect) are a strong plus. Show more Show less
Virtusa
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Virtusa
Hyderabad, Ahmedabad
10.0 - 20.0 Lacs P.A.
Pune, Chennai, Mumbai (All Areas)
0.5 - 2.5 Lacs P.A.
Chennai, Tamil Nadu, India
Salary: Not disclosed
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Chennai
4.6 - 6.3 Lacs P.A.
Hyderabad, Pune, Bengaluru
0.5 - 3.0 Lacs P.A.
Hyderabad, Chennai, Bengaluru
7.0 - 11.0 Lacs P.A.
Bengaluru
20.0 - 25.0 Lacs P.A.
Hyderabad, Bangalore Rural, Bengaluru
8.0 - 18.0 Lacs P.A.