Posted:2 hours ago|
Platform:
Work from Office
Full Time
Key Responsibilities:
Define and own the end-to-end architecture of data platforms built on Snowflake, including ingestion, transformation, storage, and consumption layers.
Design and implement data lakehouse architectures using Snowflake’s native features such as Streams, Tasks, Materialized Views, and External Tables.
Establish best practices for data governance, security, and access control using Role-Based Access Control (RBAC), Secure Views, and Row-Level Security.
Architect scalable ETL/ELT pipelines using Snowflake, DBT, and Python, integrating with orchestration tools like Airflow etx.
Lead integration of Snowflake with enterprise systems such as data catalogs, data quality frameworks, monitoring tools, and BI platforms.
Working knowledge ofAI integrationswith Snowflake, includingSnowflake CortexandLLM-powered analytics, with experience in enabling intelligent data applications.
Guide teams in implementing CI/CD pipelines for Snowflake deployments using tools like GitHub Actions, Azure DevOps, or GitLab CI.
Monitor and optimize Snowflake performance using Query Profile, Resource Monitors, Clustering Keys, and Auto-Suspend/Resume features.
Collaborate with data analysts, engineers, and business stakeholders to ensure platform reliability and data accessibility.
Conduct architectural reviews, code audits, and mentoring sessions to ensure adherence to standards and scalability.
Stay current with Snowflake innovations and advocate for adoption of new features and capabilities.
Skills / Qualifications:
Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required.
At least 10 years of experience in data development and solutions in highly complex data environments with large data volumes.
At least 10 years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis.
At least 2 years of experience developing data pipelines and data warehousing solutions using Python and snowpark.
Familiarity with Snowflake’s architecture, including virtual warehouses, data storage layers, and compute scaling.
Hands on experience with DBT preferred.
Experience with performance tuning SQL queries, Spark job, and stored procedures.
An understanding of E-R data models (conceptual, logical, and physical).
Understanding of advanced data warehouse concepts is required
Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.
Tata Consultancy Services
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Salary: Not disclosed
hyderabad, pune, chennai, bengaluru
25.0 - 32.5 Lacs P.A.
pune, chennai
15.0 - 30.0 Lacs P.A.
bengaluru
3.0 - 7.0 Lacs P.A.
india
Salary: Not disclosed
chennai
5.0 - 6.0 Lacs P.A.
maharashtra
Salary: Not disclosed
hyderabad, chennai
13.0 - 14.0 Lacs P.A.
bengaluru
0.5 - 0.6 Lacs P.A.
hyderabad, chennai, bengaluru
10.0 - 20.0 Lacs P.A.