Home
Jobs

7 - 12 years

9 - 14 Lacs

Posted:19 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Req ID: 330199
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Snowflake Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN).
"Job Duties: As a Senior Snowflake Engineer, you will be a member of the C3 Data Warehouse team with a focus on building our next-gen data platform used for sourcing and storing data from different technology systems across the firm into a centralized data platform that empowers various reporting and analytics solutions for the Technology Risk functions within Morgan Stanley. In this role you will be primarily responsible for contributing to the development of our Cloud Data Warehouse utilizing Snowflake and Python-based tooling. You will also be responsible for the design and development of our data warehouse utilizing Snowflake capabilities such as data sharing, time travel, Snow Park, workload optimization across analytic and AI use-cases, and the ingestion and storage of structured and unstructured data. You will also work on the integration of our Snowflake data warehouse with existing internal platforms for data quality, data cataloging, data discovery, incident logging, and metric generation. You will be working closely with data warehousing leads, data analysts, ETL developers, infrastructure engineers, and data analytics teams to facilitate the implementation of this data platform and data pipeline framework.
KEY RESPONSIBILITIES: To design, develop, and manage our Snowflake data warehouse. To contribute towards the establishment of best practices for the optimal and efficient usage of Snowflake with tooling like Airflow, DBT and Spark. To assist with the testing and deployment of our data pipeline framework utilizing standard testing frameworks and CI/CD tooling. To monitor the performance of queries and data loads and perform tuning as necessary. To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues.
Minimum Skills Required: Bachelor s degree in Computer Science, Software Engineering, Information Technology, or related field required.
At least 7 years of experience in data development and solutions in highly complex data environments with large data volumes. At least 5 years of experience with developing data solutions on Snowflake. At least 5 years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Preferred 3 years of data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. Preferred 3 years of experience developing solutions in a hybrid data environment (on-Prem and Cloud) Hands on experience with Python is preferred."

Mock Interview

Practice Video Interview with JobPe AI

Start Artificial Intelligence Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Bengaluru, Karnataka, India