Home
Jobs

5 - 10 years

7 - 12 Lacs

Posted:19 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Req ID: 330195
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Python Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN).
Job Duties: As a Senior Python Engineer, you will be a member of the C3 Data Warehouse team with a focus on building our next-gen data platform used for sourcing and storing data from different technology systems across the firm into a centralized data platform that empowers various reporting and analytics solutions for the Technology Risk functions within Morgan Stanley. In this role you will be primarily responsible for contributing to the development of a unified data pipeline framework written in Python utilizing technologies such as Airflow, DBT, Spark and Snowflake. You will also be responsible for contributing to the integration of this framework with existing internal platforms for data quality, data cataloging, data discovery, incident logging, and metric generation. You will be working closely with data warehousing leads, data analysts, ETL developers, infrastructure engineers, and data analytics teams to facilitate the implementation of this data platform and data pipeline framework.
To develop various components in Python of our unified data pipeline framework.
To contribute towards the establishment of best practices for the optimal and efficient usage of Snowflake. To assist with the testing and deployment of our data pipeline framework utilizing standard testing frameworks and CI/CD tooling. To monitor the performance of queries and data loads and perform tuning as necessary. To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues.
Minimum Skills Required: At least 5 years of experience in data development and solutions in highly complex data environments with large data volumes.
At least 5 years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. At least 3 years of experience developing solutions in a hybrid data environment (on-Prem and Cloud) Exposure to Power BI / Snowflake

Mock Interview

Practice Video Interview with JobPe AI

Start Artificial Intelligence Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Mumbai Metropolitan Region

Ahmedabad, Gujarat, India