Data Engineer Pyspark

0 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Join our digital revolution in NatWest Digital XIn everything we do, we work to one aim. To make digital experiences which are effortless and secure.So we organise ourselves around three principles: engineer, protect, and operate. We engineer simple solutions, we protect our customers, and we operate smarter.Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive.This role is based in India and as such all normal working days must be carried out in India.Join us as a Data Engineer
  • This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences
  • You’ll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customers’ and the bank’s data safe and secure
  • Participating actively in the data engineering community, you’ll deliver opportunities to support the bank’s strategic direction while building your network across the bank
  • We're offering this role at associate level
What you'll do
As a Data Engineer, you’ll play a key role in driving value for our customers by building data solutions. You’ll be carrying out data engineering tasks to build, maintain, test and optimise a scalable data architecture, as well as carrying out data extractions, transforming data to make it usable to data analysts and scientists, and loading data into data platforms.You’ll also be:
  • Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development
  • Practicing DevOps adoption in the delivery of data engineering, proactively performing root cause analysis and resolving issues
  • Collaborating closely with core technology and architecture teams in the bank to build data knowledge and data solutions
  • Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions
  • Sourcing new data using the most appropriate tooling and integrating it into the overall solution to deliver for our customers
The skills you'll need
To be successful in this role, you’ll need a good understanding of data usage and dependencies with wider teams and the end customer, as well as experience of extracting value and features from large scale data.You’ll also demonstrate:
  • Experience of ETL technical design, including data quality testing, cleansing and monitoring, and data warehousing and data modelling capabilities
  • At least five years of experience in Oracle PL-SQL, PySpark, AWS S3, Glue, and Airflow
  • Experience of using programming languages alongside knowledge of data and software engineering fundamentals
  • Good knowledge of modern code development practices
  • Strong communication skills with the ability to proactively engage with a wide range of stakeholders

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You