Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title:

Data Engineer

Location: Bangalore

Experience : 10 + Years

About The Role

As a

Data Engineer

in the

WLO Program

, you will be responsible for designing, building, and managing robust data pipelines to support

WLO data loads

and

downstream reporting requirements

. You will play a crucial role in sourcing, curating, and transforming data for reporting, model monitoring, and other downstream consumption needs.Working as part of a

cross-functional squad

with SMEs, Business Analysts, Data Analysts, and Engineers, you’ll collaborate across teams to ensure seamless integration between business objectives and technical solutions. You’ll also partner with teams in

Data Portfolio, Assurance, Risk, Technology, and Operations

to ensure successful project delivery aligned with ANZ’s data standards and governance policies.Aligned with ANZ’s shift toward a

Data Mesh and Data Product architecture

on cloud platforms, you’ll help establish scalable, compliant, and reusable data frameworks to support the next phase of data transformation.

Core Accountabilities

Key Responsibilities / Role Accountabilities

  • Design and develop data structures and processes for ingestion, integration, and analytics layers using AWS (Lambda) and Google Cloud Platform (GCP) technologies such as:
    • Cloud Run, CloudSQL, Google Cloud Storage, BigQuery, Vertex AI
    • DBT (Data Build Tool) and Python
  • Collaborate in a cross-skilled squad, integrating efforts across business and technology to deliver end-to-end data engineering capabilities.
  • Contribute to data architecture design patterns for both on-premises and cloud-based solutions.
  • Optimize and maintain data flows, building robust, fault-tolerant pipelines that clean, transform, and aggregate unstructured or messy data into well-organized datasets for business use.
  • Implement automated controls to ensure data quality, process validation, auditability, and reconciliation.
  • Support data sourcing, transformation, and validation activities to ensure alignment with business requirements and downstream consumption needs.
  • Contribute to the continuous improvement of data engineering standards, documentation, and operational processes.

Key Skills

  • Strong hands-on experience in data pipeline design, ETL/ELT development, and cloud data engineering.
  • Practical experience with AWS and GCP cloud services (Lambda, BigQuery, Cloud Run, Vertex AI, etc.).
  • Proficiency in Python and DBT, with a focus on automation, reusability, and scalability.
  • Understanding of data governance, data quality, and audit/reconciliation controls.
  • Experience working in Agile / cross-functional teams.
  • Strong analytical, problem-solving, and communication skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You