Auxo AI - Data Architect - Python Programming

10 - 12 years

0 Lacs

Posted:3 weeks ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Description

AuxoAI is hiring a Data Architect GCP to lead enterprise data platform design, architecture modernization, and solution delivery across global client engagements. In this client-facing role, you will architect scalable data platforms using GCP-native services, guide onshore/offshore data engineering teams, and define best practices across ingestion, transformation, governance, and consumption layers.

Role

This role is ideal for someone who combines deep GCP platform expertise with leadership experience, and is confident working with both engineering teams and executive :
  • Design and implement enterprise-scale data architectures using GCP services, with BigQuery as the central analytics platform
  • Lead end-to-end implementation of medallion architecture (Raw Processed Curated) patterns
  • Oversee data ingestion pipelines using Cloud Composer, Dataflow (Apache Beam), Pub/Sub, and Cloud Storage
  • Implement scalable ELT workflows using Dataform and modular SQLX transformations
  • Optimize BigQuery workloads through advanced partitioning, clustering, and materialized views
  • Lead architectural reviews, platform standardization, and stakeholder engagements across engineering and business teams
  • Implement data governance frameworks leveraging tools like Atlan, Collibra, and Dataplex
  • Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
  • Enable downstream consumption through Power BI, Looker, and optimized data marts
  • Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
  • Manage a distributed team of data engineers; set standards, review code, and ensure platform stability

Requirements

  • 10+ years of experience in data architecture and engineering
  • 4+ years of hands-on GCP experience, including BigQuery, Dataflow, Cloud Composer, Dataform, and Cloud Storage
  • Deep understanding of streaming + batch data patterns, event-driven ingestion, and modern warehouse design
  • Proven leadership of cross-functional, distributed teams in client-facing roles
  • Strong programming skills in Python and SQL
  • Experience working with data catalog tools (Atlan, Collibra), Dataplex, and enterprise source connectors
  • Excellent communication and stakeholder management skills

Preferred Qualifications

  • GCP Professional Data Engineer or Cloud Architect certification
  • Experience with Vertex AI Model Registry, Feature Store, or ML pipeline integration
  • Familiarity with AlloyDB, Cloud Spanner, Firestore, and enterprise integration tools (e.g., Salesforce, SAP, Oracle)
  • Background in legacy platform migration (Oracle, Azure, SQL Server)
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You