Senior Data Architect (12+ Yr)

12 - 15 years

0 Lacs

Posted:1 week ago| Platform: Foundit logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title:

Data Architect

Experience:

Location:

Engagement:

Job Overview

Data Architect

Data Lakes, Data Warehouses, and Data Lakehouse

Key Responsibilities

  • Define and drive the

    enterprise data architecture strategy

    supporting analytics, BI reporting, and AI/ML initiatives.
  • Architect, design, and implement

    Data Lakes, Data Warehouses, and Data Lakehouse

    platforms on cloud environments.
  • Lead data ecosystem modernization with a strong emphasis on

    Snowflake

    , supported by

    Databricks, Spark, Azure Data Factory, and Python

    .
  • Design and optimize

    batch and real-time streaming data pipelines

    for high-volume, enterprise-scale workloads.
  • Establish robust

    data ingestion, transformation, orchestration, and automation frameworks

    .
  • Define and enforce

    data governance, data quality, metadata management, and lineage standards

    .
  • Collaborate with business stakeholders, data engineers, analytics teams, and leadership to align data architecture with business goals.
  • Provide

    technical leadership and mentorship

    to engineering teams, promoting best practices and architectural standards.
  • Continuously evaluate and integrate emerging technologies to enhance performance, scalability, security, and cost efficiency.

Required Skills & Experience

  • 1215 years of overall IT experience

    , with

    8+ years in a Data Architect or Senior Data Architecture role

    .
  • Strong hands-on experience designing and implementing

    enterprise data platforms from scratch

    .
  • Core technical expertise in:
  • Snowflake

    (mandatory / primary focus)
  • Databricks and Apache Spark

    (batch & streaming)
  • Azure Data Factory (ADF)

    for orchestration and data pipelines
  • Python

    for data engineering, automation, and custom processing
  • Solid understanding of

    ETL/ELT patterns

    , data modeling, schema design, and query performance optimization.
  • Experience with

    real-time data streaming

    technologies (e.g., Kafka, Spark Streaming, Azure Event Hubs).
  • Working knowledge of

    data governance, cataloging, lineage, compliance, and security frameworks

    .
  • Strong understanding of

    cloud-native services

    across

    Azure, AWS, or GCP

    .

  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Job-Specific Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Skills

    Practice coding challenges to boost your skills

    Start Practicing Now

    RecommendedJobs for You