Data Architect

12 - 16 years

0 Lacs

Posted:1 week ago| Platform: Foundit logo

Apply

Skills Required

Work Mode

On-site

Job Type

Full Time

Job Description

Role Overview

The Data Architect will design and govern scalable big data and analytics platforms on any leading cloud (AWS/Azure/GCP), working closely with sales and delivery teams to shape solutions for prospective and existing clients. The role combines hands?on architecture, technical leadership, and presales responsibilities, with a strong focus on SQL, Python, and modern data engineering practices.

Key Responsibilities

  • Design and own end?to?end data architectures including data lakes, data warehouses, and streaming pipelines using big data technologies (e.g., Spark, Kafka, Hive) on public cloud platforms.
  • Define canonical data models, integration patterns, and governance standards to ensure data quality, security, and compliance across the organization.
  • Lead presales activities: assess client requirements, run discovery workshops, define solution blueprints, size and estimate effort, and contribute to RFP/RFI responses and proposals.
  • Build and review PoCs/accelerators using SQL and Python (e.g., PySpark, notebooks) to demonstrate feasibility, performance, and business value to customers.
  • Collaborate with data engineers, BI/ML teams, and application architects to ensure the designed architecture is implemented as intended and is cost?efficient, scalable, and reliable.
  • Establish best practices for data security, access control, and lifecycle management in alignment with regulatory and enterprise policies.
  • Monitor and continuously optimize data platforms for performance, reliability, and cost, leveraging cloud?native services and observability tools.
  • Provide architectural guidance and mentoring to engineering teams; review designs and code for critical data components.

Required Skills & Experience

  • 12-16 years of overall experience in data engineering/analytics, with 4+ years as a Data/Big Data Architect.
  • Strong expertise in

    SQL

    (analytical queries, performance tuning) and

    Python

    for data processing and automation.
  • Hands?on experience with big data frameworks and tools such as Spark, Kafka, Hadoop ecosystem, distributed file systems, and modern ETL/ELT pipelines.
  • Practical experience on at least one major cloud platform (AWS, Azure, or GCP) with services such as data lakes, warehouse services (Redshift/Snowflake/BigQuery/Synapse), and orchestration tools.
  • Proven presales exposure: client workshops, solution design, RFP/RFI responses, effort estimation, and building PoCs or demos.
  • Strong understanding of data modeling (OLTP, OLAP, dimensional modeling), data governance, security, and compliance.
  • Ability to communicate complex data solutions clearly to both technical and business stakeholders.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Grid Dynamics logo
Grid Dynamics

Information Technology and Services

Los Altos

RecommendedJobs for You

hyderabad, chennai, bengaluru