Data Architect

8 - 13 years

25 - 30 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About Exponentia.ai
Exponentia.ai is a fast-growing AI-first technology services company, partnering with enterprises to shape and accelerate their journey to AI maturity. With a presence across the US, UK, UAE, India, and Singapore , we bring together deep domain knowledge, cloud-scale engineering, and cutting-edge artificial intelligence to help our clients transform into agile, insight-driven organizations.
We are proud partners with global technology leaders such as Databricks, Microsoft, AWS, and Qlik , and have been consistently recognized for in ation, delivery excellence, and trusted advisories.
Awards & Recognitions:
  • In ation Partner of the Year - Databricks,
  • Digital Impact Award, UK - (TMT Sector)
  • Rising Star - APJ Databricks Partner Awards
  • Qlik s Most Enabled Partner - APAC
With a team of 450+ AI engineers, data scientists, and consultants , we are on a mission to redefine how work is done, by combining human intelligence with AI agents to deliver exponential outcomes.
Learn more:
About the Role:
We are looking for a highly skilled Data Architect with hands-on experience in modern cloud-based data platforms and strong working knowledge of Databricks. The candi will architect scalable data ecosystems, design end-to-end data pipelines, and establish data standards to support advanced analytics, BI, and AI initiatives.
Key Responsibilities
Data Architecture & Platform Design
  • Design and implement scalable enterprise data architectures across cloud environments.
  • Develop conceptual, logical, and physical data models for analytical and operational use cases.
  • Define data ingestion, transformation, and integration patterns using Databricks, Delta Lake, and related frameworks.
  • Architect ELT/ETL pipelines leveraging Databricks Workflows, Delta Live Tables, or orchestration tools.
Databricks & Lakehouse Responsibilities
  • Develop and optimize data pipelines on Databricks (SQL, Python, PySpark).
  • Implement Lakehouse architecture principles using Delta Lake, Unity Catalog, and Databricks compute clusters.
  • Optimize Spark jobs, cluster configurations, and cost/performance strategies.
  • Work with Databricks features such as feature store, MLflow, Delta Sharing, and workspace governance.
Data Governance & Quality
  • Define data quality rules, lineage, metadata standards, and governance frameworks.
  • Collaborate with security teams to ensure compliance with data privacy and security requirements.
  • Implement governance structures using Unity Catalog, RBAC, and data access policies.
Cross-functional Collaboration
  • Partner with data engineers, analysts, AI/ML teams, and business stakeholders to deliver data-driven solutions.
  • Translate business needs into scalable, secure, and efficient data architectures.
  • Provide architectural guidance and best practices around Databricks and cloud data systems.
Strategy & In ation
  • Evaluate data technologies and recommend tooling aligned with modernization and scalability goals.
  • Drive cloud migration and transformation initiatives, including legacy system modernization.
  • Contribute to the long-term enterprise data architecture roadmap.
Ideal Candi Profile
  • Bachelor s/Master s in Computer Science, Information Systems, Engineering, or related field.
  • 8+ years of experience in data architecture, data engineering, or advanced analytics platforms.
  • Strong hands-on experience with Databricks (SQL, PySpark, Delta Lake).
  • Experience building lakehouse architectures on AWS/Azure/GCP.
  • Proficiency with ETL/ELT orchestration tools (Airflow, ADF, Databricks Workflows, DBT, etc.).
  • Deep understanding of distributed computing concepts and Spark optimization.
  • Strong experience with data warehousing (Snowflake, Redshift, BigQuery, Synapse) and data lakes (S3, ADLS, GCS).
  • Expertise in SQL, data modeling, and designing scalable data ecosystems.
  • Solid understanding of API integration, streaming platforms (Kafka, Kinesis), and REST architectures.
  • Experience with data governance and security frameworks.
Preferred Qualifications
  • Databricks certification (Data Engineer Associate/Professional, Data Architect).
  • Experience with MLflow, Feature Store, AutoML, or AI/ML pipelines on Databricks.
  • Knowledge of Iceberg/Hudi/Delta Lake and Lakehouse best practices.
  • Experience with CI/CD pipelines for data (Git, Jenkins, Azure DevOps, Gi b Actions).
Why Join Exponentia.ai?
  • In ate with Purpose: Opportunity to create pioneering AI solutions in partnership with leading cloud and data platforms
  • Shape the Practice: Build a quee capability from the ground up with full ownership
  • Work with the Best: Collaborate with top-tier talent and learn from industry leaders in AI
  • Global Exposure: Be part of a high-growth firm operating across US, UK, UAE, India, and Singapore
  • Continuous Growth: Access to certifications, tech events, and partner-led in ation labs
  • Inclusive Culture: A supportive and diverse workplace that values learning, initiative, and ownership
Ready to build the future of AI with us?
Apply now and become a part of a next-gen tech company that s setting bench ks in enterprise AI solutions.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Exponentia Team logo
Exponentia Team

Business Consulting and Services

Mumbai Maharashtra

RecommendedJobs for You

noida, uttar pradesh, india