Data bricks Architect

14 - 24 years

35 - 50 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Key Responsibilities:

  • Platform Architecture Design

    : Lead the design and architecture of the

    digital platform

    , ensuring that the data infrastructure is scalable, secure, and reliable. Focus on utilizing

    AWS services

    (e.g.,

    S3

    ,

    Redshift

    ,

    Glue

    ,

    Lambda

    ,

    Kinesis

    ) and

    Databricks

    to build a robust, cloud-based data architecture.
  • Data Integration & ETL Pipelines

    : Architect and implement

    ETL/ELT pipelines

    to integrate data from multiple sources (e.g., transactional databases, third-party services, APIs) into the platform, using

    AWS Glue

    ,

    Databricks

    , and other tools for efficient data processing.
  • Cloud Strategy & Deployment

    : Implement cloud-native solutions, leveraging AWS tools and Databricks for data storage, real-time processing, machine learning, and analytics. Design the platform to be cost-efficient, highly available, and easily scalable.
  • Data Modelling

    : Develop and maintain data models for the platform that support business intelligence, reporting, and analytics. Ensure the data model design aligns with business requirements and the overall architecture of the platform.
  • Machine Learning & Analytics Enablement

    : Work with data scientists and analysts to ensure that the architecture supports advanced analytics and machine learning workflows, enabling faster time to insights and model deployment.
  • Data Security & Governance

    : Implement data governance frameworks to ensure data privacy, compliance, and security in the digital platform. Use

    AWS security tools

    and best practices to safeguard sensitive data and manage access control.
  • Platform Performance & Optimization

    : Monitor and optimize platform performance, including the efficiency of data processing, data retrieval, and analytics workloads. Ensure low-latency and high-throughput data pipelines.
  • Collaboration & Stakeholder Management

    : Collaborate closely with stakeholders across data engineering, data science, and business teams to align the platform architecture with business needs and evolving technological requirements.

Skills & Qualifications:

Required:

  • Bachelors / Master’s degree in computer science

    , Engineering or a related field.
  • 10+ years

    of experience in data architecture, data engineering, or a related field, with a strong background in designing scalable, cloud-based data platforms.
  • Extensive experience with

    AWS services

    such as

    S3

    ,

    Redshift

    ,

    Glue

    ,

    Lambda

    ,

    Kinesis

    , and

    RDS

    , with a deep understanding of cloud architecture patterns.
  • Strong proficiency in

    Databricks

    , including experience with

    Apache Spark

    ,

    Delta Lake

    , and

    MLflow

    for building data pipelines, managing large datasets, and supporting machine learning workflows.
  • Expertise in

    data modelling

    techniques, including designing star/snowflake schemas, dimensional models, and ensuring data consistency and integrity across the platform.
  • Experience with

    ETL/ELT processes

    , integrating data from a variety of sources, and optimizing data flows for performance.
  • Proficiency in programming languages such as

    Python

    and

    SQL

    for data manipulation, automation, and data pipeline development.
  • Strong knowledge of

    data governance

    and security practices, including data privacy regulations (GDPR, CCPA) and tools like

    AWS IAM

    ,

    AWS KMS

    , and

    AWS CloudTrail

    .
  • Experience with

    CI/CD

    pipelines and automation tools for deployment, testing, and monitoring of data architecture and pipelines.

Preferred:

  • Experience with

    real-time streaming

    data solutions such as

    Apache Kafka

    or

    AWS Kinesis

    within the

    Databricks

    environment.
  • Experience with

    data lake

    management, particularly using

    AWS Lake Formation

    and

    Databricks Delta Lake

    for large-scale, efficient data storage and management.

Soft Skills:

  • Strong communication skills, with the ability to explain complex technical concepts to business leaders and stakeholders.
  • Excellent problem-solving skills with the ability to architect complex, scalable data solutions.
  • Leadership abilities with a proven track record of mentoring and guiding data teams.
  • Collaborative mindset, capable of working effectively with cross-functional teams, including engineering, data science, and business stakeholders.
  • Attention to detail, with a focus on building high-quality, reliable, and scalable data solutions.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Zensar logo
Zensar

Information Technology and Services

Mumbai

RecommendedJobs for You

Hyderabad, Pune, Bengaluru

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru