Big Data Technical Architect

10 - 16 years

35 - 45 Lacs

Posted:6 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Hello,

"Big Data Architect"

Exp

Loc

Work Mode

Notice Period:

NOTE: We are looking for Immediate joiners(notice period served or serving candidate)

Apply only If you are Immediate joiners(notice period served or serving candidate)

Apply only if you are having 10+Years of relevant experience as per the JD.

Big Data Technical Architect

Roles & Responsibilities

Key Responsibilities:

  • • Architect and guide implementation of

    Data Lakes

    ,

    Lakehouses

    , and

    Data Warehouses

     using tools such as

    Snowflake

    ,

    Microsoft Fabric

    , and

    Delta Lake

    .
  • • Design and implement scalable, secure, and high-performing

    Big Data architectures

     across

    AWS

     and

    Azure

    .
  • • Develop robust

    ETL/ELT pipelines

     using modern data services like

    AWS Glue

    ,

    Azure Data Factory

    ,

    Spark

    , and custom scripts in

    Python/SQL/PLSQL

    .
  • • Integrate structured and unstructured data sources using

    API integrations

    , event-driven pipelines,

    real-time data ingestion

    , and batch processing.
  • • Lead the

    BI and analytics layer

     strategy using tools such as

    Tableau

    ,

    Power BI

    , and

    Qlik Sense

     for enterprise reporting and dashboarding.
  • • Design and implement

    data models

     (conceptual, logical, physical) that support both operational and analytical requirements.
  • • Establish and enforce

    data governance

    ,

    data security

    , and

    data quality

     standards across platforms.
  • • Drive initiatives in

    data observability

    , monitoring data pipelines, identifying issues, and ensuring SLA adherence.
  • • Serve as a

    technical SME and advisor

     to both internal teams and clientstranslating business needs into technical solutions.
  • • Lead architectural reviews and provide guidance on data best practices and cloud optimization.
  • • Develop and deliver

    technical presentations

     to executive and non-technical stakeholders.

Domain Experience (Good to Have):

  • • Exposure to

    BFSI

     domain, including understanding of risk management, regulatory compliance (Basel III, PCI DSS), fraud detection, and financial data workflows.
  • • Familiarity with

    Retail

     data challenges such as supply chain analytics, customer behavior tracking, inventory management, and omni-channel reporting.
  • • Experience with

    Pharma

     and

    Healthcare

     sectors, including clinical data management, regulatory compliance (HIPAA, GDPR), patient analytics, and drug safety data.
  • • Ability to adapt data architecture and BI solutions to domain-specific requirements across these industries, supporting both operational and strategic business goals.

Required Skills:

  • • Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.
  • • Strong hands-on experience with cloud platforms and related data technologies:
  • o

    AWS:

     S3, AWS Glue, Redshift, Lambda, Kinesis Data Streams & Firehose, Managed Kafka (MSK), EMR (Spark), Athena, IAM, KMS.
  • o

    Azure:

     Data Lake Storage Gen2, Synapse Analytics, Data Factory, Event Hubs, Stream Analytics, Managed Kafka, Databricks, Azure Functions, Active Directory, Key Vault.
  • • Proven expertise in building and optimizing

    ETL/ELT pipelines

     using AWS Glue, Azure Data Factory, Apache Spark, and scripting languages like

    Python

    ,

    SQL

    , and

    PL/SQL

    .
  • • Solid experience with

    data lake

     and

    lakehouse

     strategies, and hands-on with modern data warehouse platforms such as

    Snowflake

     and

    Microsoft Fabric

    .
  • • Skilled in

    real-time data ingestion

     and streaming technologies like

    Apache Kafka

    ,

    AWS Kinesis

    ,

    Azure Event Hubs

    , and

    Spark Streaming

    .
  • • Deep understanding of

    data modeling

     concepts (conceptual, logical, physical) and best practices for both OLTP and OLAP systems.
  • • Expertise in

    business intelligence tools

     such as

    Tableau

    ,

    Power BI

    , and

    Qlik Sense

     for enterprise-grade dashboards and analytics.
  • • Strong grasp of

    data governance

    ,

    data security

     (encryption, access control),

    data quality frameworks

    , and

    data observability

     tools like

    Monte Carlo

    ,

    DataDog

    , or

    Great Expectations

    .
  • • Familiarity with relevant

    data privacy

     and

    regulatory compliance

     standards (GDPR, CCPA, HIPAA, PCI DSS).
  • • Excellent

    client-facing communication

     skills with ability to explain complex technical concepts to non-technical stakeholders.
  • • Proven leadership and mentoring capabilities in guiding cross-functional teams.

Qualifications:

  • • Bachelor of Engineering (B.E./B.Tech) degree in Computer Science, Information Technology, Electronics, or related field.

Mock Interview

Practice Video Interview with JobPe AI

Start Big Data Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Srs Business Solutions India logo
Srs Business Solutions India

Information Technology

Bangalore

RecommendedJobs for You