Senior Architect – Data Platforms

10 years

0 Lacs

Posted:22 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Position Overview:

The Senior Architect will lead the design and development of modern data architectures leveraging Snowflake, Kafka, Splunk, Airflow, AWS, Apache Iceberg, and Presto. This role requires deep expertise in distributed data platforms, cloud infrastructure, real-time streaming, and scalable analytics solutions. The ideal candidate will drive both technical leadership and architectural vision for enterprise-scale systems.

Key Responsibilities

           •          Lead end-to-end architecture for secure, scalable, and high-performance data platforms utilizing Snowflake, Kafka, and AWS services.

           •          Architect, implement, and optimize real-time and batch data pipelines using Airflow, Apache Iceberg, Presto, and Kafka Streams.

           •          Drive integration of Splunk for operational analytics, logging, and monitoring across data platforms.

           •          Develop and enforce architecture best practices for metadata management, governance, streaming ingestion, and cloud-native data warehousing.

           •          Collaborate with data engineers and DevOps teams to ensure efficient APIs, data models, connectors, and cloud microservices.

           •          Optimize cost, performance, and reliability of solutions across AWS infrastructure and underlying data services.

           •          Lead technology strategies for advanced analytics, data lake implementation, self-service tools, and machine learning enablement.

•          Mentor engineers, review code and architectures, communicate technical concepts to stakeholders, and create technical documentation.

Experience and Skills

           •          10+ years’ experience in data architecture, cloud platforms, and distributed systems.

           •          Proven expertise in Snowflake data warehousing, including migration, data sharing, and performance optimization.

           •          Extensive hands-on experience with Kafka (CDC, streams, real-time use cases), Airflow (ETL orchestration), and Splunk (monitoring/log analysis).

           •          Deep knowledge of AWS data services and infrastructure (EC2, S3, Glue, Lambda, etc.).

           •          Practical experience architecting solutions with Apache Iceberg and Presto for large-scale analytics.

           •          Strong collaboration, problem solving, and communication skills, with a track record in mentoring and technical leadership.

           •          Bachelor’s degree in Computer Science, Engineering, or related; Master’s preferred.

Preferred Skills

           •          Experience with additional big data tools (Spark, Databricks) and modern DevOps practices.

           •          Knowledge of security, compliance, and multi-geo deployment for global enterprise data platforms.

           •          Certification in AWS or relevant cloud architecture is valued.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You