Posted:22 hours ago|
Platform:
On-site
Full Time
Position Overview:
The Senior Architect will lead the design and development of modern data architectures leveraging Snowflake, Kafka, Splunk, Airflow, AWS, Apache Iceberg, and Presto. This role requires deep expertise in distributed data platforms, cloud infrastructure, real-time streaming, and scalable analytics solutions. The ideal candidate will drive both technical leadership and architectural vision for enterprise-scale systems.
Key Responsibilities
• Lead end-to-end architecture for secure, scalable, and high-performance data platforms utilizing Snowflake, Kafka, and AWS services.
• Architect, implement, and optimize real-time and batch data pipelines using Airflow, Apache Iceberg, Presto, and Kafka Streams.
• Drive integration of Splunk for operational analytics, logging, and monitoring across data platforms.
• Develop and enforce architecture best practices for metadata management, governance, streaming ingestion, and cloud-native data warehousing.
• Collaborate with data engineers and DevOps teams to ensure efficient APIs, data models, connectors, and cloud microservices.
• Optimize cost, performance, and reliability of solutions across AWS infrastructure and underlying data services.
• Lead technology strategies for advanced analytics, data lake implementation, self-service tools, and machine learning enablement.
• Mentor engineers, review code and architectures, communicate technical concepts to stakeholders, and create technical documentation.
Experience and Skills
• 10+ years’ experience in data architecture, cloud platforms, and distributed systems.
• Proven expertise in Snowflake data warehousing, including migration, data sharing, and performance optimization.
• Extensive hands-on experience with Kafka (CDC, streams, real-time use cases), Airflow (ETL orchestration), and Splunk (monitoring/log analysis).
• Deep knowledge of AWS data services and infrastructure (EC2, S3, Glue, Lambda, etc.).
• Practical experience architecting solutions with Apache Iceberg and Presto for large-scale analytics.
• Strong collaboration, problem solving, and communication skills, with a track record in mentoring and technical leadership.
• Bachelor’s degree in Computer Science, Engineering, or related; Master’s preferred.
• Experience with additional big data tools (Spark, Databricks) and modern DevOps practices.
• Knowledge of security, compliance, and multi-geo deployment for global enterprise data platforms.
• Certification in AWS or relevant cloud architecture is valued.
GeakMinds, Inc
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
chennai, tamil nadu, india
Salary: Not disclosed
chennai, tamil nadu, india
Salary: Not disclosed