Data Architect - Snowflake/DBT

12 years

30 - 48 Lacs

Posted:3 days ago| Platform: SimplyHired logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Title: Data Engineering Solutions Architect

Experience: 8–12 Years
Location: Hyderabad

Mandatory Skills

  • Data Architecture

  • Snowflake

  • dbt

  • Matillion

  • Python

  • AWS / Azure / GCP

  • Medallion Architecture

Required Qualifications

  • 8–12 years of experience in Data Engineering / Data Architecture with hands-on delivery experience.

  • Strong expertise in Snowflake or Databricks; hands-on with SQL, Python/PySpark, dbt, and Matillion.

  • Experience building and orchestrating pipelines with Airflow, Control-M, or similar tools.

  • Experience with cloud-native services such as AWS Glue, Redshift, Athena, GCP BigQuery, Dataflow, Pub/Sub, or equivalent.

  • Strong knowledge of data modeling, ETL/ELT design, and Medallion / layered / modular architectures.

  • Experience with real-time and API-based integrations such as Kafka, Kinesis, Pub/Sub, and REST APIs.

  • Familiarity with traditional ETL tools like Informatica, SSIS, and Oracle Data Integrator.

  • Strong understanding of data governance, performance tuning, and Data DevOps (CI/CD).

  • Excellent communication and stakeholder management skills.


We are looking for an experienced Data Engineering Solutions Architect to join our growing Data Practice. The ideal candidate will have 8–12 years of hands-on experience designing, architecting, and delivering large-scale data warehousing, data lake, ETL/ELT, and reporting solutions across modern and traditional data platforms.

You will play a key role in defining scalable, secure, and cost-effective data architectures that enable advanced analytics and AI-driven insights for our clients. This role demands a strong mix of technical depth, solution leadership, and a consulting mindset.

Key Responsibilities

  • Design and architect end-to-end data solutions using Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, and cloud-native services across AWS, Azure, and GCP.

  • Build and manage data ingestion, transformation, integration, and orchestration frameworks for structured and semi-structured data.

  • Architect scalable data lakes and data warehouses with focus on performance, governance, and cost optimization.

  • Design solutions for real-time, streaming, micro-batch, and event-driven ingestion patterns.

  • Lead delivery of data visualization and reporting using Power BI, Tableau, and Streamlit.

  • Work closely with business and technical stakeholders to translate requirements into architecture blueprints.

  • Define and enforce engineering standards, reusable frameworks, and delivery best practices.

  • Mentor data engineers and support capability building on emerging technologies.

  • Provide thought leadership on data modernization, AI/ML enablement, and platform strategy.


Preferred Qualifications

  • Certifications: Snowflake SnowPro, Databricks Certified Architect, AWS Data Analytics Specialty, Google Professional Data Engineer.

  • Consulting or client-facing experience.

  • Exposure to AI/ML frameworks, data quality, or metadata management.

  • Experience in multi-cloud or hybrid architecture design.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You