Home
Jobs

Principal Data Architect

8 - 13 years

15 - 30 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Essential Responsibilities: Architecture & Design Define and document the overall data platform architecture in GCP, including ingestion (Pub/Sub, Dataflow), storage (BigQuery, Cloud Storage), and orchestration (Composer, Workflows). Establish data modeling standards (star/snowflake schemas, partitioning, clustering) to optimize performance and cost. Platform Implementation Build scalable, automated ETL/ELT pipelines for IoT telemetry and events. Implement streaming analytics and CDC where required to support real-time dashboards and alerts. Data Products & Exchange Collaborate with data scientists and product managers to package curated datasets and ML feature tables as consumable data products. Architect and enforce a secure, governed data exchange layerleveraging BigQuery Authorized Views, Data Catalog, and IAMto monetize data externally. Cost Management & Optimization Design cost-control measures: table partitioning/clustering, query cost monitoring, budget alerts, and committed-use discounts. Continuously analyze query performance and storage utilization to drive down TCO. Governance & Security Define and enforce data governance policies (cataloging, lineage, access controls) using Cloud Data Catalog and Cloud IAM. Ensure compliance with privacy, security, and regulatory requirements for internal and external data sharing. Stakeholder Enablement Partner with business stakeholders to understand data needs and translate them into platform capabilities and SLAs. Provide documentation, training, and self-service tooling (Data Studio templates, APIs, notebooks) to democratize data access. Mentorship & Leadership Coach and mentor engineers on big data best practices, SQL optimization, and cloud-native architecture patterns. Lead architecture reviews, proof-of-concepts, and pilot projects to evaluate emerging technologies (e.g., BigQuery Omni, Vertex AI). What You'll Bring to Our Team Minimum Qualifications Bachelor’s degree in Computer Science, Engineering, or related field. 8+ years designing and operating large-scale data platforms, with at least 5 years hands-on experience in GCP (BigQuery, Dataflow, Pub/Sub). Deep expertise in BigQuery performance tuning, data partitioning/clustering, and cost-control techniques. Proven track record building streaming and batch pipelines (Apache Beam, Dataflow, Spark). Strong SQL skills and experience with data modeling for analytics. Familiarity with data governance tools: Data Catalog, IAM, VPC Service Controls. Experience with Python or Java for ETL/ELT development. Excellent communication skills, able to translate technical solutions for non-technical stakeholders.

Mock Interview

Practice Video Interview with JobPe AI

Start Gcp Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

RecommendedJobs for You