Posted:2 days ago|
Platform:
On-site
Full Time
We are looking for an experienced Data Architect with 6–10 years of expertise in data engineering, data management, and architecture. This role will define the strategy, design, and implementation of enterprise-scale data platforms, ensuring data availability, governance, security, and scalability across the organization. The Data Architect will collaborate with technology and business stakeholders to align data initiatives with strategic goals.
∙Define the enterprise data architecture roadmap, covering data modeling, integration, quality, and
governance.
∙Architect and implement data platforms including data warehouses, lakes, and lakehouses (e.g., Snowflake, BigQuery, Redshift and Databricks).
∙Establish standards for data modeling, schema design, metadata management, and lineage tracking.
∙Lead the design and development of data integration frameworks, covering ETL/ELT pipelines, APIs,
streaming, and real-time data delivery.
∙Ensure data governance, compliance, privacy, and security frameworks are embedded in all data platforms.
∙Partner with data engineering, analytics, and product teams to build scalable data products and pipelines.
∙Optimize enterprise-wide data storage, retrieval, and processing to balance performance and cost.
∙Collaborate with AI/ML and business intelligence teams to enable advanced analytics and AI readiness.
∙6–10 years of proven experience in data architecture, engineering, or solution design.
∙Strong expertise in Data Lake creation (e.g., Databricks Delta Lake, Snowflake, and Azure Data Lake Storage etc).
∙Strong expertise in relational databases (Oracle, PostgreSQL, MySQL) and NoSQL technologies (MongoDB, Cassandra, DynamoDB).
∙Proficiency in SQL and programming (Python, Scala, or Java).
∙Deep understanding of data modeling techniques (dimensional, relational, document, graph).
∙Experience with big data frameworks (Spark, Hadoop, Hive) and cloud-native data platforms (AWS Redshift, GCP BigQuery, Azure Synapse).
∙Strong grounding in data governance, data quality, and metadata management.
∙Familiarity with data orchestration tools (Airflow, Dagster, Luigi).
∙Cloud experience in AWS, GCP, or Azure for building large-scale distributed systems.
∙Bachelor’s or Master’s degree in Computer Science, Engineering or related field.
∙Experience with real-time streaming frameworks (Kafka, Flink, Kinesis, Pub/Sub).
∙Knowledge of data security frameworks, compliance (GDPR, HIPAA, DPDPA), and role-based access control.
∙Familiarity with containerization and orchestration (Docker, Kubernetes).
∙Exposure to AI/ML pipelines and data readiness for MLOps.
∙Contributions to data standards, open-source projects, or industry best practices.
∙Strategic thinker with the ability to translate business needs into scalable data solutions.
∙Strong development and mentoring skills to guide data teams.
∙Excellent communication and stakeholder management abilities.
∙Proactive, ownership-driven mindset with focus on data quality, security, and reliability.
∙Growth mindset; stays ahead of evolving data and cloud technology landscapes.
r3 Consultant
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
hyderabad
13.0 - 18.0 Lacs P.A.
hyderabad
13.0 - 18.0 Lacs P.A.
bengaluru
48.0 - 72.0 Lacs P.A.
hyderābād
7.21 - 8.66 Lacs P.A.
kochi, all india
Salary: Not disclosed
mumbai, maharashtra, india
Experience: Not specified
Salary: Not disclosed
bengaluru, karnataka, india
Salary: Not disclosed
hyderabad, telangana, india
Salary: Not disclosed
14.0 - 24.0 Lacs P.A.
bengaluru
13.0 - 18.0 Lacs P.A.