Data Architect

10 years

0 Lacs

Posted:22 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Position: Data Architect

Experience: 10+ years (including a minimum of 2 years in a Data Architect role)

Location: Chennai (WFO)

Immediate Joinee Preferred


Job Responsibilities:

Design and architect scalable, high-performance, and secure data platforms (Data Warehouses, Data Lakes, Lakehouse) to support analytics, AI/ML, and reporting needs.

Leverage Big Data technologies such as PySpark, Hadoop, Kafka, and Databricks for large-scale data processing and transformation.

Develop and maintain conceptual, logical, and physical data models aligned with business requirements and governance standards.

Architect data integration pipelines to enable seamless data flow across diverse systems, including legacy and modern cloud platforms (Databricks, Snowflake, AWS, Azure).

Lead modernization and migration initiatives from on-premises databases (e.g., Informix, Oracle) to cloud-native platforms (AWS, Azure, Google Cloud).

Continuously optimize data pipelines and storage for performance, scalability, cost efficiency, and SLA adherence.

Partner with business development teams to provide technical leadership during pre-sales engagements, including client workshops and architecture discussions.

Collaborate with data engineers, data scientists, business analysts, and IT teams to ensure architecture alignment with business goals.

Define and enforce data governance frameworks, quality standards, and security policies to ensure compliance with organizational and regulatory requirements.

Evaluate emerging technologies and tools to define long-term data strategy and ensure future scalability.


Key Skills

·       Data modeling (conceptual, logical, physical) for high-performance platforms

·       Cloud platforms: AWS, Azure, GCP; Databricks, Snowflake

·       Big Data architecture and distributed processing (PySpark, Hadoop, Kafka)

·       Relational (SQL Server, Oracle, MySQL) and NoSQL (DynamoDB, CosmosDB)

·       ETL/ELT pipeline design, automation, and system integration

·       Cloud migration: legacy to cloud transitions

·       Leveraging cloud platform tools for data engineering, pipeline automation, and analytics (e.g., AWS

Glue, Azure Data Factory, GCP Dataflow, Databricks, Snowflake, Talend)

·       Data security, compliance, and governance

·       Agile and cross-functional team collaboration

·       Lakehouse architecture: combining data lake flexibility with warehouse performance

·       AI/ML and Data Science platform integration for analytics, predictive modelling, and insights

Programming & scripting: SQL, Python & PySpark

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

chennai, tamil nadu, india

mumbai, maharashtra, india

gurugram, haryana, india

gurgaon, haryana, india

pune, delhi / ncr, mumbai (all areas)