Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Data Architect - Databricks


Required Technical Skill Set: AWS/Azure, Databricks, Spark/ PySpark, SQL


No of Requirements: 6


Desired Experience Range: 10-11 Years


Location of Requirement: Kochi / Bangalore / Chennai


Desired Competencies (Technical/Behavioral Competency)


Must-Have(Ideally should not be more than 3-5)

  • Strong experience in programming languages such as Python, PySpark, or Scala for data engineering tasks.
  • Expereince in guideing high-performing teams to design and implement scalable data solutions leveraging Databricks and cloud-native tools.
  • Hands-on experience with cloud technologies, expertise in Databricks data engineering services to handle large-scale data processing and real-time data streams.
  • Good understanding of Databricks best practices
  • Deep expertise in Databricks (DLT, Unity Catalog, Workflows, SQL Warehouse)
  • Experience with data security (RBAC, row/column-level security policies).


Good-to-Have(Ideally should not be more than 3-5)


  • Strong understanding of data modeling concepts and techniques to create efficient and scalable data models.
  • Experience with version control systems such as Git for code management and collaboration.
  • Knowledge of data governance, data quality standards, and data security practices to ensure compliance and protection of sensitive information



Responsibility of / Expectations from the Role


Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading data from various sources to ensure seamless data flow and accessibility.


Collaborate with cross-functional teams to integrate data from multiple disparate sources, ensuring consistency, accuracy, and reliability of data.


Optimize data processing workflows and storage solutions for performance, scalability, and cost-efficiency


Implement data quality checks and validation processes to ensure the accuracy, completeness, and consistency of data throughout the data lifecycle.


Monitor the performance of data pipelines and infrastructure, identifying and resolving issues to maintain system stability and reliability.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Tata Consultancy Services logo
Tata Consultancy Services

Information Technology and Consulting

Thane

RecommendedJobs for You

indore, madhya pradesh, india

bengaluru, karnataka