Databricks Architect

5 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Role Description

This is a full-time hybrid role ( Multiple roles) for a Databricks lead and Architect located in Chennai and Pondicherry, with the flexibility to work remotely on occasion. The Databricks Architect will be responsible for designing scalable architectures, developing software solutions, integrating Databricks with other platforms, and creating efficient architectural designs. Additionally, the role involves collaborating with cross-functional teams, managing projects to ensure timely delivery, and supporting the implementation of Databricks solutions that align with business goals.


Key Responsibilities:

  • Be the go to person for everything related to solutioning and Architecture. Should be leading the the team.
  • Lead the

    migration of Oracle Data Warehouse to Databricks

    , ensuring minimal downtime and data integrity. 
  • Work on data pipelines using

    Databricks best practices

    for performance, cost, and reliability, while guiding junior engineers. 
  • Optimize Spark/PySpark jobs

    for performance and efficiency in processing large datasets. 
  • Implement data quality checks, validation, and reconciliation processes during migration. 
  • Work on Hadoop ecosystem (HDFS, Hive, Sqoop) for legacy data integration where needed. 
  • Leverage AWS cloud services (S3, EMR, EC2, Lamda, RDS, Glue) for data storage and processing. 
  • Implement CI/CD pipelines for automated deployment of Databricks workflows. 
  • Ensure data governance, security, and compliance in the cloud environment. 


Mandatory Skills & Experience:

  • 5+ years in Data Engineering, with strong expertise in Databricks, Oracle, and Data Warehousing. 
  • Hands-on experience in migrating on-prem data warehouses (Oracle) to Databricks. 
  • Hands on expertise using key features of Databricks ( Lakeflow connect, Lakebridge, Autoloader, Unity Catalog, Spark Declarative Pipelines, Lakehouse Architecture)
  • Proficiency in 

    PySpark, SQL

    , and ETL/ELT frameworks. 
  • Strong knowledge of Hadoop ecosystem (HDFS, Hive, Sqoop). 
  • Experience with AWS cloud services (S3, EMR, Glue, Lambda, RDS). 
  • Familiarity with data modeling, CDC, and performance tuning in Databricks. 
  • Good understanding of orchestration tools (Airflow, Control-M) and CI/CD pipelines. 
  • Proficiency in Architecture, Architectural Design, and Integration
  • Strong Project Management skills to lead and ensure project milestones are met
  • Ability to work with cloud-based platforms and familiarity with AI/ML and Big Data technologies
  • Excellent problem-solving and analytical skills
  • Strong communication and collaboration skills to work effectively in a hybrid environment
  • Experience with Databricks and end-to-end deployment of data solutions is highly desirable
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field



Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Decision Minds logo
Decision Minds

Data Analytics / Software

Analytics City

RecommendedJobs for You

hyderabad, telangana, india

bengaluru, karnataka, india