Databricks Developer

3 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title:

Location:


About Us:

[Insert 1-2 sentences about your company's mission and the impact of its data team.]

The Role:

We are looking for a passionate Databricks Developer to architect and implement our next-generation data platform. You will be instrumental in building robust, scalable data pipelines, optimizing our cloud data warehouse, and creating powerful visualizations that drive key business decisions.


Key Responsibilities:

  • Design, develop, and maintain efficient and reliable ETL/ELT pipelines using Databricks, PySpark, and Spark SQL.
  • Build and optimize our cloud data warehouse (e.g., Snowflake, BigQuery, Synapse) for performance and scalability.
  • Develop interactive and insightful dashboards and reports using Power BI/Tableau for various business units.
  • Collaborate with data analysts and business stakeholders to translate requirements into technical solutions.
  • Implement and manage Delta Lake and Lakehouse architecture for unified data analytics.
  • Perform data modeling, performance tuning, and query optimization.
  • Ensure best practices in data governance, security, and quality across all data assets.
  • Work in an Agile environment, utilizing CI/CD and DevOps practices for data engineering.


Required Qualifications (Must-Haves):

  • 3+ years of experience as a Data Engineer, BI Developer, or similar role.
  • Hands-on commercial experience with Databricks.

  • Strong proficiency in PySpark and Spark SQL.

  • Proven experience building dashboards with Power BI or Tableau.

  • Hands-on experience with a cloud data warehouse (Snowflake, Azure Synapse, Google BigQuery, or AWS Redshift).

  • Expert-level SQL skills and experience in query optimization.
  • Solid understanding of ETL/ELT concepts and data modeling techniques (e.g., Star Schema).
  • Excellent problem-solving abilities and strong communication skills.


Preferred Qualifications (Good-to-Have):

  • Experience with data orchestration tools like Azure Data Factory or Apache Airflow.
  • Familiarity with DevOps practices and CI/CD pipelines for data projects.
  • Knowledge of the DBT (data build tool) framework.
  • Experience working in an Agile/Scrum development process.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Jigya Software Services logo
Jigya Software Services

IT Services and IT Consulting

hyderabad telangana

RecommendedJobs for You