Databricks

2 - 6 years

0 Lacs

Posted:2 weeks ago| Platform: Shine logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Every day, your work will make an impact that matters as you thrive in a dynamic culture of inclusion, collaboration, and high performance. As the undisputed leader in professional services, Our Client offers unrivaled opportunities for you to succeed and realize your full potential. We are currently looking for highly skilled Databricks Data Engineers to join our data modernization team. In this role, you will play a pivotal part in designing, developing, and maintaining robust data solutions on the Databricks platform. Your expertise in data engineering, coupled with a profound understanding of Databricks, will be crucial in creating solutions to drive data-driven decision-making across diverse customers. You will be responsible for designing, developing, and optimizing data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. Furthermore, you will build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) adhering to coding standards and best practices. Collaborating with technical and business stakeholders, you will decipher data requirements and translate them into technical solutions. Additionally, you will develop data models and schemas to support reporting and analytics needs, ensuring data quality, integrity, and security through appropriate checks and controls. Monitoring and optimizing data processing performance, you will identify and resolve bottlenecks while staying updated with the latest advancements in data engineering and Databricks technologies. Qualifications for this role include a Bachelor's or master's degree in any field, along with 2-6 years of experience in designing, implementing, and maintaining data solutions on Databricks. You should have experience with at least one of the popular cloud platforms such as Azure, AWS, or GCP, and familiarity with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes. Proficiency in data warehousing and data modeling concepts, as well as experience with Python or SQL and Delta Lake, is required. An understanding of DevOps principles and practices, excellent problem-solving and troubleshooting skills, and strong communication and teamwork abilities are also essential. In addition to the exciting challenges this role will offer, you will have the opportunity to work with one of the Big 4 companies in India. A healthy work environment and work-life balance are among the benefits you can enjoy as part of our team.,

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

chennai, tamil nadu, india

hyderabad, telangana

chennai, tamil nadu

chennai, tamil nadu, india

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru