Databricks

0 years

0 Lacs

Posted:5 days ago| Platform: SimplyHired logo

Apply

Skills Required

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary

Join our dynamic team as a Developer where you will leverage your expertise in Spark in Scala Databricks and Amazon technologies to drive innovative solutions. With a hybrid work model and day shifts you will collaborate with cross-functional teams to optimize data processes and enhance our data infrastructure. Your contributions will directly impact our companys success and societal advancements.


Responsibilities

  • Develop and implement scalable data processing solutions using Spark in Scala to enhance data efficiency and performance.
  • Collaborate with cross-functional teams to design and optimize data workflows using Databricks Workflows and Databricks SQL.
  • Administer and manage Databricks Unity Catalog to ensure data governance and security across the organization.
  • Utilize Databricks CLI to automate data pipeline deployments and streamline operational processes.
  • Integrate and manage data storage solutions using Amazon S3 and Amazon Redshift to support data analytics initiatives.
  • Write and maintain high-quality Python and PySpark code to process and analyze large datasets effectively.
  • Implement Databricks Delta Lake for reliable and efficient data storage and retrieval.
  • Monitor and troubleshoot data workflows to ensure seamless data operations and minimize downtime.
  • Provide technical expertise and support to team members fostering a collaborative and innovative work environment.
  • Stay updated with the latest industry trends and technologies to continuously improve data solutions.
  • Contribute to the development of best practices and standards for data engineering processes.
  • Ensure data quality and integrity through rigorous testing and validation procedures.
  • Document and communicate technical solutions and processes to stakeholders for transparency and alignment.


Qualifications

  • Possess strong experience in Spark in Scala and Databricks technologies demonstrating proficiency in data engineering.
  • Have hands-on experience with Amazon S3 and Amazon Redshift for data storage and management.
  • Demonstrate expertise in Python and PySpark for data processing and analysis.
  • Experience with Databricks SQL and Delta Lake is essential for efficient data operations.
  • Familiarity with Databricks CLI and Unity Catalog Admin is highly desirable.
  • Strong problem-solving skills and ability to work in a hybrid work model.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Cognizant logo
Cognizant

IT Services and IT Consulting

Teaneck New Jersey

RecommendedJobs for You

chennai, tamil nadu, india

hyderabad, telangana

chennai, tamil nadu

chennai, tamil nadu, india

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru