Databricks

8 - 12 years

0 Lacs

Posted:4 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary

We are seeking a highly skilled Sr. Developer with 8 to 12 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. Experience in Property & Casualty Insurance is mandatory. The role is hybrid with day shifts

Responsibilities

  • Lead the design and implementation of data processing pipelines using Spark in Scala and PySpark
  • Oversee the administration of Databricks Unity Catalog and manage data governance
  • Provide expertise in Delta Sharing and ensure secure data sharing practices
  • Develop and maintain Databricks CLI scripts for automation and operational efficiency
  • Implement Delta Live Pipelines for real-time data processing and analytics
  • Utilize Structured Streaming to build scalable and fault-tolerant streaming applications
  • Manage risk by identifying and mitigating potential data-related issues
  • Integrate Apache Airflow for orchestrating complex data workflows
  • Optimize data storage and retrieval using Amazon S3 and Amazon Redshift
  • Write efficient and maintainable code in Python for various data processing tasks
  • Create and manage Databricks SQL queries for data analysis and reporting
  • Ensure data integrity and performance with Databricks Delta Lake
  • Develop and monitor Databricks Workflows to automate data tasks
  • Collaborate with cross-functional teams to understand business requirements and deliver solutions
  • Communicate effectively in English both written and spoken to document processes and interact with stakeholders
  • Stay updated with the latest industry trends and best practices in data engineering and Property & Casualty Insurance

Qualifications

  • Possess strong technical skills in Spark in Scala PySpark and Databricks ecosystem
  • Have experience with Delta Sharing Databricks Unity Catalog Admin and Databricks CLI
  • Demonstrate expertise in Delta Live Pipelines and Structured Streaming
  • Show proficiency in Risk Management and Apache Airflow
  • Be skilled in using Amazon S3 and Amazon Redshift for data storage solutions
  • Have advanced knowledge of Python for data processing
  • Be experienced in Databricks SQL Databricks Delta Lake and Databricks Workflows
  • Have a background in Property & Casualty Insurance
  • Be proficient in English for effective communication
  • Be able to work in a hybrid model with day shifts
  • Have a minimum of 8 years and a maximum of 12 years of relevant experience
  • Be detail-oriented and capable of managing multiple tasks simultaneously
  • Be committed to continuous learning and professional development.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Cognizant logo
Cognizant

IT Services and IT Consulting

Teaneck New Jersey

RecommendedJobs for You

chennai, tamil nadu, india

hyderabad, telangana

chennai, tamil nadu

chennai, tamil nadu, india

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru