Databricks spark

8 - 9 years

5 - 6 Lacs

Posted:3 days ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

Job Summary

The Sr. Developer role is crucial for driving innovation and efficiency in our hybrid work model. With a focus on Kafka Python Databricks SQL Databricks Workflows and PySpark the candidate will enhance our data processing capabilities. Experience in Claims and Billing domains is advantageous. This position requires 8 to 9 years of experience and offers a day shift schedule with no travel requirements.


Responsibilities

  • Develop and maintain scalable data processing solutions using Kafka Python and PySpark to enhance data flow and analytics capabilities.
  • Collaborate with cross-functional teams to design and implement Databricks Workflows that streamline data operations and improve efficiency.
  • Utilize Databricks SQL to perform complex data queries and generate actionable insights for business stakeholders.
  • Ensure data integrity and quality by implementing robust data validation and error-handling mechanisms.
  • Optimize existing data pipelines for performance and scalability ensuring they meet the evolving needs of the organization.
  • Provide technical guidance and support to junior developers fostering a culture of continuous learning and improvement.
  • Participate in code reviews and contribute to the development of best practices and coding standards.
  • Work closely with product managers and business analysts to understand requirements and translate them into technical solutions.
  • Monitor and troubleshoot data processing systems to ensure high availability and reliability.
  • Stay updated with the latest industry trends and technologies to drive innovation and maintain a competitive edge.
  • Document technical specifications and system designs to facilitate knowledge sharing and collaboration.
  • Engage in regular team meetings to discuss project progress challenges and opportunities for improvement.
  • Contribute to the companys purpose by developing solutions that enhance operational efficiency and deliver value to society.


Qualifications

  • Possess strong expertise in Kafka Python Databricks SQL Databricks Workflows and PySpark with a proven track record of successful implementations.
  • Demonstrate proficiency in designing and optimizing data pipelines for large-scale data processing.
  • Exhibit excellent problem-solving skills and the ability to work effectively in a hybrid work model.
  • Have experience in the Claims and Billing domains which is considered a plus.
  • Show strong communication skills and the ability to collaborate with diverse teams.
  • Display a commitment to continuous learning and staying abreast of emerging technologies.
  • Hold a bachelors degree in Computer Science Information Technology or a related field.


Certifications Required

Certified Apache Kafka Developer Databricks Certified Data Engineer Associate

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Cognizant logo
Cognizant

IT Services and IT Consulting

Teaneck New Jersey

RecommendedJobs for You