Home
Jobs

Associate - Projects

5 - 8 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary We are seeking a skilled Developer with 5 to 8 years of experience to join our team. The ideal candidate will have expertise in Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. Experience in Property & Casualty Insurance is a plus. This is a hybrid role with day shifts and no travel required. Responsibilities Develop and maintain data pipelines using Amazon S3 and Amazon Redshift to ensure efficient data storage and retrieval. Utilize Python to write clean scalable code for data processing and analysis tasks. Implement Databricks SQL for querying and analyzing large datasets to support business decisions. Manage and optimize Databricks Delta Lake for reliable and high-performance data storage. Design and execute Databricks Workflows to automate data processing tasks and improve operational efficiency. Leverage PySpark to perform distributed data processing and enhance data transformation capabilities. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Ensure data quality and integrity by implementing robust validation and monitoring processes. Provide technical support and troubleshooting for data-related issues to maintain smooth operations. Stay updated with the latest industry trends and technologies to continuously improve data solutions. Contribute to the development of best practices and standards for data engineering within the team. Document technical specifications and processes to ensure knowledge sharing and continuity. Participate in code reviews and provide constructive feedback to peers for continuous improvement. Qualifications Possess strong expertise in Amazon S3 and Amazon Redshift for data storage and management. Demonstrate proficiency in Python for developing scalable data processing solutions. Have hands-on experience with Databricks SQL for data querying and analysis. Show capability in managing Databricks Delta Lake for high-performance data storage. Exhibit skills in designing Databricks Workflows for automating data processes. Utilize PySpark for distributed data processing and transformation tasks. Experience in Property & Casualty Insurance domain is a plus. Strong problem-solving skills and ability to troubleshoot data-related issues. Excellent communication and collaboration skills to work effectively with cross-functional teams. Ability to stay updated with the latest industry trends and technologies. Strong documentation skills for maintaining technical specifications and processes. Experience in participating in code reviews and providing constructive feedback. Commitment to maintaining data quality and integrity through robust validation processes. Certifications Required AWS Certified Solutions Architect Databricks Certified Data Engineer Associate Python Certification Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Developer Interview Now
Cognizant
Cognizant

IT Services and IT Consulting

Teaneck New Jersey

10001 Employees

2579 Jobs

    Key People

  • Brian Humphries

    CEO
  • Gina Schaefer

    CFO

RecommendedJobs for You

Hyderabad, Telangana, India

Chennai, Tamil Nadu, India

Chennai, Tamil Nadu, India

Chennai, Tamil Nadu, India

Hyderabad, Telangana, India

Bangalore Urban, Karnataka, India

Noida, Uttar Pradesh, India

Noida, Uttar Pradesh, India

Chennai, Tamil Nadu, India