DevOps Data Engineer - Spark & Scala || 4 To 15 years || Bangalore

5 - 10 years

10 - 20 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role & responsibilities

  • Design, build, and maintain data pipelines using Scala, Spark, and SQL.
  • Develop and optimize data transformations using DataFrames.
  • Manage and maintain our data infrastructure on cloud platforms (preferably AWS or GCP).
  • Implement and manage data streaming solutions using MSK (Managed Streaming for Kafka) or Kafka.
  • Design and implement data storage solutions using S3 or similar object storage.
  • Automate infrastructure provisioning, configuration management, and data pipeline deployments using tools such as Terraform, Ansible, or CloudFormation.
  • Build and maintain CI/CD pipelines for automated data pipeline deployments using tools such as Jenkins, GitLab CI, or CircleCI.
  • Monitor data pipeline performance and identify areas for optimization.
  • Troubleshoot and resolve data pipeline and infrastructure issues in a timely manner.
  • Implement data quality checks and monitoring.
  • Implement security best practices across our data infrastructure and pipelines.
  • Collaborate with data scientists and data engineers to understand their data needs and build solutions to meet those needs.
  • Participate in on-call rotations to ensure 24/7 system availability.
  • Contribute to the development of internal tools and automation scripts.
  • Stay up-to-date with the latest data engineering and DevOps technologies and trends.

Preferred candidate profile

  • Bachelor's degree in Computer Science, Data Science, or a related field (or equivalent experience).
  • 5+ years of experience in a data engineering or DevOps role.
  • Strong proficiency in Scala, Spark, and SQL.
  • Experience with DataFrames.
  • Experience with cloud platforms (preferably AWS or GCP).
  • Experience with MSK (Managed Streaming for Kafka) or Kafka.
  • Experience with S3 or similar object storage.
  • Experience with infrastructure-as-code tools such as Terraform or CloudFormation.
  • Experience with configuration management tools such as Ansible or Chef.
  • Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI.
  • Proficiency in scripting languages such as Python or Bash.
  • Strong understanding of data warehousing concepts.
  • Strong troubleshooting and problem-solving skills.
  • Excellent communication and collaboration skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Tata Consultancy Services logo
Tata Consultancy Services

Information Technology and Consulting

Thane

RecommendedJobs for You

pune, maharashtra, india

pune, maharashtra, india