DataOps Engineer

4 - 8 years

6 - 10 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Overview:

We are seeking a skilled DataOps Engineer with a strong foundation in DevOps practices and Data Engineering principles. The ideal candidate will be responsible for ensuring smooth deployment, observability, and performance optimization of data pipelines and platforms. You will work at the intersection of software engineering, DevOps, and data engineering bridging gaps between development, operations, and data teams.

Key Responsibilities:

  • Design, implement, and manage CI/CD pipelines using tools such as Jenkins, Git, and Terraform.
  • Manage and maintain Kubernetes (K8s) clusters for scalable and resilient data infrastructure.
  • Develop and maintain observability tools and dashboards (e.g., Prometheus, Grafana, ELK stack) for monitoring pipeline and platform health.
  • Automate infrastructure provisioning and deployments using Infrastructure as Code (IaC) tools, preferably Terraform.
  • Collaborate with data engineers to debug, optimize, and track performance of data pipelines (e.g., Airflow, Airbyte, etc.).
  • Implement and monitor data quality, lineage, and orchestration workflows.
  • Develop custom scripts and tools in Python to enhance pipeline reliability and automation.
  • Work closely with data teams to manage and optimize Snowflake environments, focusing on performance tuning and cost efficiency.
  • Ensure compliance with security, scalability, and operational best practices across the data platform.
  • Act as a liaison between development and operations to maintain SLAs for data availability and reliability.

Required Skills & Experience:

  • 4-8 years of experience in DevOps / DataOps / Platform Engineering roles.
  • Proficient in managing Kubernetes clusters and associated tooling (Helm, Kustomize, etc.).
  • Hands-on experience with CI/CD pipelines, especially using Jenkins, GitOps, and automated testing frameworks.
  • Strong scripting and automation skills in Python.
  • Experience with workflow orchestration tools like Apache Airflow and data ingestion tools like Airbyte.
  • Solid experience with Infrastructure as Code tools, preferably Terraform.
  • Familiarity with observability and monitoring tools such as Prometheus, Grafana, Datadog, or New Relic.
  • Working knowledge of data platforms, particularly Snowflake, including query performance tuning and monitoring.
  • Strong debugging and problem-solving skills, especially in production data pipeline scenarios.
  • Excellent communication skills and ability to collaborate across engineering, operations, and analytics teams.

Preferred Qualifications:

  • Experience with cloud platforms (AWS, and/or GCP) and cloud-native DevOps practices.
  • Familiarity with data cataloging and lineage tools.
  • Exposure to container security, policy management, and data governance tools.
  • Background in data modeling, SQL optimization, or data warehousing concepts is a plus.

 

Mock Interview

Practice Video Interview with JobPe AI

Start Machine Learning Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Velotio Technologies logo
Velotio Technologies

Software Development

Pune Maharashtra

RecommendedJobs for You

Navi Mumbai, Maharashtra, India