Data Bricks Operations Professional

2 - 7 years

5 - 9 Lacs

Posted:None| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Overview
Job SummaryWe are seeking a highly skilled Databricks Platform Operations Engineer to join our team, responsible for daily monitoring and resolution of data load issues, platform optimization, capacity planning, and governance management. This role is pivotal in ensuring the stability, scalability, and security of our Databricks environment while acting as a technical architect for platform best practices. The ideal candidate will bring a strong operational background, potentially with earlier experience as a Linux, Hadoop, or Spark administrator, and possess deep expertise in managing cloud-based data platforms. Databricks Operations Engineer Location:Hyderabad/Bangalore Shift24x7 Work ModeWork from Office ResponsibilitiesKey ResponsibilitiesPrimary ResponsibilityData Load Monitoring & Issue Resolution
  • Monitor data ingestion and processing dashboards daily to identify, diagnose, and resolve data load and pipeline issues promptly.
  • Act as the primary responder to data pipeline failures, collaborating with data engineering teams for rapid troubleshooting and remediation.
  • Ensure data availability, reliability, and integrity through proactive incident management and validation.
  • Maintain detailed logs and reports on data load performance and incident resolution.

  • Platform Optimization & Capacity Planning
  • Continuously optimize Databricks cluster configurations, job execution, and resource allocation for cost efficiency and performance.
  • Conduct capacity planning to anticipate future resource needs and scaling requirements based on workload trends.
  • Analyze platform usage patterns and recommend infrastructure enhancements to support business growth.

  • Databricks Governance & Security
  • Implement and enforce data governance policies within Databricks, including access control, data lineage, and compliance standards.
  • Manage user permissions and roles using Azure AD, AWS IAM, or equivalent systems to uphold security and governance best practices.
  • Collaborate with security and compliance teams to ensure adherence to organizational policies and regulatory requirements.

  • Technical Architecture & Collaboration
  • Serve as a Databricks platform architect, providing guidance on environment setup, best practices, and integration with other data systems.
  • Work closely with data engineers, data scientists, governance teams, and business stakeholders to align platform capabilities with organizational goals.
  • Develop and maintain comprehensive documentation covering platform architecture, operational procedures, and governance frameworks.

  • Operational Excellence & Automation
  • Troubleshoot and resolve platform and job-related issues in collaboration with internal teams and Databricks support.
  • Automate routine administrative and monitoring tasks using scripting languages (Python, Bash, PowerShell) and infrastructure-as-code tools (Terraform, ARM templates).
  • Participate in on-call rotations and incident management processes to ensure continuous platform availability.

  • RequirementsRequired Qualifications
  • Experience in administering Databricks or comparable cloud-based big data platforms.
  • Experience with Jenkins Scripting/ Pipeline Scripting
  • Demonstrated experience in daily monitoring and troubleshooting of data pipelines and load processes.
  • Strong expertise in Databricks platform optimization, capacity planning, governance, and architecture.
  • Background experience as Linux Administrator, Hadoop Administrator, or Spark Administrator is highly desirable.
  • Proficiency with cloud platforms (Azure, AWS, or GCP) and their integration with Databricks.
  • Experience managing user access and permissions with Azure Active Directory, AWS IAM, or similar identity management tools.
  • Solid understanding of data governance principles, including RBAC, data lineage, security, and compliance.
  • Proficient in scripting languages such as Python, Bash, or PowerShell for automation and operational tasks.
  • Excellent troubleshooting, problem-solving, communication, and collaboration skills.

  • Preferred Skills:

  • Experience with infrastructure-as-code tools like Terraform or ARM templates.
  • Familiarity with data catalog and governance tools such as Azure Purview.
  • Working knowledge of Apache Spark and SQL to support platform administration and governance monitoring.
  • Experience designing and implementing data lakehouse architectures.
  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    Prodapt Solutions logo
    Prodapt Solutions

    Software Development

    Chennai Tamilnadu

    RecommendedJobs for You

    kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru