Jobs
Interviews

2 Databricks Cli Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You should have at least 5 years of hands-on experience in data engineering/ETL using Databricks on AWS/Azure cloud infrastructure and functions. Additionally, you must possess a minimum of 3 years of experience in Power BI and Data Warehousing, with the ability to perform root cause analysis on internal and external data and processes to address specific business questions and identify areas for improvement. It is preferred that you have experience with AWS services such as S3, Athena, Glue, and Lambda. A deep understanding of data warehousing concepts including Dimensional (star-schema), SCD2, Data Vault, Denormalized, OBT is essential for implementing highly performant data ingestion pipelines from multiple sources. Proficiency in Python and SQL is required for this role. You should also have a thorough understanding of Databricks platform features such as Delta Lake, Databricks SQL, and MLflow. Experience with CI/CD on Databricks using tools like BitBucket, GitHub Actions, and Databricks CLI is expected. Your responsibilities will include integrating the end-to-end Databricks pipeline to ensure data quality and consistency while moving data from source systems to target data repositories. Working within an Agile delivery/DevOps methodology to deliver proof of concept and production implementation in iterative sprints is also a key aspect of this role. Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), and MLflow is highly beneficial. Basic knowledge of API or Stream-based data extraction processes like Salesforce API and Bulk API is also required. An understanding of Data Management principles including data quality, governance, security, privacy, life cycle management, and cataloging is essential. Databricks certifications and AWS Solution Architect certification would be a plus. Experience with building data pipelines from various business applications like Salesforce, Marketo, NetSuite, and Workday is also desirable.,

Posted 2 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Pune, Chennai, Bengaluru

Hybrid

Hello Folks, We are looking for a azure DevOps Engineer to work with one of the MNC based in Bangalore/Pune/Chennai/Gurugram/Hyderabad. Please find below the job description: 6 + months of hands-on experience with Databricks CI/CD implementation . Strong proficiency with CI/CD tools : Azure DevOps, GitHub Actions, Jenkins, or similar. Familiarity with Databricks CLI, Databricks REST APIs, or Databricks Terraform provider . Experience with Git, GitOps practices, and version control in collaborative environments.r Proficient in scripting languages such as Bash, Python, or PowerShell.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies