Databricks Lead Engineer

9 - 13 years

30 - 40 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

9+ years of overall experience with considerable experience in data engineering

Mandatory Skills: 
1. Very strong in SQL, PySpark.2. Good in Databricks, DLT, Medallion Architecture, Unity Catalog, Governance.3. Good knowledge and experience in Azure Data Services (like Azure Data Factory, Azure Blob, Azure Data Lake, Key Vault, etc.).4. Excellent written & spoken communication skills, ability to interact with customer & relevant stakeholders.5. Ability to understand the provided requirements, ability to assist in architecting solutions and ability to translate it into the implementation, following best practices.6. Proactive, Self-driven, motivated, result oriented, ability to take ownership and excellent learning attitude.7. Knowledge / experience in Snowflake Datawarehouse.8. Good understanding of ETL & Data warehousing concepts.9. Integration experience with data sources like REST webservices, Confluent Kafka, Oracle, SAP HANA, Salesforce, etc.10. Excellent in communication, presentation, stakeholder management, tracking, delivery.11. Ability to lead the delivery and produce results - requirement gathering, work with cross functional teams, with offshore/onshore, delivery planning.12. Experience in working with complex client ecosystem on large projects.

Good to have:
1. Experience & conceptual knowledge of Bigdata technologies - Hadoop, HDFS, Hive, Spark.2. Experience on ETL tools (like Matillion, DataStage, DBT, Qlik, Airflow, Control M, etc.).3. Experience in GCP data services.

Roles & Responsibilities

9+ years of overall experience with considerable experience in data engineering

Mandatory Skills: 
1. Very strong in SQL, PySpark.2. Good in Databricks, DLT, Medallion Architecture, Unity Catalog, Governance.3. Good knowledge and experience in Azure Data Services (like Azure Data Factory, Azure Blob, Azure Data Lake, Key Vault, etc.).4. Excellent written & spoken communication skills, ability to interact with customer & relevant stakeholders.5. Ability to understand the provided requirements, ability to assist in architecting solutions and ability to translate it into the implementation, following best practices.6. Proactive, Self-driven, motivated, result oriented, ability to take ownership and excellent learning attitude.7. Knowledge / experience in Snowflake Datawarehouse.8. Good understanding of ETL & Data warehousing concepts.9. Integration experience with data sources like REST webservices, Confluent Kafka, Oracle, SAP HANA, Salesforce, etc.10. Excellent in communication, presentation, stakeholder management, tracking, delivery.11. Ability to lead the delivery and produce results - requirement gathering, work with cross functional teams, with offshore/onshore, delivery planning.12. Experience in working with complex client ecosystem on large projects.

Good to have:
1. Experience & conceptual knowledge of Bigdata technologies - Hadoop, HDFS, Hive, Spark.2. Experience on ETL tools (like Matillion, DataStage, DBT, Qlik, Airflow, Control M, etc.).3. Experience in GCP data services.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Impetus Technologies logo
Impetus Technologies

Information Technology and Services

Bengaluru

RecommendedJobs for You

bangalore, chennai, hyderabad, gurugram, kolkata, mumbai city, delhi