Team Lead ( ADF, Databricks, Pyspark)

0 - 2 years

5 - 12 Lacs

Posted:1 week ago| Platform: Indeed logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Company Name : PibyThree Consulting Pvt Ltd.

Job Title : Team Lead - Data Migration and Snowflake

Skill           : Azure Data factory, Databricks, PySpark, Snowflake & Data Migration

Location : Pune, Maharashtra.

Website       : PibyThree

Start Date

About Us:

Πby3 is A Cloud Transformation company enabling Enterprises for Future. We are nimble, and Highly dynamic focused team with a passion to serve our clients with utmost trust and ownership. Our expertise in Technology with vast experience over the years helps client get Solutions with optimized cost and reduced risks.

Job Description:

We are looking for an experienced Team Lead – Data Warehouse Migration, Data Engineering & BI to lead enterprise-level data transformation initiatives. The ideal candidate will have deep expertise in migration, Snowflake, Power BI and end-to-end data engineering using tools like Azure Data Factory, Databricks, and PySpark.

Key Responsibilities:

  • Lead and manage data warehouse migration projects, including extraction, transformation, and loading (ETL/ELT) across legacy and modern platforms.
  • Architect and implement scalable Snowflake data warehousing solutions for analytics and reporting.
  • Develop and schedule robust data pipelines using Azure Data Factory and Databricks.
  • Write efficient and maintainable PySpark code for batch and real-time data processing.
  • Design and develop dashboards and reports using Power BI to support business insights.
  • Ensure data accuracy, security, and consistency throughout the project lifecycle.
  • Collaborate with stakeholders to understand data and reporting requirements.
  • Mentor and lead a team of data engineers and BI developers.
  • Manage project timelines, deliverables, and team performance effectively

Must-Have Skills:

  • Data Migration: Hands-on experience with large-scale data migration, reconciliation, and transformation.
  • Snowflake: Data modeling, performance tuning, ELT/ETL development, role-based access control.
  • Azure Data Factory: Pipeline development, integration services, linked services.
  • Databricks: Spark SQL, notebooks, cluster management, orchestration.
  • PySpark: Advanced transformations, error handling, and optimization techniques.
  • Power BI: Data visualization, DAX, Power Query, dashboard/report publishing and maintenance.

Preferred Skills:

  • Familiarity with Agile methodologies and sprint-based development.
  • Experience in working with CI/CD for data workflows.
  • Ability to lead client discussions and manage stakeholder expectations.
  • Strong analytical and problem-solving abilities.

Job Type: Full-time

Pay: ₹500,000.00 - ₹1,200,000.00 per year

Schedule:

  • Day shift

Ability to commute/relocate:

  • Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred)

Education:

  • Bachelor's (Preferred)

Experience:

  • total work: 4 years (Preferred)
  • Pyspark: 2 years (Required)
  • Azure Data Factory: 2 years (Required)
  • Databricks: 2 years (Required)

Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You