Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6 - 8 years
8 - 10 Lacs
Pune, Bengaluru, Noida
Hybrid
Work Mode: Hybrid (3 days WFO) Locations: Bangalore, Noida, Pune, Mumbai, Hyderabad (Candidates must be in Accion cities to collect assets and attend in-person meetings as required). Key Requirements: Technical Skills: Databricks Expertise: 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS/Azure cloud infrastructure. Proficiency in Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), MLflow, and Databricks SQL. Experience with Databricks CI/CD tools (e.g., BitBucket, GitHub Actions, Databricks CLI). Data Warehousing & Engineering: Strong understanding of data warehousing concepts (Dimensional, SCD2, Data Vault, OBT, etc.). Proven ability to implement highly performant data ingestion pipelines from multiple sources. Experience integrating end-to-end Databricks pipelines to ensure data quality and consistency. Programming: Strong proficiency in Python and SQL. Basic working knowledge of API or stream-based data extraction processes (e.g., Salesforce API, Bulk API). Cloud Technologies: Preferred experience with AWS services (e.g., S3, Athena, Glue, Lambda). Power BI: 3+ years of experience in Power BI and data warehousing for root cause analysis and business improvement opportunities. Additional Skills: Working knowledge of Data Management principles (quality, governance, security, privacy, lifecycle management, cataloging). Nice to have: Databricks certifications and AWS Solution Architect certification. Nice to have: Experience with building data pipelines from business applications like Salesforce, Marketo, NetSuite, Workday, etc. Responsibilities: Develop, implement, and maintain highly efficient ETL pipelines on Databricks. Perform root cause analysis and identify opportunities for data-driven business improvements. Ensure quality, consistency, and governance of all data pipelines and repositories. Work in an Agile/DevOps environment to deliver iterative solutions. Collaborate with cross-functional teams to meet business requirements. Stay updated on the latest Databricks and AWS features, tools, and best practices. Work Schedule: Regular: 11:00 AM to 8:00 PM. Flexibility is required for project-based overlap. Interested candidates should share their resumes with the following details: Current CTC Expected CTC Preferred Location: Bangalore, Noida, Pune, Mumbai, Hyderabad Notice Period Contact Information:
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2