Responsibilities Design, build,maintain ,scalable and efficient data pipelines using PySpark, Spark SQL, and optionally Scala. Develop and manage data solutions on the Databricks platform, utilizing Workspace, Jobs, (DLT), Repos, and Unity Catalog.
We are looking for a highly skilled Senior Databricks Developer to join our data engineering team. You will be responsible for building scalable and efficient data pipelines using Databricks, Apache Spark, Delta Lake, and cloud-native services (Azure/AWS/GCP). You will work closely with data architects, data scientists, and business stakeholders to deliver high-performance, production-grade solutions. Key Responsibilities : - Design, build, and maintain scalable and efficient data pipelines on Databricks using PySpark, Spark SQL, and optionally Scala. - Work with Databricks components including Workspace, Jobs, DLT (Delta Live Tables), Repos, and Unity Catalog. - Implement and optimize Delta Lake solutions aligned with Lakehouse and Medallion architecture best practices. - Collaborate with data architects, engineers, and business teams to understand requirements and deliver production-grade solutions. - Integrate CI/CD pipelines using tools such as Azure DevOps, GitHub Actions, or similar for Databricks deployments. - Ensure data quality, consistency, governance, and security by using tools like Unity Catalog or Azure Purview. - Use orchestration tools such as Apache Airflow, Azure Data Factory, or Databricks Workflows to schedule and monitor pipelines. - Apply strong SQL skills and data warehousing concepts in data modeling and transformation logic. - Communicate effectively with technical and non-technical stakeholders to translate business requirements into technical solutions. Required Skills and Qualifications : - Hands-on experience in data engineering, with specifically in Databricks. - Deep expertise in Databricks Workspace, Jobs, DLT, Repos, and Unity Catalog. - Strong programming skills in PySpark, Spark SQL; Scala experience is a plus. - Proficient in working with one or more cloud platforms : Azure, AWS, or GCP. - Experience with Delta Lake, Lakehouse architecture, and medallion architecture patterns. - Proficient in building CI/CD pipelines for Databricks using DevOps tools. - Familiarity with orchestration and ETL/ELT tools such as Airflow, ADF, or Databricks Workflows. - Strong understanding of data governance, metadata management, and lineage tracking. - Excellent analytical, communication, and stakeholder management skills.
Title: Senior Databricks Engineer Location: Baner, Pune Employment Type : Full-Time Availability : Immediate Joiners Only Work Mode : office Experience : 7+ Years Job Overview: We are seeking a highly skilled Senior Databricks Engineer to join our growing data engineering team. The ideal candidate will have deep expertise in building and optimizing scalable data pipelines on Databricks using PySpark, Spark SQL, and modern cloud-native tools. This role is crucial to delivering high-performance, production-ready solutions in collaboration with cross-functional teams including data scientists, architects, and business stakeholders. Key Responsibilities: Design, build, and maintain scalable data pipelines on Databricks using PySpark, Spark SQL, and optionally Scala. Work extensively with Databricks components such as Workspace, Jobs, DLT (Delta Live Tables), Repos, and Unity Catalog. Implement robust Delta Lake solutions adhering to Lakehouse and Medallion Architecture best practices. Collaborate closely with data architects, engineers, and business teams to translate requirements into technical solutions. Develop and manage CI/CD pipelines using tools like Azure DevOps, GitHub Actions, etc., for Databricks deployments. Ensure data quality, security, and governance using tools like Unity Catalog or Azure Purview. Utilize Airflow, Azure Data Factory, or Databricks Workflows for data orchestration and pipeline monitoring. Apply strong SQL and data modeling skills to support transformation and warehousing requirements. Communicate effectively with both technical and non-technical stakeholders. Required Skills & Qualifications: Minimum 4 years of hands-on experience in Data Engineering, with a focus on Databricks. Expertise in Databricks Workspace, Jobs, DLT, Repos, and Unity Catalog. Strong programming proficiency in PySpark and Spark SQL. Knowledge of Scala is a plus. Experience with one or more cloud platforms: Azure, AWS, or GCP. In-depth understanding of Delta Lake, Lakehouse architecture, and medallion architecture patterns. Proven ability to implement CI/CD pipelines using DevOps tools. Familiarity with orchestration tools like Apache Airflow, Azure Data Factory, or Databricks Workflows. Strong knowledge of data governance, metadata management, and lineage tracking. Excellent problem-solving, communication, and stakeholder management skills. If you're interested please share me your updated resume on jobs@newscapeconsulting.com contact number- +919702096664
Job Title: Senior Databricks Engineer / Azure Databricks engineer Location: Baner, Pune, Maharashtra Experience: 5+ Years Availability: Immediate Joiner or within 15 Days Work Type: Full-Time, Permanent Company Overview: Newscape is a fast-growing digital services provider focused on transforming the healthcare ecosystem through advanced data and cloud solutions. We help clients modernize legacy systems, enabling them to stay agile in a rapidly changing digital landscape. Our specialization lies in delivering scalable, intelligent, and user-centric healthcare technology solutions. Position Summary: We are seeking a seasoned Senior Databricks Engineer to join our data engineering team in Pune. The ideal candidate is expected to bring deep expertise in Databricks, Spark technologies, Delta Lake, and cloud platforms (Azure/AWS/GCP), and have a passion for building highly scalable data pipelines. You will play a key role in implementing Lakehouse architecture, ensuring data quality, and integrating robust CI/CD and orchestration pipelines. Key Responsibilities: Design, develop, and optimize large-scale data pipelines using Databricks , PySpark , and Spark SQL . Implement Lakehouse architecture using Delta Lake and follow medallion architecture principles (Bronze, Silver, Gold layers). Develop and manage Databricks components: Jobs , Delta Live Tables (DLT) , Repos , Unity Catalog , and Workflows . Collaborate with data architects , data scientists , and business stakeholders to deliver scalable and maintainable data solutions. Ensure data governance , security , and compliance using tools like Unity Catalog and Azure Purview . Build CI/CD pipelines for data projects using Azure DevOps , GitHub Actions , or equivalent. Schedule and monitor workflows using Airflow , Azure Data Factory , or Databricks Workflows . Perform data modeling, transformation, and loading using strong SQL and data warehousing concepts . Translate complex business requirements into technical implementations with clear documentation and stakeholder alignment. Provide mentorship and technical guidance to junior team members. Required Qualifications: 6+ years of experience in data engineering , including 4 + years on Databricks . Expert-level proficiency in Databricks Workspace , DLT , Jobs , Repos , and Unity Catalog . Strong hands-on knowledge of PySpark , Spark SQL , and optionally Scala . Experience with one or more major cloud platforms: Azure , AWS , or GCP (Azure preferred). Solid understanding and hands-on experience with Delta Lake , Lakehouse , and medallion architecture . Proven experience with CI/CD tools such as Azure DevOps , GitHub , Bitbucket , etc. Familiarity with orchestration tools like Apache Airflow , ADF , or Databricks Workflows . Understanding of data governance , lineage , and metadata management practices. Strong communication and collaboration skills, with the ability to interact effectively with technical and non-technical stakeholders. Nice to Have: Experience in the healthcare domain and understanding of healthcare data standards. Exposure to machine learning workflows or support for data science teams . Certifications in Databricks , Azure , or other cloud platforms. What We Offer: Opportunity to work on cutting-edge technologies and transformative healthcare projects. Collaborative work environment with a focus on learning and innovation. Competitive salary and performance-based growth. Work-life balance with flexible engagement for high performers. Exposure to global healthcare leaders and next-gen data platforms. Thanks & regards, Swapnil Supe HR Executive +91 8233829595 swapnil.supe@newscapeconsulting.com
If you want to kickstart or grow your career in the technology industry, this is your chance! 🌟 Join Newscape Consulting , a fast-growing IT solutions company working in AI/ML, Data and Cloud. 📍 Location: Baner, Pune (Work from Office) 💼 Experience: Fresher to 2 Years What You’ll Do: Client / Prospect Outreach - Email campaigns Social Media Marketing Handle & connect with potential clients to understand their needs 🌍 Work closely with senior team on developing solutions and giving demos to clients Perform all core Business Development Executive responsibilities Conduct market research & competitor analysis We’re Looking For: Graduates: BBA, BCA, CA, or equivalent, understanding of technology is an added advantage Strong communication skills Eager to learn & grow in a fast-paced environment Why Join Us? ✅ Real-world exposure to IT Inside Sales & Business Development ✅ Learn from experienced professionals ✅ Work on cutting-edge tech ✅ Great career growth opportunities 📩 Apply Now: Send your resume to swapnil.supe@newscapeconsulting.com Subject Line: Inside Sales Executive – Your Name
If you want to kickstart or grow your career in the technology industry, this is your chance! ???? Join Newscape Consulting , a fast-growing IT solutions company working in AI/ML, Data and Cloud. ???? Location: Baner, Pune (Work from Office) ???? Experience: Fresher to 2 Years What Youll Do: Client / Prospect Outreach - Email campaigns Social Media Marketing Handle & connect with potential clients to understand their needs ???? Work closely with senior team on developing solutions and giving demos to clients Perform all core Business Development Executive responsibilities Conduct market research & competitor analysis Were Looking For: Graduates: BBA, BCA, CA, or equivalent, understanding of technology is an added advantage Strong communication skills Eager to learn & grow in a fast-paced environment Why Join Us ? Real-world exposure to IT Inside Sales & Business Development ? Learn from experienced professionals ? Work on cutting-edge tech ? Great career growth opportunities ???? Apply Now: Send your resume to [HIDDEN TEXT] Subject Line: Inside Sales Executive Your Name Show more Show less
As a Senior Databricks Developer at Newscape Consulting, you will play a crucial role in our data engineering team, focusing on building scalable and efficient data pipelines using Databricks, Apache Spark, Delta Lake, and cloud-native services (Azure/AWS/GCP). Your responsibilities will include collaborating closely with data architects, data scientists, and business stakeholders to deliver high-performance, production-grade solutions that enhance user experience and productivity in the healthcare industry. Your key skills should include a strong hands-on experience with Databricks including Workspace, Jobs, DLT, Repos, and Unity Catalog. Proficiency in PySpark, Spark SQL, and optionally Scala is essential. You should also have a solid understanding of Delta Lake, Lakehouse architecture, and medallion architecture. Additionally, proficiency in at least one cloud platform such as Azure, AWS, or GCP is required. Experience in CI/CD for Databricks using tools like Azure DevOps or GitHub Actions, strong SQL skills, and familiarity with data warehousing concepts are essential for this role. Knowledge of data governance, lineage, and catalog tools like Unity Catalog or Purview will be beneficial. Familiarity with orchestration tools like Airflow, Azure Data Factory, or Databricks Workflows is also desired. This position is based in Pune, India, and is a full-time role with the option to work from the office. Strong communication, problem-solving, and stakeholder management skills are key attributes that we are looking for in the ideal candidate for this role.,