Jobs
Interviews
Newscape Consulting

Newscape Consulting provides specialized business consulting services focused on enhancing operational efficiencies and strategic planning for small to medium-sized enterprises.

4 Job openings at Newscape Consulting
Senior Data Developer

Pune

6 - 8 years

INR 15.0 - 19.2 Lacs P.A.

Work from Office

Full Time

Responsibilities Design, build,maintain ,scalable and efficient data pipelines using PySpark, Spark SQL, and optionally Scala. Develop and manage data solutions on the Databricks platform, utilizing Workspace, Jobs, (DLT), Repos, and Unity Catalog.

Sr. Databricks Developer

Pune

7 - 9 years

INR 15.0 - 18.0 Lacs P.A.

Work from Office

Full Time

We are looking for a highly skilled Senior Databricks Developer to join our data engineering team. You will be responsible for building scalable and efficient data pipelines using Databricks, Apache Spark, Delta Lake, and cloud-native services (Azure/AWS/GCP). You will work closely with data architects, data scientists, and business stakeholders to deliver high-performance, production-grade solutions. Key Responsibilities : - Design, build, and maintain scalable and efficient data pipelines on Databricks using PySpark, Spark SQL, and optionally Scala. - Work with Databricks components including Workspace, Jobs, DLT (Delta Live Tables), Repos, and Unity Catalog. - Implement and optimize Delta Lake solutions aligned with Lakehouse and Medallion architecture best practices. - Collaborate with data architects, engineers, and business teams to understand requirements and deliver production-grade solutions. - Integrate CI/CD pipelines using tools such as Azure DevOps, GitHub Actions, or similar for Databricks deployments. - Ensure data quality, consistency, governance, and security by using tools like Unity Catalog or Azure Purview. - Use orchestration tools such as Apache Airflow, Azure Data Factory, or Databricks Workflows to schedule and monitor pipelines. - Apply strong SQL skills and data warehousing concepts in data modeling and transformation logic. - Communicate effectively with technical and non-technical stakeholders to translate business requirements into technical solutions. Required Skills and Qualifications : - Hands-on experience in data engineering, with specifically in Databricks. - Deep expertise in Databricks Workspace, Jobs, DLT, Repos, and Unity Catalog. - Strong programming skills in PySpark, Spark SQL; Scala experience is a plus. - Proficient in working with one or more cloud platforms : Azure, AWS, or GCP. - Experience with Delta Lake, Lakehouse architecture, and medallion architecture patterns. - Proficient in building CI/CD pipelines for Databricks using DevOps tools. - Familiarity with orchestration and ETL/ELT tools such as Airflow, ADF, or Databricks Workflows. - Strong understanding of data governance, metadata management, and lineage tracking. - Excellent analytical, communication, and stakeholder management skills.

Senior Databricks Engineer

Pune, Maharashtra, India

7 years

None Not disclosed

On-site

Full Time

Title: Senior Databricks Engineer Location: Baner, Pune Employment Type : Full-Time Availability : Immediate Joiners Only Work Mode : office Experience : 7+ Years Job Overview: We are seeking a highly skilled Senior Databricks Engineer to join our growing data engineering team. The ideal candidate will have deep expertise in building and optimizing scalable data pipelines on Databricks using PySpark, Spark SQL, and modern cloud-native tools. This role is crucial to delivering high-performance, production-ready solutions in collaboration with cross-functional teams including data scientists, architects, and business stakeholders. Key Responsibilities: Design, build, and maintain scalable data pipelines on Databricks using PySpark, Spark SQL, and optionally Scala. Work extensively with Databricks components such as Workspace, Jobs, DLT (Delta Live Tables), Repos, and Unity Catalog. Implement robust Delta Lake solutions adhering to Lakehouse and Medallion Architecture best practices. Collaborate closely with data architects, engineers, and business teams to translate requirements into technical solutions. Develop and manage CI/CD pipelines using tools like Azure DevOps, GitHub Actions, etc., for Databricks deployments. Ensure data quality, security, and governance using tools like Unity Catalog or Azure Purview. Utilize Airflow, Azure Data Factory, or Databricks Workflows for data orchestration and pipeline monitoring. Apply strong SQL and data modeling skills to support transformation and warehousing requirements. Communicate effectively with both technical and non-technical stakeholders. Required Skills & Qualifications: Minimum 4 years of hands-on experience in Data Engineering, with a focus on Databricks. Expertise in Databricks Workspace, Jobs, DLT, Repos, and Unity Catalog. Strong programming proficiency in PySpark and Spark SQL. Knowledge of Scala is a plus. Experience with one or more cloud platforms: Azure, AWS, or GCP. In-depth understanding of Delta Lake, Lakehouse architecture, and medallion architecture patterns. Proven ability to implement CI/CD pipelines using DevOps tools. Familiarity with orchestration tools like Apache Airflow, Azure Data Factory, or Databricks Workflows. Strong knowledge of data governance, metadata management, and lineage tracking. Excellent problem-solving, communication, and stakeholder management skills. If you're interested please share me your updated resume on jobs@newscapeconsulting.com contact number- +919702096664

Azure Data Engineer

Pune

5 - 10 years

INR 8.0 - 18.0 Lacs P.A.

Work from Office

Full Time

Job Title: Senior Databricks Engineer / Azure Databricks engineer Location: Baner, Pune, Maharashtra Experience: 5+ Years Availability: Immediate Joiner or within 15 Days Work Type: Full-Time, Permanent Company Overview: Newscape is a fast-growing digital services provider focused on transforming the healthcare ecosystem through advanced data and cloud solutions. We help clients modernize legacy systems, enabling them to stay agile in a rapidly changing digital landscape. Our specialization lies in delivering scalable, intelligent, and user-centric healthcare technology solutions. Position Summary: We are seeking a seasoned Senior Databricks Engineer to join our data engineering team in Pune. The ideal candidate is expected to bring deep expertise in Databricks, Spark technologies, Delta Lake, and cloud platforms (Azure/AWS/GCP), and have a passion for building highly scalable data pipelines. You will play a key role in implementing Lakehouse architecture, ensuring data quality, and integrating robust CI/CD and orchestration pipelines. Key Responsibilities: Design, develop, and optimize large-scale data pipelines using Databricks , PySpark , and Spark SQL . Implement Lakehouse architecture using Delta Lake and follow medallion architecture principles (Bronze, Silver, Gold layers). Develop and manage Databricks components: Jobs , Delta Live Tables (DLT) , Repos , Unity Catalog , and Workflows . Collaborate with data architects , data scientists , and business stakeholders to deliver scalable and maintainable data solutions. Ensure data governance , security , and compliance using tools like Unity Catalog and Azure Purview . Build CI/CD pipelines for data projects using Azure DevOps , GitHub Actions , or equivalent. Schedule and monitor workflows using Airflow , Azure Data Factory , or Databricks Workflows . Perform data modeling, transformation, and loading using strong SQL and data warehousing concepts . Translate complex business requirements into technical implementations with clear documentation and stakeholder alignment. Provide mentorship and technical guidance to junior team members. Required Qualifications: 6+ years of experience in data engineering , including 4 + years on Databricks . Expert-level proficiency in Databricks Workspace , DLT , Jobs , Repos , and Unity Catalog . Strong hands-on knowledge of PySpark , Spark SQL , and optionally Scala . Experience with one or more major cloud platforms: Azure , AWS , or GCP (Azure preferred). Solid understanding and hands-on experience with Delta Lake , Lakehouse , and medallion architecture . Proven experience with CI/CD tools such as Azure DevOps , GitHub , Bitbucket , etc. Familiarity with orchestration tools like Apache Airflow , ADF , or Databricks Workflows . Understanding of data governance , lineage , and metadata management practices. Strong communication and collaboration skills, with the ability to interact effectively with technical and non-technical stakeholders. Nice to Have: Experience in the healthcare domain and understanding of healthcare data standards. Exposure to machine learning workflows or support for data science teams . Certifications in Databricks , Azure , or other cloud platforms. What We Offer: Opportunity to work on cutting-edge technologies and transformative healthcare projects. Collaborative work environment with a focus on learning and innovation. Competitive salary and performance-based growth. Work-life balance with flexible engagement for high performers. Exposure to global healthcare leaders and next-gen data platforms. Thanks & regards, Swapnil Supe HR Executive +91 8233829595 swapnil.supe@newscapeconsulting.com

Newscape Consulting logo

Newscape Consulting

|

Consulting

Atlanta

20 Employees

4 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview