Newscape Consulting provides specialized business consulting services focused on enhancing operational efficiencies and strategic planning for small to medium-sized enterprises.
Pune, Baner
INR 9.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Develop, test, and maintain web applications using Blazor and .NET Core. Work with RESTful APIs to integrate with backend services. Collaborate with the team in an Agile/Scrum environment. Implement and maintain UI components using Blazor WebAssembly and Blazor Server. Optimize application performance and ensure responsive UI across devices. Design, develop, and debug software using C# and SQL Server. Follow SOLID principles and best coding practices. Participate in code reviews and contribute to improving software architecture. The ideal candidate should have experience in Blazor framework, .NET Core, C#, REST API, and SQL Server along with a good understanding of front-end development.
Pune
INR 15.0 - 19.2 Lacs P.A.
Work from Office
Full Time
Responsibilities Design, build,maintain ,scalable and efficient data pipelines using PySpark, Spark SQL, and optionally Scala. Develop and manage data solutions on the Databricks platform, utilizing Workspace, Jobs, (DLT), Repos, and Unity Catalog.
Pune
INR 15.0 - 18.0 Lacs P.A.
Work from Office
Full Time
We are looking for a highly skilled Senior Databricks Developer to join our data engineering team. You will be responsible for building scalable and efficient data pipelines using Databricks, Apache Spark, Delta Lake, and cloud-native services (Azure/AWS/GCP). You will work closely with data architects, data scientists, and business stakeholders to deliver high-performance, production-grade solutions. Key Responsibilities : - Design, build, and maintain scalable and efficient data pipelines on Databricks using PySpark, Spark SQL, and optionally Scala. - Work with Databricks components including Workspace, Jobs, DLT (Delta Live Tables), Repos, and Unity Catalog. - Implement and optimize Delta Lake solutions aligned with Lakehouse and Medallion architecture best practices. - Collaborate with data architects, engineers, and business teams to understand requirements and deliver production-grade solutions. - Integrate CI/CD pipelines using tools such as Azure DevOps, GitHub Actions, or similar for Databricks deployments. - Ensure data quality, consistency, governance, and security by using tools like Unity Catalog or Azure Purview. - Use orchestration tools such as Apache Airflow, Azure Data Factory, or Databricks Workflows to schedule and monitor pipelines. - Apply strong SQL skills and data warehousing concepts in data modeling and transformation logic. - Communicate effectively with technical and non-technical stakeholders to translate business requirements into technical solutions. Required Skills and Qualifications : - Hands-on experience in data engineering, with specifically in Databricks. - Deep expertise in Databricks Workspace, Jobs, DLT, Repos, and Unity Catalog. - Strong programming skills in PySpark, Spark SQL; Scala experience is a plus. - Proficient in working with one or more cloud platforms : Azure, AWS, or GCP. - Experience with Delta Lake, Lakehouse architecture, and medallion architecture patterns. - Proficient in building CI/CD pipelines for Databricks using DevOps tools. - Familiarity with orchestration and ETL/ELT tools such as Airflow, ADF, or Databricks Workflows. - Strong understanding of data governance, metadata management, and lineage tracking. - Excellent analytical, communication, and stakeholder management skills.
My Connections Newscape Consulting
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.