Posted:17 hours ago| Platform: SimplyHired logo

Apply

Work Mode

On-site

Job Description

About Tarento:
Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions.
We're proud to be recognized as a Great Place to Work, a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you’ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose.

Core Responsibilities
  • Architect and deliver scalable data solutions using Azure Databricks, Azure Data Factory, Azure Synapse, and related Azure services.
  • Design and oversee Lakehouse and ETL/ELT data platform architectures, efficiently moving and transforming large-scale data.
  • Develop, optimize, and manage robust data pipelines, ensuring high performance, reliability, and cost efficiency.
  • Implement unified data governance, including fine-grained security, data masking, encryption, and access controls with tools like Unity Catalog and Azure Purview.
  • Collaborate with stakeholders, analytics, and development teams to translate requirements into scalable solutions and pipelines.
  • Integrate workflows into CI/CD environments using Azure DevOps and Git, supporting continuous delivery for data projects.
  • Lead performance tuning and troubleshooting of Spark jobs and ETL processes, refactoring code for throughput and latency.
  • Maintain documentation of architectures, models, and configurations for compliance and knowledge sharing.

Required Skills
  • Deep expertise in Azure Databricks, Data Factory, Delta Lake, and Apache Spark.
  • Hands-on experience with data integration, data modelling, SQL, and Python.
  • Strong understanding of cloud-based data engineering, real-time processing, Lakehouse architectures, and event-driven design.
  • CI/CD process design using Azure DevOps, Git integration, and infrastructure-as-code deployment.
  • Knowledge of data governance frameworks, security best practices, and compliance standards (e.g., GDPR, HIPAA).
  • Excellent communication, leadership, and stakeholder management abilities.

Job Description Bullet Points
  • 10+ years of datawarehousing experience
  • Lead architecture, design, and delivery of enterprise-scale data platforms on Azure Cloud.
  • Develop and optimize ETL/ELT pipelines and databricks workflows for complex datasets across cloud domains.
  • Manage, tune, and administer Databricks clusters for performance and cost optimization.
  • Implement data security, governance, and compliance features using Azure services and Unity Catalog.
  • Collaborate on pipeline integration with downstream analytics, AI, and reporting teams.
  • Create and maintain architecture artefacts and reusable assets to accelerate data platform delivery.
  • Mentor, guide, and support junior engineers across DevOps, DataOps, and analytics.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

satellite, ahmedabad, gujarat