Posted:12 hours ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Important Note (Please Read Before Applying)

Do NOT apply if:

less than 5 years or more than 8 years

hands-on Azure Data Engineering experience

ADF, Databricks, Synapse

remote-only roles

• Your background is not aligned with data engineering (support/testing only)

Apply ONLY if you meet ALL mandatory criteria above.


Job Title:

Location:

Experience:

Employment Type:

Notice Period:


About the Company

Our client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success, they empower clients and society to move confidently into the digital future.



Data Architecture & Pipeline Development:

Azure Data Factory, Azure Databricks, and Azure Synapse Analytics

ETL/ELT workflows

Star Schema, Snowflake, and Data Vault

Azure Event Hubs, Stream Analytics, or Kafka


Data Management & Governance:

• Ensure data quality, consistency, and integrity across environments

metadata management, lineage, and cataloging

Azure AD and RBAC

DevOps practices


Cloud & DevOps Integration:

Azure Data Lake Storage (ADLS Gen2)

IaC tools

CI/CD pipelines

• Monitor and optimize compute and storage costs while ensuring scalability

Collaboration & Leadership

• Work closely with data scientists, BI teams, and product owners

• Mentor junior data engineers on best practices and Azure services

• Participate in architectural reviews and platform design decisions

• Communicate technical concepts and trade-offs to non-technical stakeholders


Mandatory Skills:


Core Azure Skills:

✔ Azure Data Factory (pipelines, triggers, integration runtime)

✔ Azure Databricks (PySpark, Delta Lake, notebooks, jobs)

✔ Azure Synapse Analytics (Dedicated & Serverless SQL pools)

✔ Azure Data Lake Storage Gen2

✔ Azure Functions / Logic Apps

✔ Azure Event Hubs / Kafka / Stream Analytics


Programming & Tools:

Python or Scala

SQL

PySpark

Git, CI/CD, DevOps

✔ Power BI or other BI tool exposure is a plus


Data Architecture:

Lakehouse architecture

SCDs, fact & dimension tables

✔ Partitioning, indexing, and query optimization


Good to Have:

• API-based data ingestion & REST integrations

ML data pipelines / feature stores

Azure cost optimization & performance tuning


Soft Skills:

✔ Strong analytical & problem-solving skills

✔ Excellent communication and documentation abilities

Agile / fast-paced environments

✔ Proven cross-functional collaboration experience

✔ Self-driven with a strong passion for cloud & data engineering

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You