Senior Data Engineer / Architect

8 years

4 - 8 Lacs

Posted:2 weeks ago| Platform: GlassDoor logo

Apply

Work Mode

Remote

Job Type

Part Time

Job Description

Experience Required: 8+Years
Mode of work: Remote
Skills Required: Azure DataBricks, Eventhub, Kafka, Architecture,Azure Data Factory, Pyspark, Python, SQL, Spark
Notice Period : Immediate Joiners/ Permanent/Contract role (Can join within September 15th 2025)
  • Design, develop, and maintain scalable and robust data solutions in the cloud using Apache Spark and Databricks.
  • Gather and analyse data requirements from business stakeholders and identify opportunities for data-driven insights.
  • Build and optimize data pipelines for data ingestion, processing, and integration using Spark and Databricks.
  • Ensure data quality, integrity, and security throughout all stages of the data lifecycle.
  • Collaborate with cross-functional teams to design and implement data models, schemas, and storage solutions.
  • Optimize data processing and analytics performance by tuning Spark jobs and leveraging Databricks features.
  • Provide technical guidance and expertise to junior data engineers and developers.
  • Stay up to date with emerging trends and technologies in cloud computing, big data, and data engineering.
  • Contribute to the continuous improvement of data engineering processes, tools, and best practices.
Requirements:
  • Bachelor’s or master’s degree in computer science, engineering, or a related field.
  • 10+ years of experience as a Data Engineer, Software Engineer, or similar role, with a focus on building cloud-based data solutions.
  • Strong knowledge and experience with Azure cloud platform, Databricks,EventHub,Architecture,Spark,Kafka, ETL Pipeline,Python/Pyspark, SQL EventHub, Copilot Studio..
  • Strong experience with cloud platforms such as Azure .
  • Experience with big data systems, including Apache Spark / Kafka
  • Experience contributing to the architecture and design of large-scale distributed systems
  • Expertise in Databricks Lakehouse Platform, its architecture, and its capabilities.
  • Experience building production pipelines using Databricks and Azure services
  • Experience with multiple coding languages such as Python or SQL.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Enable Data Incorporated logo
Enable Data Incorporated

Information Technology

Data City

RecommendedJobs for You