4 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

This role is for one of the Weekday's clients

Min Experience: 4 yearsLocation: IndiaJobType: full-timeWe are seeking an experienced Data Engineer with strong expertise in Databricks and modern data engineering practices. The ideal candidate will have 4+ years of hands-on experience in developing scalable data pipelines, managing distributed data systems, and supporting end-to-end CI/CD processes. This role involves architecting and optimizing data workflows that enable seamless data-driven decision-making across the organization.

Requirements

Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines for large-scale datasets using Spark, Hive, or Glue.
  • Develop and optimize data integration workflows using ETL tools such as Informatica, Talend, or SSIS.
  • Write, optimize, and maintain complex SQL queries for data transformation and analytics.
  • Collaborate with cross-functional teams including data scientists, analysts, and product stakeholders to translate requirements into technical solutions.
  • Deploy data workflows using CI/CD pipelines and ensure smooth automated releases.
  • Monitor and optimize data workflows for performance, scalability, and reliability.
  • Ensure data accuracy, governance, security, and compliance across pipelines.
  • Work with cloud-based data platforms such as Azure (ADF, Synapse, Databricks) or AWS (EMR, Glue, S3, Athena).
  • Maintain clear documentation of data systems, architectures, and processes.
  • Provide mentorship and technical guidance to junior team members.
  • Stay current with emerging data engineering tools, technologies, and best practices.

What You'll Bring

  • Bachelor's degree in IT, Computer Science, or related field.
  • 4+ years of experience in data engineering and distributed data processing.
  • Strong hands-on experience with Databricks or equivalent technologies (Spark, EMR, Hadoop).
  • Proficiency in Python or Scala.
  • Experience with modern data warehouses (Snowflake, Redshift, Oracle).
  • Solid understanding of distributed storage systems (HDFS, ADLS, S3) and formats such as Parquet and ORC.
  • Familiarity with orchestration tools such as ADF, Airflow, or Step Functions.
  • Databricks Data Engineering Professional certification (preferred / required as needed).
  • Experience in multi-cloud or migration-based projects is a plus

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bengaluru, karnataka, india