Software Engineer

0 - 4 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

We are looking for a passionate and detail-oriented Software Engineer to join our Big Data / Data Warehouse (DWH) / ETL team. The ideal candidate should have a strong understanding of data systems, data processing pipelines, and an interest in large-scale data analytics. Fresh graduates with relevant academic projects or training in data engineering are also welcome to apply.

Responsibilities

  • Design, develop, and maintain ETL pipelines for efficient data extraction, transformation, and loading.
  • Work with Big Data technologies (Hadoop, Spark, Hive, etc. ) to process and analyse large datasets.
  • Develop and maintain data warehouse models and ensure data integrity and consistency.
  • Optimise and tune ETL workflows for better performance and scalability.
  • Collaborate with data analysts, data scientists, and other engineering teams to deliver high-quality data solutions.
  • Monitor, debug, and troubleshoot data flow issues in production systems.
  • Write clean, maintainable, and well-documented code following best practices.

Requirements

  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 0-4 years of experience in Data Engineering / ETL / Big Data development.
  • Familiarity with Python, Java, or Scala for data processing.
  • Understanding of cloud data platforms (AWS Redshift, Google BigQuery, Azure Synapse, Snowflake) is a plus.
  • Strong analytical, problem-solving, and debugging skills.
  • Hands-on experience with SQL and database concepts (RDBMS like MySQL, PostgreSQL, or Oracle).
  • Strong knowledge of ETL tools (Informatica, Talend, SSIS, Apache NiFi, or custom ETL pipelines).
  • Strong knowledge of Big Data technologies (Hadoop, Spark, Hive, HDFS).
  • Hands-on experience with Data modelling and data warehousing concepts (Star/Snowflake schema).

Preferred Skills (Good To Have)

  • Experience with Airflow, Kafka, or AWS Glue.
  • Knowledge of Linux / Unix systems.
  • Familiarity with version control tools (Git, Bitbucket).
  • Understanding of CI/CD pipelines and data governance practices.
This job was posted by Sarthak Agrawal from Material Depot.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You