Digikore Studios - Data Engineer - ETL

3 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About The Role

We are seeking a skilled Data Engineer ETL to design, develop, and maintain robust ETL pipelines that enable seamless data integration and processing across multiple data sources.You will play a critical role in transforming raw data into actionable insights by ensuring efficient, scalable, and reliable data flows to support analytics, reporting, and machine learning initiatives.

Key Responsibilities

  • Design, build, and optimize scalable ETL pipelines to extract, transform, and load data from various structured and unstructured data sources.
  • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver reliable data solutions.
  • Develop data workflows and automation using ETL tools (e.g., Apache NiFi, Talend, Informatica) or custom scripts (Python, SQL, Shell).
  • Monitor and troubleshoot ETL jobs to ensure high data quality and timely delivery.
  • Implement data validation, error handling, and logging to ensure data accuracy and integrity.
  • Optimize data storage and retrieval through database tuning, partitioning, and indexing strategies.
  • Collaborate with cloud engineers and infrastructure teams to manage data pipelines in cloud environments (AWS, Azure, GCP).
  • Document ETL processes, data lineage, and architecture to maintain knowledge sharing and compliance.
  • Stay current with new data engineering technologies and best practices to improve system performance and reliability.

Qualifications

  • Bachelors degree in Computer Science, Information Systems, or related field.
  • 3+ years of experience in data engineering with a focus on ETL development.
  • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL, Oracle).
  • Hands-on experience with ETL tools and frameworks (Apache Airflow, Talend, Informatica, AWS Glue, etc.).
  • Strong programming skills in Python, Java, or Scala.
  • Familiarity with big data technologies such as Hadoop, Spark, Kafka is a plus.
  • Experience working with cloud data platforms (AWS Redshift, Google BigQuery, Azure Synapse).
  • Knowledge of data warehousing concepts and dimensional modeling.
  • Strong analytical and problem-solving skills.
  • Ability to work collaboratively in an agile team environment
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You