Data Pipeline Engineer

7 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job description

Location: Kuwait On on-site 


Experience Required: 7 Years years

Employment Type: Full-time

Department: Data Engineering / Analytics



Job Summary:

We are seeking a skilled and detail-oriented Data Pipeline Engineer to design, build, and manage scalable data pipelines for ingestion, integration, quality control, transformation, and dashboard integration. The ideal candidate will work closely with upstream systems like Seabed and Sahala, and ensure seamless delivery of clean, actionable data to business intelligence and operational systems.



Key Responsibilities:



1. Data Pipeline Development

Extract data from Seabed, Sahala, and other relevant upstream systems.

Design and implement batch/streaming ingestion processes into data lakes or data warehouses.

Integrate multi-source data into a centralized pipeline ensuring schema consistency and data alignment.



2. Data Quality (QC) Implementation

Define and enforce data quality checks such as:

Completeness

Accuracy

Consistency

Timeliness

Duplicate handling

Create alert systems, logs, and reports for data quality failures and anomalies.



3. Data Transformation Logic

Design and apply transformation logic to convert raw data into analysis-ready formats.

Implement business logic for filtering, aggregation, and enrichment.

Prepare data outputs suitable for:

Analytical dashboards

Operational workflows



4. System & Dashboard Workflow Integration

Integrate processed data with BI tools (e.g., Power BI, Tableau) and automation systems (e.g., APIs, scripts).

Validate output data for compatibility with front-end dashboards and backend systems.



5. Documentation & Handover

Create clear documentation for:

Data architecture

Data flow diagrams

QC rules

Transformation logic

Provide technical/user documentation for ongoing support and future enhancements.



6. Testing & Validation

Perform end-to-end testing including:

Source-to-target validation

QC & transformation rule validation

Dashboard/system integration testing



Required Skills & Qualifications:

Bachelor’s/Master’s degree in Computer Science, Data Engineering, or related field.

Strong experience in building ETL/ELT pipelines (batch & streaming).

Proficiency in SQL, Python, or Spark.

Experience with data lakes/data warehouses (e.g., AWS Redshift, Snowflake, Azure Data Lake).

Hands-on with BI tools like Power BI or Tableau.

Familiarity with data quality frameworks and validation tools.

Excellent documentation and communication skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Gurgaon / Gurugram, Haryana, India