Posted:1 week ago|
Platform:
On-site
Full Time
Job description
Location: Kuwait On on-site
Experience Required: 7 Years years
Employment Type: Full-time
Department: Data Engineering / Analytics
Job Summary:
We are seeking a skilled and detail-oriented Data Pipeline Engineer to design, build, and manage scalable data pipelines for ingestion, integration, quality control, transformation, and dashboard integration. The ideal candidate will work closely with upstream systems like Seabed and Sahala, and ensure seamless delivery of clean, actionable data to business intelligence and operational systems.
Key Responsibilities:
1. Data Pipeline Development
Extract data from Seabed, Sahala, and other relevant upstream systems.
Design and implement batch/streaming ingestion processes into data lakes or data warehouses.
Integrate multi-source data into a centralized pipeline ensuring schema consistency and data alignment.
2. Data Quality (QC) Implementation
Define and enforce data quality checks such as:
Completeness
Accuracy
Consistency
Timeliness
Duplicate handling
Create alert systems, logs, and reports for data quality failures and anomalies.
3. Data Transformation Logic
Design and apply transformation logic to convert raw data into analysis-ready formats.
Implement business logic for filtering, aggregation, and enrichment.
Prepare data outputs suitable for:
Analytical dashboards
Operational workflows
4. System & Dashboard Workflow Integration
Integrate processed data with BI tools (e.g., Power BI, Tableau) and automation systems (e.g., APIs, scripts).
Validate output data for compatibility with front-end dashboards and backend systems.
5. Documentation & Handover
Create clear documentation for:
Data architecture
Data flow diagrams
QC rules
Transformation logic
Provide technical/user documentation for ongoing support and future enhancements.
6. Testing & Validation
Perform end-to-end testing including:
Source-to-target validation
QC & transformation rule validation
Dashboard/system integration testing
Required Skills & Qualifications:
Bachelor’s/Master’s degree in Computer Science, Data Engineering, or related field.
Strong experience in building ETL/ELT pipelines (batch & streaming).
Proficiency in SQL, Python, or Spark.
Experience with data lakes/data warehouses (e.g., AWS Redshift, Snowflake, Azure Data Lake).
Hands-on with BI tools like Power BI or Tableau.
Familiarity with data quality frameworks and validation tools.
Excellent documentation and communication skills.
EICE Technology
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowmaharashtra
Salary: Not disclosed
Delhi, India
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed
Gurgaon / Gurugram, Haryana, India
4.5 - 9.5 Lacs P.A.
Delhi, Delhi, India
2.0 - 4.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
maharashtra
Salary: Not disclosed
Delhi, India
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed