Position :
 Data Engineer (1Data Platform)
Location :
 Delhi
Type :
 Full-time, Onsite, Individual Contributor
Objective
Data powers price-intelligence, BOM optimisation, and excess-inventory trades for the Global electronics supply chain. Your mission is to safeguard the truthfulness and business relevance of every dataset we provide; spotting hidden anomalies, reconciling multi-source price feeds, and ensuring each metric we publish can be trusted by procurement, finance, and compliance teams of our customers.
Key Responsibilities
End-to-End Data Validation :
-  Trace each data item, from customs records and supplier catalogues through ELT transforms to the customer dashboard, verifying completeness, accuracy, timeliness, and lineage.
-  Define acceptance criteria and golden datasets for complex calculations such as landed-cost, tariff duty, lifecycle risk scores, and alternate-part price floors.
 
Domain-Focused Test Design & Execution
-  Draft and execute detailed test plans that reflect real-world supply-chain scenarios : non-standard UOM shipments, multi-currency quotes, EOL announcements, QC failures, compliance flags, etc.
-  Challenge business rules like, import-value thresholds, MOQ rounding, tariff/ tax treatments, by creating boundary and negative test cases.
 
Exploratory Data Analysis
-  Use SQL, Mongo queries and ClickHouse queries to profile data distributions, identify outliers, and reconcile divergent prices across regions (IN, SEA, CN, US, EU, etc).
-  Produce investigative reports for anomalies (spikes, missing batches, unit-of-measure mismatches) and drive root-cause resolution with Data Eng and Product Delivery teams.
 
Quality Governance & Stakeholder Alignment
-  Maintain a living Data-Quality Specification that maps every metric to source tables, transformation logic, and business meaning.
-  Conduct UAT sessions with supply-chain analysts and sourcing managers; capture feedback, raise defects, and sign off releases.
-  Own data-quality SLAs and report status in weekly engineering reviews.
 
Process Improvement
-  Where automation adds clear value, introduce light-weight scripted checks (Great Expectations, custom SQL assertions) and embed them in CI.
-  Mentor developers on writing testable SQL/ELT models and documenting edge cases.
 
Must-Have Qualifications
-  4+ years in data-centric QA, business-analysis, or data-engineering roles with a heavy validation component.
-  Deep domain knowledge of electronic components, MPN nomenclature, tariff/HS codes, landed-cost structures, lifecycle statuses (Active, NRND, EOL), supply chain risks and commodity compliances.
-  Strong SQL/ NoSQL skills for ad-hoc investigation and reconciliation across MongoDB or PostgreSQL or ClickHouse; comfort with queries, joins, etc.
-  Practical experience validating multi-source price feeds, currency conversions, and time-series data.
-  Demonstrated ability to translate business rules into precise test cases and to communicate findings clearly to both engineers and non-technical stakeholders.
-  Meticulous attention to detail and an instinct for questioning numbers that look right but feel wrong.
 
Nice-to-Have
-  Familiarity with Great Expectations for declarative data testing.
-  Exposure to supply-chain or ERP data (SAP, Oracle, QAD) and EDI/CSV ingestion quirks.
-  Basic scripting in Python for one-off data probes and small automation hooks.
-  ISTQB (Advanced) or equivalent certification focused on data quality.
 
(ref:hirist.tech)