We are looking for a seasoned Data Engineer with strong experience in designing and building modern data platforms from the ground up. This role involves close collaboration with architects, DevOps, and business teams to establish a scalable, secure, and high-performing data ecosystem. The ideal candidate is hands-on, cloud-savvy, and passionate about data infrastructure, governance, and engineering best practices.Job Title: Data Engineer Profilecode : LOC_DATA_ETL_SSA_1 At bpost group, data is more than numbers its a strategic asset that drives critical decisions across our logistics and e-commerce operations. As we continue to evolve into a more data-driven organization, we are investing in cutting-edge infrastructure and talent to unlock deeper insights and enable smarter resource allocation. We are seeking a Data Engineer to join our Yield and Capacity Management team. In this role, you will play a central part in designing, building, and maintaining robust data pipelines and platforms that support advanced analytics, forecasting, and optimization of our operational capacity and pricing strategies. If youre passionate about scalable data architecture, enjoy working at the intersection of business and technology, and want to make a tangible impact on performance and profitability, then this is your opportunity to help shape the future of data-driven logistics at bpost group. Role Summary: We are looking for a seasoned Data Engineer with strong experience in designing and building modern data platforms from the ground up. This role involves close collaboration with architects, DevOps, and business teams to establish a scalable, secure, and high-performing data ecosystem. The ideal candidate is hands-on, cloud-savvy, and passionate about data infrastructure, governance, and engineering best practices. Key Responsibilities: Platform Design and Architecture Design and implement the foundational architecture for a new enterprise-grade data platform Work with architects to define infrastructure, storage, and processing solutions aligned with business needs Ensure the platform adheres to security, compliance, and scalability standards Data Ingestion and Pipeline Development Build scalable, reusable data ingestion pipelines from structured and unstructured sources Develop batch and streaming data workflows using modern ETL/ELT frameworks Ensure data quality, lineage, and monitor dbt , sqlmesh ), Python, and Spark or similar processing engines Experience with AWS, Azure, or GCP data services Familiarity with data modeling, schema evolution, and partitioning strategies. Competency in modern data orchestration tools (e.g., Apache Airflow, dbt ) High-level understanding of the capabilities and the role of different technical areas (cloud engineering, platform engineering, analytics engineering, ML engineering). Soft Skills: Strong problem-solving and systems thinking mindset Effective communication and stakeholder management abilities Ability to balance strategic planning with hands-on implementation. Preferred Skills: Experience with Kubernetes, Docker, or serverless architectures Exposure to data mesh or domain-oriented data platform design Familiarity with tools like Apache Kafka, Delta Lake, or Iceberg.