How DataChef helped PostNL design an organized process for development and delivery of its data products

an illustration to describe PostNL

Who is PostNL?

PostNL, formerly TNT N.V., operates mostly in the Netherlands and Belgium. It specializes in the mail, parcels, and e-commerce. The company provides universal delivery in the Netherlands. Additionally, it provides cross-border delivery services. Following the separation of TNT Express from the rest of the company in May 2011, the remainder became PostNL.

As of 2018, the company has more than 37000 employees and earns €2.772 billion per year with €209 million in operating income.



Why were we there?

Trying to compete with its competitors, PostNL realized the value of implementing data-driven logistics strategies. DataChef helped PostNL implement data mesh in response to the limitations of traditional data platforms. The main responsibilities we had were:

  1. Training and assisting an internal expert data product team.
  2. Establish required standards for data mesh readiness by creating sample data products.

AWS is the main cloud platform provider used in PostNL. Also, there is no streaming or event-sourcing solution within PostNL's current infrastructure. However, most data sources and applications support webhooks or can communicate with downstream services via SNS or SQS. Therefore, we designed their main stack for the event sourcing mechanism based on the following AWS technologies to keep the migration bar as low as possible:

  • SQS: To help with message delivery.
  • Lambda: To help with processing.
  • DynamoDB: To store data for operational access and also publish change events.
  • S3: To store data with high retention and analytical use.


We helped PostNL by

  • Standardizing data accessibility patterns so that the organization can have the flexibility to access data based on its analytical and operational needs.
  • Developing a data product programming model with the following objectives in mind:
    • Better communications between engineers and the business in order to improve efficiency.
    • Assisting developers in focusing on business logic and providing solutions in an optimized manner while enhancing their developer experience.
    • Providing a compass (technology and architecture) for developing future data products
  • Designing, developing, and production of data products that attract users from the business and logistics sectors


The Impact We Made

  • The product development time was reduced from more than 3 months to 2 weeks.
  • Data product monitoring solutions got introduced to monitor:
    • Cost
    • Codebase quality
    • Data quality
  • Through the introduction of the take-in process, rather than non-stop requests for features, 3 meetings of one hour will be held to support delivery.
  • Standardizing documentation reduced the time it takes to onboard new developers from weeks to hours.

Back to top