AlgoRail: Demands for more transparent data management in Dutch public sector

The Dutch government has used a secret algorithm for the last years in order to fight social benefits fraud. In February 2020, a court ruled the immediate halt due to its privacy and transparency violations. Civil society organizations demand that citizens are better informed about what their data is used for in the public sector. These insights written by Koen Vervloesem make up the second stop of our AlgoRail summer journey around Europe, finding out more about how algorithmic systems are deployed in our European neighborhood.

In its fight against fraud, the Dutch government has been cross-referencing personal data from citizens in various databases since 2014. This system, called SyRI (for “system risk indication”), wants to find “unlikely citizen profiles” that warrant further investigation. Despite major objections, SyRI has been implemented without any transparency for citizens about what happens with their data.

The idea is this: if certain government agencies suspect fraud with benefits, allowances or taxes in a specific neighborhood, they can make use of SyRI, which then decides which citizens in the neighborhood need to be investigated further.

False positives?

There is a detailed procedure for government agencies that want to use SyRI. The agency that askes for the analysis cannot just penalize the citizens that are flagged for an unlikely combination of data. First, the flagged citizens are examined at the Ministry of Social Affairs and Employment for false positives. Data on citizens that are deemed false positives is not handed over.

No transparency

But even with these checks in place, the lack of transparency is still a big issue. Citizens are not automatically warned if they are flagged for fraud risk by SyRI, and they cannot access the reasons why they have been flagged, which data SyRI uses and how. It is only known from the official resolution that is the legal basis for SyRI, that the system can cross-reference data about work, fines, penalties, taxes, properties, housing, education, retirement, debts, benefits, allowances, subsidies, permits and exemptions, and more. In the beginning of 2018, the Platform Bescherming Burgerrechten filed a case against the Dutch state to stop the use of SyRI.

Black box

SyRI pseudonymizes the data sources it uses with a ‘black box’ method, so that for each data source that is linked, all citizen names are replaced by a unique identifier for each individual.

It is still unclear what is happening in this ‘black box’, and the Dutch government blocked all attempts from concerned parties to shed light on this. In 2017 the Ministry of Social Affairs decided that the risk models it used should be kept secret. One reason was that otherwise potential offenders could adapt their behavior if the state disclosed SyRI’s risk models.

Primarily used in low-income neighborhoods

There’s another problem with SyRI: it turns out SyRI has been primarily used in low-income neighborhoods. This exacerbates biases and discrimination: if the government only uses SyRI’s risk analysis in neighborhoods that are already deemed high-risk, it is no wonder that it will find more high-risk citizens there.

SyRI is not necessary

According to Mr Huissen from the Platform Bescherming Burgerrechten, the government does not need this kind of mass surveillance to prevent fraud: “The government already has information about who owns which house, so it could check this before granting the person a rental allowance. For all big fraud scandals in social security we have seen in the past decades it became clear afterwards that they could have been prevented with simple checks beforehand. That happens far too little. It is tempting to look for solutions in secret algorithms analyzing big data sets, but often the solution is far simpler.”

No fair balance

On 5 February 2020, the Dutch court of The Hague ordered the immediate halt of SyRI because it violates article 8 of the European Convention on Human Rights, which protects the right to respect for private and family life. Article 8 requires that any legislation has a “fair balance” between social interests and any violation of the private life of citizens.

SyRI’s goal, or “social interest”, is to prevent and fight fraud. The Dutch state claimed that the SyRI legislation offered sufficient guarantees to do this while protecting the privacy of citizens, but the court disagreed. The legislation is insufficiently transparent and verifiable, and there are not enough safeguards against privacy intrusions, judges wrote.

The biggest problem of SyRI is not that it wants to battle fraud, but that the system is too opaque. If the government wants to ‘fix’ this problem, it will have to add more transparency.

A new way of dealing with algorithms

Public organizations in the Netherlands have reacted to the court’s decision by reassessing their own algorithmic systems for fraud detection. Tijmen Wisman, chairman of the Platform Bescherming Burgerrechten, hopes that the government will do more. “Just adapting SyRI to be more transparent will still result in information asymmetry. Citizens do not want to give information anymore to their government if the latter can use this information in all possible ways against them.” “Data must no longer be allowed to roam freely, but must reside in an authentic data source. Each organization should keep logs for every time that their data is consulted. Citizens should be able to easily access these logs. This way it becomes clear to citizens what their data is used for, and they can challenge this use.”

That’s it for this second stop of our AlgoRail. Enjoy this thought-provoking insight with the great view of a Dutch water mill whilst we continue our journey to Belgium.


This story was shortened by Julia Gundlach. The unabridged story  was published on the AlgorithmWatch website.

The blog series AlgoRail is part of the Automating Society Report 2020 by Bertelsmann Stiftung and AlgorithmWatch, which will be published this fall and is coordinated by Dr. Sarah Fischer. In addition to journalistic stories like this one, the report gives an overview of various examples of algorithmic systems as well as current debates, policy responses and key players in 15 countries. A first issue of the report was published in January 2019.


This text is licensed under a Creative Commons Attribution 4.0 International License



Comment