lots of room for improvement surrounding the algorithm use in city counsels
- 05 december 2025
Today we launched a study into the use of algorithms by the ten largest city counsels in the Netherlands. It shows that the city counsels don't meet all the aspects of the AVG. Therefore, we call for registration in the Algoritmeregister (Algorithm register), A public register where government bodies can register the algorithms they have in use.
low transparency
The inventory of which algorithm were in use at the counsels we looked at, were clearly not in good shape. There was also very little experience with public accountability about the use of algorithms. Off the ten counsels that got a Woo-verzoek (a request to make government data public) from Bits of Freedom, only the counsel of Nijmegen was able to fulfill the request in the legislated time limit. All the other counsel could not, or not within the legislated time, provide the existing documents about the processes surrounding the use of algorithms. Unacceptable, cause transparency is one of the prerequisites of the AVG so counsels do not fulfill their lawful duties.
Furthermore, the counsels that did come with a response - City counsel Amsterdam, Utrecht, Groningen en Nijmegen - did not properly consider why the algorithm should be used. This is worrisome. Counsels have a lot of data available about citizens, so it is tempting to use that data with new technologies. We find that that temptation is sometimes irresistible. Counsels seem to forget that citizens should not seen as suspects from the outset.
vulnerable groups
The research shows that the most complex and invasive algorithms and data analyses often get used on vulnerable groups: people who get financial assistance, inhabitants of the so called problem neighborhoods and teenagers. The goal of the application is mostly law enforcement. Providing support for the community is often portrayed as a secondary goal. It seems like this is done to legitimize the use of the algorithm, because in reality, not a lot gets done with the goals of providing support. Especially when vulnerable groups are involved, counsels should take a good at what the effect is of these invasive methods and if this fits with their bigger task as a counsel of providing support.
"Counsels seem to forget that it should no be the standard to see their citizens as suspects"
Mitigating measures
Some counsels are aware of the problems surrounding the use fo algorithms, and have taken mitigating measures. The counsel of Groningen started an Ethical Commision Data & Technology to get advice on the functioning of the algorithms they have in use. They use this as a base for their considerations if they want to continue using the algorithm.
The city counsel of Utrecht has developed her own so called 'impact assessment', which means she does not only evaluate algorithms with the current law, but also using ethical frameworks. We do have to point out that both these counsels could not provide us with this documentation within the legislated time period.
Our advice
We think that all counsels should actively perform impact assessments, including risk analyses of algorithms. We also call to make registration of high impact algorithms in the algorithm register, mandatory. This has already been talked about in the second chamber, but has not been made into legislation.
Lastly we advice to look at the bigger picture before choosing to use an algorithm, taking the perspective of citizens into consideration. What exactly is the problem that you are trying to solve with the use of algorithms as a counsel? And what are not only the direct, but also the indirect consequences of using an algorithm. Is the algorithm the most suited method to work on a certain problem or are there other ways. This is not taken into consideration enough. Counsels have to map the use of algorithms and be able to provide accountable to their citizens.