Dutch police are to stop using an algorithm to predict if someone will show violent behaviour “immediately”, following an investigation earlier this week by website Follow the Money.
FTM revealed how the police are using a system that experts say is ethically and statistically poor to select people who are likely to be violent for a personally targeted approach.
Those selected for the Risicotaxatie Instrument Geweld (RTI-G) programme faced repressive measures that police hoped would stop them offending or offending again.
This could include more frequent arrests and body searches, the confiscation of money and expensive items and the involvement of social workers to stop brothers and sisters “going the same way”, FTM said.
People selected for the programme are warned that they face extra surveillance, leading to at least one subject to challenge the programme in court, and win.
The algorithm was based on factors such as age and sex, criminal history and prior contacts with the police. However, no checks had been carried out to assess if the system was prejudice free to make sure that people living in certain areas or with minority roots were not unfairly over-represented, FTM said.
An earlier version of the system, which police say they stopped using in 2017, automatically ranked people with an Antillean, Moroccan and Somalian background as more high risk than the Dutch.
Erasmus University professor Marc Schuilenburg told FTM the revised system is still “based on air” and is “completely unacceptable”.
There are no examples abroad of successful predictive policing technology at an individual level, he said.
The police first told FTM they would review the algorithm but have now decided to stop using it altogether because of “doubts about its usefulness”, the investigative website said.
The Netherlands has been hit by a string of scandals involving algorithms which discriminate against certain groups in recent years and data protection watchdog Autoriteit Persoonsgegevens (AP) has started monitoring their use.
Among the cases to hit the headlines is the unregulated use of algorithms by the Dutch tax office to create risk profiles of potential benefit fraudsters which led to thousands of people being wrongly ordered to pay back benefits.
Student finance body Duo was also caught up in an ethnic profiling scandal, after Investico revealed students with ethnic minority roots are “noticeably more often” accused of student loan or grant fraud than other students. The finance body’s checks are partly based on algorithms.
In May, it emerged that the foreign affairs ministry has been using a profiling system to analyse the risk posed by people applying for short-stay visas for the Schengen area since 2015.
Thank you for donating to DutchNews.nl.
We could not provide the Dutch News service, and keep it free of charge, without the generous support of our readers. Your donations allow us to report on issues you tell us matter, and provide you with a summary of the most important Dutch news each day.Make a donation