Probation service used error-ridden algorithms to assess risks

A prison corridor
Photo: Depositphotos.com

Yet another Dutch government department has become embroiled in a row over the use of algorithms – this time the unit which assesses whether prisoners are likely to re-offend.

On Thursday, justice ministry inspectors published a report criticising the way error-ridden algorithms were used by probation service Reclassering Nederland to assess the risk of re-offending among prisoners who had already been convicted as well as people being held in pre-trial custody.

The research into the justice ministry department is part of a wider study into the government’s use of algorithms in the wake of the childcare benefit and student loan scandals, both of which involved ethnic profiling.

The probation department’s director, Jessica Westerik, told current affairs show Nieuwsuur on Thursday evening that the findings are “very confrontational” and that much needs to be done to ensure the algorithms are used correctly.

The department uses the algorithm some 44,000 cases a year to help decide whether someone is likely to re-offend, including people being held in jail awaiting trial. The reports are used by judges to determine sentences and to decide whether people should be released from prison early.

The inspectors’ report shows that the system used by the department gets it wrong 20% of the time and that one of the algorithms, Oxrec, fails to meet the standards for government use.

Since 2018, the inspectors said, the Oxrec formulas used to assess people with convictions and suspects had been swapped, drug use was not properly taken into account and serious psychological problems were not included at all.

In addition, the system was based on old data about the Swedish prison population rather than the Dutch.

This means “in most cases the risk of re-offending was estimated to be lower than it was”, the inspectors said, particularly among drug users and people with serious psychological issues.

“AI and algorithms make it very attractive not to think, to ask fewer questions and to accept what is offered to us,” Delft University professor Cynthia Liem told Nieuwsuur. “You see this happening a lot with technological applications.”

Temporary halt

Reclassering Nederland has stopped using the systems and is now looking into the use of risk assessment programmes in its work.

Government inspectors say the most serious error made by the system is that the risk of reoffending was under-estimated, which could have endangered society.

Westerik said she could not rule out that people had been released from jail too early or given longer sentences because of the system. “But I think the likelihood is very low,” she told Nieuwsuur.

In 2020, researchers warned the justice ministry about the use of the Oxrec algorithm, saying that the use of postcodes could indirectly lead to ethnic profiling.

The department had taken that warning “very seriously”, Westerik said, but added there was sufficient scientific evidence at the time for neighbourhood and income to be considered relevant factors.

Thank you for donating to DutchNews.nl.

We could not provide the Dutch News service, and keep it free of charge, without the generous support of our readers. Your donations allow us to report on issues you tell us matter, and provide you with a summary of the most important Dutch news each day.

Make a donation