Software algorithm may have discriminated student: council


Software designed to stop students cheating while taking exams at home during the coronavirus lockdown may be using discriminatory algorithms, the Dutch human rights council has said in a preliminary ruling.

The case was taken to the council by VU master’s student Robin Pocornie who said she had faced discrimination because the proctoring software did not recognize her when she tried to log in, possibly because of her black skin.

The council has given the the university 10 weeks to prove that the software was not discriminatory.

Pocornie said she had tried repeatedly on different occasions to log in to the system and several times was not allowed to answer the questions. Instead the software made comments such as ‘room too dark’ or ‘face not found’. The software did work when she shone a lamp directly on her face.

The council said in intermediary ruling that Pocornie had brought forward enough evidence to indicate the algorithm was discriminatory and that the university so far had failed to produce enough checkable information to prove the opposite.

Research carried out on behalf of the software provider, which the company says clears the algorithm, has not been made public and the council is unable to access it.

The council said this is the first time that someone has been able to make a case for algorithmic discrimination. ‘This might be a preliminary ruling but it is an important milestone in our history of judgements,’ council chairwoman Jacobine Geel said.

The council is urging other people who feel they may have faced discrimination because of an algorithm to come forward.

Thank you for donating to

We could not provide the Dutch News service, and keep it free of charge, without the generous support of our readers. Your donations allow us to report on issues you tell us matter, and provide you with a summary of the most important Dutch news each day.

Make a donation