New Zealand / Technology

Police using algorithms 'a huge problem' for biases, researcher says

07:58 am on 16 July 2021

A researcher is worried the proliferation of crime-fighting computer algorithms will only make ingrained police bias against Māori worse.

File photo. Photo: pixabay

Police have put out a new report revealing they use about two dozen algorithms, and that assesses the 10 riskiest ones.

Police said it shows they are using a range of fairly unsophisticated algorithms - that lack AI power - in a considered and limited way.

But data expert Dr Karaitiana Taiuru said the algorithms pose a threat.

"It's a huge problem," he said.

"It's well documented by other scholars and researchers, that there are biases in the New Zealand police.

"So now we have all these algorithms that are using data from biased police and biased incidents."

Having read the report, he is now more worried there will be more bias - but also hopeful because its release showed police wanted to be open about fixing this, Taiuru said.

The catalyst for the report, police said, was them wanting to make sure they are not in breach of the national Algorithm Charter they signed up to only last year.

But some of their algorithms have been in use for years: A youth offender screening one since 2009; and one that produces a daily top five highest-risk offender list since 2017.

"Perhaps they shouldn't have signed up [to the Charter] until they basically ensured there was no bias and made sure there was true and proper Te Tiriti consultation and representation," Dr Taiuru said.

The report shows the youth risk screening algorithm has not had its predictive accuracy assessed since 2016.

Police deputy chief executive Mark Evans said they will have a look at that gap, but added the report "doesn't actually say that it's not appropriate".

"Like a lot of organisations, we acknowledge the need to put more effective governance around this work," Evans told RNZ.

However, "none of the advice we've been given gives me particular concern that, in some way, we've used them in a way that's led to unfortunate or disproportionate outcomes for people", Evans added.

The report's lack of mention of how algorithms might generate police bias against Māori specifically was not a fundamental flaw, Evans said.

"The report very clearly talks about the potential unintended or intended consequences of bias against particular communities."

Work was well underway to bring in a better safeguarding framework, and better consultation, that will include Māori more, he said.

An expert panel set up to advise police about technology, which reviewed the report, said the "devil lies very much in the details" of these safeguards.

"Such steps are indispensable to the safe and ethical deployment of any algorithms," it said.

But algorithms had been developed by police and other agencies for years in a vacuum of rules and leadership, said panel chair Professor Colin Gavaghan.

"The first the Police Commissioner seemed to find out about it when these things hit the headlines - that's just not adequate.

"Yes, in an ideal world it would be been in place for years but we are where we are."

"I've not seen any of the kind of bad outcomes that have certainly be seen in other countries," he said.

However, New Zealand could not just import algorithms from overseas.

"There's absolutely no way that you can just have a sort of a generic approach," Gavaghan said.

"We've been quite forthright about that ... the police have taken that on board."

The panel, in its review of the report, questioned how the consultants made the call about how risky each police algorithm is and warned the police that the ethics of trailing algorithms was very different from actually deploying them.

"Evaluative research requires the consent of participants and consideration of any potential law breaking during the trial."

The panel had this warning too, about the report - and police's - stress on how humans, not algorithms, are making the decisions ultimately: "Nominal human oversight can offer false reassurance".