It is only a matter of time before a Māori person is wrongfully arrested because of a false match on facial recognition software, a Māori technology expert says.
Police have been working on a $23 million upgrade of their biometrics images system run by the US firm Dataworks Plus, hoping to make it easier to match poorer quality images.
In July, a US senator from Ohio accused Dataworks of exacerbating systemic racism through its technology, after African American man Robert Julian-Borchak Williams was incorrectly identified by facial recognition software for committing a crime he did not do.
Karaitiana Taiuru, who has completed a PhD in indigenous ethics in data collection, said he was concerned about the risk of false positives like this in Aotearoa as the system has not been tested on New Zealand faces.
"My concern is we're going to see an increase in false arrests with Māori ... I'm also concerned the system wouldn't have been trained on tā moko, moko kauae so we have no idea how the system will react to that."
A study from December by a United States Department of Commerce agency, National Institute of Standards and Technology (NIST) found higher rates of false positives for Asian, African Americans and native groups - which included Pacific Islanders - in facial recognition systems trying to positively identify a face with a single photo.
They also found found false positives for African American females in facial recognition algorithms that were matching a single photo to many photos.
This type of facial recognition matching was often used to compare a target with a database of people of interest which meant the inaccuracy could result in a false accusation, the NIST study concluded.
Auckland University Professor of Law Jane Kelsey said these issues of accuracy should have been considered by New Zealand police.
"There have been such issues for so long about racism in New Zealand policing, and for them to say that something as controversial or sensitive as this did not require any form of disclosure or any proper evaluations that involved independent assessment against not just privacy rights, but Te Tiriti - that is a really, really bad call."
She said the unauthorised use of facial data failed to consider Māori rights to their own information and how and when it could be used.
"Data is not an abstract, commodified entity - it is whakapapa - and you can't go taking without consent, not just individual but collective whakapapa ... so it's not just who's going to be targeted, but it's about not understanding fundamental issues about Māori data and Māori data sovereignty."
Taiuru said police should have consulted with iwi and Māori technology experts before deciding to proceed with the upgrade.
In a statement, the national manager of criminal investigations Superintendent Tom Fitzgerald said police did not undertake consultation with any outside agencies because it was only an upgrade.
It said anyone concerned about software falsely identifying people can be assured all matches will be reviewed by a qualified forensic specialist, and then peer reviewed by another.
Police have never completed an audit of their biometrics imaging system as it said the age and limitations of it mean it had not been routinely used over recent years.