Internet / Technology

Blurred lines - the police and facial recognition technology

05:00 am on 24 September 2020

Photo: Oleshko Artem/123RF

There is growing alarm about the use of facial recognition technology - especially when the police don't tell us what they’re doing in that sphere.

Listen

But an outright ban is not the answer – and failing to work on the development of such technology risks holding up things we find useful such as using an e-gate at customs.

Today The Detail looks at the fears about facial recognition technology and the global gaping hole of regulation.

Use of the tech is reaching new levels in law enforcement but New Zealand police haven't been forthcoming about it.

The Detail's Jessie Chiang speaks to Gehan Gunasekara, an associate professor at Auckland University's business school. He's also the chair of the Privacy Foundation.

"We have been trying to get to the bottom of that with the police," he says.

"I have to say it hasn't been very helpful, the police haven't been as transparent as we would like."

First, there was the Clearview AI business in May this year.

The police ran a trial of the US facial recognition technology without telling the government, the privacy commissioner or even their own top boss - police commissioner Andy Coster.

In the end, the police ditched Clearview AI and Coster announced a review to make sure privacy implications are properly considered when it comes to surveillance tech.

But just last month, RNZ reported on police links with Japanese company NEC, which develops the facial recognition programme NeoFace.

The Department of Internal Affairs is also connected and has a $20 million contract with the company to update the passport system.

Gunasekara shares what concerns him the most about this, and what measures are in place to protect the privacy of New Zealanders.

Gehan Gunasekara Photo: Jessie Chiang

Chiang also speaks to Zak Doffman, the founder and CEO of UK company Digital Barriers, which provides live surveillance tech such as facial recognition.

He's calling for a detailed legal framework in which the tech can operate.

"We develop facial recognition but we're not an advocate for the indiscriminate use of it at all," he says. "We think the industry has gone too far, there are no controls, there is no regulation."

Just last month, the court of appeal for England and Wales found that South Wales police were unlawful in their use of facial recognition technology through live CCTV footage.

"In essence what's happened is, the police have gone out and used facial recognition in a way they felt was appropriate, there has been a challenge and because there's no regulation, it's quite a subjective matter to try and interpret," says Doffman.

Ben Bradford from University College London's global city policing institute says the problem is that because there's no specific law around the use of facial recognition technology, what is and isn't allowed is usually dependent on whether it infringes on other areas of law, such as privacy.

He also talks to The Detail about the "chilling effect" and the importance of training frontline police to use the tech appropriately.