Facial recognition technology is "dangerously inaccurate"

Adjust Comment Print

Most of the results generated by automated facial recognition systems used by the police are inaccurate and a waste of tax payers' money, according to a new study.

Freedom of Information requests by the Big Brother Watch organisation have shown that 91 per cent of so-called "matches" found by South Wales Police's technology were wrong.

South Wales Police, which made 2,685 "matches" between May 2017 and March 2018 with 2,451 false alarms, said the system has improved over time.

Denham says she will also consider recent reports by Civil Society, Big Brother Watch, and the Electronic Frontier Foundation.

The ability, as police see it, to track suspects anywhere there is a camera offers a big leap in crime fighting ability from finding vulnerable people or missing persons, to hunting terrorism suspects or keeping track of one-time suspects for whom there are not the resources to keep under surveillance by officers. South Wales police have been granted more than £2m in government funding to test the system, but it is 91pc inaccurate in their testing.

More news: Kim Kardashian appetite suppressant post reinstated after Instagram 'mistakenly' take it down

A member of staff from the human rights organisation Liberty who observed the Met Police's operation at Notting Hill Carnival a year ago claimed the technology led to at least 35 false positives, five people being unduly stopped and one wrongful arrest.

"Facial recognition has always been feared as a feature of a future authoritarian society, with its potential to turn CCTV cameras into identity checkpoints, creating a world where citizens are intensively watched and tracked", it said.

"It is plainly disproportionate to deploy a technology by which the face of every passer-by is analysed, mapped and their identity checked".

The Biometrics Commissioner Paul Wiles raised concerns a year ago about a database of custody photographs that includes 19 million images, hundreds of thousands of which belong to people who were later acquitted or never charged with a crime. Despite this, the force is planning seven more deployments this year.

Senior staff attorney at the EFF, Jennifer Lynch, warned of the lack of oversight in terms of biometric systems: "The adoption of technologies like these is occurring without meaningful oversight, without proper accuracy testing, and without the enactment of legal protections to prevent misuse".

More news: UK to produce 'significant' white paper on Brexit plans

They can then end up on the Police National Database and be turned into facial biometrics used to identify individuals via specialized software.

IT Pro has approached both the Met and South Wales Police for comment.

For instance, a developer of a content filtering AI system may claim that they are able to identify a high percentage of terrorist content on the web, but only when they already know that the content they're analyzing is terrorist content.

The Metropolitan Police said that "all alerts against the watch list are deleted after 30 days", adding that any "faces in the video stream that do not generate an alert are deleted immediately".

"It is deeply disturbing and undemocratic that police are using technology that is nearly entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms".

More news: Iran gasfield proves too hot to handle for Total

Comments