Home Office Admits Facial Recognition Tech Issue with Bla...
Tech Beetle briefing GB

Home Office Admits Facial Recognition Tech Issue with Black and Asian Subjects

Essential brief

Home Office Admits Facial Recognition Tech Issue with Black and Asian Subjects

Key facts

Facial recognition technology used by UK police shows higher false positive rates for black and Asian individuals, especially black women.
The Home Office has acknowledged the bias and is testing a new algorithm intended to reduce these disparities.
Police and crime commissioners and civil rights groups call for stronger safeguards, transparency, and public consultation before expanding use.
Concerns remain about the technology’s deployment in public spaces and its potential impact on communities of color.
Independent reviews by police inspectorate and forensic science regulators are underway to assess effectiveness of bias mitigation.

Highlights

Facial recognition technology used by UK police shows higher false positive rates for black and Asian individuals, especially black women.
The Home Office has acknowledged the bias and is testing a new algorithm intended to reduce these disparities.
Police and crime commissioners and civil rights groups call for stronger safeguards, transparency, and public consultation before expanding use.
Concerns remain about the technology’s deployment in public spaces and its potential impact on communities of color.

The UK Home Office has acknowledged significant issues with its facial recognition technology, revealing that it is more prone to incorrectly identifying black and Asian individuals compared to white people under certain settings.

This admission follows testing by the National Physical Laboratory (NPL), which evaluated the police national database’s use of retrospective facial recognition tools.

The NPL found that the false positive identification rate (FPIR) was substantially higher for Asian subjects (4.0%) and black subjects (5.5%) than for white subjects (0.04%).

Notably, black women experienced an even higher false positive rate of 9.9%, compared to 0.4% for black men.

These findings have raised concerns about inherent biases in the technology, prompting calls from police and crime commissioners for stronger safeguards and caution against expanding its use nationally.

The Association of Police and Crime Commissioners criticized the lack of transparency, questioning why these results were not shared earlier with affected communities and stakeholders.

The government has launched a 10-week public consultation to explore broader applications of facial recognition, including accessing passport and driving licence databases.

Civil rights groups, such as Liberty, warn that the racial bias could lead to serious consequences for people of color, urging a halt to the technology’s rollout until robust oversight and transparency measures are established.

Former cabinet minister David Davis also expressed alarm over plans to deploy the technology in public spaces like shopping centers and transport hubs, calling for parliamentary debate.

Officials defend the technology’s use as vital for catching serious offenders and emphasize existing manual safeguards requiring visual verification of matches by trained officers.

The Home Office stated it has procured a new algorithm tested to show no statistically significant bias, with further evaluations planned.

Additionally, the police inspectorate and forensic science regulator have been asked to review law enforcement’s use of facial recognition to ensure effective mitigation of bias.

This development highlights the ongoing challenges of balancing technological innovation in policing with protecting civil liberties and preventing discrimination.