Urgent Clarity Sought Over Racial Bias in UK Police Facial Recognition Technology
Essential brief
Urgent Clarity Sought Over Racial Bias in UK Police Facial Recognition Technology
Key facts
Highlights
The UK's data protection watchdog, the Information Commissioner's Office (ICO), has requested urgent clarification from the Home Office regarding racial bias found in police facial recognition technology.
This follows testing by the National Physical Laboratory (NPL) which revealed that the technology, used to identify serious offenders through the police national database, is more likely to incorrectly match black and Asian individuals compared to white individuals.
Specifically, the false positive identification rate (FPIR) for white subjects was found to be 0.04%, while it was 4.0% for Asian subjects and 5.5% for black subjects.
The bias was particularly pronounced among black women, with a FPIR of 9.9%, significantly higher than the 0.4% for black men.
The Home Office acknowledged these findings and admitted the technology was more prone to errors for certain demographic groups.
In response, the ICO has emphasized the importance of public confidence in the use of such technology and warned that perceptions of bias could undermine trust.
The watchdog is considering enforcement actions, which could include legally binding orders to halt the technology's use or fines, while also offering to collaborate with the Home Office and police to improve the system.
Police and crime commissioners have expressed concern over the bias and urged caution regarding plans to expand the technology nationally, potentially deploying cameras in public spaces like shopping centers and transport hubs without sufficient safeguards.
The Home Office stated it is taking the issue seriously, having procured and tested a new algorithm that reportedly shows no statistically significant bias.
Additionally, the police inspectorate and forensic science regulator have been asked to review the use of facial recognition in law enforcement to assess the effectiveness of mitigation measures.
This development comes shortly after the policing minister described the technology as the "biggest breakthrough since DNA matching," highlighting the tension between technological advancement and ethical concerns.
Facial recognition technology scans faces against watchlists of known or wanted criminals, either in real-time or retrospectively, to aid police investigations.
The ICO's intervention underscores the need for transparency and fairness in deploying such powerful tools, especially given their potential impact on minority communities.