New UK Law Criminalises Non-Consensual Deepfake Images Am...
Tech Beetle briefing GB

New UK Law Criminalises Non-Consensual Deepfake Images Amid Calls for Stronger Protections

Essential brief

New UK Law Criminalises Non-Consensual Deepfake Images Amid Calls for Stronger Protections

Key facts

The UK has criminalised the creation of non-consensual AI-generated explicit images under the Data (Use and Access) Act 2025.
Victims and campaigners call for stronger protections, including civil remedies like takedown orders and better support services.
Sex workers face unique challenges as misuse of commercial sexual images is often treated as copyright issues, limiting access to justice.
The government plans to ban 'nudification' apps and prioritise non-consensual deepfake offences under the Online Safety Act to enhance platform responsibilities.
Despite legal progress, online abuse remains widespread, necessitating ongoing efforts to protect victims from digital exploitation.

Highlights

The UK has criminalised the creation of non-consensual AI-generated explicit images under the Data (Use and Access) Act 2025.
Victims and campaigners call for stronger protections, including civil remedies like takedown orders and better support services.
Sex workers face unique challenges as misuse of commercial sexual images is often treated as copyright issues, limiting access to justice.
The government plans to ban 'nudification' apps and prioritise non-consensual deepfake offences under the Online Safety Act to enhance platform responsibilities.

The UK has introduced a new legal offence criminalising the creation of non-consensual AI-generated explicit images, marking a significant step in combating deepfake abuse. This amendment to the Data (Use and Access) Act 2025, which came into force recently, makes it illegal to create intimate deepfake images without consent. Campaigners and victims, however, argue that while the law is a positive development, it does not go far enough in providing comprehensive protection and justice for victims.

Victims like Jodie, who discovered deepfake pornography using her likeness in 2021, have shared their difficult journeys to justice. Jodie and others testified against a perpetrator who posted manipulated images of women from social media to pornographic websites, resulting in a prison sentence. She emphasised that prior to this law, there was no adequate legal framework to address the harm caused by such AI-generated abuse. Despite the law receiving royal assent in July 2025, enforcement was delayed until February 2026, a lag that campaigners say allowed more victims to suffer without recourse.

Beyond criminalisation, campaigners from the Stop Image-Based Abuse coalition are calling for expanded civil remedies, including takedown orders that compel platforms and devices to remove abusive content promptly. They also advocate for improved relationships and sex education to raise awareness about image-based abuse and for increased funding for specialist support services like the Revenge Porn Helpline. These measures aim to address the broader ecosystem of harm caused by intimate image abuse.

Sex workers, represented by advocates like Madelaine Thomas, highlight additional gaps in the law. Thomas, who has endured the non-consensual sharing of her commercial sexual images for years, points out that misuse of such content is often treated solely as a copyright issue rather than an abuse matter. This legal framing limits victims’ access to justice and support, underscoring the need for laws that recognise the unique harms faced by sex workers.

The government has responded by promising further actions to tackle the technology enabling such abuse. The Ministry of Justice confirmed that sharing intimate deepfakes was already illegal, and now creating them is also a criminal offence. Additionally, the government plans to ban 'nudification' apps that facilitate the creation of deepfake pornography and to prioritise non-consensual sexual deepfakes as a key offence under the Online Safety Act. This will impose stricter duties on online platforms to proactively prevent the spread of such harmful content.

Despite these advances, the persistence of online abuse remains alarming, with statistics indicating that one in three women in the UK experience online abuse. The new law represents progress but also highlights the ongoing challenges in protecting individuals from emerging forms of digital exploitation. Continued advocacy and legislative refinement will be crucial to ensuring victims receive comprehensive protection and support in the evolving digital landscape.