
The government says the steps will improve safeguards for women and girls during a global push to curb abuse in a world where images sent privately can be easily shared online and AI-based tools can instantly create sexually explicit images.
Britain said on Thursday it would amend legislation passing through parliament to create a legal duty for major platforms to take down nonconsensual intimate images no more than two days after they are reported.
It is already illegal in Britain to post such images online, but some victims have reported difficulty getting platforms to permanently remove them.
“The online world is the frontline of the 21st century battle against violence against women and girls,” Prime Minister Keir Starmer said in a statement.
Nonconsensual images fuelling online safety debate
A surge in nonconsensual images has fed into Britain’s wider debate over online safety. Ministers are examining whether to restrict social media access for under 16s, echoing Australia’s ban.
Britain said its media regulator Ofcom was considering treating the sharing of illegal intimate images with the same severity as child sexual abuse and terrorist content.
The government said victims would only need to report material once, with platforms expected to remove the same image across services and prevent re-uploads.
Any fines for failing to do so could be applied to a platform’s ‘Qualifying Worldwide Revenue’ – a measure used by Ofcom which covers income generated anywhere in the world from the parts of the service it regulates.
In a separate statement, Ofcom said it would fast-track a decision on new rules requiring platforms to use “hash-matching” tools to block illegal intimate images at source. The decision would come in May, and new measures could come into effect this summer.