February 23, 2026
The Government has announced that tech companies will be required to take down intimate images shared without a victim’s consent within 48 hours.
The new requirement will be introduced through an amendment to the Crime and Policing Bill. According to the Government’s press release, companies will be legally required to remove non-consensual intimate images no more than 48 hours after they are flagged to them.
The announcement comes only days after the Prime Minister delivered a speech outlining a range of other measures that the Government would be introducing to strengthen online safety, particularly for children. As discussed here, these include tabling an amendment to the Crime and Policing Bill to bring chatbots within the scope of the Online Safety Act 2023, and introducing legislation that would fast-track any recommendations from the consultation on protecting young people online.
The new obligation on tech companies also sits alongside recent measures announced at the start of this year which include the coming into force – and designation as a priority offence under the Online Safety Act 2023 – of the criminal offence of creating or requesting the creation of non-consensual intimate images (discussed here), as well as the ban on so-called ‘nudification’ apps (discussed here).
Commenting on the new rules, the Technology Secretary, Liz Kendall, said, “the days of tech firms having a free pass are over. Because of the action we are taking, platforms must now find and remove intimate images shared without consent within a maximum of 48 hours. No woman should have to chase platform after platform, waiting days for an image to come down. Under this government, you report once and you’re protected everywhere. The internet must be a space where women and girls feel safe, respected, and able to thrive”.
To read more, click here.