Tech Firms Face 48 Hour Deadline To Remove Abusive Images

The UK government is moving to force tech platforms to remove non-consensual intimate images within 48 hours of being flagged or face fines of up to 10 percent of global turnover.

The 48 Hour Rule

On 19 February 2026, ministers confirmed an amendment to the Crime and Policing Bill that will place a strict 48 hour takedown duty on platforms hosting intimate images shared without consent.

Under the proposed law, any non-consensual intimate image reported to a platform must be removed within two days. Failure to comply could trigger fines of up to 10 percent of qualifying worldwide revenue or, in extreme cases, service blocking in the UK.

The government is also clear that victims should not have to chase individual platforms. The intention is that an image will only need to be reported once, with removal applied across multiple services and future uploads automatically blocked.

Prime Minister Sir Keir Starmer said: “The online world is the frontline of the 21st century battle against violence against women and girls. That’s why my government is taking urgent action against chatbots and ‘nudification’ tools.

“Today we are going further, putting companies on notice so that any non-consensual image is taken down in under 48 hours.”

Why The Government Is Escalating This

Ministers have highlighted intimate image abuse as part of a wider violence against women and girls strategy, which the government has labelled a national emergency.

Technology Secretary Liz Kendall said: “The days of tech firms having a free pass are over. Because of the action we are taking, platforms must now find and remove intimate images shared without consent within a maximum of 48 hours.

“No woman should have to chase platform after platform, waiting days for an image to come down. Under this government, you report once and you’re protected everywhere.”

The government has also signalled that non-consensual intimate images will become a “priority offence” under the Online Safety Act. Ofcom is expected to treat such material with the same severity as child sexual abuse content and terrorist material, including exploring digital marking techniques so that flagged images are automatically detected and blocked on re-upload.

Internet service providers may also receive guidance on blocking access to rogue websites that fall outside the reach of mainstream regulation but host abusive content.

What This Means For Platforms

For large social media firms, messaging services and content hosts, the message from government is that platforms must act fast.

The 48 hour window will require robust detection systems, clear reporting mechanisms and sufficient human moderation capacity to assess complex cases. Automated tools may help, particularly where digital fingerprints are applied to known abusive material, yet borderline cases will still require judgement.

The financial stakes are high. A 10 percent global revenue fine is significant for multinational platforms, and the threat of service blocking in the UK raises further commercial risk.

There are also operational challenges to consider. Images may be edited, cropped or slightly altered to evade automated detection. Smaller platforms may lack the infrastructure of larger tech companies. Critics argue that strict timelines could lead to over-removal, particularly where context is disputed.

Civil liberties groups have historically warned that rapid takedown mandates risk curbing legitimate expression if not carefully implemented. Platforms will need clear guidance from Ofcom on evidential thresholds and appeals processes.

What Does This Mean For Your Business?

The impact of this measure extends beyond consumer social media. Any UK business operating user-generated content, community forums, file sharing or messaging functionality will need to understand its exposure. If intimate content is hosted or shared on a corporate platform, the 48 hour rule will apply once flagged.

Even organisations that don’t host content directly need to pay attention. Investors, customers and partners now expect clear and proactive safeguards against online abuse, and there is far less tolerance for getting this wrong.

This law is also designed to reinforce a broader compliance trend. The Online Safety Act already imposes duties of care on platforms, and this amendment tightens expectations around response time and cross-platform coordination.

For SMEs building apps or digital services, moderation strategy can no longer be an afterthought. Clear reporting channels, defined internal processes and documented escalation routes will be essential.

This legislation marks a significant escalation in how the UK treats online intimate image abuse. It shifts responsibility firmly onto platforms and signals that enforcement will be measured not only by policy statements, but by speed and action.