This change, brought in under the Online Safety Act, is an important shift away from a ‘report-and-react’ model and towards safety by design. Instead of expecting people - including children - to absorb the harm and then report it, platforms must take steps to stop this material appearing at all.
At the Marie Collins Foundation (MCF), we welcome measures that reduce exposure to sexual harassment and abuse online, and we will be watching closely to ensure this is implemented in ways that meaningfully protect children, not just on paper.
Why this matters
Unsolicited sexual images are not just a ‘content moderation’ issue - they are a form of sexual harassment and violation that cause real harm.
The Government’s announcement also highlights the scale of the problem: one in three teenage girls has received unsolicited sexual images. For children and teenagers, this can normalise sexual boundary violations, create shame, and make it harder to seek support.
What platforms are required to do now
Under the new requirements, platforms must take proactive steps to prevent unsolicited nudes reaching users, rather than relying on users to report after the harm occurs.
Tech companies could tackle these images for example by using automated systems that pre-emptively detect and hide the image, implementing moderation tools, and stricter content policies.
Some services have already shown what prevention can look like. For example, Bumble (dating app) has previously launched a feature that automatically detects and blurs nudity in chat images, giving the recipient control over whether to view, block, or report.
What we’ll be looking for next: implementation, evidence, and enforcement
Today’s change designates cyberflashing as a 'priority offence' under the Online Safety Act. This brings it within the Act’s Illegal Harms Code.
The key question now is how consistently and effectively platforms implement these duties. From MCF’s perspective, the next phase must prioritise:
The Online Safety Act includes significant sanctions for non-compliance, including fines of up to 10% of qualifying worldwide revenue, and potential service restrictions.
Need support or advice?
If you are concerned that a child may have been harmed online - including through unwanted sexual content - the Marie Collins Foundation (MCF) can provide specialist advice on trauma-aware, child-centred responses and support pathways. Call us on 01765 688827.
You can also access practical online safety advice and reporting routes via CEOP’s Safety Centre (Child Exploitation and Online Protection Command), which provides guidance for parents and carers and ways to report concerns about online child sexual abuse:
https://www.ceop.police.uk/safety-centre/