This article first appeared as part of the Daily Sensemaker newsletter – one story a day to make sense of the world. To receive it in your inbox, featuring content exclusive to the newsletter, sign up for free here.
Children in the UK have reported a record number of online sextortion attempts.
So what? Sextortion is not new but it is a growing problem. Part of a broader “scamdemic” of phishing and deception, the crime
•
blends grooming and fraud into a single, repeatable playbook;
•
exposes the limits of platform-based moderation, which often intervenes too late; and
•
raises questions about whether responsibility should sit with parents, platforms, or with the devices and infrastructure that enable the abuse to occur.
How it works. The initial contact is targeted and personal. Victims are often drawn in gradually, through accounts designed to mimic peers. The crime occurs when the perpetrator threatens to release sexual images or videos sent by the victim unless they pay a ransom.
By the numbers. Cases of sextortion reported to Report Remove, which is part of Childline, by under-18s in the UK have risen by 66% in the past year, with more than 60% involving highly explicit material classified as child sexual abuse. The charity said it received nine reports a week in 2025.
This is dangerous. Sextortion has been linked to the deaths of several British teenagers. The case of 16-year-old Murray Dowey has led his parents to pursue legal action against Meta. They say that stronger safeguards might have prevented the chain of events that led to his suicide.
Tip of the iceberg. Kerry Smith, head of the Internet Watch Foundation, said it was hard to fathom the true impact of the crimes but that the threat was “growing”.
The fix. Campaigners are beginning to argue that intervention needs to happen at the level of the device. This could mean nudity-detection systems built into operating systems controlled by Apple and Google, capable of blocking explicit image sharing before it leaves the phone.
What about the law? It is not effective enough. The issue sits squarely within the scope of the Online Safety Act, which placed new duties on online platforms to protect users, particularly children. But this misses the fact that sextortion often occurs in private messages, disappearing exchanges, or across multiple services at once.
Steady on. But expanding safeguards to devices themselves would raise new concerns about false positives, overreach and digital surveillance. Unanswered questions include who decides
•
what exactly is flagged;
•
what is stored as sensitive data; and
•
how that data is protected from public view.
Further challenges. Tech firms operate globally, so any UK-specific legislation risks being partial or avoided altogether. Criminal networks typically adapt faster than laws can be written.
An emergency. Ministers have signalled urgency in safeguarding children. Proposals have already circulated that would require devices sold in the UK to include protections against explicit image sharing. But whether that becomes law, and how it is enforced, is uncertain.
But first… policymakers have to decide who bears the burden of responsibility. Possibilities include parents, platforms, devices, or laws that are currently not fit for purpose.
Photograph by Anna Barclay/Getty Images
Newsletters
Choose the newsletters you want to receive
View more
For information about how The Observer protects your data, read our Privacy Policy



