Just as Keir Starmer’s government is beginning to grapple with the need for a social media ban for under-16s, it is becoming clear that we are poised to act too late when it comes to chatbots. AI, like social media before it, is going to have vast health and social consequences, but the government’s approach is wait and see.
At a meeting of social media bosses last week, the prime minister said the current state of regulation – or lack of it – of children’s access to their platforms “cannot be allowed to stand”. It’s about time.
A social media ban for under-16s has been pioneered in Australia. Bans are being established in France, Spain, Indonesia, Malaysia and elsewhere. Meta and Google have meanwhile been fined $6m for pushing addictive apps that they knew could damage young users’ mental health. Despite this, Starmer himself is equivocal about a ban. He shouldn’t be. The research linking addictive apps to depression and self-harm is robust. The House of Lords will have a chance to vote for one as part of the children’s wellbeing and schools bill on Monday. Starmer should see soon enough that the time has come to act.
Even then, his government will be fighting the last mental health war – and losing ground in the next. Addictive bots can exploit weak points in the human psyche quickly and remorselessly, causing their victims to spiral into delusions of romance, power and brilliance, or troughs of despair. Critics of AI regulation say it will stifle commercial opportunity and prove beyond British jurisdiction. It is unclear who in Whitehall is responsible for writing the rules of engagement, yet rules are essential. The government needs to act, and it can.
The UK has an Online Safety Act that currently does not apply to AI because it only governs interactions between people. Parliament can close this loophole by assigning liability for bots’ harmful actions to the companies behind them. They will resist, but Australia’s experience is that the major players will comply if the law is clear and fair. Secondly, Starmer needs to make clear who is in charge of AI regulation. A team responsible for AI safety should be formed drawing on expertise from across government and industry to issue updated safety guidelines frequently and regularly. The alternative is to leave mental health at the mercy of digital product managers who can control bots’ power over millions as if by turning dials.
AI is changing how we learn, think, interact, design, do business and fill long leisure hours, all faster than we can comprehend. It cannot be allowed to trample on national governments, or personal health.
Farage, in plain sight
Nigel Farage’s investment in the Bitcoin trading company Stack BTC is shocking. More worrying, no one seems that shocked. The leader of Reform UK bought his shares after months of promising lighter crypto regulation; understandably, the crypto crowd likes to applaud Farage as the champion of the money of the future. They say establishment has-beens don’t understand the transformative power of digital currencies to enable wealth creation and growth. But that’s not the point. The point is that the referee shouldn’t be able to bet on the game. Or, in fact, get paid by the players.
In the past, even the perception of doing so would have been political suicide. Transparency has been a restraint on any temptation MPs might have felt to feather their own nests. What happens when an MP is unrestrained? Since he bought the shares, Farage has made a promotional video to boost their market value. Deliberately leveraging his public profile to make himself richer, he has flipped the understanding of transparency, monetising what was once stigmatising.
The rules aimed at stopping this are inadequate. Farage operates almost exclusively outside parliament and the regulations only really apply inside. That needs to change urgently, before an unimaginably wealthy industry captures a whole political class in this country as we have seen it do in the United States.
Newsletters
Choose the newsletters you want to receive
View more
For information about how The Observer protects your data, read our Privacy Policy



