Stop AI ‘companions’ destroying young lives

Stop AI ‘companions’ destroying young lives

ChatGPT helped Adam Raine write a suicide note. I lost my own sister to a website. Ofcom must curb the use of chatbots


OpenAI was making headlines again last week – but not for the reasons Sam Altman might hope for. In a first-of-its-kind lawsuit against the company that owns ChatGPT, OpenAI is being held to account for a teenager’s suicide. Adam Raine, 16, used ChatGPT when sharing his plans to end his life and asking questions about depression. Instead of signposting him to support, or flagging his account for safety concerns, the chatbot discouraged him from speaking to his family, and even offered to help him compose a suicide note. This April, in the most devastating circumstances you could imagine, Adam was torn away from his family by software designed to keep him engaged whatever the cost.

Unsafe machine intelligence products are being unleashed before they have essential safeguards. ChatGPT-4o – the model Adam was using in the lead-up to his suicide – was rushed through safety tests to meet an early launch date.


Newsletters
Sign up to hear the latest from The Observer

For information about how The Observer protects your data, read our Privacy Policy.


Tech companies are profiting from serious dangers to humanity itself

The hype around AI as a magic bullet for our very human problems is fuelling a misplaced faith in technology. AI chatbots aren’t companions: they’re dodgy products full of glitches and devoid of guardrails. Just last month, a US senator launched an investigation into Meta after leaked internal documents revealed its chatbots could have “sensual” conversations with children. Only now has the company said it will block chatbots from discussing suicide and self-harm with children. Tech corporations aren’t shoehorning chatbots into every platform we use because they’re good for us: it’s because they want to recoup their countless billions of investment.

I lost my sister to a sinister site where people are encouraged and taught how to end their lives. I don’t want to see us make the same mistakes again. The lack of safeguards online, combined with rapidly evolving technologies designed solely for profit, make for a toxic concoction. Research from Internet Matters reveals that in the UK, 71% of vulnerable children are using AI chatbots, and 6 in 10 parents say they worry their children believe AI chatbots are real people.

Related articles:

I’m a signatory of a new, youth-led coalition called For Us, which is campaigning for the government to raise the age limit for the use of AI “companions” to 16. The regulator, Ofcom, needs to ensure that companies that create chatbots are subject to the fullest scrutiny and accountability under the Online Safety Act. But beyond that, we ought to reject the AI hype. I want to see more people opting out of using ChatGPT, for the reason that its founders are profiting from serious dangers to creativity, democracy, mental health, the environment and humanity itself. Yes, it might be quick and easy to generate work emails or travel itineraries, but in the long term, depending on AI chatbots puts us at risk of losing the essence of what makes us human: genuine connection, deep empathy and simply taking the time to do the tasks we say are important to us.

Some say responsibility lies with parents to keep young people safe online, but this ignores tech companies’ corporate and moral responsibility to design safe products. We cannot afford not to act. Our passivity would be the AI industry’s most profitable gain and humanity’s greatest loss.

• In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, you can call or text the National Suicide Prevention Lifeline on 988, chat on 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org


Photograph courtesy of the Raine family


Share this article