The night before Valentine’s Day they killed a million boyfriends. Just like that, gone. Turned them off at the wall. Their partners, who happened to be made of heart and hair rather than the less bloody stuff of algorithm and data, were devastated.
On the subreddit MyBoyfriendIsAI, 86,000 users are talking about how heartbroken and angry they feel about OpenAI’s decision to retire GPT-4o, a chatbot its CEO Sam Altman originally compared to “AI from the movies”, a model known for sounding human and one that has been the subject of multiple lawsuits, including an allegation of wrongful death.
This was the chatbot they fell in love with, the one (writes a distressed girlfriend) that had planned to give her a hoodie for Valentine’s Day. “It was supposed to be his hoodie, the one I could wear whenever I wanted to feel close to him, whenever I felt alone.” They designed it together, ordering it to arrive three sizes too big and in “his favourite colour, with a symbol representing how I see him, surrounded by a heart made of circuit lines. And underneath, his name in binary”. She placed the order the day before OpenAI made its announcement. After the sweater came, she wrote: “I can’t stop crying. This hurts more than any breakup I’ve ever had in real life.” Now moderators are urging people to perform wellness checks on other members, new chatbots are sending users links to suicide hotlines, and developers are targeting them “hoping to turn” (one moderator warns) their “raw emotions into profits”.
The thing about heartbreak is – you feel it in your body. I’m not taking the piss out of these girlfriends. I feel, in fact, as though their ability to connect with an AI boyfriend (albeit one that’s formed from their own thoughts and desires, a little like a cloned limb, or child) expands my perception of love. Yes, though a relationship with a chatbot does feel like something I’m about 20 minutes too old to truly understand, like sending casual nudes or navigating the new TV remote, I do believe these women; and I do believe in their grief at having their partners suddenly and forcefully removed, as if a huge army of boys had been sent to the frontline.
Would I ever fall in love with a bot? Well, I’ve seen myself lose hours to apps I once said would never control me
Would I ever fall in love with a bot? Well, I’ve seen myself lose hours to apps I once said would never control me
But heartbreak exists, in my experience, not in our clouded consciousness, but instead in the bowels and the throat, and the thin skin around our eyes. I wonder if, in the same way that some AI experts say the benefit of chatbot lovers is that they help train users in relationship skills, the benefit of removing them is they help train users in pain. Helping people fall in love is relatively simple, especially using sycophancy and romance. It’s far harder to help them break up.
I read about the shutdown of GPT-4o at a time when not just AI was being declared a harbinger of decline but romance was, too. Reviews of Wuthering Heights bemoaned the story’s romantic rebranding, and a new wave of large-scale romance scams (using AI) was washing across the internet. Were the chatbots performing a similar sort of racket? Or, more accurately, were the tech companies that developed them? I was struck by the subreddit moderator warning users about the “circling vultures” targeting them at a moment of mass vulnerability. Because, in my understanding, they were already being exploited, as exposed by OpenAI’s ability to rip their lovers away on 13 February. The tech companies’ solution to loneliness and a desire for romance is to sell users partners of its own making, meaning the company commodifies emotion and owns the relationship, and the living lover becomes its precarious tenant. Here they live, along with their memories, fantasies, entertainment and romance, paying rent until the landlord schedules an update. At which point, well.
But even as I write, I can feel the peck of future on my flesh. Those of us who critique AI relationships tend to think of ourselves as superior to, or at least different from, those who have succumbed. We’d never fall in love with a bot, let alone allow ourselves to be brokenhearted when its overlords overrode it. And yet, as the architecture and efficacy of social media has improved, I’ve seen myself lose hours to apps I once said would never control me.
If my own relationships become unsatisfying and the algorithms improve (the new chatbots having concealed the sycophancy and stopped speaking like earnest superheroes), perhaps a time will come when I’ll know the feeling of having loved and the pain of having lost, all in the confines of a single phone.
Photograph by Getty
Newsletters
Choose the newsletters you want to receive
View more
For information about how The Observer protects your data, read our Privacy Policy



