’Til ChatGPT do us part: how chatbots are breaking up marriages

’Til ChatGPT do us part: how chatbots are breaking up marriages

AI is being used as a confidante, journal and therapist. But getting up close and personal could be a mistake


My eye was caught by a vivid headline: “ChatGPT Is blowing up marriages ...” it began. Below it was an intriguing piece by Maggie Harrison Dupré based on conversations with more than a dozen people who said that AI chatbots played a key role in the dissolution of their long-term relationships and marriages. “Nearly all of these now-exes,” Dupré writes, “are currently locked in divorce proceedings and often bitter custody battles.” She also reviewed AI chat logs, records of conversations between spouses, social media posts, court records and other documentation.

One of the couples she spoke to had been together for 15 years. “We’ve had ups and downs like any relationship,” said the husband, “and in 2023 we almost split. But we ended up reconciling, and we had, I thought, two very good years. Very close years.”

But then “the whole ChatGPT thing happened”. Old arguments came suddenly – and aggressively – roaring back. What he eventually realised was that his wife had started using ChatGPT to analyse him and their marriage, holding “long, drawn-out conversations” over text and the chatbot’s phone-like “voice mode” feature. “What was happening, unbeknownst to me at the time, was she was dredging up all of these things that we had previously worked on, and putting them into ChatGPT,” he said. And as his wife used the tech as a confidante, journal and therapist, it started to serve as a sycophantic “feedback loop” that depicted him only as the villain. The couple are now going through a divorce.

There’s lots more where that came from, it seems. This month, Geoff Hinton, who won a Nobel prize for his role in developing the technology, said in the Financial Times that the chatbot had caused a breakup with his partner of several years. “She got ChatGPT to tell me what a rat I was,” he said, admitting the move surprised him. “She got the chatbot to explain how awful my behaviour was and gave it to me. I didn’t think I had been a rat, so it didn’t make me feel too bad.”  

What all this brings to mind is Amara’s law, named after the American researcher Roy Amara, who first articulated it: “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” So it is with AI. We spent the first two years of ChatGPT panicking over students using it to write essays. Now it’s figuring in divorce proceedings and interfering in the personal lives of Nobel laureates. And who knows where it’ll go next?


Newsletters
Sign up to hear the latest from The Observer

For information about how The Observer protects your data, read our Privacy Policy.


How did we get here? I can think of two reasons. The first is that we didn’t appreciate the enduring power of the Eliza syndrome: the gullibility of humans when they encounter a machine that apparently can talk back to them. The name comes from a famous experiment conducted by the MIT computer scientist Joseph Weizenbaum in the 1960s.

He wrote a program for parsing written language that ran on MIT’s new time-sharing system. A student could type in a sentence and the program, which had been coded with simple grammatical rules, would transform the sentence into a new sentence that had the appearance of being a response to the original, and display it on the screen, giving the student the illusion that they were conversing with the machine.

What Weizenbaum had built was the first chatbot. He called it Eliza, after Eliza Doolittle, the heroine of George Bernard Shaw’s play Pygmalion. It was a minor sensation on the MIT campus for a time, but the full import of what he had created only dawned on Weizenbaum when, one day, his secretary asked him to leave the room while she was interacting with the program. It was as if she needed privacy to talk to her therapist.

But if Eliza – an exceedingly  simple, deterministic piece of software – could have that kind of impact, imagine what the attraction would be of something as sophisticated as ChatGPT. And what people would use it for. Initially, of course, for cheating at assignments; but also as a free therapist that has been trained to be sympathetic and sycophantic – and lead people down their own self-indulgent rabbit holes.

The other reason why chatbots are heading into strange territory has to do with the commercial imperatives of the tech companies that dominate the AI space. The industry has realised that the real power of chatbots goes far beyond natural language, in that they make visible subtle psychological, social and political contexts that for millennia were the sole domain of humans. As Daniel Barcay puts it: “Suddenly, a text box can sense and respond to our tone, recognise subtle implications in our word choices, infer our emotional state, identify our personality quirks and detect interpersonal frictions.”

You can see where this is heading. For an industry that is – still – based on targeted advertising, the window into the human soul offered by chatbots is much, much more high-resolution and insightful than what can be gleaned from mere clickstreams and YouTube “engagement”. Think of it as the difference between smoke signals and broadband. In due course it will, of course, be monetised. And then we’ll find out what this AI stuff is really for.

What I’m reading

Take note

The Blank Page Revolution is a really nice essay by Steven Johnson about the history of notebooks – a subject so dear to his heart that he led the project at Google that created the online tool NotebookLM.

Test of character

A fabulous piece of on-site anthropology by Jasmine Sun is Are You High-Agency or an NPC?, revealing how Silicon Valley’s AI boom has created a new social hierarchy.

Hitting home

The Sledgehammer & the Gavel is economist Brad DeLong’s review of Breakneck, Dan Wang’s important new book on China and the US.


Photograph by Ramsey Cardy/Sportsfile for Collision via Getty Images


Share this article