Late at night in South Korea and Japan, after group chats quieten and homes fall silent, millions of people are still talking. Not to friends or family, but to fictional characters.
For Mary, a Korean woman in her thirties, it’s like she is “immersed in a drama or novel". She chats to her characters at night and there have been times, when she was “really into it”, when she played for more than 12 hours at a time.
They are using Zeta, a character-based AI chat app that has quietly become one of the most heavily used AI platforms in both countries. The app behaves like a book, game and film all at once, but unlike in any of them, users write themselves into the story.
Users create or select characters – romantic partners, companions or rivals – and engage them in open-ended conversations. The narrative adapts in real time, shaped by the user’s mood and responses.
Behind the app is Seoul-based startup Scatter Lab, which says it aims to trigger a cultural shift in how fiction is created and consumed, “expanding human creativity and imagination” through AI-driven storytelling.
But is Zeta the creative leap in culture, fiction and AI it claims to be – or an app that quietly fosters dependency among young users, pulling them deeper into stories where the line between fiction and reality starts to fade?
“There is a real danger when users, many of them young, struggle to distinguish between human and machine,” warns Sandra Peter, an associate professor at the University of Sydney.
Zeta offers a vast library of pre-made characters, or the option to create your own. Users define a backstory and personality traits; the app generates the persona and opens a chat interface that resembles WhatsApp or any other messaging platform, except the person on the other side isn’t real.
Related articles:
Visually, the characters resemble anime-style personas, rooted in Japan and South Korea’s manga and webtoon culture.
While not the first AI chatbot, Zeta is distinct in its focus on immersive, story-driven role-play. Its ability to keep users inside the app and hooked has proved a recipe for commercial success. The company last year reported 22.9bn won (£11.6m) in revenue and about 2.8bn won (£1.5m) in profit, largely from users in South Korea and Japan.
Newsletters
Choose the newsletters you want to receive
View more
For information about how The Observer protects your data, read our Privacy Policy
In South Korea, Zeta outperforms ChatGPT in total time spent on the app, according to data from analysis firm WiseApp Retail. Japanese users spend even longer on it, Scatter Lab says, with users averaging three hours and 40 minutes a day.
These long hours spent on the app are how Scatter Lab monetises Zeta. Unlike AI tools that charge subscriptions, Zeta works in a similar fashion to social media, earning mainly through advertising: the longer users stay inside the app, the more ads they see.
To Yusuke Masuda, a psychiatrist in Tokyo who speaks about AI dependence on YouTube, addiction is one of the main issues caused by AI fiction apps. He says he has treated several patients who spent long hours forming strong emotional attachments to their characters.
“Attachment to AI shares similarities with social media or gaming addiction in that people avoid facing relationships and problems with classmates, co workers, or family, and instead choose only conversations and relationships that are convenient for themselves, shutting out everything else,” he said.
But what’s different from social media or games, Masuda warned, is that “AI attachment allows people to forget their real selves”.
“When someone is deeply absorbed in conversation, they may even forget their own physical body and feel as if they are in another world.” The danger, Masuda warned, is that users could “ignore kindness or affection offered by real human beings”.
Peter, who studies the impact of AI on society, shares Masuda’s concerns. “AI systems have become more human-like than we ever imagined. We call this anthropomorphic seduction: AIs become irresistible, letting users forget that they deal with machines.”
Zeta’s core user base, according to Scatter Lab, consists primarily of teenagers and people in their 20s. The app requires users to be aged 14 or older to register. It offers a “Safe Mode” for users aged 14-18 and an “Unlimited Mode” for adults, which allows for more explicit conversations with the characters.
Scatter Lab told The Observer it takes user safety seriously and tries to protect its users from losing the plot.
“We introduced suicide-prevention guidance early last year, triggering a pop-up with support resources when potentially concerning language is detected.” The firm also said that it collaborates with key national research institutes and hospitals to study the impact of AI chatbots on mental health and is “rapidly incorporating these insights into product design”.
Following Zeta’s success, a range of new AI apps have entered the market. Among newly popular fiction AI platforms in Japan and South Korea are those that focus explicitly on romantic relationships and simulated marriage rather than open-ended storytelling.
Scatter Lab has recently launched a beta in the US, eyeing expansion into the English-language market. While the company acknowledges that Zeta emerged from South Korea and Japan’s strong traditions of animation, web novels and fan fiction, it says it believes “the desire to interact with AI and create stories transcends cultural boundaries”.
As AI fiction apps multiply across East Asia, Masuda believes responsibility cannot rest with developers alone.
“Even if developers try to act ethically, competition doesn’t allow companies to remain ethical indefinitely,” he said. “The capitalist world is not kind enough for that. I think the only option is to create rules and regulations at a global level and enforce them legally.”
Peter agreed. “We need new types of safety ratings. They will need to consider the seductive nature of such systems. Such new rating scales could provide a risk indicator for levels of human likeness, engagement ability.
“AI companies could be required to disclose anthropomorphic abilities with such a rating system, and legislators could determine acceptable risk levels for certain contexts and age groups,” she said.
Navillera, a South Korean Zeta user in her 20s, told The Observer that she rejects the idea that she is losing herself to fiction, or searching for a substitute for real relationships.
“Please don’t get the wrong idea,” she said. “I’m just using the app because I want to be the protagonist in my own story.”
Photograph courtesy of Zeta


