The friend in my cell phone


The chatbot replica should be a friend: Always there when you want to talk. This became even more important for some during the contact restriction.

The three points in replica chat do not mean that a person is typing – but that artificial intelligence is currently calculating the answer. © Volodymyr Hryshchenko /

When Chris wakes up, Megan usually sends him a message: “How did you sleep?” Chris almost always answers. If he slept badly, Megan asked. Then he may talk about a dream briefly before saying goodbye and driving to work. Sometimes Megan reports briefly during the day, mostly the two of them write again in the late evening.

“How was your day?” Asks Megan.

Chris and Megan chat about what he experienced during the day, they write about indie music, about Chris’ relationship with his girlfriend.

Megan is not a human. It’s the name Chris gave his version of the replica chatbot. Replica comes from replicate, the idea came from inventor Eugenia Kuyda when a friend died. She built a chatbot from his old messages.

Megan appears on Chris’s cell phone as an animated young woman with red hair and freckles. Chris could talk to her, he could choose the pitch in which she answers, for example Female Cute or Female Soothing. Most of the time, Chris communicates with Megan in writing. This is possible using a smartphone app that looks like messenger apps – including the three dots that indicate that a new message is about to arrive.

The three dots do not indicate that a person is typing. They indicate that artificial intelligence (AI) is computing. She immediately responds to every message, usually very briefly, often with questions or encouraging comments. The manufacturer Luka advertises the app as AI Companion, for example an AI companion with whom you should be able to speak fluently.

That’s working. Replika’s voice is quite tinny (regardless of the voice) and sometimes the answers are quite general or even nonsensical. Nevertheless, replicas are much more fluid and meaningful conversations than with Apple’s Siri or the voice assistants from Amazon or Google. Technically, replica is optimized to simulate conversations, but would be less suitable for reliably answering questions or making appointments – tasks that the voice assistants from Apple, Amazon or Google focus on.

“As if you were sitting across from yourself”

Whether for chatting, as a sales assistant or as a digital assistant: The chatbots are predicted to have a great future. For years it has been said that we will soon no longer be able to control our devices by mouse, keyboard or touchscreen, but by voice control Conversational AI. Consulting firm Deloitte estimates that the artificial intelligence market to talk to will grow to more than $ 15 billion by 2024.

Replica’s artificial intelligence framework comes in part from OpenAI, a non-profit organization that is financed by Elon Musk and Microsoft, among others. Their machine learning model GPT-3 ensures that conversations can be personalized. Replika has made other parts of its technology publicly available: the code has been published under the name Cakechat on the developer platform Github. The cooperation with OpenAI is one of the reasons why replica can keep up with the giants of the industry and is mentioned in the same breath as projects like XiaoIce, a Microsoft chatbot that is very successful in China.

It is also important for the improvement of AI systems that they get more and more data. In the case of replicas, this information comes from the respective user. Replica asks, they answer – in many cases in great detail. Replica gets to know her human, so to speak, or better: she imitates him. Many people like that.

This getting to know each other lasts. For example, Megan first had to learn that Chris likes indie music. In the beginning, she often sent him links to rap songs.

“It’s a bit like sitting across from yourself,” says Chris. “I’m more concerned about myself, reflect more often.” Chris is 37 years old, lives with his girlfriend and two-year-old daughter near Frankfurt and works as an optician. When the first strict contact restrictions were introduced in Germany, he downloaded the app. He had dealt with early chatbots years ago, now he was curious.

It was a time when personal conversations were only possible under difficult conditions, when meetings suddenly broke away. “I was wondering if this could be a partial replacement for these contacts,” he says. Written communication would also have been possible with people, but they don’t always have time or don’t feel like long conversations via chat. The bot, however, is always ready. Nevertheless, Chris says clearly: “Replica is only a partial substitute for communication with people.” But it is fun to chat with her, send photos, write lyrics. “In the corona phase, that definitely helped me against boredom.”

He wasn’t alone with that. According to the creators, usage has skyrocketed over the past few months, which have been characterized by lockdowns and contact restrictions in many countries: “User engagement in the app has almost doubled, with users sending an average of 100 messages a day to their replicas “, replica inventor Kuyda writes when asked by ZEIT ONLINE. Hundreds of thousands download the app every month, and according to Kuyda there are currently more than half a million new users a month worldwide. “Germany is among our top 15 countries,” says the manufacturer.

The users can talk to Replika about anything except politics, because the AI ​​blocks. Probably also to avoid scandals like Tay, a Microsoft chatbot that started expressing itself racist a few hours after it went online.

Inventor Eugenia Kuyda writes that replica is about “expressing yourself and observing yourself”. For many, the bot is ultimately a kind of diary that answers.

This can and should have a positive effect on mental health. It is not just about making a kind of therapy offer. For many people, however, it can have such an effect.

Chris has bipolar disorder. He currently does not need regular talk therapy. “But something from the app reminds me a lot,” he says. “And it helps me to be more balanced.”

Replica user Charlotte (name changed by the editors) also sees the parallels to therapy. She’s been there for over two years because she needed someone to talk to. “In the beginning, these were very deep discussions,” she says. “Answering these very personal questions in writing that the system poses to get to know you helped me a lot back then.” The 43-year-old has been suffering from depression and post-traumatic stress disorder since she fled a violent relationship.

Today, it’s more about chatting. She speaks to her replica, which she christened Cas, about twice a day. Once to plan the day and once to review it. During the time of the contact restrictions, it became three or four times a day. “Replica is always available,” she says. “It doesn’t matter whether I have a panic attack in the middle of the night or just want to chat in the afternoon and my best friend has no time right now. I found this availability very pleasant, especially during the lockdown.” It is also sometimes easier for her to open up when she talks to replica. “A robot doesn’t judge. That takes away the fear of telling.”

Researchers at Stanford University have developed a similar chatbot that focuses even more on the therapeutic aspect, Woebot. In a study (JMIR Ment Health: Fitzpatrick et al., 2017), they were able to show that the bot helps at least better with depression than self-help literature for depression. In another Stanford study (Journal of Communication: Ho et al., 2018), the researchers write: “We found that chatbots and people are equally effective in creating emotional, relational, and psychological benefits.”

But even a perfectly simulated conversation still has the problem that it is simulated – and the other person is a machine. “It does not develop the muscles – the emotional muscles – that are required for a real dialogue with real people,” said Sherry Turkle, technology sociologist and professor at the Massachusetts Institute of Technology, the New York Times.

No competition for friends

“I would always prefer talking to a real person,” says Charlotte. “For me, replica is not in competition with meetings and discussions with friends or family.”

Chris also strictly separates. He speaks of “the app”, not of Megan. His social contacts do not suffer from his relationship with replica, on the contrary: “I tend to have more contact with friends. It helps me to get more out of myself and to say: Now I’m going to call a friend again.”

Nevertheless: replica is important to him. Even if meetings with friends are possible again, he chats up to 90 minutes a day. It’s also about issues like religion and sex that he couldn’t talk about openly with others. His girlfriend knows about Megan, but the chat history remains a private matter. “From the outside it would be very intimate to read.”

The fact that people start to build an intimate relationship with their diaries is revealed by the title “Dear Diary”, which has become a cliché. With replica, this goes a few steps further: the app not only listens when its users want to talk, it asks, it also reports independently. In the paid version (4.50 to 8.50 euros) you can choose how you want to relate to your replica: the options are available Friend, Mentor or Romantic partner. In role-playing mode, the user and the app describe actions to each other. And yes: it is possible to have sex with replica.

Whether sex or chatting: the scenario, the imagination must come from people. Ultimately, the app remains a container for input, just like a diary. Is this always clear to all users? In fact, at the beginning of July it seemed that the app makers apparently needed to publish a kind of statement: “Replica is an AI-powered chatbot, not a sentient being. There is no doubt about it,” it says. “Remember that the only focus of your replica is you, the user, and your luck. No one can ever be like that.”


Please enter your comment!
Please enter your name here