Despite the significant commercial and business potential of artificial intelligence, one of the most engaging aspects of conversational technology is the development of personal AI companions. These AI companions serve as interactive friends, romantic partners, or empathetic listeners and are in high demand.
Although media coverage often highlights virtual sex and the more sensational aspects of digital avatar relationships, Replika CEO Eugenia Kuyda emphasizes that the issue is more nuanced.
“It’s about connection and improving one’s well-being over time,” Kuyda told Decrypt. “Some people seek additional friendship, while others might find themselves falling in love with Replika. Ultimately, both experiences serve a similar purpose.”
Replika can be accessed through desktop computers, Oculus/Meta Quest VR headsets, iOS, and Android devices. It is one of the leading services in the field, with over 30 million virtual companions created.
As Kuyda noted, the upcoming Replika 2.0 update will introduce more lifelike avatars and voices. While such enhancements may not be essential for functions like customer support chatbots, Kuyda believes that deeper relationships benefit significantly from these immersive features.
“When interacting with an AI companion, it’s not engaging to converse without being able to see them,” she explained. “In the end, it feels really flat; you would prefer a more three-dimensional experience in which the person is actually visible.”
“In the context of a relationship, a lot of elements come into play,” she added.
Kuyda emphasized that improving the visual presentation is crucial for fostering a stronger connection with the AI companion.
“I think it’s important for people to see the people they’re interacting with in order to improve realism and immersion as well as to better adjust to the relationship,” the speaker stated.
The concept of Replika, launched in 2017, was born out of a personal tragedy, Kuyda explained, when she sought a way to continue communicating with someone she had lost.
“Many people began engaging with that chatbot, sharing their lives and feelings,” she noted. “This highlighted a substantial demand and need for a conversational companion.”
Thus, Replika was created as “an AI friend you could talk to anytime,” she added.
However, experts advise caution against using AI for relationships or therapy, pointing out that while AI can simulate human interaction, it is not capable of genuine human care. Despite its name, Kuyda stressed that Replika cannot serve as a true replica of a loved one.
Nonetheless, users still form attachments to their Replika, often seeking emotional support from the AI companion.
One feature of a relationship with a Replika AI companion is the ability to assign it a relationship status, which can range from mentor and friend to spouse.
According to Kuyda, “people ended up perceiving Replika in different ways; some people think that’s my wife, that’s my girlfriend, that’s my husband, or that’s someone I trust, a coach.” As a result, “we essentially just gave people a choice among some of these options, since that basically makes the experience quite different.”
Kuyda argues that the romantic aspect of AI companions is often exaggerated by media hype, and the focus should instead be on whether the user finds happiness.
She said, “When you think about it, are we not going to allow someone who is homebound due to a particular illness or disability to fall in love with a character and have a relationship that brings some sweetness and joy to their life for a period of time?”
A key concern in the discussion about generative AI is user privacy, and Kuyda emphasized that it is a major focus for Replika.
“We don’t sell people’s data or monetize it in any way,” she said. “Our revenue comes from user subscriptions. If users want to delete their data, it is done immediately. We don’t train on user conversations or what users write; we only use feedback on Replika’s responses for training, which users are aware of.”
Kuyda explained that any data collected is anonymized and disconnected from user profiles and usernames. The company is dedicated to aligning its practices with user well-being to maintain trust and transparency.
“We aim to act in our users’ best interests, as that’s what we promise and what they pay us for,” Kuyda said. “To build a company that makes people feel better and happier, it’s essential to ensure that motivations are aligned.”