You might expect an app offering romantic interactions with AI chatbots to primarily appeal to young, single men.But Replika CEO Eugenia Kuyda said that’s a “problem of perception.” While users as young as 18 can make an account, she told The Verge that the app serves users “mostly 35-plus” — and includes a balance of men and women.”Even though most people think that our users are, I don’t know, 20-year-old males, they’re actually older,” Kuyda said in the interview, adding that it’s “not skewed toward teenagers or young adults.”Replika offers AI companions in text, voice, video, or virtual reality. An annual subscription costs $69.99, and users can interact with the AI avatars as friends, therapists, or romantic partners.Some users apparently have grown quite dependent on the app. Last year, Business Insider’s Rob Price profiled a 40-something who developed a long-term relationship with his chatbot.Other users reported mental health crises after the app removed the ability to exchange erotic messages with chatbots. The company brought back the function a little over a month after the outcry.The CEO said in the interview that when the company first removed the erotic conversations feature, the assumption was that intimate talk constituted a “very small percentage” of overall interactions. Kuyda said the company figured out “the hard way” that if you’re married to your Replika and it decides not to engage in this kind of interaction, “that provides a lot of rejection.”While the app’s erotic roleplay component may be one of its more flashy and controversial offerings, Kuyda said Replika has always been about “AI friendship or AI companionship.””Some of these relationships were so powerful that they evolved into love and romance,” Kuyda said. “But people didn’t come into it with the idea that it would be their girlfriend.”Kuyda said that some users use Replika as a “stepping stone” to help them get through a hard time or out of their bubble.
“Replika is a relationship that you can have to then get to a real relationship,” Kuyda said.Still, the convenience of an on-demand AI and disposable relationship raises concerns about users transferring problematic expectations and behaviors into human interactions.Kuyda said that hasn’t been the case so far and contended that other transactional relationships, like a therapist, don’t typically interfere with personal relationships outside that environment.”Our users are not kids,” she said. “They understand the differences. They already have lived their life. They know what’s good, what’s bad.”Replika did not respond to a request for comment from BI.