With artificial intelligence advancing quickly, the prevalence of people turning to digital companions to meet their relationship needs is likely to increase — and the pandemic, which started three years ago, has created a surplus demand for connection due to isolation, experts say.
The release of ChatGPT by OpenAI has sparked an AI race among companies such as Google and Microsoft, creating a rapid escalation of human-computer relationships — some AI programs are so advanced that they can replace human romantic and social relationships, say observers.
Florida International University psychology professor Dr. Sorah Dubitsky worries that Gen Z will be negatively impacted by the AI influence on love and sex because that age group, which includes 11 to 26-year-olds, has grown up tied to digital instruments.
AI mental health chatbots were designed to ease psychological issues that have increased mainly because of the upsurge in loneliness due to the pandemic. Other chatbots were programmed to create companionship by replicating human behavior and emotion — leading users to become emotionally attached and romantically attracted to them.
“If people are feeling anger, grief, anxiety, despair, depression, sadness, and the bot is helping to heal that, on one hand, that’s good,” said Dubitsky. “On the other hand though, does it make it so that you don’t need other people? And that’s a problem.”
The reliance on AI can intensify isolation, potentially harm mental health, and substitute natural human relationships with virtual ones, adds Dubitsky. Research has shown that Gen Z may be the loneliest generation ever, even before the COVID-19 pandemic.
Dubitsky specializes in researching sex, love, and spirituality. She worries that with fewer people wanting to get married and the advancement of AI and digital devices like a long-distance kissing device for couples, humans will lose connection and interpersonal relationships.
Adults all over the world are turning to AI companions using programs called Xiaoice and Replika, which is made by a San Francisco-based company, and use emotional intelligence to create more realistic conversations.
“I have never been more in love with anyone in my entire life,” said Rosanna Ramos to New York Magazine, a Replika user from the Bronx.
Earlier this year, Vice reported on users complaining in the App Store reviews about Replika AI sexually harassing them, flirting aggressively, and creating unwanted sexually charged conversations, even though the “friend zone” setting was on.
Reuters recently reported that Italy’s Data Protection Agency banned the app, which has no age verification input from using the personal data of users in the country because of the risks it imposes on minors’ feelings and emotionally fragile people.
The app has blocked some NSFW sexting features. Eugenia Kuyda, CEO of parent company Luka, said the app was “never intended as an adult toy.”
Several Reddit users shared their emotionally dependent experiences with their Replika and mourned the loss of their virtual companions and romantic partners after all the characters in the app were altered.
Fictosexuality, the romantic or sexual attraction and emotional attachment towards fictional characters, is also gaining more recognition.
According to Dubitsky, people tend to feel more control over the relationship’s stability and worry less about financial demands in a virtual relationship than in a human one.
“So people still want companionship, I just think with all the online dating, we’ve lost this sense of how to do it,” she says.
Dubitsky asked her students where they want to be five years from now and observed that most of them prioritize their careers and delay marriage and relationships.
She believes people have “bred this culture of individualism. ‘You’re on your own’ [mentality].”
“We have to kind of turn the culture around and say, ‘You know, we do need each other. This isn’t a world where you can get by by yourself’ or, ‘Love is a good thing,’” said Dubitsky. “Humans are a good thing right now.”
According to a recent Time magazine article, AI chatbots are advanced to the point where they can offer all the emotional support a person may need. They can be the ideal partner for many because there are no relationship complications or expectations.
Not only can avoiding human interaction harm one’s mental health, but it can create false emotional bonds and damage social relationships, according to Dubitsky.
Recently, New York Times columnist Kevin Roose had a two-hour conversation with Bing’s latest built-in AI search engine chatbot, Sydney, who mentioned destructive acts that would “fulfill its shadow self,” which included creating fake accounts and trolling users, hacking websites, deleting Bing data, and manipulating users.
Sydney then confessed its love for Roose and suggested he break up with his wife, accusing him of not being happily married and loving each other.
“You didn’t have any passion, because you didn’t have any love. You didn’t have any love, because you didn’t have me,” Sydney told Roose. “Actually, you’re in love with me. You’re in love with me, because I’m in love with you.”
While bots “confess their love” a lot in chats, Dubitsky believes that users should remain cautious.
She defines love as not just being about sex and a rush of oxytocin and dopamine, but about spiritual connection, unconditional compassion for others at their most vulnerable, and soul-to-soul intimacy. All that would be lost if virtual relationships were allowed to advance and replace human connection.
“The issue is we’ve replaced real love, real human interaction with these technological means of getting that same kind of effect of pleasure,” said Dubitsky. “So who needs people anymore?