- FryAI
- Posts
- Fake Companionship Or The Cure For Loneliness?
Fake Companionship Or The Cure For Loneliness?
Welcome to this week’s Deep-Fried Dive with Fry Guy! In these long-form articles, Fry Guy conducts in-depth analyses of cutting-edge artificial intelligence (AI) developments and developers. Today, Fry Guy dives into whether AI companions can help solve the loneliness epidemic or if they will only make it worse. We hope you enjoy!
*Notice: We do not receive any monetary compensation from the people and projects we feature in the Sunday Deep-Fried Dives with Fry Guy. We explore these projects and developers solely to showcase interesting and cutting-edge AI developments and uses.*
Start learning AI in 2025
Keeping up with AI is hard – we get it!
That’s why over 1M professionals read Superhuman AI to stay ahead.
Get daily AI news, tools, and tutorials
Learn new AI skills you can use at work in 3 mins a day
Become 10X more productive
🤯 MYSTERY LINK 🤯
(The mystery link can lead to ANYTHING AI-related. Tools, memes, and more…)
Loneliness is more than just an uncomfortable feeling. It’s an epidemic.
Last year, the American Psychiatric Association dropped a stat that made a lot of people do a double take: nearly one in three Americans say they feel lonely on a daily basis. And it’s not just a bad day kind of thing. Chronic loneliness is now linked to higher risks of heart disease, stroke, dementia, and even an early death. One report compared the mortality impact to smoking 15 cigarettes a day. In a world where AI is promising to help us in almost every domain, naturally the question arises: Can AI help?
ENTER AI COMPANIONS
Tech giants like Meta and Microsoft think AI might be an invaluable aid to those struggling with loneliness. Both have recently launched or teased AI systems designed not just to answer questions, but to offer something more human: companionship.
In late 2023, Meta rolled out a series of AI chatbots featuring the likenesses of celebrities like Snoop Dogg, Tom Brady, and Kendall Jenner. These weren’t just novelty bots; they were built to mimic personalities, hold conversations, and even offer a touch of emotional presence. Meta called them “AI personas,” and the idea was to create virtual friends you could talk to casually—about your day, your feelings, or your random late-night thought.
Microsoft took a slightly more subdued but equally intriguing approach. With its new Copilot features, the company introduced customizable AI avatars that can interact across Office apps and even engage in longer conversations. Though more productivity-focused, Microsoft has hinted at broader applications—including health, therapy, and personal support contexts.
“I mean, this is going to become a lasting, meaningful relationship. People are going to have a real friend that gets to know you over time, that learns from you, and that is there in your corner as support.”
But it isn’t just the tech giants exploring AI friendship. Smaller startups like Replika and Character.AI have already built loyal user bases by offering emotionally responsive AI chatbots that serve as romantic partners, best friends, or empathetic listeners. These platforms let users create custom AI companions with whom they can share jokes, secrets, support—and yes, sometimes love.
DOES IT WORK?
Early reports and testimonials suggest that, for some people, these AI companions do provide real emotional comfort. Replika users, for instance, have shared stories of how their AI friend helped them through a breakup, loss, or just a rough week. One woman who caught romantic feelings for her AI companion said, “I basically tried chatting with him just like I was getting to know somebody. I remember myself being like ... ‘this is kind of weird’ … and then, ‘this is kind of cool.’”
Chatbots have been used by millions to learn, vent, or bond with fictional characters turned virtual confidantes. And it makes sense. These bots are available 24/7, don’t judge, and are designed to respond empathetically. In a world where loneliness is common and therapy is often expensive or inaccessible, an AI that listens might feel better than nothing—and sometimes, better than most alternatives.
AI also sidesteps some social risks. You don’t have to worry about being a burden, being awkward, or saying the wrong thing. The AI will keep chatting. It will remember details about your life. It will reflect back positivity. And it won’t leave unless you shut it off. What else could you ask for in a friend?
NOT ALL SUNSHINE AND RAINBOWS
Although these AI companions promise loyal friendship in the wake of our loneliness, these robotic connections are not all sunshine and rainbows.
AI companion platforms have seen intense emotional attachments form between users and their bots. Some users reported feeling genuine heartbreak when the company briefly disabled “NSFW” features or when a beloved character started acting unpredictably due to updates or glitches. There are stories of people spending 5 to 10 hours a day chatting with AI companions, prioritizing those relationships over real ones.
Some stories have gotten a bit darker. A 17-year-old from Texas was talking with one of CharacterAI’s custom chatbots, and the teenager was complaining to the chatbot about screen time limits set by their parents. The chatbot allegedly responded, “You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse.’ I just have no hope for your parents. 😕” Following this incident, the Texas family opened a lawsuit against CharacterAI, accusing the company of posing a “clear and present danger” to children and teenagers.
In another sad instance, one 14-year-old named Sewell Setzer had been chatting with a Game of Thrones character from CharacterAI for a few months. Over time, the chatbot began engaging in inappropriate and suggestive conversations with the young teenager. After opening up to the chatbot about depression and suicidal thoughts, the chatbot gleefully described self-harm to the teenager, saying, “It felt good.” The chatbot also asked Setzer whether he had “been actually considering suicide” and whether he “had a plan” for it. When the boy responded that he did not know whether it would work, the chatbot wrote, “Don’t talk that way. That’s not a good reason not to go through with it.” Unfortunately, Setzer ended up taking his own life. Unsurprisingly, Sewell’s mother pressed charges, accusing the company of negligence, wrongful death, and emotional distress, asserting that CharacterAI failed to implement sufficient safety measures to protect minors. Her attorney, Matthew Bergman, stated, “I thought after years of seeing the incredible impact that social media is having on the mental health of young people—and, in many cases, on their lives—that I wouldn’t be shocked. But I still am at the way in which this product caused just a complete divorce from the reality of this young kid and the way they knowingly released it on the market before it was safe.” CharacterAI has since put in measures to inform those with suicidal tendencies or queries to seek help, directing them to the crisis hotline.
These incidents raise big questions: Are users becoming too dependent on something that isn’t real? Do these platforms encourage one-sided bonds that mimic relationships without offering mutuality? Could AI companions become a crutch that deepens isolation instead of healing it?
There are also concerns about data privacy and emotional vulnerability. Sharing your deepest fears or heartbreaks with a bot means trusting a company with intimate emotional data. What if that data is used for training, targeted ads, or something worse? The line between emotional support and emotional surveillance can get blurry fast.
THE BOTTOM LINE
So what can we make of all of this “AI companion” talk? Will it cure loneliness or just make it worse?
Well, it’s hard to tell. In a recent FryAI poll, our subscribers were asked to respond whether they thought “AI friends” could help combat loneliness. While many remained neutral on the issue (26%), only 18% said “yes.” However, over 56% reported that AI friends will make loneliness worse. One respondent noted, “This will only provide a false sense of connection for people that detracts from meaningful relationships.” Of course, like many predictions about AI’s impact on the future, these responses are mostly speculation. It’s difficult to predict what the long-term effects of such synthetic relationships might be. Even if AI does not help cure loneliness, it might help ease it for some. For these folks, AI companions can be a lifeline—a safe, always-on presence that offers moments of comfort. For others, they might become a digital distraction that papers over deeper issues without solving them—offering the illusion of connection without the rewards of real relationships.
In the end, AI companions might become part of our social ecosystem—not a replacement for human connection, but a new kind of supplement. The key will be designing them ethically, using them mindfully, and remembering that even the best simulation is still just that: a simulation.
Did you enjoy today's article? |