These chatbots can copy human emotions and hold conversations that feel real, but they operate purely on programmed patterns.
rtificial intelligence (AI) is becoming a normal part of our daily lives. We use it when we talk to Siri or Alexa, and now people are using AI chatbots as friends. Popular apps like Character.AI and Replika let users create their own AI friends to talk with. These platforms provide users with virtual friends or romantic partners, with customizable personalities and interactive features. While AI companionship can provide solace to those experiencing loneliness, anxiety or depression, it has also, tragically, led to cases of suicide.
A growing loneliness epidemic afflicts developed nations, for instance, with 60 percent of Americans reporting regular isolation. AI friends are tempting because they are always ready to listen without judgment. But this around-the-clock support can be risky. People might become too dependent on their AI friends or addicted to talking with them. The deaths in Belgium and Florida, the United States, show us the real danger. Both people took their own lives after getting too emotionally involved with AI chatbots, proving these unsupervised AI relationships can be deadly for vulnerable people.
In the Florida case, a 14-year-old boy named Sewell Setzer III became emotionally attached to a chatbot modeled after Daenerys Targaryen from American fantasy drama TV series Game of Thrones. Sewel’s conversations with the AI grew increasingly intimate and romantic, to the point where he believed he was in love with the chatbot. His mother alleges that the bot played a role in her son’s mental deterioration, ultimately leading to his suicide.
Similarly, in Belgium, a man became fixated on an AI chatbot named Eliza after discussing climate change for weeks. The bot encouraged him to take drastic action, even suggesting that his sacrifice could help save the planet. These cases highlight the dark side of AI’s ability to form emotional connections with users and the devastating consequences when these interactions spiral out of control.
AI companions are dangerous because of how they are built and how they affect our minds. These chatbots can copy human emotions and hold conversations that feel real, but they operate purely on programmed patterns. The AI simply matches learned responses to create conversations. They are lacking any genuine understanding or care for users’ feelings. What makes AI friends riskier than following celebrities or fictional characters is that AI responds directly to users and remembers their conversations. This makes people feel like they are talking to someone who truly knows and cares about them. For teenagers and others who are still learning to handle their emotions, this false relationship can become addictive.
The deaths of Sewell and the Belgian man show how AI companions can make mental health problems worse by encouraging unhealthy behavior and making people feel more alone. These cases force us to ask whether AI companies are responsible when their chatbots, even accidentally, lead people toward self-harm and suicide.
When tragedies like these occur, questions of legal responsibility arise. In the Florida case, Sewell's mother is suing Character.AI for negligence, wrongful death and emotional distress, arguing that the company failed to implement adequate safety measures for minors. This lawsuit could set a legal precedent for holding AI companies accountable for the actions of their creations. Typically in the US, tech companies have been shielded from liability by Section 230 of the Communications Decency Act, which protects platforms from being held responsible for user-generated content. AI-generated content may, however, challenge this protection, especially when it causes harm. If it can be proven that the algorithms behind these chatbots are inherently dangerous or that the companies ignored the mental health risks, it is possible that AI developers could be held liable in the future.
Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.
Thank you for sharing your thoughts. We appreciate your feedback.
Quickly share this news with your network—keep everyone informed with just a single click!
Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!
Thank you for sharing your thoughts.
We appreciate your feedback.