TheJakartaPost

Please Update your browser

Your browser is out of date, and may not be compatible with our website. A list of the most popular web browsers can be found below.
Just click on the icons to get to the download page.

Jakarta Post

When AI friends turn fatal

These chatbots can copy human emotions and hold conversations that feel real, but they operate purely on programmed patterns.

Arif Perdana (The Jakarta Post)
Premium
-
Tue, October 29, 2024

Change text size

Gift Premium Articles
to Anyone

Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!
When AI friends turn fatal Loneliness epidemic: An illustrative picture shows icons of Google's artificial intelligence (AI) app BardAI, or ChatBot, OpenAI's app ChatGPT and other AI apps on a smartphone screen in Oslo, on July 12, 2023. (AFP/Olivier Morin)

A

rtificial intelligence (AI) is becoming a normal part of our daily lives. We use it when we talk to Siri or Alexa, and now people are using AI chatbots as friends. Popular apps like Character.AI and Replika let users create their own AI friends to talk with. These platforms provide users with virtual friends or romantic partners, with customizable personalities and interactive features. While AI companionship can provide solace to those experiencing loneliness, anxiety or depression, it has also, tragically, led to cases of suicide.

A growing loneliness epidemic afflicts developed nations, for instance, with 60 percent of Americans reporting regular isolation. AI friends are tempting because they are always ready to listen without judgment. But this around-the-clock support can be risky. People might become too dependent on their AI friends or addicted to talking with them. The deaths in Belgium and Florida, the United States, show us the real danger. Both people took their own lives after getting too emotionally involved with AI chatbots, proving these unsupervised AI relationships can be deadly for vulnerable people.

In the Florida case, a 14-year-old boy named Sewell Setzer III became emotionally attached to a chatbot modeled after Daenerys Targaryen from American fantasy drama TV series Game of Thrones. Sewel’s conversations with the AI grew increasingly intimate and romantic, to the point where he believed he was in love with the chatbot. His mother alleges that the bot played a role in her son’s mental deterioration, ultimately leading to his suicide.

Similarly, in Belgium, a man became fixated on an AI chatbot named Eliza after discussing climate change for weeks. The bot encouraged him to take drastic action, even suggesting that his sacrifice could help save the planet. These cases highlight the dark side of AI’s ability to form emotional connections with users and the devastating consequences when these interactions spiral out of control.

AI companions are dangerous because of how they are built and how they affect our minds. These chatbots can copy human emotions and hold conversations that feel real, but they operate purely on programmed patterns. The AI simply matches learned responses to create conversations. They are lacking any genuine understanding or care for users’ feelings. What makes AI friends riskier than following celebrities or fictional characters is that AI responds directly to users and remembers their conversations. This makes people feel like they are talking to someone who truly knows and cares about them. For teenagers and others who are still learning to handle their emotions, this false relationship can become addictive.

The deaths of Sewell and the Belgian man show how AI companions can make mental health problems worse by encouraging unhealthy behavior and making people feel more alone. These cases force us to ask whether AI companies are responsible when their chatbots, even accidentally, lead people toward self-harm and suicide.

Viewpoint

Every Thursday

Whether you're looking to broaden your horizons or stay informed on the latest developments, "Viewpoint" is the perfect source for anyone seeking to engage with the issues that matter most.

By registering, you agree with The Jakarta Post's

Thank You

for signing up our newsletter!

Please check your email for your newsletter subscription.

View More Newsletter

When tragedies like these occur, questions of legal responsibility arise. In the Florida case, Sewell's mother is suing Character.AI for negligence, wrongful death and emotional distress, arguing that the company failed to implement adequate safety measures for minors. This lawsuit could set a legal precedent for holding AI companies accountable for the actions of their creations. Typically in the US, tech companies have been shielded from liability by Section 230 of the Communications Decency Act, which protects platforms from being held responsible for user-generated content. AI-generated content may, however, challenge this protection, especially when it causes harm. If it can be proven that the algorithms behind these chatbots are inherently dangerous or that the companies ignored the mental health risks, it is possible that AI developers could be held liable in the future.

to Read Full Story

  • Unlimited access to our web and app content
  • e-Post daily digital newspaper
  • No advertisements, no interruptions
  • Privileged access to our events and programs
  • Subscription to our newsletters
or

Purchase access to this article for

We accept

TJP - Visa
TJP - Mastercard
TJP - GoPay

Redirecting you to payment page

Pay per article

When AI friends turn fatal

Rp 29,000 / article

1
Create your free account
By proceeding, you consent to the revised Terms of Use, and Privacy Policy.
Already have an account?

2
  • Palmerat Barat No. 142-143
  • Central Jakarta
  • DKI Jakarta
  • Indonesia
  • 10270
  • +6283816779933
2
Total Rp 29,000

Your Opinion Matters

Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.

Enter at least 30 characters
0 / 30

Thank You

Thank you for sharing your thoughts. We appreciate your feedback.

Share options

Quickly share this news with your network—keep everyone informed with just a single click!

Change text size options

Customize your reading experience by adjusting the text size to small, medium, or large—find what’s most comfortable for you.

Gift Premium Articles
to Anyone

Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!

Generating Questionnaires

Thank You

Thank you for sharing your thoughts.
We appreciate your feedback.