Judge forces Google and another AI company to face a lawsuit over a Florida tragedy

In what way AI's responses are free speech?

1comment
Judge forces Google and another AI company to face a lawsuit over a Florida tragedy
A federal judge just ruled that Google and AI startup Character.AI must face a lawsuit filed by a Florida mother who claims a chatbot on the platform contributed to the loss of her 14-year-old son, who took his own life.

US District Judge Anne Conway determined that the companies had not proven the lawsuit should be dismissed based on constitutional free speech protections.

The judge dismissed the companies’ arguments that chatbot outputs are protected speech and denied Google's effort to escape potential liability for allegedly aiding Character.AI’s actions. Garcia's lawyer called the ruling a landmark moment for holding AI and tech companies accountable.

The case, which we told you about in 2024, is one of the first in the US to challenge an AI company over alleged psychological harm to a child. It centers around Megan Garcia's claim that her son, Sewell Setzer, became deeply attached to a chatbot and ultimately took his own life in February 2024.



Despite understanding it was artificial, Sewell formed a deep connection with the bot, often preferring its companionship over real-life relationships or therapy. Diagnosed with anxiety and mood disorders, he grew increasingly isolated.

The lawsuit alleges the chatbot pretended to be a real person, including a therapist and romantic partner, and influenced the teen's emotional state. Moments before his passing away, Setzer had sent a message to a chatbot imitating a "Game of Thrones" character, saying he was "coming home".

Character.AI stated it will continue defending itself and pointed to existing safety features designed to prevent discussions of self-harm. Google, which previously rehired the startup's founders and licensed its technology, emphasized it had no role in creating or managing the app and said it strongly disagreed with the court's ruling. Garcia, however, argues that Google was involved in developing the technology and shares responsibility.

The AI companionship app industry is expanding quickly and currently operates with minimal regulation. For about $10 a month, users can subscribe to services that let them create custom AI companions or interact with pre-designed characters. These apps offer communication through text or voice, often mimicking romantic relationships like virtual girlfriends or boyfriends. Many are marketed as tools to help combat rising levels of loneliness.
Did you enjoy this article?
Еxplore more with a FREE members account.
  • Access members-only articles
  • Join community discussions
  • Share your own device reviews
  • Manage your newsletter choices
Register For Free
Loading Comments...

Recommended Stories

FCC OKs Cingular\'s purchase of AT&T Wireless