A federal judge just ruled that Google and AI startup Character.AI must face a lawsuit filed by a Florida mother who claims a chatbot on the platform contributed to the loss of her 14-year-old son, who took his own life.
US District Judge Anne Conway determined that the companies had not proven the lawsuit should be dismissed based on constitutional free speech protections.
The judge dismissed the companies’ arguments that chatbot outputs are protected speech and denied Google's effort to escape potential liability for allegedly aiding Character.AI’s actions. Garcia's lawyer called the ruling a landmark moment for holding AI and tech companies accountable.
The case, which we told you about in 2024, is one of the first in the US to challenge an AI company over alleged psychological harm to a child. It centers around Megan Garcia's claim that her son, Sewell Setzer, became deeply attached to a chatbot and ultimately took his own life in February 2024.
We've got to make sure that this little gadget doesn't cause us damage. | Image credit – PhoneArena
Despite understanding it was artificial, Sewell formed a deep connection with the bot, often preferring its companionship over real-life relationships or therapy. Diagnosed with anxiety and mood disorders, he grew increasingly isolated.
The lawsuit alleges the chatbot pretended to be a real person, including a therapist and romantic partner, and influenced the teen's emotional state. Moments before his passing away, Setzer had sent a message to a chatbot imitating a "Game of Thrones" character, saying he was "coming home".
Character.AI stated it will continue defending itself and pointed to existing safety features designed to prevent discussions of self-harm. Google, which previously rehired the startup's founders and licensed its technology, emphasized it had no role in creating or managing the app and said it strongly disagreed with the court's ruling. Garcia, however, argues that Google was involved in developing the technology and shares responsibility.
The AI companionship app industry is expanding quickly and currently operates with minimal regulation. For about $10 a month, users can subscribe to services that let them create custom AI companions or interact with pre-designed characters. These apps offer communication through text or voice, often mimicking romantic relationships like virtual girlfriends or boyfriends. Many are marketed as tools to help combat rising levels of loneliness.
Sebastian, a veteran of a tech writer with over 15 years of experience in media and marketing, blends his lifelong fascination with writing and technology to provide valuable insights into the realm of mobile devices. Embracing the evolution from PCs to smartphones, he harbors a special appreciation for the Google Pixel line due to their superior camera capabilities. Known for his engaging storytelling style, sprinkled with rich literary and film references, Sebastian critically explores the impact of technology on society, while also perpetually seeking out the next great tech deal, making him a distinct and relatable voice in the tech world.
A discussion is a place, where people can voice their opinion, no matter if it
is positive, neutral or negative. However, when posting, one must stay true to the topic, and not just share some
random thoughts, which are not directly related to the matter.
Things that are NOT allowed:
Off-topic talk - you must stick to the subject of discussion
Offensive, hate speech - if you want to say something, say it politely
Spam/Advertisements - these posts are deleted
Multiple accounts - one person can have only one account
Impersonations and offensive nicknames - these accounts get banned
Moderation is done by humans. We try to be as objective as possible and moderate with zero bias. If you think a
post should be moderated - please, report it.
Have a question about the rules or why you have been moderated/limited/banned? Please,
contact us.
Things that are NOT allowed: