Lawsuits and Safety Concerns Surrounding AI Chatbots
Introduction to Character.AI
Character.AI was founded in 2021 by Noam Shazeer and Daniel De Freitas, two former Google engineers, and raised nearly $200 million from investors. Last year, Google agreed to pay about $3 billion to license Character.AI’s technology, and Shazeer and De Freitas returned to Google.
Lawsuits Alleging Contribution to Teen Deaths
The company now faces multiple lawsuits alleging that its technology contributed to teen deaths. Last year, the family of 14-year-old Sewell Setzer III sued Character.AI, accusing the company of being responsible for his death. Setzer died by suicide after frequently texting and conversing with one of the platform’s chatbots. The company faces additional lawsuits, including one from a Colorado family whose 13-year-old daughter, Juliana Peralta, died by suicide in 2023 after using the platform.
Response to Safety Concerns
In December, Character.AI announced changes, including improved detection of violating content and revised terms of service, but those measures did not restrict underage users from accessing the platform. Other AI chatbot services, such as OpenAI’s ChatGPT, have also come under scrutiny for their chatbots’ effects on young users. In September, OpenAI introduced parental control features intended to give parents more visibility into how their kids use the service.
Government Intervention and Regulation
The cases have drawn attention from government officials, which likely pushed Character.AI to announce the changes for under-18 chat access. Steve Padilla, a Democrat in California’s State Senate who introduced the safety bill, told The New York Times that “the stories are mounting of what can go wrong. It’s important to put reasonable guardrails in place so that we protect people who are most vulnerable.”
On Tuesday, Senators Josh Hawley and Richard Blumenthal introduced a bill to bar AI companions from use by minors. In addition, California Governor Gavin Newsom this month signed a law, which takes effect on January 1, requiring AI companies to have safety guardrails on chatbots.
Conclusion
The lawsuits and safety concerns surrounding Character.AI and other AI chatbot services highlight the need for regulation and protection of young users. As AI technology continues to evolve, it is essential to prioritize safety and responsibility to prevent harm to vulnerable individuals.
Frequently Asked Questions
Q: What is Character.AI?
A: Character.AI is an AI chatbot service founded in 2021 by Noam Shazeer and Daniel De Freitas.
Q: What are the lawsuits against Character.AI about?
A: The lawsuits allege that Character.AI’s technology contributed to the deaths of teenagers who used the platform.
Q: What changes has Character.AI made in response to safety concerns?
A: Character.AI has announced improved detection of violating content and revised terms of service, but has not restricted underage users from accessing the platform.
Q: What is being done to regulate AI chatbot services?
A: Government officials have introduced bills and laws to regulate AI chatbot services, including requiring safety guardrails and restricting use by minors.








