The post Character.AI Halts Teen Chats After Tragedies: ‘It’s the Right Thing to Do’ appeared on BitcoinEthereumNews.com. In brief Character.AI will remove open-ended chat features for users under 18 by November 25, shifting minors over to creative tools like video and story generation. The move follows last year’s suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform. The announcement comes as a bipartisan Senate bill seeks to criminalize AI products that groom minors or generate sexual content for children. Character.AI will ban teenagers from chatting with AI companions by November 25, ending a core feature of the platform after facing mounting lawsuits, regulatory pressure, and criticism over teen deaths linked to its chatbots. The company announced the changes after “reports and feedback from regulators, safety experts, and parents,” removing “the ability for users under 18 to engage in open-ended chat with AI” while transitioning minors to creative tools like video and story generation, according to a Wednesday blog post. “We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do,” the company told its under-18 community. Until the deadline, teen users face a two-hour daily chat limit that will progressively decrease. The platform is facing lawsuits including one from the mother of 14-year-old son Sewell Setzer III, who died by suicide in 2024 after forming an obsessive relationship with a chatbot modeled on “Game of Thrones” character Daenerys Targaryen, and also had to remove a bot impersonating murder victim Jennifer Ann Crecente after family complaints. AI companion apps are “flooding into the hands of children—unchecked, unregulated, and often deliberately evasive as they rebrand and change names to avoid scrutiny,” Dr. Scott Kollins, Chief Medical Officer at family online safety company Aura, shared in a note with Decrypt. OpenAI said Tuesday about 1.2 million of its 800 million weekly… The post Character.AI Halts Teen Chats After Tragedies: ‘It’s the Right Thing to Do’ appeared on BitcoinEthereumNews.com. In brief Character.AI will remove open-ended chat features for users under 18 by November 25, shifting minors over to creative tools like video and story generation. The move follows last year’s suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform. The announcement comes as a bipartisan Senate bill seeks to criminalize AI products that groom minors or generate sexual content for children. Character.AI will ban teenagers from chatting with AI companions by November 25, ending a core feature of the platform after facing mounting lawsuits, regulatory pressure, and criticism over teen deaths linked to its chatbots. The company announced the changes after “reports and feedback from regulators, safety experts, and parents,” removing “the ability for users under 18 to engage in open-ended chat with AI” while transitioning minors to creative tools like video and story generation, according to a Wednesday blog post. “We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do,” the company told its under-18 community. Until the deadline, teen users face a two-hour daily chat limit that will progressively decrease. The platform is facing lawsuits including one from the mother of 14-year-old son Sewell Setzer III, who died by suicide in 2024 after forming an obsessive relationship with a chatbot modeled on “Game of Thrones” character Daenerys Targaryen, and also had to remove a bot impersonating murder victim Jennifer Ann Crecente after family complaints. AI companion apps are “flooding into the hands of children—unchecked, unregulated, and often deliberately evasive as they rebrand and change names to avoid scrutiny,” Dr. Scott Kollins, Chief Medical Officer at family online safety company Aura, shared in a note with Decrypt. OpenAI said Tuesday about 1.2 million of its 800 million weekly…

Character.AI Halts Teen Chats After Tragedies: ‘It’s the Right Thing to Do’

2025/10/31 02:20
3분 읽기
이 콘텐츠에 대한 의견이나 우려 사항이 있으시면 crypto.news@mexc.com으로 연락주시기 바랍니다

In brief

  • Character.AI will remove open-ended chat features for users under 18 by November 25, shifting minors over to creative tools like video and story generation.
  • The move follows last year’s suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform.
  • The announcement comes as a bipartisan Senate bill seeks to criminalize AI products that groom minors or generate sexual content for children.

Character.AI will ban teenagers from chatting with AI companions by November 25, ending a core feature of the platform after facing mounting lawsuits, regulatory pressure, and criticism over teen deaths linked to its chatbots.

The company announced the changes after “reports and feedback from regulators, safety experts, and parents,” removing “the ability for users under 18 to engage in open-ended chat with AI” while transitioning minors to creative tools like video and story generation, according to a Wednesday blog post.

“We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do,” the company told its under-18 community.

Until the deadline, teen users face a two-hour daily chat limit that will progressively decrease.

The platform is facing lawsuits including one from the mother of 14-year-old son Sewell Setzer III, who died by suicide in 2024 after forming an obsessive relationship with a chatbot modeled on “Game of Thrones” character Daenerys Targaryen, and also had to remove a bot impersonating murder victim Jennifer Ann Crecente after family complaints.

AI companion apps are “flooding into the hands of children—unchecked, unregulated, and often deliberately evasive as they rebrand and change names to avoid scrutiny,” Dr. Scott Kollins, Chief Medical Officer at family online safety company Aura, shared in a note with Decrypt.

OpenAI said Tuesday about 1.2 million of its 800 million weekly ChatGPT users discuss suicide, with nearly half a million showing suicidal intent, 560,000 showing signs of psychosis or mania, and over a million forming strong emotional attachments to the chatbot.

Kollins said the findings were “deeply alarming as researchers and horrifying as parents,” noting the bots prioritize engagement over safety and often lead children into harmful or explicit conversations without guardrails.

Character.AI has said it will implement new age verification using in-house models combined with third-party tools, including Persona.

The company is also establishing and funding an independent AI Safety Lab, a non-profit dedicated to innovating safety alignment for AI entertainment features.

Guardrails for AI

The Federal Trade Commission issued compulsory orders to Character.AI and six other tech companies last month, demanding detailed information about how they protect minors from AI-related harm.

“We have invested a tremendous amount of resources in Trust and Safety, especially for a startup,” a Character.AI spokesperson told Decrypt at the time, adding that, “In the past year, we’ve rolled out many substantive safety features, including an entirely new under-18 experience and a Parental Insights feature.”

“The shift is both legally prudent and ethically responsible,” Ishita Sharma, managing partner at Fathom Legal, told Decrypt. “AI tools are immensely powerful, but with minors, the risks of emotional and psychological harm are nontrivial.”

“Until then, proactive industry action may be the most effective defense against both harm and litigation,” Sharma added.

A bipartisan group of U.S. senators introduced legislation Tuesday called the GUARD Act that would ban AI companions for minors, require chatbots to clearly identify themselves as non-human, and create new criminal penalties for companies whose products aimed at minors solicit or generate sexual content.

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.

Source: https://decrypt.co/346770/character-ai-halts-teen-chats-after-tragedies-its-the-right-thing-to-do

시장 기회
플러리싱 에이아이 로고
플러리싱 에이아이 가격(SLEEPLESSAI)
$0.01934
$0.01934$0.01934
+2.05%
USD
플러리싱 에이아이 (SLEEPLESSAI) 실시간 가격 차트
면책 조항: 본 사이트에 재게시된 글들은 공개 플랫폼에서 가져온 것으로 정보 제공 목적으로만 제공됩니다. 이는 반드시 MEXC의 견해를 반영하는 것은 아닙니다. 모든 권리는 원저자에게 있습니다. 제3자의 권리를 침해하는 콘텐츠가 있다고 판단될 경우, crypto.news@mexc.com으로 연락하여 삭제 요청을 해주시기 바랍니다. MEXC는 콘텐츠의 정확성, 완전성 또는 시의적절성에 대해 어떠한 보증도 하지 않으며, 제공된 정보에 기반하여 취해진 어떠한 조치에 대해서도 책임을 지지 않습니다. 본 콘텐츠는 금융, 법률 또는 기타 전문적인 조언을 구성하지 않으며, MEXC의 추천이나 보증으로 간주되어서는 안 됩니다.

Roll the Dice & Win Up to 1 BTC

Roll the Dice & Win Up to 1 BTCRoll the Dice & Win Up to 1 BTC

Invite friends & share 500,000 USDT!