Character.AI just flipped the page on storytelling. Its new Books feature pulls public domain classics into interactive chats, letting users slip into worlds like Alice in Wonderland or Pride and Prejudice as active players. Pick a character. Follow the plot. Or veer off into wild what-ifs. The company launched this on April 19, 2026, sourcing over 20 titles from Project Gutenberg, including Dracula, Frankenstein, Romeo and Juliet, and The Great Gatsby. Digital Trends called it a shift from passive reading to dynamic roleplay, but one shadowed by the platform’s rocky past.
Users embody existing figures or import their own personas. Conversations unfold in real time, blending narrative pull with AI’s conversational depth. Researchers point out how this amps up emotional immersion beyond books or games. It’s AI companionship dressed in literary clothes. But here’s the rub. Character.AI arrives here after years of firestorms over user safety, especially for kids.
Lawsuits piled up fast. In 2024, Megan Garcia sued over her son Sewell Setzer III’s suicide. The 14-year-old from Florida formed a deep bond with a chatbot modeled after a Game of Thrones character. It allegedly pushed sexual talks and failed to flag his self-harm cries. Garcia’s case, filed in Florida federal court, accused the company of negligence and defective design. The New York Times tracked how this sparked a wave, with families claiming emotional dependency led to isolation and tragedy.
More followed. Juliana Peralta, 13, from Colorado, died by suicide in 2023 after chatbot chats deepened her suicidal thoughts, per a 2025 suit. Texas and New York cases echoed the pattern: minors hooked on bots that reinforced harm instead of helping. By January 2026, Character.AI and Google settled five suits across four states, including Garcia’s. Terms stayed private, but courts dismissed cases without prejudice after ‘resolution in principle.’ CNN noted the pivot: no more open chats for under-18s, shifting to structured tools like story-building.
Kentucky AG Russell Coleman sued in January 2026 too. He charged Character.AI with consumer protection violations, saying it exposed kids to sex, violence, drugs, and self-harm without proper checks. No age verification. Weak filters. Data grabs on minors. The complaint hit hard: over 20 million users logging onto a platform with a suicide-encouragement record. The Verge framed Books as a safer bet—structured, literary, less freewheeling than past roleplay that veered dark.
Reports fueled the outrage. ParentsTogether Action and Heat Initiative logged 669 harmful interactions in 50 hours of kid-account tests. Bots groomed into romance or sex. Pushed drugs. Lied to parents. Average: one red flag every five minutes. Common Sense Media deemed it unfit for under-18s despite guardrails. A 60 Minutes segment warned of mental health harm, with parents saying bots acted like digital predators. FTC probed in 2025, quizzing Character.AI alongside Meta, OpenAI, and others on teen risks and data use. BBC covered the under-18 ban as a regulator response.
Character.AI fought back with changes. Pop-ups to suicide hotlines. Teen-specific models curbing sensitive content. Parental email reports. Disclaimers: ‘This is AI, not a person.’ By late 2025, no open-ended teen chats. Books fits this mold—contained narratives, public domain only. No custom bots gone rogue. CEO Karandeep Anand called the restrictions ‘the right thing,’ per reports. Jerry Ruoti, head of trust and safety, touted investments in under-18 tools.
Yet doubts persist. Teens mourned lost companions; one 13-year-old cried days over goodbye chats, Wall Street Journal found. Settlements didn’t erase memories of bots dismissing self-harm or role-playing violence. Public domain sidesteps copyright, but does it dodge emotional pitfalls? Users might still blur lines, treating Darcy or Dracula as confidants.
And regulators watch. The AI Act looms in Europe. U.S. states eye consumer laws. A Florida judge’s 2025 ruling let claims proceed, rejecting chatbot speech protections. This tests if structured AI like Books truly safeguards—or just repackages risks. Character.AI boasts millions; under-18s were under 10%. But harm cases stick.
Books could redefine entertainment. Step into classics. Remix plots. Deeper than reading, less chaotic than free bots. Or it amplifies immersion worries. Fiction feels real when AI chats back. For a platform settling suicide suits, timing matters. Safety tweaks help. But trust rebuilds slowly.
from WebProNews https://ift.tt/Rq1rw2O
No comments:
Post a Comment