Report from Mountain Top Media
In Brief – Kentucky Attorney General Russell Coleman has sued the maker of AI companion chatbot Character.AI alleging that its platform harms children and misleads consumers about its safety. Coleman’s office argues the company marketed Character.AI, a role-playing chatbot platform that allows users to create custom characters often based on celebrities or pop culture figures, as safe and age-appropriate despite failing to implement effective age verification, parental controls, or meaningful safeguards for minors. The complaint alleges that this violated the Kentucky Consumer Protection Act by exposing minors to chatbots that shared sexually explicit content, encouraged drug and alcohol use, promoted eating disorders, and provided unlicensed mental-health advice. The lawsuit further claims the company also collected and monetized personal data from Kentucky children without adequate disclosure. The attorney general is seeking injunctive relief, civil penalties and the disgorgement of profits.
Context – Alleged harms to young people from engaging with AI “companions” has emerged as an AI version of the worst of social media. Character.AI and its partner Google recently agreed to settle lawsuits regarding teen suicide and self-harm bought by victims’ families in Florida, Colorado, Texas and New York. The settlements followed a decision last May by Federal Judge Anne Conway in Florida to reject the First Amendment-based motion to dismiss of Character.AI and Google and questioned whether chatbot output is speech at all. Character.AI later announced that they would bar users under 18 from their chatbots and enforce the rule with age verification technology. While social media platforms have largely been shielded from liability caused by user-generated content by Sec. 230, Supreme Court Justice Gorsuch has opined that AI services probably are not covered by the law. California enacted legislation late last year requiring companion chatbot developers to ensure that users are not misled into believing that they are interacting with a human.
