Report from Deadline
In Brief – Los Angeles Superior Court Judge Carolyn Kuhl has rejected the motion for summary judgement filed by Google, Meta, ByteDance and Snap asking the judge to rule that they are not responsible for mental health problems experienced by teen plaintiffs who heavily used their social media platforms. The teens and their families argue that the companies designed their platforms to be addictive and knew that their use could lead to harms including depression, sleep disruption, eating disorders, body dysmorphia, and anxiety. Judge Kuhl had earlier dismissed similar lawsuits filed by public school districts who argued that social media caused students to act out, imposing significant costs on the schools, ruling that school claims about student behavior were too remote to impose liability on the companies. However, she ruled in 2023 that the platforms were not shielded from liability for direct harms to the young users by Section 230 of the Federal Communications Decency Act. In her most recent ruling, Kuhl asserts that there is evidence that the platforms’ features “were capable of causing the type of mental harms allegedly suffered” and that the question of whether they “were a substantial factor” in causing the alleged harms will need to go to a jury trial in January that is expected to feature testimony from the company CEOs.
Context – Social media critics have been pursuing legal strategies to circumvent Sec. 230 for years, including arguing that platform designs encourage “addictive” use. State legislation regulating social media design features are facing generally skeptical federal judges, although not in the 5th Circuit Court of Appeals. Civil lawsuits targeting platforms for faulty and negligent design are rapidly metastasizing and having better luck getting past initial court hurdles. Despite popular and media claims that social media clearly harms teenagers, the federal judge who blocked Utah’s social media bill noted in his ruling the distinct lack of evidence that teens are generally harmed by social media, finding more evidence of balanced and individualized effects, so the platforms can pursue that longshot defense based on evidence.
