bbieron@platformeconomyinsights.com

Florida’s Online Content Moderation Law has Problems in Court (Again)

Jul 1, 2025

Report from the Miami Times

In Brief – Federal District Judge Robert Hinkle has rejected the State of Florida’s motion to dismiss a lawsuit challenging its 2021 law that regulates how social media companies engage in content moderation. Hinkle has overseen First Amendment-based challenges brought by tech company trade associations for nearly four years, siding with them in his original ruling, and having that decision upheld by the 11th Circuit Court of Appeals. However, the US Supreme Court found Hinkle’s review of the challenge to the Florida law, and a case from the 5th Circuit dealing with a similar Texas law, to insufficiently consider the full range of digital platforms that might be implicated, directing both courts to dive deeper. In rejecting Florida’s latest effort to dismiss the tech group’s complaint, Hinkle wrote, “The defendants have not attempted to explain what these provisions really mean or how they would be applied. Nor have the defendants offered any theory under which a state can preclude this kind of curating without violating the First Amendment.” He also noted that while the law may not be unconstitutional on its face in all possible applications, many of its provisions likely violate constitutional protections when applied to specific companies.

Context – Republican-led Florida and Texas were the first states to try to regulate social media content moderation based on arguments that progressive Big Tech companies were stifling conservatives. New York and California followed with laws pushing platforms to better police hate speech and harassment. Although the Supreme Court ordered another review of Florida’s law, a majority of the justices said that the law likely violated the First Amendment when applied to traditional social media platforms. Earlier this year, California settled a challenge to its content moderation law brough by X agreeing that the Moody decision precluded directing content moderation by traditional social media platforms, and Hinkle’s latest ruling reiterates that once past procedural questions there are major problems with regulating social media content moderation under current First Amendment jurisprudence.

View By Monthly
Latest Blog
European Commission Expands Their DSA Probe of Online Porn Sites

Report from CBC News In Brief – The European Commission has announced that they have preliminarily found four large adult content platforms to be in breach of the Digital Services Act (DSA) for failing to protect minors from being exposed to pornographic content on...

UK Government Targeting Manosphere Content on Online Platforms

Report from The Guardian In Brief – More than 60 Labour MPs have urged Ofcom, the country’s communications and digital regulator, to use its authority under the Online Safety Act to press platforms to better protect young men from risks they argue are linked to...

Google Proposes a Publisher Opt-Out for AI-Enabled Search in the UK

Report from MediaPost In Brief – Google has outlined plans to give publishers more authority over how their content appears in AI-driven search features in response to the consultation by the UK Competition and Markets Authority (CMA) regarding application of the...

Dutch Court Bans Grok from Creating Nude and Partially Nude Images

Report from Reuters In Brief – A Dutch court has ordered AI company xAI to stop its chatbot Grok from generating or distributing non-consensual sexualized images, including depictions of adults or children partially or wholly stripped naked. The preliminary...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required