bbieron@platformeconomyinsights.com

UK Ofcom Issues Rules Requiring Platforms to Protect Younger Users

May 5, 2025

Report from The Guardian

In Brief – Ofcom, the UK online content moderation regulator under the Online Safety Act (OSA), has announced a set of new rules for online sites likely to be used by young people, including the largest social media platforms, search engines, and gaming sites. Dubbed the “Children’s Code”, the new requirements for the “riskiest” services, such as big social media platforms, include implementing “highly effective” age checks to identify users under age 18, tailoring recommendation algorithms to those younger users to filter out a wide range of harmful material, and instituting effective procedures to report and remove dangerous content quickly.  Age verification technology plays a central role in the new regime, with Ofcom’s guidance suggesting that platforms verify ages by checking with banks or mobile network operators or using photo-ID matching or facial-age-estimating software. While the UK’s digital minister called the rules a “watershed moment” some online safety campaigners criticized Ofcom as “overly cautious”. Covered platforms will need to meet the new requirements by July 25.

Context – Efforts to regulate online platforms to “protect” teenagers is an increasingly global phenomenon. Online porn is consistently a top target, but social media more broadly is rapidly facing similar demands. France is requiring age verification for adult sites, but they are working to impose age-based requirements on social media, including requiring parental approval for users under age 15. The US Supreme Court heard oral arguments in January on a Texas-law requiring age checks for online porn and their decision will further inform US courts scrutinizing the flood of US state laws regulating social media to “protect” teens that would operationally depend on age checks. The EU’s Digital Services Act, called a “regulatory cousin” of the UK OSA, directs platforms on how to address a wide range of objectionable content, and Commission regulators are investigating how TikTok and Meta’s top platforms protect younger users. Countries across Asia are actively considering online age limits, and Australia has set a firm minimum age of 16 for social media besides YouTube.

View By Monthly
Latest Blog
Dutch Regulator Opens Digital Services Act Investigation of Roblox

Report from NL Times In Brief – The Netherlands Authority for Consumers and Markets (ACM) has launched a formal Digital Services Act (DSA) investigation of Roblox over concerns that the online gaming platform may not be doing enough to protect children. The DSA...

EU Commission Moves to Stop Meta from Banning Chatbots on WhatsApp

Report from Wall Street Journal In Brief – The European Commission has informed Meta that it plans to block the company’s ban on third-party AI chatbots from operating over WhatsApp. The antitrust regulator has reached a preliminary finding that Meta’s policy could...

Department of Justice and State AGs Appeal Google Search Remedies Order

Report from Bloomberg In Brief – The US Department of Justice has announced that it notified the Federal Court of Appeals for the District of Columbia that it will appeal US District Judge Amit Mehta’s remedies order in the federal antitrust lawsuit that found Google...

Governor Newsome Drops Funding for Media from California State Budget

Report from SFiST In Brief – The latest budget proposal from California Governor Gavin Newsom (D) has eliminated funding for the News Transformation Fund, a state initiative to pay millions of dollars to California media companies. The fund was announced in 2024 as...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required