bbieron@platformeconomyinsights.com

EU Investigating How Four More Large Platforms Protect Minors

Oct 1, 2025

Report from Courthouse News

In Brief – The European Commission has asked Snapchat, YouTube, Apple’s App Store and Google Play how they protect children on their platforms, including how they attempt to verify users’ ages and prevent minors from accessing harmful content and illegal products. The new Digital Services Act (DSA) probe follows EU online child safety draft guidelines adopted in July, which include making minors’ accounts private by default, turning off addictive features like “streaks” and autoplay, and blocking people from taking screenshots of content posted by kids to prevent sexual extortion. Snapchat is being asked to explain how it enforces its rules to keep kids under 13 off the app and how it stops minors from buying illegal products. YouTube is being asked about its age verification system and how it decides what videos to show to different age users. The Apple and Google app stores face questions regarding how they rate apps for age-appropriateness and then stop minors from downloading apps meant only for adults.

Context – Platforms with over 45 million active users in the EU are designated Very Large Online Platforms (VLOPs) under the DSA and face the strictest regulatory standards and enforcement by the Commission itself. The regulator has opened compliance investigations of several of the social media platforms, including Meta and TikTok, which includes probes of how they protect young people, as well as four adult sites, focusing on how they verify user ages and keep minors off their platforms. Attempts to protect teens from various online harms are accelerating globally. Other European initiatives include France imposing age checks on porn sites, the Commission overseeing a five-country pilot of an age verification app, and President von der Leyen saying that the Commission will explore a social media age limit. Australia’s 16-year-old age limit for social media and age verification mandate is being watched. Its regulator is already exploring extending the age limit to platforms with social features, such as gaming sites, and search engines are being ordered to use age verification to block minors from online porn.

View By Monthly
Latest Blog
Dutch Regulator Opens Digital Services Act Investigation of Roblox

Report from NL Times In Brief – The Netherlands Authority for Consumers and Markets (ACM) has launched a formal Digital Services Act (DSA) investigation of Roblox over concerns that the online gaming platform may not be doing enough to protect children. The DSA...

EU Commission Moves to Stop Meta from Banning Chatbots on WhatsApp

Report from Wall Street Journal In Brief – The European Commission has informed Meta that it plans to block the company’s ban on third-party AI chatbots from operating over WhatsApp. The antitrust regulator has reached a preliminary finding that Meta’s policy could...

Department of Justice and State AGs Appeal Google Search Remedies Order

Report from Bloomberg In Brief – The US Department of Justice has announced that it notified the Federal Court of Appeals for the District of Columbia that it will appeal US District Judge Amit Mehta’s remedies order in the federal antitrust lawsuit that found Google...

Governor Newsome Drops Funding for Media from California State Budget

Report from SFiST In Brief – The latest budget proposal from California Governor Gavin Newsom (D) has eliminated funding for the News Transformation Fund, a state initiative to pay millions of dollars to California media companies. The fund was announced in 2024 as...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required