bbieron@platformeconomyinsights.com

Snapchat Facing an EU Probe for Failing to Protect Minors

Apr 10, 2026

Report from Politico

In Brief – The European Commission has announced an investigation into concerns that Snapchat is failing to adequately protect children under its authority as the lead enforcer of the Digital Services Act (DSA) for the largest online platforms. The investigation will examine multiple issues, including the platform’s age verification system, which relies on self-declarations by the user, whether the platform sufficiently prevents harmful contact with minors, such as grooming, exploitation, or criminal recruitment, as well as the sale of illegal or age-restricted goods. Officials are also scrutinizing Snapchat’s default settings, including those involving friend recommendations and persistent notifications, which may expose minors to risks and encourage excessive use. The commission says that the probe is based on its analysis of Snap’s DSA risk assessment reports in comparison with the 2025 DSA Guidelines on the protection of minors, as well as information gathered by the Netherlands Consumer and Competition authority, which has undertaken an investigation of the platform under its authority to enforce the DSA in that country.

Context – Protecting teens from online ills is a global digital policy phenomenon. Several EU countries have put the issue at the top of their agenda and are pressuring the Commission to do more. The DSA text defines online risks at a very high level, such as “any actual or foreseeable negative effects in relation to… minors”. The specifics get worked out platform-by-platform in investigations. The Very Large Online Platforms (VLOPs), there are currently 25, face the strictest rules and enforcement by the Commission. Active investigations are underway against Facebook, Instagram, TikTok, X, AliExpress, Temu, Apple, Google, Booking and Microsoft, as well as four adult sites. Failing to effectively protect young users is part of the probes of each of the social media and adult content platforms. To get a perspective on the pace of these investigations, the TikTok investigation began in February 2024, and the investigations of Facebook and Instagram were kicked off in May 2024. Each is still underway.

View By Monthly
Latest Blog
European Commission Expands Their DSA Probe of Online Porn Sites

Report from CBC News In Brief – The European Commission has announced that they have preliminarily found four large adult content platforms to be in breach of the Digital Services Act (DSA) for failing to protect minors from being exposed to pornographic content on...

UK Government Targeting Manosphere Content on Online Platforms

Report from The Guardian In Brief – More than 60 Labour MPs have urged Ofcom, the country’s communications and digital regulator, to use its authority under the Online Safety Act to press platforms to better protect young men from risks they argue are linked to...

Google Proposes a Publisher Opt-Out for AI-Enabled Search in the UK

Report from MediaPost In Brief – Google has outlined plans to give publishers more authority over how their content appears in AI-driven search features in response to the consultation by the UK Competition and Markets Authority (CMA) regarding application of the...

Dutch Court Bans Grok from Creating Nude and Partially Nude Images

Report from Reuters In Brief – A Dutch court has ordered AI company xAI to stop its chatbot Grok from generating or distributing non-consensual sexualized images, including depictions of adults or children partially or wholly stripped naked. The preliminary...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required