bbieron@platformeconomyinsights.com

EU Commission Releases DSA Guidelines on the Protection of Minors

Jul 1, 2025

Report from Euronews

In Brief – The European Commission has released their guidelines on the protection of minors under the EU Digital Services Act, landmark legislation enacted in 2022 directing digital platforms to reduce access to objectionable content online. The new guidelines provide online platforms with further guidance on how to protect minors by addressing issues such as addictive design, cyber bullying, harmful content, privacy and security. Recommendations include setting minors’ accounts to private by default, disabling features that contribute to excessive use such as various kinds of notifications and tallying of streaks, and prohibiting accounts from downloading or taking screenshots of content posted by minors. Article 28 of the DSA directs platforms to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”, and last year the Commission opened DSA investigations of Meta’s Facebook and Instagram platforms, and TikTok that included preliminary concerns that they failed to comply with Article 28, as well as questioning the platforms’ age-assurance and verification methods. Those investigations are ongoing.

Context – Yes, the Commission’s guidance for digital platforms to protect minors came out a year after DSA investigations of top social media platforms on the issue began. But we’re focused on age verification here. The Commission simultaneously announced a pilot project involving Denmark, France, Greece, Italy and Spain who will roll out a prototype online age verification app that the Commission claims will protect the privacy of users. The Commission hopes to integrate age verification functionalities into the EU Digital Identity (eID) Wallet planned for next year. Online age verification mandates and technologies are moving forward in numerous jurisdictions, including FranceUKUS, and Australia. While porn sites are generally the first targets, social media soon follows. In Australia, where imposing a 16-year-old age limit on social media was the initial focus, search engines will be required to age check as well in December.

View By Monthly
Latest Blog
European Commission Expands Their DSA Probe of Online Porn Sites

Report from CBC News In Brief – The European Commission has announced that they have preliminarily found four large adult content platforms to be in breach of the Digital Services Act (DSA) for failing to protect minors from being exposed to pornographic content on...

UK Government Targeting Manosphere Content on Online Platforms

Report from The Guardian In Brief – More than 60 Labour MPs have urged Ofcom, the country’s communications and digital regulator, to use its authority under the Online Safety Act to press platforms to better protect young men from risks they argue are linked to...

Google Proposes a Publisher Opt-Out for AI-Enabled Search in the UK

Report from MediaPost In Brief – Google has outlined plans to give publishers more authority over how their content appears in AI-driven search features in response to the consultation by the UK Competition and Markets Authority (CMA) regarding application of the...

Dutch Court Bans Grok from Creating Nude and Partially Nude Images

Report from Reuters In Brief – A Dutch court has ordered AI company xAI to stop its chatbot Grok from generating or distributing non-consensual sexualized images, including depictions of adults or children partially or wholly stripped naked. The preliminary...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required