bbieron@platformeconomyinsights.com

EU Commission Releases DSA Guidelines on the Protection of Minors

Jul 1, 2025

Report from Euronews

In Brief – The European Commission has released their guidelines on the protection of minors under the EU Digital Services Act, landmark legislation enacted in 2022 directing digital platforms to reduce access to objectionable content online. The new guidelines provide online platforms with further guidance on how to protect minors by addressing issues such as addictive design, cyber bullying, harmful content, privacy and security. Recommendations include setting minors’ accounts to private by default, disabling features that contribute to excessive use such as various kinds of notifications and tallying of streaks, and prohibiting accounts from downloading or taking screenshots of content posted by minors. Article 28 of the DSA directs platforms to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”, and last year the Commission opened DSA investigations of Meta’s Facebook and Instagram platforms, and TikTok that included preliminary concerns that they failed to comply with Article 28, as well as questioning the platforms’ age-assurance and verification methods. Those investigations are ongoing.

Context – Yes, the Commission’s guidance for digital platforms to protect minors came out a year after DSA investigations of top social media platforms on the issue began. But we’re focused on age verification here. The Commission simultaneously announced a pilot project involving Denmark, France, Greece, Italy and Spain who will roll out a prototype online age verification app that the Commission claims will protect the privacy of users. The Commission hopes to integrate age verification functionalities into the EU Digital Identity (eID) Wallet planned for next year. Online age verification mandates and technologies are moving forward in numerous jurisdictions, including FranceUKUS, and Australia. While porn sites are generally the first targets, social media soon follows. In Australia, where imposing a 16-year-old age limit on social media was the initial focus, search engines will be required to age check as well in December.

View By Monthly
Latest Blog
Swedish Court Delays Ruling in PriceRunner v Google Damages Case

Report from Crowdfund Insider In Brief – Sweden’s Patent and Market Court has delayed until June 10th its decision in a major antitrust damages case between comparison shopping site PriceRunner, which is owned by Sweden-based fintech company Klarna, and Google, citing...

Indonesia Warns YouTube Over Not Complying with Age Limit Rules

Report from Business Today In Brief – Indonesia has issued a formal reprimand to Google over noncompliance by its YouTube service with the country’s new child-protection rules for social media platforms that took effect March 28. The regulations require “high-risk”...

Meta Sued for Addictive Design in Denmark by Nonprofit Association

Report from Anadolu News In Brief – A Danish non-profit association, SOMI, has filed a lawsuit against Meta in Denmark on behalf of parents and children, alleging that the company’s platforms cause psychological harm to minors. The complaint in Copenhagen City Court...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required