bbieron@platformeconomyinsights.com

EU Commission Releases DSA Guidelines on the Protection of Minors

Jul 1, 2025

Report from Euronews

In Brief – The European Commission has released their guidelines on the protection of minors under the EU Digital Services Act, landmark legislation enacted in 2022 directing digital platforms to reduce access to objectionable content online. The new guidelines provide online platforms with further guidance on how to protect minors by addressing issues such as addictive design, cyber bullying, harmful content, privacy and security. Recommendations include setting minors’ accounts to private by default, disabling features that contribute to excessive use such as various kinds of notifications and tallying of streaks, and prohibiting accounts from downloading or taking screenshots of content posted by minors. Article 28 of the DSA directs platforms to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”, and last year the Commission opened DSA investigations of Meta’s Facebook and Instagram platforms, and TikTok that included preliminary concerns that they failed to comply with Article 28, as well as questioning the platforms’ age-assurance and verification methods. Those investigations are ongoing.

Context – Yes, the Commission’s guidance for digital platforms to protect minors came out a year after DSA investigations of top social media platforms on the issue began. But we’re focused on age verification here. The Commission simultaneously announced a pilot project involving Denmark, France, Greece, Italy and Spain who will roll out a prototype online age verification app that the Commission claims will protect the privacy of users. The Commission hopes to integrate age verification functionalities into the EU Digital Identity (eID) Wallet planned for next year. Online age verification mandates and technologies are moving forward in numerous jurisdictions, including FranceUKUS, and Australia. While porn sites are generally the first targets, social media soon follows. In Australia, where imposing a 16-year-old age limit on social media was the initial focus, search engines will be required to age check as well in December.

View By Monthly
Latest Blog
Dutch Regulator Opens Digital Services Act Investigation of Roblox

Report from NL Times In Brief – The Netherlands Authority for Consumers and Markets (ACM) has launched a formal Digital Services Act (DSA) investigation of Roblox over concerns that the online gaming platform may not be doing enough to protect children. The DSA...

EU Commission Moves to Stop Meta from Banning Chatbots on WhatsApp

Report from Wall Street Journal In Brief – The European Commission has informed Meta that it plans to block the company’s ban on third-party AI chatbots from operating over WhatsApp. The antitrust regulator has reached a preliminary finding that Meta’s policy could...

Department of Justice and State AGs Appeal Google Search Remedies Order

Report from Bloomberg In Brief – The US Department of Justice has announced that it notified the Federal Court of Appeals for the District of Columbia that it will appeal US District Judge Amit Mehta’s remedies order in the federal antitrust lawsuit that found Google...

Governor Newsome Drops Funding for Media from California State Budget

Report from SFiST In Brief – The latest budget proposal from California Governor Gavin Newsom (D) has eliminated funding for the News Transformation Fund, a state initiative to pay millions of dollars to California media companies. The fund was announced in 2024 as...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required