Report from the New York Times
In Brief – European Commission has announced its preliminary finding that Meta is failing to adequately prevent children under 13 from accessing Instagram and Facebook, potentially violating the Digital Services Act. The EU regulators concluded that Meta lacks effective systems to verify users’ ages, relying too heavily on self-reported birthdates that are easy to falsify, and also criticized the company’s reporting tools for underage users as being overly complex and ineffective, noting that flagged accounts of underage users are often not reviewed or removed. The regulators allege that 10–12% of children under 13 in the EU are using these platforms, although the company challenges those numbers. Failing to effectively protect young people, which is a high-level directive for digital platforms under the DSA, has led to the issue being included in investigations of Snap, TikTok and X as well. Along with disputing the findings, Meta argues that age verification is an industry-wide challenge, that it invests heavily in youth detection and removal systems, and plans to introduce additional measures soon.
Context – I can’t help but hear the EU regulators as Casablanca’s Captain Renault. They are “shocked, shocked” that kids have been using Instagram. And YouTube, Snapchat and all the other top digital platforms? Their parents know, and many of them appreciate digital distractions. The drive to protect teens from online ills is a global digital policy phenomenon. Several EU countries have put the issue at the top of their agenda and are pressuring the Commission to do more. But this is not about that at all. The Commission recently rolled out their new age verification app, which uses government ID documents and face scans. And they are talking about a social media age limit of 15 or 16. That said, the DSA text defines online risks at a very high level, such as “any actual or foreseeable negative effects in relation to… minors”. Most of the top social media platforms are being investigated at this point for not complying with the law, including failing to effectively protect young users by meeting a new technical and process threshold on enforcing their user age rules. Round up the usual suspects.
