bbieron@platformeconomyinsights.com

Italian Consumer Protection Authority Targets DeepSeek for Hallucinations

Jul 1, 2025

Report from Reuters

In Brief – Italy’s joint antitrust and consumer rights regulator has opened a probe into Chinese AI start-up DeepSeek for allegedly failing to sufficiently warn users about the danger of false information arising from so-called “hallucinations” in its chatbot. The AGCM claims that DeepSeek did not give users “sufficiently clear, immediate and intelligible” warnings about the risk of misleading information in AI-generated content. The regulator describes AI hallucinations as “situations in which, in response to a given input entered by a user, the AI model generates one or more outputs containing inaccurate, misleading or invented information”. In January, another Italian regulator, the country’s data protection authority, ordered DeepSeek to block access to its chatbot in Italy after the company failed to address concerns that it was not complying with the General Data Protection Regulation. The company argued that it was not subject to local regulation in Italy and the app is reportedly still not available in app stores in Italy.

Context – Italy’s data protection regulator made a splash in early 2023 by banning OpenAI for a few months for failing to conform with the EU’s privacy law. Among other issues, the regulator questioned how the GDPR applied to hallucinations, a novel concept at the time. The EU has since enacted its comprehensive AI Act to regulate all AI applications, including chatbots. However, regulatory overlaps, in particular involving member state data protection authorities, remains an industry concern. DeepSeek being based in China is also an issue with some governments. However, on the topic of AI hallucinations, it is worth nothing that providing very clear warnings seems the most reasonable and sensible policy. All Generative AI (GAI) systems sometimes produce results that are false and many of the latest and more advanced systems seem to be plagued more by the problem, not less. In a recent court order dismissing a hallucination-based defamation lawsuit against OpenAI in Georgia, the judge cited the company’s “extensive warnings to users that errors of this kind could occur.”

View By Monthly
Latest Blog
OpenAI Reaches Defense Department Deal Flanking Anthropic

Report from the New York Times In Brief – OpenAI says it has reached agreement with the US Department of Defense (DoD) to supply AI for classified systems in a manner that the company says addresses its opposition to the technology being misused in autonomous weapons...

Federal Judge Blocks Virginia’s One-Hour Time Limit for Social Media

Report from Reuters In Brief – US District Judge Patricia Tolliver Giles has issued a preliminary injunction blocking Virginia from enforcing Senate Bill 854 that imposes a time limit on teens using social media platforms with so-called “addictive” features. Platforms...

FTC Chairman Accuses Apple of News Media Viewpoint Discrimination

Report from the New York Times In Brief – The Federal Trade Commission announced that it sent a warning letter to Apple CEO Tim Cook expressing concerns that the operations of the Apple News may favor certain political viewpoints in a way that conflicts with Apple’s...

PM Starmer Proposes Bringing AI Chatbots Under the UK Online Safety Act

Report from Bloomberg In Brief – UK Prime Minister Keir Starmer has announced plans to bring AI chatbots directly under the Online Safety Act (OSA) to close what he called a “legal loophole” in Britain’s online safety regime and ensure that they are designed to not...

Reddit Fined By UK ICO for Failing to Age Check 13-Year-Olds

Report from the BBC In Brief – The UK’s data protection regulator, the Information Commissioner's Office (ICO), has fined Reddit more than £14 million for failing to adequately enforce its rules regarding children under 13 accessing the platform. Following an...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required