bbieron@platformeconomyinsights.com

Another Call for an AI Pause, This Time Against “Superintelligence”

Nov 1, 2025

Report from Financial Times

In Brief – A group of over 1000 individuals, led Nobel laureates, policymakers, and celebrities, have signed a public statement calling for the development of AI “superintelligence” to be halted until there is “broad scientific consensus that it will be done safely and controllably” and “strong public buy-in.” Among the highest profile signatories are top AI scientists Gregory Hinton and Yoshua Bengio, titans of tech business Sir Richard Branson and Steve Wozniak, public policy heavyweights Steve Bannon, Glenn Beck, and Susan Rice, and celebrities including the Duke and Duchess of Sussex. Many of the same notables signed onto a letter in March 2023 that called for a six-month pause in training any AI models more powerful than GPT-4, which was also organized by the Future of Life Institute, a nonprofit organization which has long warned of a wide range of AI. FLI’s president noted that their latest effort does not call for a halt to all AI development and that many positive applications can be achieved from specialized services, but he claims that superintelligence is uniquely risky and that most Americans, per recent polling released by the group, are in favor of its regulation.

Context – We responded to the FLI’s initial 2023 campaign saying that many of the signers may be highly educated but they don’t know much about government. We were not surprised when nobody AI Act. But the trend went in the other direction by fall, with President Trump staking out a pro-industry campaign position which is now reflected in his anti-regulatory AI Action Plan. California’s Governor soon vetoed major AI safety legislation and the UK and Japan both stepped back from AI regulation. In Europe, the pace of AI rules is being criticized by top leaders, and the European Commission is rolling out plans to spur more EU-made AI. The most serious AI risk seems to be the prospect of hype and investment “bubbles”not superintelligence, and the top policy issue involves whether copyright holders will get paid for content used in AI training.

View By Monthly
Latest Blog
OpenAI Reaches Defense Department Deal Flanking Anthropic

Report from the New York Times In Brief – OpenAI says it has reached agreement with the US Department of Defense (DoD) to supply AI for classified systems in a manner that the company says addresses its opposition to the technology being misused in autonomous weapons...

Federal Judge Blocks Virginia’s One-Hour Time Limit for Social Media

Report from Reuters In Brief – US District Judge Patricia Tolliver Giles has issued a preliminary injunction blocking Virginia from enforcing Senate Bill 854 that imposes a time limit on teens using social media platforms with so-called “addictive” features. Platforms...

FTC Chairman Accuses Apple of News Media Viewpoint Discrimination

Report from the New York Times In Brief – The Federal Trade Commission announced that it sent a warning letter to Apple CEO Tim Cook expressing concerns that the operations of the Apple News may favor certain political viewpoints in a way that conflicts with Apple’s...

PM Starmer Proposes Bringing AI Chatbots Under the UK Online Safety Act

Report from Bloomberg In Brief – UK Prime Minister Keir Starmer has announced plans to bring AI chatbots directly under the Online Safety Act (OSA) to close what he called a “legal loophole” in Britain’s online safety regime and ensure that they are designed to not...

Reddit Fined By UK ICO for Failing to Age Check 13-Year-Olds

Report from the BBC In Brief – The UK’s data protection regulator, the Information Commissioner's Office (ICO), has fined Reddit more than £14 million for failing to adequately enforce its rules regarding children under 13 accessing the platform. Following an...

Platform Economy Insights produces a short email four times a week that reviews two top stories with concise analysis. It is the best way to keep on top of the news you should know. Sign up for this free email here.

* indicates required