Report from the New York Times
In Brief – The government in Denmark wants to protect citizens from deepfake images by expanding copyright law to give individuals control over their own likenesses. It is described as a proactive effort to combat so-called “deepfake” images, video, or audio of a person. Deepfakes have long been a top AI-related policy concern, with officials often trying to use the criminal code to target specific harms, such as nonconsensual pornography, political disinformation, and using celebrity deepfakes to carry out scams. The Danish proposal instead amends the country’s copyright law to make it illegal to share most deepfake images of another person without their consent. Rather than sanction the person who created the digital image, the person whose likeness was copied would have a legal basis under copyright law to submit a removal request to any digital platform where the material appears, demanding that it be taken down, backed by potential financial compensation for the rights owner. The legislation is undergoing public review before formal submission to the Danish parliament this autumn. Denmark also holds the rotating presidency of the Council of the European Union through the end of the year.
Context – Earlier this year, the US Congress overwhelmingly passed, and President Trump signed, the TAKE IT DOWN Act, criminalizing the online posting of nonconsensual intimate imagery (NCII) and forcing digital platforms to quickly take it down. Political “deepfakes” are a trickier issue in the US due to the First Amendment. The biggest AI developers have agreed in principle to identify and label AI-generated images created by their services, but “watermarking” is considered of limited value by many experts because it can be circumvented and there are AI tools that don’t use the technology. More than 20 states have enacted laws addressing election deepfakes, but none have been enforced, and California’s 2024 election deepfake law was quickly blocked by a federal judge after its enactment. X has sued the State of Minnesota to block enforcement of its election deepfake content law.
