Report from DigWatch
In Brief – Brazil has ordered X to stop AI chatbot Grok from generating sexualized images, adding to the global regulatory pressure on Elon Musk’s social media platform and AI chatbot. The directive, issued by Brazil’s National Data Protection Agency, chief prosecutor, and National Consumer Rights Bureau, requires X to implement immediate safeguards preventing the creation of sexualized content involving children, adolescents, or non-consenting adults. Brazilian authorities said that despite X claims that it deleted thousands of posts and suspending hundreds of accounts after an earlier warning, users were still able to produce sexualized deepfakes through Grok. Officials also criticized the company for lacking transparency in its response. Elon Musk and X have had a fraught relationship with Brazilian authorities, including a lengthy standoff last year with Supreme Court Justice Alexandre de Moraes over a court-ordered crackdown on populist political activism online, which temporarily led to X being shut down in the country before the platform backed down.
Context – Last summer, xAI launched Grok Imagine for paid subscribers. It included a “spicy mode” that permitted the creation of sexually suggestive content, including partial nudity, which was on brand with X’s more permissive content moderation. The image tool quickly generated complaints from civil society and child safety advocates but there was little public notice until Reuters reported on what it called a “flood of nearly nude images of real people” on Grok in markets around the world. Although Musk initially dismissed the charge, the company eventually responded to regulatory pressure with what Elon Musk described as a local law policy, with the AI chatbot capable of generating only images that he said are legal in each region, which in the US he claimed met an “R rating” and were “not real” people. The changes did not satisfy critics who claimed “sexualized” images could still be created by Grok, especially by the standalone Grok app that was not part of X. Defining “sexualized” and “intimate” or whether an image looks like a real person seems easy enough everywhere.
