Report from Reuters
In Brief – Elon Musk’s artificial intelligence company xAI has announced that it is preventing Grok, the AI chatbot integrated into X, from editing images of real people to put them in revealing clothing such as bikinis. The move represents a major step back from the company on an AI image-generating feature that allowed users to create and post realistic manipulated images that could include depictions of women, and sometime children, in revealing clothing as well as in degrading and sexualized poses. The ability to easily create such images triggered a growing backlash in recent weeks from government officials in Europe, Asia, and the United States, in particular the Governor and Attorney General of California, with many alleging that the images were illegal and threatening to shut down the platform.
Context – In Ernest Hemingway’s novel The Sun Also Rises, Mike Campbell says he went bankrupt “Gradually, then suddenly”. Last summer, xAI launched Grok Imagine, an image tool initially available only to paid subscribers. It included a “spicy mode” that permitted the creation of sexually suggestive content, including partial nudity, which was on brand with X’s more permissive content moderation. The image tool quickly generated complaints from civil society and child safety advocates. Cut ahead to January 3, 2026. Reuters reported on what it called a “flood of nearly nude images of real people” on Grok in markets around the world. Although Musk initially dismissed the charge, the company soon said it would tighten “guardrails” and later blocked non-subscribers from the tool entirely. The changes did not satisfy critics who claimed “sexualized” images could still be created. In 1964, US Supreme Court Justice Potter Stewart said about pornography, “I know it when I see it.” Guardrail testers of the latest Grok change report that Grok on X wouldn’t alter photos but standalone Grok would, and Musk defended a local law policy, with the service capable of generating images that are legal in each region, which in the US he claimed met an “R rating” and were “not real” people. Defining “sexualized” and “intimate” or whether an image looks like a real person seems easy enough everywhere.
