Taylor Swift’s name yields no search results on X days after explicit deepfake images go viral

The AI-generated images began circulating on the platform on Wednesday.

Days after sexually explicit deepfake images of Taylor Swift went viral on X (formerly Twitter), a search for her name on the platform has stopped yielding results.

As of Saturday afternoon, all searches for “Taylor Swift” trigger a message that reads, “Something went wrong. Try reloading.”

Representatives for Swift and X did not immediately respond to EW’s request for comment.

Taylor Swift

The result of a search for Taylor Swift’s name on X.X

The search stoppage comes after deepfake images of Swift in pornographic scenarios that were created through the use of artificial intelligence began circulating on the platform on Wednesday.

The images had been viewed more than 27 million times and accrued over 260,000 likes in the span of 19 hours before the account that shared them was suspended, NBC News reports.

While it’s unclear where the images came from, the outlet noted that they reportedly featured a watermark that seemed to suggest they originated on another website that’s known for creating fake celebrity nudes.

Microsoft CEO Satya Nadella told NBC News on Friday that “guardrails” need to be in place when it comes to AI technology “so that there’s more safe content that’s being produced… And there’s a lot to be done and a lot being done there.”

That same day, SAG-AFTRA called the images “upsetting, harmful, and deeply concerning,” reported Variety. “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” the organization said. “As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

At a White House press briefing on Friday, a reporter asked if President Biden supported legislation to ban explicit AI-generated images.

“It is alarming,” said Karine Jean-Pierre, the White House press secretary. “We are alarmed by the reports of the circulation of images that you just laid out… There should be legislation, obviously, to deal with this issue.”