X blocks searches for Taylor Swift due to explicit deepfake images

X blocks searches for Taylor Swift due to explicit deepfake images

SHARE IT

29 January 2024

AI-created deepfakes of Taylor Swift, including sexual photos, flooded X (previously known as Twitter) this week, raising concerns about its potential misuse. As a result, beginning Saturday, the social network has taken the extraordinary step of blocking all Taylor Swift searches.

Joe Benarroch, X's chief of business operations, confirmed this decision to the Wall Street Journal.

This is a temporary move taken with extreme caution because we value safety on this topic.

According to the story, Swift admirers inundated X with real images of the singer after the phony pictures appeared on the social network. They also attempted to report accounts that were sharing deepfake pictures.

Before X's decision to prohibit any searches for Taylor Swift, Microsoft CEO Satya Nadella commented on the flood of AI-made photographs of the singer in an interview with NBC News. He said:

Yes, we need to act. I believe that having a safe online world benefits everyone. So I don't believe anyone would desire an internet environment that is absolutely unsafe for both content providers and content consumers. As a result, I believe it is in our best interest to act quickly on this.

According to 404Media, the deepfake images of Taylor Swift shared on X were made by a group using Microsoft Designer, an AI-based image maker. Microsoft told NBC News that its own study into the matter has "not been able to reproduce the explicit images in these reports." However, it went on to say, "Out of an abundance of caution, we have taken steps to strengthen our text filtering prompts and address the misuse of our services."

View them all