“Something went wrong. Try reloading.” If since Saturday you’ve tried searching for the string “Taylor Swift” on X (Twitter), you’ve likely encountered this message as well.
It’s not #TwitterDown, but a measure taken by Elon Musk’s social platform in response to the surge of AI-generated deepfakes depicting the artist in sexually explicit poses and situations. After both the SAG-AFTRA (the U.S. union representing about 170,000 media and entertainment industry workers) and the White House had expressed concerns about the dangers of online dissemination of deepfake pornography, X implemented (not without some delay) a measure to theoretically limit the circulation of these images, preventing users from viewing any search results related to Swift’s name.
However, there are loopholes: as some users have pointed out, other search strings containing the singer’s name, such as “Taylor AI Swift,” still yield results. According to Joe Benarroch, Head of Business Operations at X, “it’s a temporary decision made with careful consideration. Our priority right now is safety.”