Fake pornographic photos of Taylor Swift are breaking the internet

This is one of the challenges of moderation in the face of the advent of artificial intelligence: how to deal with deepfakes? This question is even more timely because a pornographic video produced by AI and featuring superstar Taylor Swift is actively circulating on X (formerly Twitter), as reported by The Verge this Thursday.

45 million views

The fake video generated considerable interest on social networks, where it received over 45 million views as well as hundreds of thousands of likes along with 24,000 reposts. The original video and the X account that published it were eventually removed from the platform after more than 24 hours.

According to our colleagues, this delayed response is a direct result of mass layoffs in moderation teams and relaxed rules within the company. “The publication of images of non-consensual nudity (NCN) is strictly prohibited

“Swifties” hit back

Overwhelmed by the wave of shares, Taylor Swift fans decided to counterattack to save the image of their idol. These “Swifties”, as they call themselves, used the video sharing hashtag to flood social networks with authentic and non-explicit images of the artist.

Taylor Swift’s supporters also hit out at Elon Musk, demanding the immediate removal of these images, which could damage the 34-year-old singer’s reputation. A request which X finally accepted after a long time.

According to information from 404 Media, the video comes from a particularly active group on Telegram messaging. Its members will share media created by Microsoft Designer AI.




Source link

Leave a Comment