Pop star Taylor Swift is “furious” that fake nudes of her generated with artificial intelligence (AI) technology have spread across social media platforms and is considering taking legal action against the website that first hosted them.

“Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” a source close to Swift told The Daily Mail.

“Taylor’s circle of family and friends are furious, as are her fans obviously. They have the right to be, and every woman should be,” they added.

Learn the benefits of becoming a Valuetainment Member and subscribe today!

The X account that originally posted the material (“@FloridaPigMan”) has been taken down, a fact the source acknowledged. The source also called on the government to take action: “The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted.”

Taylor Swift fans—colloquially known as “Swifties”—formed a counteroffensive against the nudes by flooding the X algorithm with celebratory posts about the 34-year-old billionaire.

The original propagator of the images was a website called Celeb Jihad, which functions as an aggregator of “deepfake” or manipulated pornographic images of celebrities. The user who uploaded the photos to X—where the post received over 45 million views and was re-shared over 20,000 times—was suspended for violating the app’s content policies, per tech news site The Verge. X officially prohibits material that falls under the categories of “synthetic and manipulated media” or “nonconsensual nudity.”

Related: New Jersey High School Devastated by AI-Generated Nude Scandal

But the “source” close to Swift was not impressed with the scrubbing, which occurred 17 hours after they had initially been uploaded. “It is shocking that the social media platform even let them be up to begin with,” they told the Daily Mail.

Last week, Congressmembers Joseph Morelle (D-NY) and Tom Kean (R-NJ) reintroduced a bill, titled “Preventing Deepfakes of Intimate Images Act,” to make the non-consensual sharing of digitally altered porn a federal crime, with penalties of jail time, fines, or both. The House Committee on the Judiciary has yet to decide on whether to pass the bill or not.

Shane Devine is a writer covering politics, economics, and culture for Valuetainment. Follow Shane on X (Twitter).

Add comment