Taylor Swift is reportedly “furious” about AI-generated, NSFW images that have been making the rounds on social media. The photos — which showed Swift in various sexual positions — began making the rounds on Wednesday and went viral on Thursday morning.
According to a report from the Daily Mail, a source close to the pop star said she is considering possible legal action against the website responsible for generating the images, Celeb Jihad.
“Taylor’s circle of family and friends are furious, as are her fans, obviously. They have the right to be, and every woman should be,” the source said.
“The door needs to be shut on this. Legislation needs to be passed to prevent this, and laws must be enacted. It is shocking that the social media platform even let them be up to begin with”
Taylor Swift is said to be considering legal action against the deepfake website that generated explicit AI images of her which circulated online, Daily Mail reports. pic.twitter.com/fQ961NdZTU
— Pop Base (@PopBase) January 25, 2024
Swift’s fans also came to her defense on social media, slamming the website responsible for generating the “disgusting” photos and the accounts that were sharing them. The account where the images originated, @FloridaPigMan, “no longer exists.”
I hope there’s a place in hell specially reserved for people who thinks making explicit photos of women is ok and acceptable. I hope that hell is scary and torturous
PROTECT TAYLOR SWIFT
TAYLOR SWIFT AI pic.twitter.com/Hg2rkhco9O— sai | NOLA 10/25 (@ttearsinexile) January 25, 2024
The images on X for approximately 17 hours before the account was removed were viewed more than 45 million times.
If Swift does take legal action, it will be interesting to see whether she files the lawsuit against the person behind @FloridaPigMan account, X, or Celeb Jihad.
Nonconsensual deepfake pornography is currently illegal in Texas, Minnesota, New York, Hawaii, and Georgia.
