SELECT LANGUAGE BELOW

Taylor Swift is ‘furious’ about AI nude images

Taylor Swift is reportedly “outraged” that an AI-generated nude image of herself surfaced on social media on Thursday and is considering possible legal action against the site that generated the photo. It is said that there is.

“While we are currently deciding whether to take legal action, one thing is clear: AI-generated fake images are abusive, offensive, exploitative, and are done without Taylor’s consent or knowledge. “It was given to him,” said a source close to the 34-year-old. the one-year-old pop star said Daily Mail.

“Taylor's family and friends are outraged, and of course her fans are as well. They have a right to do that, and every woman should,” the source added, adding that “Taylor's family and friends are furious, and so are her fans.” He said the X account that originally posted “no longer exists.”

According to the news site, a source close to Swift added: “We need to close the door on this. We need to legislate and legislate to prevent this.”

After the photo took the internet by storm Thursday morning, the Swifties banned the band together and tried to bury the image by sharing tons of positive posts about the “Shake It Off” singer. And so.

A source close to Swift tells the Daily Mail that Taylor Swift is considering legal action after the X account @FloridaPigMan used artificial intelligence to generate nude images of the singer. Told. AP

As of Thursday afternoon, the photo in question shows Swift in a sexual position at a Kansas City Chiefs game, a nod to her highly publicized romance with the team's tight end Travis Kelce. It looks like he was pulled from the podium by something.

The crude images were traced back to an account with the handle @FloridaPigMan, but that account didn't leave any results for X.

The account reportedly obtained the images from Celeb Jihad, which boasts a collection of fake pornographic images, or “deepfakes,” using celebrities' likenesses.

Once posted on He was suspended for violating the rules. The Verge.

The image was posted on X for about 17 hours before it was removed, The Verge reported.

A source close to Swift told the Daily Mail: “It's shocking that a social media platform would let them go in the first place.”

X's help center outlines policies regarding “synthetic and manipulated media” and “nonconsensual nudity,” both of which prohibit posting X-rated deepfakes on the site.

Swift's explicit images remained public for about 17 hours before being removed from X. In that time, it has been viewed more than 45 million times and reposted approximately 24,000 times. AFP (via Getty Images)

Representatives for X did not immediately respond to The Post's request for comment. Swift has also yet to comment on the image in question.

However, her fan base has many opinions on the matter, slamming MPs for not enacting stricter policies regarding the use of AI.

Some accused the incident of exposing a larger problem. …It needs to be regulated because it is done without consent,” wrote one X user.

“I wish the public would have been horrified by AI porn without Taylor Swift being involved,” said another. “The abhorrent thing is that this may be the only thing that moves the needle and keeps the crackdown on track. Never say she doesn't have the power to drive change.”

Regulations regarding AI vary from state to state, so what kind of laws could be filed against @FloridaPigMan, or whether Swift will sue Celebjihad, X, or the user behind the X account? It is unclear whether

Swift previously threatened legal action after Celeb Jihad shared another fake nude image in 2011, but insisted the site was “satire” and not a porn site. There was also a disclaimer about respecting the A-list, so nothing happened. The Daily Mail says its intellectual property is protected.

The songstress' loyal Swiftie brigade said the incident shined a light on a larger issue surrounding deepfakes, saying women, both famous and unknown, are falling victim to the scheme. deep fake

Non-consensual deepfake pornography has also been made illegal in Texas, Minnesota, New York, Hawaii and Georgia, but high schools in New Jersey and Florida have decided to stop the distribution of AI-generated nude images. Not successful. of female students were circulated by their male classmates.

Last week, Rep. Joseph Morrell (D-N.Y.) and Rep. Tom Keene (R-N.J.) made the nonconsensual sharing of digitally altered pornographic images a federal crime, punishable by jail time, fines, or both. We have resubmitted a bill that would impose penalties such as:

The Intimate Image Deepfake Prevention Act has been referred to the House Judiciary Committee, which has not yet decided whether to pass the bill.

In addition to making the sharing of digitally altered intimate images a criminal offense, Morrell and Keene's bill would also allow victims to sue perpetrators in civil court.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News