Taylor Swift’s photos were released, causing anger among the fan community because there are no legal regulations to handle cases where images are defamed.

On January 25, the New York Post reported that photos created with deepfakes – synthetic media that have been digitally manipulated to replace one person’s portrait with another’s – show Taylor Swift in various sexually explicit positions during a Kansas City Chiefs game, following the “Country Music Princess’s” newly public relationship with football player Travis Kelce.

It is currently unclear who created the sensitive photos as well as who was the first to share them on platform X (Twitter). As of the morning of January 25 (US time), the keyword “Taylor Swift AI” has become a trend on the famous social networking site, with more than 58,000 posts.

Taylor Swift's pornographic photos were released - Photo 1.

Taylor Swift's pornographic photos were released - Photo 2.

Taylor Swift became a victim of deepfakes technology. 

This makes Swifties (Taylor Swift’s fan community) upset when their idol is used as entertainment. They shared a series of articles calling for justice and protection of the Safe & Sound singer :

“How is this not called sexual assault? We’re talking about a woman’s body/face being used for purposes she would likely never allow or feel comfortable with. Why is there no law to prevent this behavior?

“When I saw AI photos of Taylor Swift, I couldn’t believe my eyes. Those pictures are disgusting.”

“Protect Taylor Swift and stop sharing AI porn pics. The people who created them are sick.”

Besides, some people believe that although Taylor Swift’s case is wrong, it partly awakens people about the downside of artificial intelligence.

“I think we should be happy that Taylor Swift accidentally brought public attention to the dangers created by AI. It’s sad that it took a long time for people to pay attention to this situation. Here are other stories: boys used deepface as porn on more than 30 girls at a New York high school; Marvel actress Xochitl Gomez (17 years old) found an 18+ photo of herself created by AI on social networks; Streamer QTCinderella is upset because her AI pornographic photos are spread on the Internet,” an X account with a green check mark expressed his opinion.

Daily Mail quoted a separate source as saying that Taylor Swift is said to be considering legal action against the deepfakes website that creates objectionable AI images of her.

Taylor Swift's pornographic photos were released - Photo 3.

Taylor Swift has not officially spoken out but is said to be considering taking legal action against her image tampering. 

In reality, the US government has begun to take specific actions to prevent the use of AI for bad purposes.

In October 2023, US President Joe Biden signed an executive order to prevent AI from creating child sexual abuse material or creating non-consensual private images of real individuals. The order also requires the federal government to issue guidance on blurring or affixing logos to AI creations.

Non-consensual deepfake pornography is already illegal in Texas, Minnesota, New York, Hawaii and Georgia. However, the lack of a common legal framework for the entire United States has led to the circulation of nude images created by AI in high schools in New Jersey and Florida.

In mid-January, US Congressmen Joseph Morelle and Tom Kean proposed to the US House Judiciary Committee a draft Act to prevent deepfakes of private images. The bill would make the gratuitous sharing of digitally altered pornographic images a federal crime, with possible penalties such as jail time, fines, or both. Additionally, if the bill passes, victims will be allowed to sue perpetrators in civil court.