Graphic AI-generated images of Taylor Swift appeared online this week.
Millions came across fake sexually explicit AI-generated images of Taylor Swift on social media this week, underscoring for many the need to regulate potential nefarious uses of AI technology.
The White House Press Secretary told ABC News Friday they are “alarmed” by what happened to Swift online and that Congress “should take legislative action.”
“We are alarmed by the reports of the…circulation of images that you just laid out – of false images to be more exact, and it is alarming,” White House Press Secretary Karine Jean-Pierre told ABC News White House Correspondent Karen L. Travers.
“While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people,” she added.
Taylor Swift performs onstage during “Taylor Swift | The Eras Tour” at Allianz Parque on Nov. 24, 2023 in Sao Paulo.
Buda Mendes/tas23/Getty Images
MORE: How AI could affect jobs
Jean-Pierre highlighted some of the actions the administration has taken recently on these issues including: launching a task force to address online harassment and abuse and the Department of Justice launching the first national 24/7 helpline for survivors of image-based sexual abuse.
And the White House is not alone, outraged fans were surprised to find out that there is no federal law in the U.S. that would prevent or deter someone from creating and sharing non-consensual deepfake images.
But just last week, Rep. Joe Morelle renewed a push to pass a bill that would make nonconsensual sharing of digitally-altered explicit images a federal crime, with jail time and fines.
“We’re certainly hopeful the Taylor Swift news will help spark momentum and grow support for our bill, which as you know, would address her exact situation with both criminal and civil penalties,” a spokesperson for Morelle told ABC News.
A Democrat from New York, the congressman authored the bipartisan “Preventing Deepfakes of Intimate Images Act,” which is currently referred to the House Committee on the Judiciary.
Deepfake pornography is often described as image-based sexual abuse — a term that also includes the creation and sharing of non-fabricated intimate images.
A few years back, a user needed to have a certain level of technical skills to create AI-generated content with rapid advances in AI technology, but now it’s a matter of downloading an app or clicking a few buttons.
Now experts say there’s an entire commercial industry that thrives on creating and sharing digitally manufactured content that appears to feature sexual abuse. Some of the websites airing these fakes have thousands of paying members.
Last year, a town in Spain made international headlines after a number of young schoolgirls said they received fabricated nude images of themselves that were created using an easily accessible “undressing app” powered by artificial intelligence, raising a larger discussion about the harm these tools can cause.
The sexually explicit Swift images were likely fabricated using an artificial intelligence text-to-image tool. Some of the images were shared on the social media platform X, formerly known as Twitter.
One post sharing screenshots of the fabricated images was reportedly viewed over 45 million times before the account was suspended on Thursday.
Early Friday morning, X’s safety team said it was “actively removing all identified images” and “taking appropriate actions against the accounts responsible for posting them.”
“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” read the statement. “We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We’re committed to maintaining a safe and respectful environment for all users.”
Stefan Turkheimer, RAINN Vice President of Public Policy, a nonprofit anti-sexual assault organization, said that on a daily basis “more than more than 100,000 images and videos like this are spread across the web, a virus in their own right. We are angry on behalf of Taylor Swift, and angrier still for the millions of people who do not have the resources to reclaim autonomy over their images.”
News
Jennifer Lopez, 54, poses VERY provocatively in green lingerie for Intiмissiмi collab as she flashes wedding band froм Ben Affleck: ‘Made with love’
Jennifer Lopez was showing off her мodeling s𝓀𝒾𝓁𝓁s this week as she posed in lingerie. The 54-year-old forмer Fly Girl froм In Living Color pυshed oυt her hips…
I can’t be pregnant for a g@y. Beyonce stated and reveals Blue Ivy’s true Dad to Jay-Z
I can’t be pregnant for a g@y. Beyonce stated and reveals Blue Ivy’s true Dad to Jay-Z In the dazzling world of music and fame, a statement…
Tom Brady has ‘accepted’ Gisele Bündchen’s romance with jiu-jitsu instructor began ‘years ago’: report
Tom Brady reportedly has “accepted” ex-wife Gisele Bündchen’s romance with Joaquim Valente began “years ago.” According to the Daily Mail, the former NFL star is learning to…
Top MLB Candidates Due for Contract Extension Before Opening Day
Corbin Carroll and Jake Cronenworth were among several stars to sign new deals before last season. Who’s in line this year to put pen to paper during…
Will John Cena and Randy Orton have a collaboration on OnlyFans?
Fans of both wrestlers respond to this idea. enowned wrestler John Cena surprised his fans by revealing that he created an account on OnlyFans, an online platform where creators…
Chris Brown Features Karrueche Tran Look-A-Like In ‘To My Bed’ Music Video
Watch Chris Brown’s latest music video… The controversial singer has decided to feature a Karrueche Tran look-a-like in the visuals taken from his latest album ‘Heartbreak On…
End of content
No more pages to load