Taylor Swift’s Deepfake Porn Disaster Spotlights Major Threat of AI

AI-generated porn has been ravaging the internet—and innocent victims’ lives—for a while now, but it’s finally getting some serious mainstream attention. Why? First, because the tech is now sophisticated enough to create deepfake images and videos that are nearly indistinguishable from real ones. And second, because America’s Sweetheart, Taylor Swift, is the latest celebrity victim.

This week, AI-generated porn images of superstar Taylor Swift circulated on X, formerly Twitter, igniting outrage among “Swifties.” The images depict Swift in a pornographic scene involving a group of football players.

This prompted her loyal fanbase to flood the platform with real images of the popstar, and the phrase “protect Taylor Swift,” in an attempt to bury the nonconsensual images. But their attempt was unsuccessful. One image garnered over 45 million views, 24,000 reposts, and hundreds of thousands of likes and bookmarks before the verified user who shared the images had their account suspended for violating platform policy.

But Taylor Swift isn’t the only celebrity to be targeted by creators of AI porn. In the past, celebrities like Scarlett Johansson and Emma Watson were also victims of fake porn videos.

These AI-generated deepfakes, often labelled as “leaked,” are real enough to trick viewers into thinking they’re seeing the real thing.

While it’s tragic that Taylor Swift and other celebrity women have become targets, it’s even more concerning that this injustice is already inflicting trauma on high schoolers and children.

Most recently, teenage girls in New Jersey reported that fellow classmates had begun spreading AI-generated nudes of about 30 girls in their school. Dorata Mani, whose 14-year-old daughter was one of the victims, said she was “terrified by how this is going to surface and when. My daughter has a bright future and no one can guarantee this won’t impact her professionally, academically or socially.

AI porn is vicious and no one is off-limits.

It’s also incredibly easy to create. Some deepfake porn creators charge a mere $20 for fabricated videos of anyone they wish, including children. And if that wasn’t bad enough, there are “nudify” apps available on the app store, allowing anyone with a clear photo of someone’s face to generate explicit pictures of that person with a click of a button.

While the technology behind AI porn is impressive, the consequences of its use are disturbing and far-reaching. The New York Post put it aptly, stating, “As a culture, we’ve given so much oxygen to the narrative that ChatGPT will take your job—and not enough to the sickening fact that AI is coming for your image and reputation. Your sanity and mental health, too. There’s no limit to the chaos and utter destruction these pernicious, reality-bending Artificial Intelligence tools can produce. And we’re only seeing the tip of the iceberg.”

RELATED: AI Porn Is Here and It’s Dangerous

According to Sensity.ai, 90% to 95% of AI deepfake technology is used for non-consensual pornography.

Deepfake pornography involves the use of advanced algorithms and machine learning to create realistic videos and images of individuals engaged in sexual acts. It can be created using photos or videos of people without their consent. With the prevalence of social media, this means that anyone with an online presence is vulnerable to having their image used in a pornographic context without their knowledge or permission.

But the non-consensual use of individuals’ images can have serious consequences. Victims of AI porn may experience feelings of violation, humiliation, powerlessness, and even post-traumatic stress disorder (PTSD) as a result of their experience. In some cases, the use of AI porn can even lead to harassment or blackmail, with perpetrators threatening to distribute the content unless the victim complies with their demands.

One female Twitch streamer who discovered herself in deepfake porn said, “I was wishing for eye bleach. I saw myself in positions I would never agree to, doing things I would never want to do.”

RELATED: Deepfake Porn and the Twitch Streamer Who Accidentally Brought it to Light

Major AI platforms have attempted to mitigate the generation of abusive or illegal content (like child sexual abuse material or ‘child porn’) on their sites however, this is proving to be futile. One of the largest datasets used to train AI systems, LAION, was recently found to contain over 3,000 explicit images of child sexual abuse.

In recent years, there has been an exponential increase in child sexual abuse material (CSAM) available on the internet and deepfake technology is pouring gasoline on the already-raging fire of the child abuse image crisis.

The Legal Response to AI Porn

In October 2023, President Joe Biden signed an executive order regulating AI development after Elon Musk, Sam Altman and other tech industry leaders have warned the technology’s unsupervised advancement “could pose a risk to humanity.” And while a few states have initiated laws preventing the practice, there’s currently no federal law on the books.

Coincidentally, however, the Taylor Swift deepfake porn fiasco was perfectly timed with the reintroduction of the Preventing Deepfakes of Intimate Images Act to the House Judiciary Committee by US Reps. Joseph Morelle (D-NY) and Tom Kean (R-NJ).

The bill would make the nonconsensual sharing of digitally altered pornographic images a federal crime, with penalties like jail time, a fine or both. It would also allow victims to sue perpetrators in court.

RELATED PODCAST:


Similarly, there are several bills being proposed by legislators that aim to address AI generated content that “mimics another person, living or dead.” These bills are in response to outrage by celebrities after recent viral videos featured AI-generated songs mimicking the voices of pop artists like Justin Bieber, Bad Bunny, Drake, and The Weekend. However, these bills would focus more on protecting the creative and intellectual property of celebrities and fall horribly short of protecting the rest of us from nonconsensually created deepfakes.

Legislators and Big Tech Must Intervene

This Taylor Swift deepfake incident is not the first instance of nonconsensual deepfake pornography and it certainly won’t be the last. Legislators MUST take action to hold creators of deepfake pornography accountable and place regulations on AI developments.

Since the release of Taylor Swift’s deepfakes, X posted this statement:

But it should not take this happening to “America’s Sweetheart” to instigate change. There are countless normal, everyday people who have been victimized by this digital weapon—co-workers, classmates, and even children.

This is why we support bills like the PROTECT Act, which would require websites that allow sexually explicit material to verify consent from anyone uploading or appearing in uploaded content. The PROTECT Act also requires websites to delete images that have been uploaded without consent.

The need for robust legislation—targeting both the creators of deepfake porn and the social platforms and porn sites it’s distributed on—is paramount. But while we’re waiting for legislators and big tech to sort out this mess, what are we to do?

As a starting point, we can focus on porn-proofing our kids by giving them tools to navigate the pornified world they’re being raised in. And we can speak openly and honestly with our friends, family, and peers about the way pornography is poisoning our society. We wrote Raised on Porn (the book) and created Raised on Porn (the documentary) to help with that, and compiled this list of resources to help you and your loved ones live a life free from porn’s influence.


You Can Support Victims of Image-Based Sexual Abuse

Exodus Cry actively reaches out to sexually exploited women and girls, assists them in finding freedom, and provides resources to help them heal. Working with some of the nation’s best therapists, we provide specialized therapy for these incredible survivors. To date, we’ve provided over 1000 hours of life-changing trauma therapy for victims of sex trafficking, exploitation, and image-based sexual abuse. After just six months, survivors reported, on average:

  • 87% decrease in self-harm
  • 63% decrease in suicidal thoughts
  • 61% decrease in depression
  • 53% decrease in substance-based coping
  • 51% decrease in anxiety

You can help these women and girls build beautiful new lives by giving below.

DONATE

multimedia pencil news balance mail paperplane banknote fire shop wallet right-arrow porn-computer director-chair book-outline dollar-sign flag cart profile archive facebook-official twitter-square