Support Our Work and Make a Donation!

In the News: A.I Generated Sexually Explicit Deepfakes

In the past couple weeks, thousands of articles, posts, and news stories have come out about the AI generated sexually explicit deepfake images of Taylor Swift that were circulating on social media platforms like X, Telegram, Reddit, and others. The hashtag #ProtectTaylorSwift began dominating X and the images have been taken off of various platforms.

Many people, advocates, and governmental officials have weighed in and are rightfully and seriously alarmed by the exploitation done to Taylor. It has sparked a long overdue conversation about the need of increased legislations surrounding AI and related privacy issues. However, as many of us already know, this is not anything new. The battle against AI related sexual harm and exploitation has been happening for a while now.

In 2018, a NetClean report stated that law enforcement had already began noticing the use of deepfakes in their investigations and believed that the use of them to sexually exploit people would increase. In June of 2023, The FBI released a statement regarding the creation and use of sexually explicit deepfakes in sextortion and harassment schemes.

  • Update your privacy settings on social media. Keep your friends list and profile where you post more personal things private. Update your passwords frequently, enable multi factor authentication, and turn on notifications for any login attempts. Educate your kids about these realities in ways that are age appropriate. NCMEC’s NetSmartz provides games, animated videos, and other activities to help empower children to make safer choices when it comes to their online presence.
  • Be cautious whenever someone asks you to send any type of content over the internet, text, or app messaging – even if that person appears to be trustworthy. Predators are good at creating fake profiles and making things seem real. Remember that content is able to be recorded, screenshotted, manipulated, and further shared without your permission or consent.
  • Report any offense involving a young person to the FBI and the National Center for Missing and Exploited Children. NCMEC provides a free service known as Take It Down, which could help victims, who have possession of the image or video files, remove or stop the online sharing of nude, partially nude, or sexually explicit content that was taken while under 18 years old.
  • Contact your representatives and push for federal and state legislation to protect people against the harms of these schemes. As of today, less than 20 states have legislation specifically focused on deepfake AI. A few have laws that criminalize nonconsensual deepfake sexually explicit images while others have legislation that give victims the right to sue those who use AI to create images based on their likeness. Other states have legislation related only to the use of deepfake AI within political campaigns.



Federal Bureau of Investigations (2023). Malicious actors manipulating photos and videos to create explicit content and sextortion schemes.

NetClean (2018). A report about child sexual abuse crimes.