Blackmailers are using deepfaked nudes to bully and extort victims, warns FBI


The FBI has issued an advisory warning of an “uptick” in extortion schemes involving fake nudes created with the help of AI editing tools.
The agency says that as of April this year, it’s received an increasing number of reports of such “sextortion” schemes. Malicious actors find benign images of a victim on social media then edit them using AI to create realistic and sexually-explicit content.
“The photos are then sent directly to the victims by malicious actors for sextortion or harassment,” writes the agency. “Once circulated, victims can face significant challenges in preventing the continual sharing of the manipulated content or removal from the internet.”
“The key motivators for this are a desire for more illicit content, financial gain, or to bully and harass others”
The FBI says blackmailers typically use such material to demand real nude images from a victim or payments of some sort. Says the agency: “The key motivators for this are a desire for more illicit content, financial gain, or to bully and harass others.”
The agency recommends that the public “exercise caution” when sharing images of themselves online, but this is difficult advice to follow. Only a few images or videos are needed to create a deepfake, and no-one can be completely safe from such extortion schemes unless they remove all images of themselves from the web. Even then, individuals could covertly capture photographs in real life if they know their target personally.
Nude deepfakes first began to spread online in 2017, when users on forums like Reddit began using new AI research methods to create sexually explicit content of female celebrities. Although there have been some attempts to counter the spread of this content online, tools and sites to create deepfake nudes are easily accessible.
The FBI notes that such extortion schemes “may violate several federal criminal statutes.” There are also a limited number of global laws that criminalize the creation of such non-consensual fake images. In Virginia in the US, for example, deepfakes are outlawed as a type of “revenge porn” while the UK is currently planning to make the sharing of such images illegal in its upcoming Online Safety Bill.
The FBI has issued an advisory warning of an “uptick” in extortion schemes involving fake nudes created with the help of AI editing tools. The agency says that as of April this year, it’s received an increasing number of reports of such “sextortion” schemes. Malicious actors find benign images of…
Recent Posts
- The secretive X-37B space plane snapped this picture of Earth from orbit
- Beyond 100TB, here’s how Western Digital is betting on heat dot magnetic recording to reach the storage skies
- The end of an era? TSMC, Broadcom could tear apart Intel’s legendary business after 57 years by separating its foundry and chip design
- Beterbiev vs Bivol 2 LIVE: Fight stream, cheapest PPV deals, how to watch light-heavyweight title rematch
- Spotify HiFi was announced four years ago, and it’s almost here — maybe
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010