FBI

The Federal Bureau of Investigation (FBI) is warning of a rising trend of malicious actors creating deepfake content to perform sextortion attacks.

Sextortion is a form of online blackmail where malicious actors threaten their targets with publicly leaking explicit images and videos they stole (through hacking) or acquired (through coercion), typically demanding money payments for withholding the material.

In many cases of sextortion, compromising content is not real, with the threat actors only pretending to have access to scare victims into paying an extortion demand.

FBI warns that sextortionists are now scraping publicly available images of their targets, like innocuous pictures and videos posted on social media platforms. These images are then fed into deepfake content creation tools that turn them into AI-generated sexually explicit content.

Although the produced images or videos aren’t genuine, they look very real, so they can serve the threat actor’s blackmail purpose, as sending that material to the target’s family, coworkers, etc., could still cause victims great personal and reputational harm.

“As of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats,” reads the alert published on the FBI’s IC3 portal.

“Based on recent victim reporting, the malicious actors typically demanded: 1. Payment (e.g., money, gift cards) with threats to share the images or videos with family members or social media friends if funds were not received; or 2. The victim sends real sexually-themed images or videos.”

The FBI says that explicit content creators sometimes skip the extortion part and post the created videos straight to pornographic websites, exposing the victims to a large audience without their knowledge or consent.

In some cases, sextortionists use these now-public uploads to increase the pressure on the victim, demanding payment to remove the posted images/videos from the sites.

The FBI reports that this media manipulation activity has, unfortunately, impacted minors too.

How to protect yourself

The rate at which capable AI-enabled content creation tools are becoming available to the broader audience creates a hostile environment for all internet users, particularly those in sensitive categories.

There are multiple content creation tool projects available for free via GitHub, which can create realistic videos from just a single image of the target’s face, requiring no additional training or dataset.

Many of these tools feature built-in protections to prevent misuse, but those sold in underground forums and dark web markets don’t.

Porn creation tool offered on the dark web
Porn creation tool offered on the dark web
Source: Kaspersky

The FBI recommends that parents monitor their children’s online activity and talk to them about the risks associated with sharing personal media online.

Furthermore, parents are advised to conduct online searches to determine the amount of exposure their children have online and take action as needed to take down content.

Adults posting images or videos online should restrict viewing access to a small private circle of friends to reduce exposure. At the same time, children’s faces should always be blurred or masked.

Finally, if you discover deepfake content depicting you on pornographic sites, report it to the authorities and contact the hosting platform to request the removal of the offending media.

The UK has recently introduced a law in the form of an amendment to the Online Safety Bill that classifies the non-consensual sharing of deepfakes as a crime.

Source: www.bleepingcomputer.com