The abilities of generative AI to create life-like images are impressive, but the U.S. Federal Bureau of Investigation says that criminals are using deepfakes to target victims for extortion.

"The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content," the agency said in a PSA alert on Monday.

The FBI says law enforcement agencies received over 7,000 reports last year of online extortion targeting minors, with an uptick in victims of so-called "sextortion scams" using deepfakes since April.

A deepfake is an increasingly common type of video or audio content created with artificial intelligence that depicts false events that are increasingly harder to discern as fake, thanks to generative AI platforms like Midjourney 5.1 and OpenAI's DALL-E 2.

AD

In May, a deepfake of Tesla and Twitter CEO Elon Musk made to scam crypto investors went viral. The video shared on social media contained footage of Musk from previous interviews, edited to fit the scam.

Deepfakes are not all malicious, a deepfake of Pope Francis wearing a white Balenciaga jacket went viral earlier this year, and more recently, AI-generated deepfakes have also been used to bring murder victims back to life.

In its recommendations, the FBI warned against paying any ransom because doing so does not guarantee the criminals will not post the deepfake anyway.

The FBI also advises caution when sharing personal information and content online, including using privacy features like making accounts private, monitoring children's online activity, and watching for unusual behavior from people you've interacted with in the past. The agency also recommends running frequent searches for personal and family member information online.

AD

Other agencies sounding the alarm include the U.S. Federal Trade Commission, which warned that criminals have been using deepfakes to trick unsuspecting victims into sending money after creating an audio deepfake of a friend or family member that says they have been kidnapped.

"Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We're living with it, here and now. A scammer could use AI to clone the voice of your loved one," the FTC said in a consumer alert in March, adding that all the criminal needs are a short audio clip of a family member's voice to make the recording sound real.

The FBI has not yet responded to Decrypt's request for comment.

Stay on top of crypto news, get daily updates in your inbox.