[PHISHING ALERT] “Hey Did You See That Fake AI Porn Movie Of Yourself?”

Heads-up. I am sorry to have to bring up a very distasteful topic, but in the very near future your users will get emails with something close to the ultimate click-bait, luring them to see an AI-generated porn video starring… themselves.

Forget about fake news. Its been all over Reddit the last month, we are now in the age of fake porn.

It’s possible to near-seamlessly stitch anybody’s face onto existing porn videos, using AI—see the AI Porn [NFSW!] Motherboard report—like Emma Watson’s face on an actress’s nude body.

Now, Photoshop has been around for years, and the idea is not new, but AI has brought it technically to a whole new level, and ethically lower than ever before.

Since December, production of AI-assisted fake porn has “exploded,” Motherboard reports. Thousands of people are doing it, and the results are ever more difficult to spot as fakes.

You don’t need expertise with sophisticated AI anymore. A a redditor called deepfakeapp created code named FakeApp he built to be used by those without computer science training.

The Porn AI Genie is out of the bottle

So here is how the bad guys are going to use this. Raise dedicated high-powered graphics servers using AI code to mass-generate fake short clips, using pictures from social media, hosted on sites with Exploit Kits, and send spear phishing attacks to high-risk targets like people in HR, Accounting and C-levels.

The spear phishing attacks are going to have a highly suggestive or embarrassing picture of the target, and the “only thing” they need to do to see it is click on the Play Icon.

A good number are going to initially fall for this social engineering tactic. The combo of both the shock value and trying to prevent a negative consequence can be close to overwhelming, exactly the effect the bad guys are trying to create and manipulate someone into a click.

And in yet another case where tech is ahead of the law and humanities, these videos are actually not even illegal, despite the humiliation it can bring. WIRED has a new post that covers this.

I suggest you consider sending this email to your employees, friends and family. You want to run this by HR and Legal, and get management buy-in before you send because of the very controversial nature of this threat. Feel free to copy/paste/edit:

[IMPORTANT PHISHING WARNING] I am sorry to have to bring up a very distasteful topic, but criminals on the internet are expected to soon come out with a new wave of phishing attacks that are highly inappropriate.

 

It was always possible to use PhotoShop and put anyone’s face on a nude body. But now technology has advanced to the point where this is possible with video as well. You can imagine how this can be misused, and fake pornography has exploded on the scene.

 

Bad guys are going to use this to manipulate innocent people and shock them to click on a video link in a phishing email, trying to prevent possibly very negative consequences if co-workers, friends and family might “find out, or might see”.

 

DON’T FALL FOR IT. If one of these inappropriate phishing emails makes it in your inbox, DO NOT CLICK and follow the normal IT procedure to report malicious emails like this.

Exerpted from the KnoeBe4 blog.

by Stu Sjouwerman
Founder and CEO, KnowBe4, Inc

NewStu-6

Leave a Reply

Your email address will not be published. Required fields are marked *

Font Resize