CEO Duped By Scammer
Deepfake Scammers Step it Up
Cristal M Clark
In case you haven’t heard one of the newest trends used by scammers is called Deepfake, From Wikipedia: “Deepfake (a portmanteau of “deep learning” and “fake”) is a technique for human image synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique known as generative adversarial network.”
Until recently, Deepfakes have been used to spread propaganda, pornography, fake news, fake press conferences, memes, jokes, you name it Deepfakes are used for pretty much any type of scam you could dream up, using mostly images and video.
In this particular case however, a scammer was able to Deepfake a CEO’s voice and trick him into transferring $243,000. Here’s how it played out, the CEO of an unidentified energy firm based in the UK thought he was taking a call from his boss, another CEO who is his businesses’ parent company CEO which is based in Germany.
The caller whose voice sounded familiar told him to send $243,000 (€220,000) to a Hungarian supplier within the hour. So that’s what the guy did.
The money was reportedly sent to a Hungarian bank account, moved to an account in Mexico, and then magically distributed to various other locations. Oddly, no suspects have been identified.
The scammer also happened to call the victim company three times. Once the transfer occurred, the scammer called a second time to falsely claim that the money had been reimbursed. Then the scammer apparently called a third time to ask for another payment. Even though the same fake voice was used, the last call was made with an Austrian phone number and the “reimbursement” had not gone through, but due to the number of calls and the Austrian number the victim felt something was not quite right.
According to a report over at the Wall Street Journal, the firms insurance company Euler Hermes Group said that this is a first for them as they have never dealt with a client who had been duped using AI mimicry. After an investigation they learned that the victim recognized his superior’s voice because it had a hint of a German accent and the same “melody.”
This is not the first news of this type of technology though, back in May the AI company Dessa released a simulation of the podcaster Joe Rogan’s voice that was a pretty damn near-perfect replica. In fact, the replica was so similar to the real thing that a longtime listener would have difficulty distinguishing between Joe Rogan and “Joe Fauxgan.”
Because this technology is readily commercially available, companies and firms are going to have to develop protocols in an effort to prevent money transfers such as this one, and those protocols need to be kept strictly confidential and change often just in case.
Until then, if someone dial’s you up at work, sounds like your boss and tells you to transfer a large sum of money, it would be best to make sure that the request is legitimate.
Sooner or later, being able to replicate anyone’s voice will be just like blinking one’s eyes.
Cristal M Clark