Cyber criminals use deepfake audio on phone call to extort €200k

Cyber criminals use deepfake audio on phone call to extort €200k

Just when you thought the cost of overseas calls was decreasing…..

As if battling cybercrime wasn’t hard enough, criminals have now weaponised artificial intelligence in the form of deepfake audio. In a recent example, the Chief Executive Officer (CEO) of a UK subsidiary energy company was tricked into wiring €200,000 to a Hungarian supplier on the instructions of who the CEO believed to be the Chief Exec of the German parent company. In reality, the conversation took place with an artificial intelligence (AI) equipped criminal gang using deepfake software to mimic the German Chief Exec. The software was able to perfectly impersonate the voice, including tone, punctuation and German accent, completely fooling the CEO. The call was accompanied by an email, supposedly from the Chief Exec, reiterating the payment instructions. As everything appeared in order, the funds were transferred to Hungary, however, were soon moved on to Mexico and various other locations, with law enforcement still looking for suspects.

Who do you think you’re talking to?

Although this incident reads like the plot line of a Mission Impossible film, it is unfortunately not an isolated case. Since the fraudulent incident in March this year, other deepfake voice frauds cases have come to light. This social engineering attack could be a sign of things to come. Although we have seen deepfakes imitate celebrities and public figures in video format, it’s an endeavour that still takes several hours of footage to achieve. Being able to fake voices convincingly takes fewer recordings to produce and with greater computing power will become easier to create. It begs the question can voice recognition be relied on as an accurate form of identity verification?

Do you know who I am?

In the future, deepfake audio fraud is likely to be highly exploited in criminal activity. As the technology continues to evolve, it will become increasingly more difficult to distinguish real audio from fake. If you want to ensure authentication of identity you need to use a seriously secure mobile comms service.

Armour Mobile uses MIKEY-SAKKE identity-based encryption protocol to secure multimedia services. It provides secure voice and video calls, voice and video conference calls, one-to-one and group messaging and file attachments. The solution ensures that the parties exchanging calls and data are the parties they claim to be!  Most importantly Armour Mobile protects not only the content of communications, but also the associated meta-data. This means no-one even knows you are having a conversation, let alone what that conversation is about.

Imitation – not always the sincerest form of flattery

Deepfakes might have arrived but there are tools to identify the real from the fake. Armour mobile helps prevent fraudulent activity by enabling secure collaboration between trusted colleagues. Communications are conducted within a closed user group and only those added to the system can call and message others. So, when discussing commercially sensitive information such as corporate intellectual property, financial transactions, and customer details, you will know exactly who you are speaking with.

With deepfake ransomware among experts’ list of cyber fears for 2020, it’s time to armour up.

Contact us today for more details.

  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k
  • Cyber criminals use deepfake audio on phone call to extort €200k