Your principal pops up on video chat, asking you to change a route to go past a particular location, as their plans have changed and they have a meeting.
Luckily when you make the pickup, you confirm with them. They’re surprised at the request as they don’t recall calling, and so you don’t make the route change.
Later investigation reveals that the call was made from a spoofed phone, using a real-time generated video and audio stream of the principal mapped to the live movements of some unknown adversary.
It sounds like science fiction, or rubber masks from Mission Impossible, but the technology to do this is now so commoditised that you’ve probably seen it being posted on social media to let people duet with celebrities, or insert themselves into film clips.
The Fake Obama
Possibly still the most famous and dramatic demonstration was a video of President Obama seated in a briefing room. The opening lines are the most dramatic: “We are entering an age in which our enemies can make it look like anyone is saying anything at any point in time.”
This video was created in 2018, by Jordan Peele. The synthesized voice and image were synchronised to Jordan’s own lip movements and words, and were generated based on publicly available video and audio of Obama speaking. I highly recommend searching for and watching this video if you want a Halloween scare.
Since then the technology has improved by leaps and bounds, and can now be done in near-realtime. In 2019 a fraud was carried out against a UK-based energy firm (still unknown which), where the CEO was convinced he was on the phone with his boss, the CEO of the parent company. He followed orders to transfer €220 000 to the bank account of a Hungarian supplier.
Is it a real threat?
This attack was purely voice-based, but the CEO stated that he specifically recognised the very subtle German accent in his boss’ voice, and that the patterns and rhythms of his speech were the same.
The fraudster carrying out the attack made three phone calls – first to initiate the transfer, second to claim it had been reimbursed, and once more to ask for a followup payment to be transferred. It was only after the second call, noticing that the supposed reimbursement had not happened, that the CEO looked harder and noticed that the call had been made from an Austrian phone number.
What is slightly shocking (and fortunate for the CEO) is that the fraudsters had not spoofed the mobile phone number to be from the right region, which is a trivial feat. If they had the followup payment may have happened.
In 2020 a bank manager in Hong Kong received a phone call from a company director he had spoken with before, talking about an acquisition he was looking to make and that he needed some transfers (around $35 million) authorised. He recognised the voice, spoke with the nominated lawyer after checking his inbox for e-mails from the right sources, and began making the transfers.
These are the attacks which are known to be successful and have been made public. Other attempts have been reported using both video and audio, and they’re unlikely to be going away any time soon. Detection technologies exist, but are not commonly used and the deception technologies continuously improve. We are likely to see a lot more in the future given the reliance over the last two years in online meeting tools and the opportunities for attackers to use them to manipulate circumstances in their favour.
Even ignoring the more attention-grabbing financial attacks, there is another attack vector available based on images. In 2017, deepfake pornography became prominent, with an eventual estimate in 2019 that 96% of all deepfakes were pornographic. Celebrities were the most common victims, with some of the fakes featuring in articles, but anyone could be a victim.
Of course if pornography is an option, blackmail materials indistinguishable from genuine images can be generated. Arguably there’s the potential for this to then devalue blackmail material more worried since anything can be argued as a fake. Deepfakes have also been used in politics (notably in 2020 by the Belgian branch of Extinction Rebellion who published a deepfake of the Prime Minister talking about a link between deforestation and COVID-19), art (debates are ongoing about artificially inserting deceased, or even simply unavailable celebrities into media), fraud, and to create fake social media profiles for non-existent persons for misinformation campaigns.
These fake social media profiles are often referred to as sockpuppets, with the most well-documented occurring in 2018 when a persona named Oliver Taylor submitted articles (which were then published) accusing a British academic and his wife of being terrorist sumpathisers. While it is believed, given the evidence, that the persona is entirely synthesized several newspapers have not retracted the articles or removed them from their websites.
What can you do about it?
Verify – if a phone call is asking you to do something, confirm through some other means. Preferably through a secure channel or in person. If an e-mail is asking for something, verify it. And, if the impersonated person or the target of the impersonation has the type of profile that attracts capable adversaries, do not take at face value any media of them.
You can test your own abilities to spot fakes with a tool created by MIT at https://detectfakes.media.mit.edu. Other similar online quiz-type tools exist, some more challenging than others. The site https://thispersondoesnotexist.com generates random synthesized faces with each visit, highly popular for the simpler form of sockpuppet profile, and with some giveaways in the generated pictures that are useful for detection.
Ultimately though we are simply entering an age where the integrity of electronic media cannot be assumed, and need to adjust our behaviours and processes appropriately until the tools to protect ourselves become available (if ever).
Cyber Security Fundamentals – Deepfakes and Impersonation
By James Bore
James Bore is an independent cybersecurity consultant, speaker, and author with over a decade of experience in the domain. He has worked to secure national mobile networks, financial institutions, start-ups, and one of the largest attractions’ companies in the world, among others. If you would like to get in touch for help with any of the above, please reach out at james@bores.com
Leave a Reply