× Home Modules Articles Videos Life Events Calculators Quiz Jargon Login
☰ Menu

How AI is supercharging financial scams

Written and accurate as at: Aug 15, 2023 Current Stats & Facts

As with any new technology, we have to take the good with the bad. But in the case of artificial intelligence (AI), the can of worms it threatens to open might be unlike anything we’ve ever seen. 

Already, the technology is being used by cybercriminals to swindle people of their money, and the methods they’re using are getting increasingly difficult to detect.

Below are some of the ways AI is being deployed and what you can do to help keep your finances safe.


“Voice cloning” technology used in phone scams

Using just a short audio sample and some commercially-available programs, it’s now possible to clone someone’s voice and prompt it to say anything you want. Fraudsters have jumped at this opportunity, using AI-generated voices to impersonate relatives of victims and leaving messages urgently requesting money to get out of a bind.

These scams can be surprisingly difficult to see through. A global survey by McAfee found that 70% of people had trouble distinguishing between a cloned voice and the real thing.1

Scammers are setting their sights on much larger targets too. In 2020, a manager in Hong Kong received a call from someone using deep fake technology to simulate his boss’s voice. The call was so convincing that he wound up transferring USD 35 million to fund a bogus acquisition deal.2

Similar tactics were used against the CEO of a UK-based energy firm, who was duped into sending $243,000 after speaking to a deep fake imposter.3

Generative AI used to gain victims’ trust

Malicious actors can now use language models like ChatGPT to polish up the messages they send to potential victims. All the usual giveaways — typos and missing punctuation — can be mended, giving things a more realistic (and potentially persuasive) sheen.

Romance scams are also bound to receive a shot in the arm. With AI, a scammer can generate original images to use on dating profiles and social media, potentially making it easier to win someone’s interest and trust.

Once a victim is invested in the ‘relationship,’ scammers will start asking for money, gifts and bank details. They might even get the victim to perform tasks that amount to money laundering on their behalf.

On the more extreme end, deepfake video technology can be used to recreate prominent figures’ likenesses against their will. A recent example involved CommBank CEO Matt Comyn, whose image was digitally replicated and used to shill an investing website.4

Ways to protect your finances

The reason so many of these scams work is because they succeed in creating a heightened emotional state in victims. Fraudsters assume that the more distressed or excited people are, the less guarded they’ll likely be with their money. 

Fortunately, many of the usual methods for protecting your finances can still apply:

  • Be wary of texts, emails or calls that seem crafted to create a sense of urgency. Keep a cool head and try to verify if the message has come from a legitimate source.

  • If you receive a message from your bank asking for login details or other personal information, take it with a grain of salt.

  • It’s common for scammers to request payment via channels that are hard to trace and difficult to recover money from, such as gift cards and cryptocurrency, so consider any mention of these a potential warning sign.

  • Set up multi-factor authentication if you haven’t done so already.

  • Exercise caution when discussing your finances over the phone, even if you believe it’s your child or grandchild on the other line. If at any point you become suspicious, put the call on hold and ring them yourself.

Australians lost a record $3.1 billion to scams in 2022,5 and that figure is poised to grow as AI use becomes more widespread. Scammers can target anyone, no matter how tech-savvy they are, but awareness of this new breed of scams can go a long way.

 

Sources

1 https://www.mcafee.com/blogs/privacy-identity-protection/artificial-imposters-cybercriminals-turn-to-ai-voice-cloning-for-a-new-breed-of-scam/
2 https://www.forbes.com/sites/thomasbrewster/2021/10/14/huge-bank-fraud-uses-deep-fake-voice-tech-to-steal-millions/?sh=545c8b0b7559 
3 https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/?sh=499b2abc2241 
4 https://www.afr.com/companies/financial-services/cba-jostles-with-ai-generated-matt-comyn-scam-20230615-p5dgtx 
5 https://www.scamwatch.gov.au/news-alerts/accc-calls-for-united-front-as-scammers-steal-over-3bn-from-australians 

View Terms and conditions