Love Was in the Air; So Were AI-Based Scams

Valentine's Day highlighted the surge in romance scams, many drawing on AI capabilities that should have insurers worried. 

Image
ai on phone in hand

Valentine's Day brought a flurry of reports about devastating romance scams perpetrated by organized, remarkably patient cyber criminals. 

While insurance doesn't cover these individual scam victims, they dramatize what insurers are up against as criminals become more sophisticated and tap into AI's capabilities as they go after the companies that insurers do cover. 

It's a scary sight — and may get worse.

Fortune magazine told the tales of three women who have come forward to warn others as part of their efforts to combat romance scams. They described great patience by the scammers and careful wooing; one even invested $185,000 to fool his victim.

After meeting a man on a dating site, one of the women received numerous trinkets from him — then a $100,000 check. She worried the check was fake, but it was real, and she deposited it. He then began planning a birthday bash for her on a yacht and laid out $85,000 for deposits and various purchases. He texted her constantly — she counted 20,000 text messages, in all — and asked her to pray with him daily. Eventually, he suggested she withdraw her life savings and invest in a sure-thing crypto venture, which she did. She received email weekly reports, likely generated by AI, that showed thousands of dollars of gains each time — but those were fake. She lost $1 million in savings, as well as her condo.

Another woman felt she had bonded with a man and was preparing to meet him, when he asked her to help him log into his bank account while he was abroad. He supposedly needed to pay a translator as part of a deal that would net him a $10 million payment. When the woman logged into the bank account, she saw $700,000 in cash. The next day, both were locked out of the bank account, and he said he needed $20,000 for an additional payment. Reassured by his impressive bank account, she sent him the money. He pocketed it. 

The third woman was wooed by a man who not only seemed to care for her but for her six rescue dogs. When she'd talk with the scammer on the phone, if one made noise in the background, he'd ask, "Is that Duffy? Is that Trixie?" He gradually talked her out of her life savings, as well the life insurance payment she'd received when her husband died. She was left with so little money that she she couldn't repair her air conditioner, a decision that led to a fire that burned her house down and killed all her dogs. To escape, she had to flee the house in her underwear at five in the morning, in a neighborhood where she'd lived for 42 years. 

Vox describes AI as a "force multiplier" for these romance scams, which occur all year but pick up around Valentine's Day. Someone who might have been able to run a few scams at a time pre-AI can now operate 20 or more simultaneously.

"On the dark web," Vox says, "fraudsters can purchase romance scam toolkits complete with customer support, user reviews, and tiered pricing packages. These toolkits come with pre-built fake personas with AI-generated photosets, conversation scripts for each stage of the scam, and deepfake video tools, [Chris Nyhuis, the founder of cybersecurity firm Vigilant] told me. 'The skill barrier to entry is essentially gone.'"

Richard Graham, the practice lead for financial crime at Moody's, told me these scams are becoming more ambitious. 

"Five years ago, six years ago, [scammers] were just people online, trying to get your information, get a couple bucks and move on. Now... instead of asking you for money on day one, they're building rapport. They're spending a lot of time with you to better understand who you are.... sometimes over weeks, but usually months.... It's not just a $3,000 payment they want. They want everything."

Graham says it's hard to know just how many billions of dollars a year are lost to these scammers, because the vast majority of victims are too embarrassed to report the crimes. But he says a UN report found that some 235,000 people worldwide were working in professional organizations, largely in Southeast Asia, two or three years ago to perpetrate these romance scams, and he assumes the number has grown since then. 

He says AI helps these professionals develop better scripts to use as they groom their victims. It can also can help those who aren't native English speakers to smooth out any issues with their language skills. AI certainly helps gather information from social media sites as scammers try to learn as much as possible about their marks. AI also makes it easier and less expensive to cast a wide net of messages that could begin interactions with potential victims. While the techniques used to woo victims are referred to as "love bombing," the thieves refer to their goal with a crasser term: "pig butchering."

Graham says AI isn't being used much at the moment to generate deep fakes as part of romance scams — but only because they aren't needed just yet. 

"That actually has been a surprise to me. I would have thought at this point that deep fakes would have been a much bigger problem," he said. "They are a problem, but they haven't really scaled yet, because these other scams [based on dating sites, social media and text messages] have just been so low-effort and so successful."

That's not a happy thought for cybersecurity, in general: Thieves have enormously powerful tools at their disposal — but don't even have to use them much just yet because people are still so easy to fool.

The only solution is to escalate our vigilance as fast or faster than the thieves are escalating their capabilities. The stories of corporate victims aren't as dramatic as those of the poor women Fortune described, but the financial damage is orders of magnitude greater. 

Cheers,

Paul

P.S. Graham offered a pointer for anyone who thinks someone they know might be getting courted as part of a scam. He says thieves always want to get victims into a chat app as fast as possible. That way, they don't put at risk a social profile they've spent a great deal of time crafting and won't be caught in any safeguards set up by the social site. So you can simply ask whether the person you're concerned about has any chat apps on their phone. That question won't raise hackles in the way that a statement like, "I'm worried about you," would. If the person has a chat app, you can explore further and perhaps head off a financial catastrophe.