If you receive a phone or video call from a distressed family member who needs you to send money now — for bail, hospital fees or even as ransom for a kidnapping — your first reaction is likely to be panic. But you may want to take a breath and think twice before acting.
Deepfakes use voice cloning and synthetic video to impersonate a person’s likeness in an image, video or audio clip. The technology behind this is becoming increasingly sophisticated, making these deepfakes even harder to distinguish from the real thing.
And online fraudsters are cashing in. More than 4.2 million fraud reports have been filed since 2020, resulting in more than $50.5 billion in losses, “with a growing portion stemming from deepfake scams,” according to the FBI (1). But these numbers are likely much higher since many victims of online scams don’t report them to law enforcement.
Financial losses from deepfake and other AI-generated scams alone are projected to reach $40 billion in the U.S. alone by 2027, according to Deloitte’s Center for Financial Services (2).
And the problem is only going to get worse. Cybersecurity firm DeepStrike estimates an increase from 500,000 deepfakes in 2023 to 8 million in 2025. “This is consistent with a growth rate where the volume of deepfake videos increases by 900% annually. This isn’t linear growth; it’s a viral proliferation that outpaces nearly every other cyber threat,” according to DeepStrike (3). Here’s how they are rising in use so quickly and how to protect yourself.
The rise of an almost-perfect scam
Deepfakes aren’t just being used to trick corporations and banks. The same techniques are being used to up the ante on the classic ‘grandchild scam,’ in which fraudsters use deepfakes to mimic family members in distress — faking an emergency that requires an urgent transfer of funds to cover a hospital bill or bail, or maybe even ransom for a kidnapping.
This scam is particularly effective — and particularly nefarious — because it bypasses logic and targets emotion, triggering a sense of urgency. And that’s the point, since it causes even the most rational person to act before verifying.
“By the end of 2026, so we’re talking just over a year, it’s going to be the majority of the way scams are done,” Roger Grimes, a cybersecurity expert with Clearwater-based security awareness company KnowBe4, told Tampa Bay 28, during an investigation into how AI deepfake scams are targeting families (4).
“Hackers are doing these fake kidnappings. They’re calling grandparents to say that your grandchild has been in an accident. They need $5,000 to get out of trouble,” Grimes said (4).
AI tools are already cheap, accessible and easy to use, meaning nearly anyone with a social media presence could be vulnerable.
Audio deepfakes are one of the most widespread forms of deepfake attacks. “With as little as three seconds of genuine audio, AI voice generators can mimic someone’s voice with up to 95% accuracy,” according to cybersecurity firm McAfee (5).
For video deepfakes, scammers can take advantage of publicly available images and videos scraped from sources like Instagram, TikTok and YouTube. And human detection rates are just 24.5% for high-quality video, according to DeepStrike (3).
The aftermath of a deepfake scam can be devastating on many levels. You likely won’t be able to get your money back, since banks may refuse to reimburse losses for transactions that were ‘authorized.’ More than two-thirds (77%) of people targeted by an AI voice scam lost money as a result, with 36% losing between $500 and $3,000, according to a McAfee survey (6).
Plus, if you shared sensitive data with the scammers, they could potentially use that for identity theft — and commit more fraud, such as taking out loans in your name (and hurting your credit score). Victims may also feel ashamed and confused, triggering anxiety, depression and distrust.
Must Read
- Dave Ramsey warns nearly 50% of Americans are making 1 big Social Security mistake — are you doing the same?
- Thanks to Jeff Bezos, you can now become a landlord for as little as $100 — and no, you don't have to deal with tenants or fix freezers. Here's how
- Robert Kiyosaki says this 1 asset will surge 400% in a year and begs investors not to miss this ‘explosion’
Join 250,000+ readers and get Moneywise’s best stories and exclusive interviews first — clear insights curated and delivered weekly. Subscribe now.
How to protect yourself from deepfakes
While state and federal lawmakers are attempting to enact laws to protect consumers against deepfake scams, it’s hard for lawmakers to keep up with rapidly advancing technology that currently doesn’t have a lot of adequate guardrails.
This, unfortunately, puts much of the onus on consumers to protect themselves. In this environment, you’ll need to assume that hearing a loved one’s voice or seeing them on a video is no longer proof of identity.
But there are a few red flags to look out for, as well as verification habits that could protect your finances.
“Even if it’s a deepfake AI, they’ve got to still scam you,” Grimes told Tampa Bay 28. “If it’s an unexpected message asking you to do something you’ve never done before, research it, even if it sounds and looks like somebody you know” (4).
In videos, look for signs such as jitter (brief flickers), unnatural eye movement and inconsistent lighting and shadows. With phone calls, the voice may sound exactly like your family member, but an out-of-character request and sense of urgency should raise your Spidey senses. Listen for red flags such as unnatural interruptions, unusual pacing or repetitive phrasing.
Since deepfakes are getting better all the time, it may be hard to note these discrepancies, especially when you’re caught off guard and flooded with panic. The scammers try to create a sense of urgency so you don’t have time to think before you act.
Ask yourself how likely this scenario would be. If your child is at school, why are they suddenly calling you for bail money? And if they needed bail money, why would it be so urgent? And why would local authorities ask you to send a wire transfer?
Verify the facts before making any moves. For example, you could text your child or call the school to find out if they’re in class. Another method is simple and pretty much foolproof: Establish a safe word with close family members so, if a supposed family member calls from an unknown number frantically asking for money, ask for the safe word.
If you suspect you’ve been scammed, contact local police and the FBI’s Internet Complaint Center at www.ic3.gov. If you’ve already sent money to a potential fraudster, contact your bank immediately.
Article sources
We rely only on vetted sources and credible third-party reporting. For details, see our editorial ethics and guidelines.
American Bankers Association (1); Deloitte (2); DeepStrike (3); Tampa Bay 28 (4); McAfee (5, 6)
You May Also Like
- Turning 50 with $0 saved for retirement? Most people don’t realize they’re actually just entering their prime earning decade. Here are 6 ways to catch up fast
- Inside a $1B real estate fund offering access to thousands of income-producing rental properties — with flexible minimums starting at $10
- Vanguard’s outlook on U.S. stocks is raising alarm bells for retirees. Here’s why and how to protect yourself
- Here are 5 easy ways to own multiple properties like Bezos and Beyoncé. You can start with $10 (and no, you don’t have to manage a single thing)
Vawn Himmelsbach is a veteran journalist who covers tech, business, finance and travel. Her work has been featured in publications such as The Globe and Mail, Toronto Star, National Post, CBC News, Yahoo Finance, MSN, CAA Magazine, Travelweek, Explore Magazine and Consumer Reports.
