If you're still worried about AI taking your job, the Federal Trade Commission (FTC) Chair has news for you: it could also be used as a tool to take your money from you.
If this sounds like some rejected subplot from a dystopian sci-fi novel, think again. FTC Chair Lina Khan has warned the public about the perils of customer data being leveraged for possible “surveillance pricing.”
During the 2024 Fast Company Innovation Festival in September, Khan used the example of a passenger being charged more money for an airline ticket “because the company knows that they just had a death in the family and need to fly across the country.”
As of now, no airline carrier has either implemented or been investigated for this practice. However, could Khan’s prediction become a reality?
From cunning scammers to commerce vendors
Over the years, airlines have already introduced additional fees for checked bags, seat selection, ticket changes, and even carry-on luggage — items once taken for granted as free.
The situation has become so dire that it stoked the ire of the U.S. Senate which, in March, stepped up its investigation into the billions of dollars airlines rake in as a result of excess fees.
"Given just how much intimate and personal information that digital companies are collecting on us, there's increasingly the possibility of each of us being charged a different price based on what firms know about us," Khan elaborated.
This isn’t just about air transport, either. Targeted data crunched by artificial intelligence (AI) is already here, arriving — as these things often do — via scammers.
Whether through events such as the 2017 Equifax breach that exposed the personal information of 147 million people, or companies such as Meta reneging on user privacy commitments, it’s increasingly possible for your valuable personal data to somehow get into the digital wild.
Sometimes, the danger can come from an innocuous source: purchase data that a company acquires through honest means (think: your favorite online merchants, for example).
If merged with broader information obtained and analyzed via cutting-edge analytics, the merchant could discover you’re in a vulnerable position in the blink of an AI, if you will. This can then possibly open the door for price gouging.
Must Read
- Dave Ramsey warns nearly 50% of Americans are making 1 big Social Security mistake — here’s what it is and the simple steps to fix it ASAP
- Robert Kiyosaki begs investors not to miss this ‘explosion’ — says this 1 asset will surge 400% in a year
- Vanguard reveals what could be coming for U.S. stocks, and it’s raising alarm bells for retirees. Here’s why and how to protect yourself
Join 250,000+ readers and get Moneywise’s best stories and exclusive interviews first — clear insights curated and delivered weekly. Subscribe now.
How to protect yourself
In February, popular fast-food chain, Wendy’s, endured online scorn and ridicule shortly after CEO Kirk Tanner announced a $20 million investment in digital menu boards to test dynamic pricing.
The AI-driven technology could, in theory, analyze consumer habits based on rushes and lulls, and charge accordingly. Customers slammed it as a backdoor way to rip them off. (Wendy’s has since said the digital menu boards would only provide discounts or recommend products.)
You can also tell politicians and media outlets when you spot price discrimination — and let companies know you're onto them through social media.
Meanwhile, AI scams have become part of the FTC’s "bread and butter fraud work," Khan noted, so reach out to the agency when you see or experience something amiss.
"Some of these AI tools are turbocharging that fraud because they allow some of these scams to be disseminated much more quickly, much more cheaply, and on a much broader scale," she said.
Before anyone comes to your rescue, know that criminals will continue to ply their cunning as AI grows more sophisticated.
One new scam involves voice cloning, wherein malefactors use the voice imprint of a loved one (recorded in a coffee shop, for example) to ply for emergency cash, hoping your guard’s down because it’s a voice you recognize.
The technology is so widespread that reputable outlets, such as Descript, boast that they can clone your voice in a minute using only seconds of an audio recording.
In 2023, McAfee surveyed 7,000 people and found one in four people had experienced a voice cloning scam or knew someone who had.
In addition, one in 10 respondents also revealed they’d received a phone call with an AI voice clone, with 77% of that cohort admitting they’d lost money as a result.
To protect yourself, don’t take calls from strange or unknown numbers and if someone asks for money, even if they claim to be a person known to you, hang up and call said person back at their legitimate phone number.
With email phishing, scammers now use AI technology to make their copy more realistic, or to pose as someone who may oversee an account service, such as Netflix.
You May Also Like
- Turning 50 with $0 saved for retirement? Most people don’t realize they’re actually just entering their prime earning decade. Here are 6 ways to catch up fast
- This 20-year-old lotto winner refused $1M in cash and chose $1,000/week for life. Now she’s getting slammed for it. Which option would you pick?
- Warren Buffett used these 8 repeatable money rules to turn $9,800 into a $150B fortune. Start using them today to get rich (and stay rich)
- Here are 5 easy ways to own multiple properties like Bezos and Beyoncé. You can start with $10 (and no, you don’t have to manage a single thing)
Lou Carlozo is a freelance contributor to Moneywise.
