“When it comes to frauds and scams targeting seniors, I am here to tell you that things aren’t as bad as you think—unfortunately, they are far worse. According to the FTC’s Consumer Sentinel report for 2022, older Americans reported more than $1.6 billion in losses to frauds and scams. This number is undoubtedly lower than the actual figure because many seniors, for a variety of reasons, including embarrassment or shame, fail to report the scams perpetrated against them. FTC estimates that in 2022 the actual amount lost by seniors to scams could be as high as $48.4 billion. And now with Artificial Intelligence (AI), the scams are getting worse.
AI has become a sophisticated weapon that can be deployed by even the most unsophisticated scammers.”
In November, Steve Weisman, senior lecturer at Bentley University, said this before the Senate Special Committee on Aging. The purpose of this hearing was to discuss how Artificial Intelligence (AI) is being used in furtherance of scams against older adults. AI technology makes the scam more believable to the person being targeted by mimicking “human-like behavior,” often through speech and/or writing. AI can manifest itself through chatbots (online virtual assistants), voice cloning technology and deepfakes (AI generated videos). Through voice cloning technology, AI replicates a target’s loved-one’s voice, which has made emergency scams more realistic.
In Sugarland, Texas, news station ABC KTRK reported on a scam perpetrated against an 82-year-old man named Jerry (last name purposefully not reported). The report states Jerry received a call from what he thought was the San Antonio Police Department, indicating his son-in-law was in jail for causing a car accident. On the call, Jerry thought his son-in-law got on the phone to explain the situation. Unfortunately, Jerry fell victim to voice cloning technology because he paid money to get his son-in-law out of jail. Jerry’s son-in-law was never in jail, and Jerry and his wife were swindled out of $17,000. To make matters worse, the couple lives in a costly assisted living facility. According to Jerry, this scam will force him to find a job to supplement the money they lost.
Equally concerning is how easily this type of technology can be administered. Scammers can get a few seconds of a person’s audio recording (often from the internet) and clone their voices for fraudulent purposes. These AI techniques allow for the use of more personalized emails, phone calls and chatbots in furtherance of the scam.
Despite the level of sophistication associated with these scams, people can practice good habits to decrease the likelihood of being victimized. Advice from the Senate Special Committee on Aging includes:
- Do not share sensitive information via phone, email, text, or social media.
- Do not transfer or send money to unknown locations.
- Consider designating a “safe word” for your family that is only shared with family members and close contacts.
- Do not provide any personal or sensitive information to an online chatbot.
- Report potential scams to the authorities and the companies involved.
Thankfully, government agencies and elected representatives are taking notice.
At the Senate Special Committee on Aging hearing in November, “Modern Scams: How Scammers Are Using Artificial Intelligence & How We Can Fight Back,” senators heard testimony from witnesses who had been victimized by scammers, and from experts in the field. Each witness brought a unique perspective on how to address the issue. Chairman Bob Casey (D-PA) indicated a federal response is required to combat this problem. After the hearing, Senators Casey, Richard Blumenthal (D-Conn), John Fetterman (D-PA) and Kirsten Gillibrand (D-NY) sent a letter to the Federal Trade Commission (FTC) requesting more information regarding the use of AI in furtherance of scams. Also, the committee runs a Fraud Hotline (1-855-303-9470) which provides resources for older adults and their family members to report scams and suspicious activity.
In addition, the private sector is responding to the problem. An article published by the Financial Times, “AI heralds the next generation of financial scams,” details how companies like iProov and Catch are developing AI technologies to combat scammers. Catch’s product weeds out email scams and advises its client on how to proceed. “The cash that [older adults] tend to lose is a lot more valuable to them—on average the cheque size they lose is higher and if they’re retired, they don’t have the time to make that money back,” co-founder of Catch Uri Pearl told the Financial Times.
“Shock, relief and anger—one emotion followed the other. I said to Brett that there was no doubt in my mind that it was his voice on the phone—it was the exact cadence with which he speaks. I sat motionless in my car just trying to process these events. How did they get my son’s voice? The only conclusion I can come up with is that they used artificial intelligence, or AI, to clone his voice.” This is what Gary Schildhorn told the Special Committee on Aging at the November hearing about how he felt after he realized that he almost paid a scammer $90,000, despite the fact his son wasn’t in jail.
Often, it feels like scammers are always one step ahead of the authorities. However, through a proactive approach with the government and private sector, hopefully older Americans can be in a better position to stay ahead of these bad actors.
Evan Carmen, Esq. is the Legislative Director for Aging Policy at the B’nai B’rith International Center for Senior Services. Click here to read more from Evan Carmen.