Older Americans reportedly lost $1.1 billion to fraud in 2022, according to the annual Senate Committee on Aging report released this month, and most of the scams utilized AI technology to clone the voices of people they knew and other AI-generated ploys.
During a Thursday committee hearing on AI scams, committee chairman Sen. Bob Casey, D-Pa., published the group’s annual fraud book highlighting the top scams last year. It found that from January 2020 to June 2021, the FBI found “individuals reportedly lost $13 million to grandparent and person-in-need scams.”
Sen. Elizabeth Warren, D-Mass, also a member of the committee, said the $1.1 billion figure in total losses is “almost surely an underestimate,” since it does not factor in the instances of victims who don’t report scams due to embarrassment.
Casey said in a statement that “federal action” is needed to put up guardrails to protect consumers from AI-generated scams. There are currently very little regulations on AI capacities, which witnesses urged lawmakers to crack down on through legislation.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
“Any consumer, no matter their age, gender, or background, can fall victim to these ultra-convincing scams, and the stories we heard today from individuals across the country are heartbreaking,” he said. “As a parent and grandparent, I relate to the fear and concern these victims must feel.”
The top 10 categories of scams reported in the fraud book were financial impersonation and fraud, robocalls, computer scams, catfishing on dating profiles, identity theft and others.
The most prominent scams used AI technology to mimic people’s voices who then make calls to the victims, family members or loved ones, asking for money. Several testimonies from witnesses in the hearing said they received calls that sounded exactly like their loved one was in danger, was injured or was being held hostage.
Tahir Ekin, PhD, director of the Texas State Center for Analytics and Data Science, who was present at the hearing, testified this deliberate strategy of impersonation catapults “their believability and emotional appeal.”
US MILITARY NEEDS AI VEHICLES, WEAPON SYSTEMS TO BE ‘SUPERIOR’ GLOBAL FORCE: EXPERTS
“Prioritizing the enhancement of data and AI literacy among older Americans, and actively involving them in prevention and detection efforts, stands as a cornerstone,” he said.
One older couple, featured in a video testimony in the hearing, received a call from who they thought was their daughter. She sounded distressed and asked for help.
“My daughter was, she was crying on the phone, profusely crying and saying, ‘mom, mom, mom,’ and of course my wife was saying, ‘LeAnn, LeAnn, what is the matter?’, and she repeated it again, ‘mom, mom, mom’ and it sounded exactly like her,” Terry Holtzapple, one of the victims, said.
Gary Schildhorn, a Philadelphia-based attorney and another targeted victim of an AI voice clone scam, also testified at the hearing. He almost sent $9,000 to the scammer until he confirmed with his daughter-in-law it was an extortion attempt.
The scammer, posing as an attorney, called Schildhorn requesting funds to bail his son out of jail for causing a car accident and failing a breathalyzer test.
US, NOT CHINA, SHOULD TAKE LEAD ON AI
“There was no doubt in my mind that it was his voice on the phone — it was the exact cadence with which he speaks,” he said. “I sat motionless in my car just trying to process these events. How did they get my son’s voice? The only conclusion I can come up with is that they used artificial intelligence, or AI, to clone his voice… it is manifestly apparent that this technology… provide[s] a riskless avenue for fraudsters to prey on us.”
Since no money was sent, however, law enforcement told Schildhorn that no crime had been committed and no further action was taken.
“With crypto and AI, law enforcement does not have a remedy,” Schildhorn said during the hearing. “There needs to be some legislation to allow these people to be identified… so that there’s a remedy for the harm that’s being caused. Currently, there’s no remedy,” he said.
According to the Federal Trade Commission, elderly Americans are more likely to fall prey to online scams than younger people.
Read the full article here