WASHINGTON – Calls made using artificial intelligence-generated voices are illegal, the Federal Communications Commission said on Thursday after a fake robocall impersonating President Joe Biden tried to dissuade people from voting for him in New Hampshire’s Democratic primary.
FCC Chair Jessica Rosenworcel said the declaratory ruling gives prosecutors new tools to go after entities behind robocalls.
“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, impersonate celebrities and misinform voters. We’re putting the scammers behind these robocalls on notice,” Rosenworcel said.
The FCC noted that previously, attorneys general could target the result of an unwanted AI voice-generated robocall, but the new action makes the use of AI to generate the voice in those robocalls itself illegal.
Earlier this week, New Hampshire Attorney General John Formella said the fake Biden robocall was traced back to Texas-based Life Corp. He said a cease and desist letter had been sent to the company, which is run by Walter Monk and a criminal. an investigation is underway.
“The use of generative artificial intelligence has brought a new threat to voter suppression schemes and the campaign season with the increased believability of fake robocalls,” said Democratic FCC Commissioner Geoffrey Starks.
“Voice cloning,” the FCC said, “can convince the called party that a trusted person or someone they care about, such as a family member, wants or needs them to take some action that they would not otherwise do.”
In 2023, the FCC finalized a $5.1 million fine imposed on conservative activists for making more than 1,100 illegal robocalls ahead of the 2020 US election.
The solicitations sought to discourage voting by telling potential voters that if they voted by mail, their “personal information would be part of a public database used by police departments to track old warrants and used by credit card companies to collect debts.” debts.”