The Chairman of the US Federal Communications Commission (FCC) Jessica Rosenworcel has proposed criminalizing telephone calls using artificial intelligence.
Today we announced a proposal to make AI-voice generated robocalls illegal – giving State AGs new tools to crack down on voice cloning scams and protect consumers. https://t.co/OfJUZR0HrG
— The FCC (@FCC) January 31, 2024
“Attackers are increasingly using AI technology to imitate human voices through robocalls. We are taking measures to protect consumers from such fraud,” the department said in a statement.
According to the FCC, the number of such calls has increased sharply in recent years because technology can “mislead consumers by imitating the voices of celebrities, political figures and even family members.”
By implementing the bill, the regulator would provide US attorneys general with “new tools to pursue the criminals behind these nefarious call bots and hold them accountable to the fullest extent of the law.”
Such incidents will be regulated under the Telephone Consumer Protection Act of 1991, which sets standards for political and marketing calls made without the subscriber's consent.
The announcement came after the US President Joe Biden's voice-imitation scandal. The bot urged not to vote in the primaries because it would “allow the Republicans to elect Donald Trump again.” One of these conversations was recorded by NBC journalists.
NBC reports that NH voters are getting robocalls with a deepfake of Biden's voice telling them to not vote tomorrow.
— Alex Thompson (@AlexThomp) January 22, 2024
The state attorney general's office issued a statement calling the calls misinformation and advising voters to “ignore the content of the messages entirely.” Trump's representatives deny involvement in the incident.
Let us recall that in July 2023, the UN stated that deepfakes created by artificial intelligence threaten the integrity of information and lead to incitement of hatred in society.
In March, journalist Joseph Cox “spoofed” a bank's voice ID using a free AI speech synthesis service. According to the Washington Post, scammers are increasingly using voice imitation technology to blackmail victims' relatives.
Found an error in the text? Select it and press CTRL+ENTER
Cryplogger newsletters: keep your finger on the pulse of the Bitcoin industry!