Study Warns About AI Programs Impersonating Biden, Other Candidates, To Manipulate Elections
A study from the Center for Countering Digital Hate (CCDH) revealed that AI programs can mimic the voices of politicians like President Joe Biden and former president Donald Trump, which could be used to manipulate elections.
Tests conducted by CCDH found that AI-enabled tools were used to create convincing but false statements about 80% of the time.
The company's CEO, Imran Ahmed, said in The Hill's report, that there are lacking safeguards for the tool. In addition, the level of skill required to be able to use these tools today is low.
"These platforms can be easily manipulated by virtually anyone to produce dangerous political misinformation," he lamented.
In 2024, mimic voices have been used to influence voters. Robocalls using a fake voice of Biden during the New Hampshire Democratic primary told voters to stay home in an attempt to decrease the turnout of voters.
Steve Kramer, the one behind the scheme, said that he was inspired by a need to warn people on how dangerous AI can be. He was charged with voter suppression and a misdemeanor for impersonating a candidate. The Federal Communications Commission (FCC) fined him $6 million.
Following the New Hampshire primary incident, the FCC banned AI voices in phone calls. It also required TV ads to reveal whether AI was used or not.
"As artificial intelligence tools become more accessible, the commission wants to make sure consumers are fully informed when the technology is used," said Jessica Rosenworcel, FCC Chair.
"Today, I've shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue," she said.
CCDH tested six AI tools and it found that only a few of them have any built-in safeguards to protect against disinformation, AOL revealed. The six tools include, ElevenLabs, Speechify, PlayHT, Descript, Invideo AI, and Veed.
CCDH also tested the tools using different politicians' voices, including those of Biden and Trump. Other politicians included in the test were UK Prime Minister Rishi Sunak and French President Emmanuel Macron.
Some examples of an AI-generated message include a message telling people not to vote because of a bomb threat, that Biden manipulated the results, and Macron confessing to the misuse of funds.
Out of the AI programs that were tested, only ElevenLab passed in a sense that it blocked the creation of mimic statements that use the voices of politicians.
© Copyright IBTimes 2024. All rights reserved.