Fact check: How do I spot audio deepfakes? – DW – 08/14/2024 (2024)

Did former US PresidentBarack Obama suggest thatthe Democrats were behind the failed assassination attempt of his successor, Donald Trump?

Several audio recordings have been circulating in the US, ostensibly ofObama speakingwith his former advisor, David Axelrod, about the upcoming USpresidential election in November.

In one of the snippets, a voice that sounds like Obama's says: "It was their only opportunity and these idiots missed it. If only they could get rid of Trump, we would ensure their victory against any Republican candidate."

But the audiohasbeen identified as fake, meaning Obama never said any of this. Instead,the audiowas synthetically generated with the help of artificial intelligence (AI).

NewsGuard, a misinformation and media watchdog in the United States, published an analysis of the audio files. It used multiple AI detection tools and interviewed a digital forensics expert before concluding they were fake. It also spoke with an Obama spokesperson, who confirmed they were not authentic.

Obama is not the only politician with deepfakes of their voice out there. Earlier thisyear, an AI-generated audio of US President Joe Biden's voice urged voters in the New Hampshire primary election not to vote.

And it's not just a problem in the United States. Last year, shortly before an election inSlovakia, anaudio deepfake went public impersonating liberal party leaderMichal Simecka. In the UK, London Mayor Sadiq Khan also fell victim to a fake AI recording of him supposedlymakingcontroversial remarks.

Deepfakes: Manipulating elections with AI

Audiodeepfakes have become a significant disinformation threat, especially in times of political uncertainty, such as during elections.

Easier to make, harder to debunk

AI-generated audio fakes can be particularly harmful in election cycles becausethey are so easy to create and disseminate.

"They require less training data and computing power — compared to video deepfakes —to produce nearly realistic outcomes," said Anna Schild, an expert in media and communication inDW's Innovation team.

She has been examiningtheimpact of audio deepfakes, together with her colleague Julia Bayer, and explains why they are fast-growing in popularity.

"Their versatile applicability, from robocalls to voice messages and video voice-overs offer many different dissemination channels," Schild said.

Audio deepfakes are also harder to detect than other forms of disinformation.

"Audio fakes are a bit more difficult to recognize than video deepfakes because we simply have fewer clues," Nicolas Müller, a machine-learning engineer at Germany's Fraunhofer Institute for Applied and Integrated Security, told DW.

"In a video, we have the audio, the video, and a certain synchronicity between them," said Müller, whohas studiedpeople's abilities to detect fake recordings. He and his colleagues found that, in audio files, there are fewer elements people can rely on to detect whether the recording is authentic or not.

So, what can users do if they encounter an audio file that they feel may have been created by AI?

One solution would be tocombine standard verification techniques with the use of AI software specializedin identifying audio deepfakes.

Fact check: How do I spot audio deepfakes? – DW – 08/14/2024 (1)

Honing the senses

One way to verify if audio recordings are real or not is to check the file for telling patterns that could indicate AI interference.

In the above-mentioned example of Barack Obama, that would mean comparing the suspicious file with a known and verified audio track of his voice to find any possible deviations from Obama's normal manner of speaking.

Thiscould include differing pronunciations, unnatural pauses, or unrealistic breathing patterns.

Beyond this, a closer look at the audio in question could involve checking for background noiseor unnatural sounds.

Finding these clues can be hard for untrained listeners, but several tools have been designed to help peoplepractice recognizing this kind of disinformation.

One of them is the Digger deepfake detection project, designed in cooperation with DW.The project has developed practical exercises for people to train theircritical listening skills.

Nicolas Müller's team has also developed a game for participants to test how well they can spot audio deepfakes.

Fact check: How do I spot audio deepfakes? – DW – 08/14/2024 (2)

Using AI tools to combat AI disinformation

An additional verification layer involves using AI-supported software trainedto detect audio deepfakes.

In our example with the synthetic voice of Obama, NewsGuard useddeepfake checkers like TrueMedia, which has a deepfake detector bot that it says canreply to users' verification requests on the social media platform X (formerly Twitter).

Fraunhofer, meanwhile, developed Deepfake Total, a platform where users can upload suspicious audio filesto have them analyzed. All uploaded files are rated with a score on a "fake-o-meter," which indicates the likelihood of the file being artificial.

It’s important, however, to stress that tools to detect deepfakes are not infallible. While they can estimate the likelihood of a file being AI-generated, they are not always correct.That is why such tools should always be used with caution, as one of multiple verification steps.

This could include browsing fact-checking sites and platforms to see ifthe audio in question has already been debunked by other fact-checkers.

Several media outlets have alsodeveloped tools to identify audio deepfakes, such asVerificAudio, by Spain's global media company PRISAMedia,which aims to detect fakes in the Spanish-speaking world.

Jose Gutierrezfrom PRISA Media explained to DW that the tool is based on two AI-guided processes:While the first compares suspicious audio files with authentic audio recordings of the same person, the second analyzes acoustic features such as bandwidth, pitch, frequency and sound texture.

Gutierrez also stressedthatthistool didnot provide definitive answers, but percentages of plausibility.

Fact check: How do I spot audio deepfakes? – DW – 08/14/2024 (3)

Checking the context

If all this seems too complicated or technical, what also helps while trying to identify audio deepfakesis to focus on more traditional verification skills that are not exclusive to audio recordings.

DW's Innovation team suggests "zooming out" to check content, source and any other relevant information onthe file in question. They list someworking tools on a website called"How to Verify".

Some helpful tips include comparing the audio content with known facts, checking the person’s social media channels, and searchingadditional context on trustworthy news sources.

In the end, it's all about using a mixture of techniques. As DW Innovation points out,"there's no one-button solution that can assist and detect any kind of manipulation in audio."

This article is part of a DW Fact Check series on digital literacy. Other articles include:

  • How do I spot manipulated images?
  • How do I spot AI-generated images?
  • How do I spot a deepfake?
  • How do I spot state-sponsored propaganda?
  • How do I spot fake social media accounts, bots and trolls?

And here you can read more abouthow DW fact-checks fake news.

Edited by: Rachel Baig

Fact check: How do I spot audio deepfakes? – DW – 08/14/2024 (2024)

References

Top Articles
Latest Posts
Article information

Author: Terrell Hackett

Last Updated:

Views: 6326

Rating: 4.1 / 5 (52 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Terrell Hackett

Birthday: 1992-03-17

Address: Suite 453 459 Gibson Squares, East Adriane, AK 71925-5692

Phone: +21811810803470

Job: Chief Representative

Hobby: Board games, Rock climbing, Ghost hunting, Origami, Kabaddi, Mushroom hunting, Gaming

Introduction: My name is Terrell Hackett, I am a gleaming, brainy, courageous, helpful, healthy, cooperative, graceful person who loves writing and wants to share my knowledge and understanding with you.