close
close

Election deepfakes are here and better than ever

Election deepfakes are here and better than ever

These days, it's not difficult to create a reasonably good deepfake, especially if someone has access to artificial intelligence tools, a decent gaming computer, and tons of audio and video samples of the person they want to clone.

Fraudsters use the video and audio clones, often played in real time, to defraud everyone from companies who think they are wiring money to a top executive to parents who transfer money in a panic after receiving a cry for help from someone they believe is their child.

And now the increasingly convincing fake videos, some of which are being shared on social media and promoted by people like Donald Trump and Elon Musk, are being used to dupe Americans ahead of the presidential election in November. Experts worry that these deepfakes could potentially influence how, or even whether, people vote at all.

“It's important to make people aware of these things because the election is, what, three months away and it's already happening,” said Brandon Kovacs, a senior red teamer at cybersecurity firm Bishop Fox. Red teamers help companies bolster their cyber defenses by attacking them to find security vulnerabilities.

Election fraud, whether spread by politicians or enemies of the United States, is nothing new. What is new for the 2024 presidential campaign is the rise of open-source artificial intelligence-based tools, complete with YouTube tutorials, that allow virtually anyone to create potentially convincing deepfakes and use them to spread misinformation.

“It's not like you're using some crazy secret tools,” Kovacs said. “Everything is out there and ready to go.”

d5cd3f09-f8a6-40c0-8df6-4ee2bf5336de-1-201-a

Brandon Kovacs (Bishop Fox) demonstrates how easy it is to deepfake someone by transforming himself into a female colleague.

Bree Fowler/CNET

That's one of the main reasons why Kovacs spent a long weekend at the Defcon conference in Las Vegas earlier this month, demonstrating how easy it is to create fake videos in the major event's AI Village, an annual gathering that brings together tens of thousands of hackers and other cybersecurity experts.

Using only a consumer gaming laptop, a simple DSLR camera, some lighting, and a green screen, Kovacs transformed eager Defcon attendees into everything from the more well-known hackers on his team to celebrities like Jackie Chan and Keanu Reeves.

While the results were admittedly not perfect, it was amazing how well the face-swapping software worked. Participants were transformed into the deepfake person of their choice in real time on a TV screen next to them. Background scenes such as an office or TV newsroom replaced the green screen, and props such as wigs helped frame the swapped face and add elements of natural movement that made the overall image more convincing.

When the attendee moved and spoke, so did the deepfake image. Voices were also cloned in real time, but were barely audible in the packed convention center. There wasn't much video lag, and Kovacs said using a more powerful computer rather than a consumer model would have minimized it significantly.

The goal of the demonstrations was to make people aware of how advanced deepfakes have become and to help computer system defenders develop better models to detect them.

Deepfakes as disinformation

Deepfakes don't have to be cutting-edge technology to be convincing – especially when they're spread by a celebrity.

Trump recently posted images on his Truth Social account, at least some of which appeared to be artificially generated. These images suggested he was supported by megastar Taylor Swift and her fans. The images, which he captioned “I accept,” were originally posted on X, formerly Twitter, by a user who called them satire. One of the images reposted on Trump's Truth Social account even includes the word “satire” in the caption.

On the other hand, Trump has also falsely accused Vice President Kamala Harris' campaign of doctoring a photo taken at Detroit Wayne County Metropolitan Airport, claiming that she “artificially edited” it to show a huge crowd that he said did not exist. Yet numerous other videos and photos of the event showed a crowd similar in size to the one in the Harris campaign photo. Local reporters at the event estimated the crowd at about 15,000 people.

In the same Truth Social post, Trump also repeated his false claims of voter fraud by Democrats. Nearly four years after being voted out of office, Trump continues to spread the lie that the 2020 election was rigged, despite there being no actual evidence to support this.

Representatives for both the Trump campaign and Swift did not respond to emails seeking comment.

Elon Musk, the owner of the Trump-supporting platform X, got into trouble in July after he posted a video on X that used voice-cloning technology to mimic Harris' voice, creating a deepfaked voiceover that played over footage from one of her campaign videos.

Although Musk did not describe the post as satire, he clarified that it was after receiving criticism.

It's still okay to be funny, right?

Whether from late-night talk show hosts or the internet, it's hard to imagine modern American politics without satire. Granted, sometimes it can inadvertently misinform people. But deepfakes can take things to a new level when the people behind them intentionally use them to spread disinformation for their own benefit.

And Adam Marrè, chief information security officer at cybersecurity company Arctic Wolf, says these deepfakes need to convince relatively few people to be effective.

“When people look through their feeds, they're going to see these and I think some of them might not understand that these are deepfakes, or maybe they don't care,” Marrè said. “And all of that will influence opinion in some way.”

That's why he believes it's crucial for social media companies, as well as AI companies, to do their best to track down the people behind malicious deepfakes.

“I still worry, though, that we're relying on them to do this, on their goodwill, on their desire to be a good citizen,” he said. “There's no regulatory basis that we can use, and no policymakers that can enforce this. That's something we still lack.”

Related Post