The Real Threat of Fake Faces

Imagine a video of Donald Trump appearing in your social media feed. Would you ask yourself if the video is real? Well, why shouldn’t it be? This is the reality of deepfakes. How can fake imitations be a threat to the integrity of democratic elections?

Illustration Deepfakes

Main content

In the article “Anticipating and addressing the ethical implications of deepfakes in the context of elections”, Nicholas Diakopoulos and Deborah G. Johnson discuss how deepfakes can affect the U.S. presidential election in November 2020. Deepfakes are videos where anyone can look or sound like an entirely different person with the use of synthetic media technologies. By constructing a total of eight scenarios as a basis for the analysis, the authors examine two ethical questions: (1) What harms result from the use of deepfakes in elections? and (2) What can be done to limit those harms? 


The harm and how to deal with it

The analysis shows and concludes that deepfakes can lead to harm in elections. In the extension of existing literature, they present a novel misattribution harm they call “persona plagiarism”. They hope this can be useful in further analysis of deepfakes in both elections and beyond. To limit the harms of deepfake-use, the article points out four strategies: education and media literacy, subject defense, verification and publicity modulation.


The scenarios

In the first scenario Diakopoulos and Johnson show how entertainment shows politically motivated individuals to exercise their free speech by creating parodies. It also shows how taking the content from one context to another can lead to misunderstandings, and the scenario raise ethical questions about distributing videos of “people” without their consent. 

“As part of its coverage of the primaries, a satirical news show develops a website where viewers can interactively create synthesized videos of the various candidates. The users can act like a puppet-master with their web camera, making gestures and mouth movements that are mapped onto a candidate in a synthesized video. If the user records their voice, it is also dubbed and lip synced onto the candidate. Several users create humorous parody videos of candidates, exaggerating characteristic facial expressions and gestures while adding passable voice impersonations. Some of the caricatures reveal genuine concerns and reflect critical commentaries of the candidates, while others more vacuously mock or ridicule. Users download the videos and re-upload them to social media platforms such as YouTube, Facebook, and Reddit where they begin circulating. While 26 most viewers (e.g.~90%) can easily recognize the videos as parodies, a few are convincing enough that some people can’t tell. In particular, one of the more visually believable videos is an exaggerated portrayal of a female candidate’s approach to health policy that goes viral”  

The second scenario illustrates how complicated the synthetic video can be. With the combination of real events and synthetic content, the deepfake can be harder to spot. As Diakopoulos and Johnson points out, a candidate’s qualifications are often exaggerated, but the possibilities for doing this have expanded with deepfakes:

“A small veterans’ organization would like to see a particular candidate win the democratic primary because although he is not the only candidate with military experience, he is the only one with significant combat experience and the only one making veterans issues a central component of his campaign. The group becomes a Political Action Committee (PAC) and raises funds to make a promotional video for the candidate. The video consists of a combination of video clips with voice over that valorize the candidate’s bravery. One of the clips is a synthesized depiction of an incident in Iraq when the candidate heroically saved the lives of several members of their platoon. The video is posted on YouTube without any indication that one portion is synthesized. Thousands view the video; hundreds make comments. From the comments, it is apparent that some viewers believe the video is real footage taken by a reporter present at the event. Other comments include complaints from soldiers in the candidate’s platoon who claim that the depiction is an exaggeration of what happened”.