YouTube Bans 'Deepfakes' in 2020 Election

Google's video service clarified its rules ahead of the Iowa caucuses.

This March 20, 2018, file photo shows the YouTube app on an iPad.
This March 20, 2018, file photo shows the YouTube app on an iPad.
AP Photo/Patrick Semansky, File

Better late than never, YouTube is making clear there will be no “birtherism” on its platform during this year's U.S. presidential election. Nevermind that the conspiracy theory around former President Barack Obama's citizenship emerged in 2008 and has not been a widespread issue since he last ran for president in 2012.

The Google-owned video service is also reiterating that it won't allow election-related “deepfake” videos and anything that aims to mislead viewers about voting procedures and how to participate in the 2020 census.

Neither of these policies is new either, but YouTube clarified its rules ahead of the Iowa caucuses Monday in an apparent attempt to ensure that it is working to prevent the spread of election-related misinformation on its service. Google, Facebook, Twitter and other technology platforms are under intense pressure to prevent interference in the 2020 elections after they were manipulated in 2016 by Russia-connected actors.

The company is mostly reiterating guidelines that it has been putting in place since the last presidential election in 2016.

Its ban on technically manipulated videos of political figures was made apparent last year when YouTube became the first major platform to remove a doctored video of House Speaker Nancy Pelosi. But the announcement Monday further clarifies that it will take down any election-related videos that are technically altered to mislead people in a way that goes beyond simply taking clips of speech out of context. The company also said it would remove doctored videos that could cause “serious risk of egregious harm” — such as to make it appear that a government official is dead.

Facebook, which last year had resisted early calls to yank the Pelosi video, said in January that it was banning “deepfake” videos, the false but realistic clips created with artificial intelligence and sophisticated tools. Such videos are still fairly rare compared to simpler “cheap fake” manipulations such as were used in the video that altered Pelosi's speech to make it seem like she was slurring her words.

Google also said Monday that it will remove any videos that advance false claims about whether political candidates and elected officials are eligible to serve in office. That had been policy before, but wasn't made explicit.

The company's announcement comes about nine years after celebrity businessman Donald Trump began to get notice for claiming that Barack Obama, the nation's first African American president, was not born in the United States.

Trump repeatedly voiced citizenship doubts even after Obama produced his long-form birth certificate. Trump only fully backed off from the idea in the final stages of his 2016 presidential campaign.

YouTube said it will also crack down on any attempts to artificially increase the number of views, likes and comments on videos. It changed its systems for recommending what videos users watch last year in a push to curb harmful misinformation. Twitter and Pinterest also last week outlined their efforts to reduce election misinformation on their platforms.

More in IoT