Facebook will be rejecting political ads from both parties claiming victory before the election results are officially announced.
Earlier this month, Facebook CEO Mark Zuckerberg announced his action plan to “protect democracy” in the run-up to the election on November 3. The 2020 election has been called one of the most important in our lifetimes and, in a year like 2020 – with riots and a deadly plague taking center stage in the national and international psyche – the tensions are high.
One of the main concerns of Facebook’s election policy is the chance that the results could be delegitimized, particularly by President Trump. Due to the pandemic, a huge number of voters, most of whom are likely to be democrats, will choose to do so by mail-in ballot. The chances are, the initial result we see on election night will be quite different to the official result we will see when all of the mail-in ballots have been counted; something that could take days or even weeks.
Trump has already made claims against mail-in voting, suggesting it is fraudulent and may produce unreliable results (all of which has been debunked by experts) and Zuckerberg, like many others, is concerned that either candidate could take the result on election night and claim it as the official result.
Initially, Zuckerberg said Facebook will use its Voting Information Center to make users aware that election night results aren’t official and planned to fact-check and label any posts claiming victory before the true results were out. However, Fast Company made the point that, despite Facebook attempting to limit political disinformation by banning political ads in the run-up to the election, that would do nothing to “prevent a candidate such as Donald Trump from declaring victory before the final results are tabulated,” at 12:01am November 4.
Hours later, a Facebook spokesperson told Fast Company that they would now be “rejecting political ads that claim victory before the results of the 2020 election have been declared.”
This policy is coupled with a move to potentially restrict content around election day to quell any violence that might erupt, including hate speech and fake news. Speaking to the Financial Times, Facebook’s Head of Global Affairs, Nick Clegg, said the company was looking at “some break-glass options available to us if there really is an extremely chaotic and, worse still, violent set of circumstances.”
Time and time again, Facebook has failed to act on hate speech, a lot of which led to genuine atrocities such as the 2017 genocide in Myanmar and, more recently, the death of two protesters by a boy said to be connected to a right-wing militia Facebook failed to remove. They have also come under a lot of scrutiny this year for allowing the spread of political misinformation and health disinformation regarding the pandemic, as well as a failure to prohibit the rise of QAnon, a far-right conspiracy theory.
However, this recent policy does show a step in the right direction for Facebook in terms of its election policy, especially after the Cambridge Analytica scandal in 2016. The outcome of election night is still up in the air and a lot of people are fearful of the events that may follow; Facebook’s new policies are a direct response to these concerns.