In a move to keep their millions of users safe and combat misinformation and harmful content – especially in the face of the upcoming election – TikTok is making some changes to their community guidelines. According to a company blog post, this will include updating their policies on misleading content, broadening their fact-checking partnerships to help verify election-related misinformation, adding an in-app reporting option for election misinformation and working with experts including the U.S. Department of Homeland Security to protect against foreign influence on our platform.
They are especially focussing on the elimination of deepfakes and disinformation: “Our intent is to protect users from things like shallow or deepfakes, so while this kind of content was broadly covered by our guidelines already, this update makes the policy clearer for our users”.
Deepfakes are largely associated with both revenge porn and misinformation campaigns (according to The Verge, there are no reports of any election campaign using deepfake technology, although the Trump campaign have shared “less sophisticated,” or shallow, edits which have been branded similarly misleading).
To combat fake news, TikTok will be working closely with PolitiFact and Lead Stories to help with fact-checking content relating to the election. They will also be adding an election misinformation option to the in-app reporting feature and introducing an election information center to connect people to authoritative information, according to the blog post.
In January this year, TikTok came under fire for secretly building in a system that would make creating deepfakes much easier. Fears surrounding deepfakes are often to do with the potential to impact major election campaigns – and there’s no denying that the 2020 election will be exactly that.