This Tuesday, republicans introduced a bill that would begin to strip away some of the protections afforded to internet platforms – particularly social media platforms – relating to the content they do (or don’t) choose to moderate.
Section 230 essentially means that platforms are not responsible for the content posted on their site, masking them from legal liability in the case where a user posts something illegal (with exceptions to copyright violations, sex work-related material, and violations of federal criminal law). Although it’s generally misunderstood as a piece of legislation, it’s a much-contested bill, with many politicians on both sides, including President Trump, believing tech companies hold too much power when it comes to both limiting free speech and allowing dangerous and incorrect content to spread.
The bill introduced on Tuesday is called the Online Freedom and Viewpoint Diversity Act. Introduced by Senators Lindsey Graham, Roger Wicker, and Marsha Blackburn, the bill aims to pressure platforms and moderators to clearly label why something has been removed, or restricted access to, or risk losing the protection which Section 230 provides.
Notably, the act wants to clarify terms used by platforms on posts that have been restricted. For example, by swapping the term “otherwise objectionable,” for more concrete explanations like “promoting terrorism,” or “self-harm”. This means companies would need to have more concrete policies in place. Companies will need to have an “an objectively reasonable belief” that content violates a specific policy.
“Social media companies are routinely censoring content that to many, should be considered valid political speech,” Graham said in a statement Tuesday. “This reform proposal addresses the concerns of those who feel like their political views are being unfairly suppressed.”
Senator Blackburn made the point that big social media platforms like Facebook and Twitter “exert unprecedented influence over how Americans discover new information, and what information is available for discovery,” and since there is “no meaningful alternative” to these platforms, “ there will be no accountability for the devastating effects of this ingrained ideological bias”.
While the sentiments seem good, I fear the reasons behind this bill are less benevolent than they are self-interested. There is consistent talk of social media platforms skewing their moderation policies in favor of left-wing politics. Disinformation and hate speech are consistently removed or labeled as such, often leaving President Trump caught in the crossfire. Facebook has been especially diligent throughout the pandemic and 2020 election campaigns, aiming to label and dispel any misinformation relating to the spread of Covid-19 and the voting process.
Still, after being accused of having an anti-conservative bias by President Trump, and with free speech at the heart of its values, Facebook has fumbled the bag more than once when it came to its moderation policy. Most recently, we saw this with the Kenosha Guard page (belonging to a militia group organizing a counter-protest to the BLM marches in the city) which almost undoubtedly led to the shooting and killing of two protesters by a young Facebook user. Notably still, Facebook’s strive for partiality has also facilitated the rise of QAnon, an unfounded conspiracy suggesting that Trump is trying to unearth a pedophile ring in the upper classes of America. QAnon has links to the anti-lockdown movement, the 5G conspiracy, and the idea that Coronavirus does not actually exist.
On the other hand, it may lead to companies enforcing stricter policies and further limiting freedom of speech on their platforms, especially since revoking Section 230 means these companies will be liable for the content posted on their platform. In order to protect themselves, companies might remove even more content than before, having the adverse effect than the one the bill is striving for.
So it seems, rather than protecting society with freedom of expression, these policies might incite further harm.