Mark Zuckerberg made several newsworthy choices this week. One 鈥 to as an example of content that Facebook should keep up because 鈥渢here are different things that different people get wrong鈥 and 鈥渋t鈥檚 hard to impugn [their] intent鈥 鈥 was ill-advised.
But another 鈥 to keep Facebook from diving deeper into the business of censorship 鈥 was the right call. On Wednesday, Facebook announced a to remove misinformation that contributes to violence, following criticism that content published on the platform has led to attacks against minorities overseas. When pushed to go further and censor all offensive speech, .
While many commentators are focusing legitimate criticism on about Holocaust denial, others are calling for Facebook to adopt a more aggressive takedown policy. What's at stake here is the ability of one platform that serves as a forum for the speech of billions of people to use its enormous power to censor speech on the basis of its own determinations of what is true, what is hateful, and what is offensive.
Given Facebook鈥檚 nearly unparalleled status as a forum for political speech and debate, it should not take down anything but unlawful speech, like incitement to violence. Otherwise, in attempting to apply more amorphous concepts not already defined in law, Facebook will often get it wrong. Given the enormous amount of speech uploaded every day to Facebook鈥檚 platform, attempting to filter out 鈥渂ad鈥 speech is a nearly impossible task. The to try to deal with the volume is only likely to exacerbate the problem.
If Facebook gives itself broader censorship powers, it will inevitably take down important speech and silence already marginalized voices. We鈥檝e seen this before. Last year, when activists of color and white people posted the exact same content, Facebook moderators . When Black women posted screenshots and descriptions of racist abuse, Facebook moderators or . And when people used Facebook as a tool to document their , Facebook chose to shut down their livestreams. The 老澳门开奖结果鈥檚 own Facebook post about censorship of a public statue was also by Facebook.
Facebook has shown us that it does a bad job of moderating 鈥渉ateful鈥 or 鈥渙ffensive鈥 posts, even when its intentions are good. Facebook will do no better at serving as the arbiter of truth versus misinformation, and we should remain wary of its power to deprioritize certain posts or to moderate content in other ways that fall short of censorship.
There is no question that giving the government the power to separate truth from fiction and to censor speech on that basis would be dangerous. If you need confirmation, look no further than President Trump鈥檚 preposterous of the term 鈥渇ake news.鈥 A private company may not do much better, even if it鈥檚 not technically bound by the First Amendment to refrain from censorship.
As odious as certain viewpoints are, Facebook is right to resist calls for further outright censorship. When it comes to gatekeepers of the modern-day public square, we should hope for commitment to free speech principles.