Until corporate media and the neoliberal establishment refused to acknowledge their direct role in the election of Donald Trump and threw a temper-tantrum about misinformation on social media to scapegoat blame, Facebook CEO Mark Zuckerberg balked at the notion faulty reports circulating on social media had anything at all to do with the November 8th shocker.
“Of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes,” Zuckerberg wrote in a post to his platform last Saturday. “The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
Now, rather than stand by that original assertion, Zuckerberg instead cast all logic aside and unleashed a Machiavellian seven-point plan to eradicate the “very small amount” of false information — read: all opinion not in lock step with the establishment narrative — from the newsfeeds of Facebook’s billion-plus users.
Because, apparently, we can’t be trusted to think for ourselves.
“The bottom line is: we take misinformation seriously,” Zuckerberg wrote late Friday evening, apparently forgetting what he posted exactly one week ago. “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.”
Curiously, the head of the Facebook Ministry of Truth neglected to explain how the 65 corporate presstitutes and myriad mendacious mainstream outlets exposed in Wikileaks’ Podesta Files for colluding with the Clintonite establishment were awarded a free pass to spread propagandic disinformation — and, frequently, flagrant lies.
Worse, what Zuckerberg wrote next should send chills down the spines of anyone who has ever been forced to deal with fallout from the social media platform’s already-rampant and oft-inexplicable censorship via erroneous and revenge reporting on posts, arbitrary unpublishing of pages, ghosting, and newsfeed suppression — as well as those who look to Facebook for alternatives to vapid mainstream media:
“Historically, we have relied on our community to help us understand what is fake and what is not.Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it’s much less likely to spread.”
Snopes? Really? The same Snopes that took it upon itself to “debunk” an inside joke in meme form that happened to go viral?
In just those three sentences, Zuckerberg does more to expose the innate perils of censorship than any scholarly tome on the subject ever could — personal opinion always operates the censor’s heavy hand. It’s inescapable fact that what one individual deems devoid of value, another may find sacrilegiously offensive — while another may laugh off as innocuous.
Dismissing that scripture — or, perhaps, forgetting it formed the foundation for First Amendment protections of free speech, press, and expression — Zuckerberg laid out his plan to combat the ‘relatively small percentage of misinformation,’ encompassing the following points:
- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.
- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
- Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.
- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.
- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.
- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.