In a post on November 9, 2016, Facebook founder Mark Zuckerberg outlined how he was going to tackle the “fake news” occurring on his massive social media site. He stated:
The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible.
So far so good.
But he then went on to outline a 7-point plan that will rely on users, technical means, and third parties, to identify and flag fake news. Why could that be a problem? Because the third parties they intend to use – Snopes.com has been mentioned among others – have their own biases. As do all Facebook users; one person’s must trusted source can viewed by someone else as unreliable. So is Facebook going to censor posts based on the advice of biased sources?
Let’s fast-forward to Dec. 27, 2016. Brendan Larsen of the GodOrAbsurdity.com website reported that he was now on his 4th Facebook page – the three previous edition having been shut down by Facebook for violating their Community Standards – and that he’d had a total of 35 posts banned by Facebook. According to Larsen:
The original page had about 13,000 likes and was reaching millions of people until atheists got it shut down. I’m taking a new approach now where we avoid posting anything that might get us banned – it’s just too difficult trying to rebuild followers from zero each time they shut us down.
While some of Larsen’s posts were graphic – he showed the brutalized bodies of aborted children – Facebook says it removes “graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.” That was certainly not the case here.
Facebook also says they will remove:
…content that directly attacks people based on their: Race, Ethnicity, National origin, Religious affiliation, Sexual orientation, Sex, gender, or gender identity…
This seems the most likely reason Larsen was banned (Facebook didn’t provide an explanation) since he has shared posts about Islamic terrorism – to link terrorism and Islam is, in some circles, automatically “hate speech.” This is the problem with biased users policing speech on Facebook – instead of censoring what’s fake, they may simply censor what they don’t like.
On February 20, LifeSiteNews.com reported that Christian “vlogger” (video blogger) Elizabeth Johnston was having similar troubles for posting Biblical commentary on homosexuality. Johnston said:
They are muzzling me and my biblical message while Mark Zuckerberg claims that FB is unbiased…. The post Facebook deleted included no name-calling, no threats, and no harassment. It was intellectual discussion and commentary on the Bible.
This has a happier ending – on February 24, after LifeSiteNews.com brought publicity to her situation, Facebook apologized for this “error” and restored her post.
What’s the takeaway? In asking Facebook to eliminate “fake news” we are also asking them to become the arbitrator of truth for their users. But do we really want them “policing” the news we read? God tells us that it is the presence of multiple counselors (Prov. 11:14) and access to the other side of the story (Prov. 18:17) that helps us find the truth. This is why Christians, overall, oppose censorship – we don’t want someone limiting who we can hear from. We shouldn’t trust Facebook or anyone with such enormous power.
Of course there is a time and place for censorship, but it is a blunt tool, and should only be used for clear and pressing problems. So, for example, Facebook should ban posts that promote pornography and human trafficking – these are, on the one hand, enormous evils, and on the other, clear evils. To confront this sort of wickedness requires very little in the way of judgment or discernment on the part of Facebook – it would be hard for them to mess up here. But when it comes to “fake news” the problem simply isn’t big enough or clear enough to turn to censorship as the solution.
Instead we should simply test what we read, and pass along only that which we know to be true. If in doubt, don’t pass it on – a simple but effective solution if ever there was one!