That's something I wrote on November 15, 2016, and I'm seeing it today because my son John is quoting me on Facebook.
John is reacting to a new story at Politico: "Facebook lifts ban on posts claiming Covid-19 was man-made as Wuhan theories surge."
That headline has been changed to "Facebook no longer treating 'man-made' Covid as a crackpot idea/Facebook’s policy tweak arrives as support surges in Washington for a fuller investigation into the origins of Covid-19." The new headline minimizes the problem with Facebook, which was censorship. That's more than just treating something like a crackpot idea. One way to treat a crackpot idea is to call it a crackpot idea — that's the "more speech" remedy that should be our first idea for dealing with bad speech. In this case, more speech would have given us a better chance to approach the truth. By going to censorship, Facebook gummed up the speech process for a year.
Here's my original post from November 2016, "November 15, 2016/Facebook, don't even try to censor fake news. You can't draw that line, and you shouldn't want to." That was just after Trump won the election (and long before the arrival of Covid-19).
2 comments:
Mattman26 writes:
Even if you were to ascribe a ton of good-faith intent to Facebook in deciding what’s fake or not (and I certainly don’t), and even if you believe that some organizations possess the competency and trustworthiness to reliably make such a determination on one issue or another (again, I pretty much don’t), you’re left with, “Who the hell is Facebook to presume to possess the competency to determine the origin of a virus?”
There’s the logical fallacy of “appeal to authority” (experts say X, therefore X is true), and it seems like that’s about the only fuel we’re running on these days. How many people have the time or capacity to carefully evaluate competing theories of a virus’ origins, or whether claims of voting irregularities are “baseless?” Does anyone at Facebook who decided to stifle the “lab origin” theory even have any college biology under their belt, much less the highly specialized knowledge and aptitude you’d need to unwind the question? I’m going to guess no. Instead, it’s “all the right people are saying X, so anybody saying not-X should be shut out altogether.” The implications are horrifying.
Temujin writes:
It's good to see that Facebook, like our media and our government, does not react to facts. They react to consensus. When they know where the pack is leaning, they are all in with them. As long as the pack is mostly progressives or people on the Left. Facts? Truth? Well, that depends on what the consensus of the group is. We'll get back to you on this 'truth' thing you ask about.
The evil part is that Facebook has BILLIONS of subscribers. They curate the news and do so in a way to gain a desired result. So when they decide to shut off a fact, it's gone. In this case, enough of the facts finally leaked out, albeit too late for a proper investigation to take place and with an election already in the rearview mirror for many. But enough leaked out that even their media friends had to admit that it's a very real possibility.
So the largest global purveyor of information was dragged, kicking and screaming, to the truth. How hard it must be to have their mentality. At some point they must lose the connection between reality and their fake world and actually not know what's real and what's Facebook.
Post a Comment