There are good reasons to worry about Facebook's "fact checking" efforts, primarily that the company is a raging garbage fire that destroys everything it touches and only has two approaches to solving any problem:

1/
1. Make it fully automated in a way that guarantees that there will be innumerable false positives in which legitimate material is erroneously censored, with no effective means of appeal, even as dedicated trolls exploit the system's blind spot to carry on as normal;

2/
But there's one reason NOT to worry about Facebook factchecking, and that's the "Backfire Effect," a discredited psychological principle that holds that when learn facts that challenge their worldview, they double down on their false beliefs.

4/
Again, I don't trust Facebook to do anything well, let alone factchecking. But among the things Facebook does badly, apparently, is "understanding how factchecking works."

7/
This is well-put in an op ed by @EthanVPorter and @ThomasJWood, authors of "False Alarm: The Truth about Political Mistruths in the Trump Era," a peer-reviewed book from Cambridge University Press.

https://www.cambridge.org/us/academic/subjects/politics-international-relations/american-government-politics-and-policy/false-alarm-truth-about-political-mistruths-trump-era

8/
And they did a new study to show that this would work on FB, too: "Across all issues, people who had seen misinformation and then a related fact-check were substantially more factually accurate than people who had only seen the misinformation."

10/
"Prior research has found that, on social media, fake news is disproportionately shared by older, more conservative Americans. In our study this group did not show any special vulnerability to backfire effects. When presented with fact-checks they became more accurate too."

eof/
You can follow @doctorow.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: