đźš‚Naturally, once I finally I finished laboriously trying to solve the problem of free speech, I remembered the relevant concept for that interview. It's something I called "shouldful thinking" in parallel with the wishful and fearful forms of motivated reasoning.
You can see why it's hard to overcome. People's consciences prescribe what they should and should not think, not just what they should and should not do--so that willingness to change your mind depends not only socially but individually on whatever "rules" apply.
I struggled terribly with this--it was one reason it took me so long to see scholarship as unreliable. (I had this idea that doubting people, questioning them, or seeing myself as more likely to be right was insufferably arrogant and insulting, so I just...wouldn't think so.)
What I would have needed instead was a "morality of thought" that prioritized reasoning and critical thinking. These words were used in school but the underlying concepts weren't taught, and no one else I knew ever talked about them at all.
What also happens, even among people who do prioritize reasoning and critical thinking (in practice, not just in name), is that they'll carve out exceptions. These could be political, religious, relational ("I'll question God but not my wife")...anything, really.
So if you have a whole culture or group believing that it is *morally wrong* to hold a certain view or idea, of course they won't change their minds in that direction (or admit to it), even if all the evidence is right in front of them. They'd have to value evidence over virtue.
Which most people don't, I think, because otherwise we'd collectively be much better at evaluating evidence than we are.
I've heard this sort of thing denigrated as "religious," but I don't think that's valid. Religions do prescribe morality, but they aren't the only things that do. And someone who does prioritize critical thinking probably does so for moral reasons, too--just different morals.
Anyway...this is how I understand motivated reasoning of the "shouldful" kind. If it makes you feel guilty and wrong to think a certain way, you'd have to really prioritize evidence and reason to overcome that sense of guilt (and, in my case, fear of hurting someone).
I don't see this as cowardice or an indictment of anyone's character, but I do think it makes us vulnerable to serious mistakes. It's also a problem when people simply switch the moral valence of an idea from bad to good.
As well as when they *claim* to prioritize evidence and reason but really don't (or aren't very good at it). But that's a problem for another day.đźš‚
And--holding a view for non-evidentiary reasons doesn't make that view wrong, nor evidentiary reasons right. And just because a whole group believes strongly in something doesn't mean they're (all, only) doing so because they think they should. They could be justified, too.
đźš‚FYI, guys, this thread was *really helpful.* I didn't think so at the time, but I'd forgotten how it was to live like that in the complete absence of malice or awareness from anyone--very ironic, since "resist conformity" was such a strong ~idea~ in my school/home/generation.
You can follow @MiddleLiddell.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: