I’ve been cycling through The Righteous Mind and Moral Tribes, respectively by Jonathan Haidt and Joshua Greene. These blokes are social psychologists and moral philosophers. I started each of these books with the conception that I would neither like nor agree with the content. As for like, I suppose that’s a silly preconception better captured by whether or not I agree; that with which I don’t agree, I don’t like.
This said, I like the style of both of the authors, and I am finding the material to be less contentious than I first thought. I can already envisage myself agreeing with much of the substance but waiting to disagree with the conclusions.
Although I committed myself to document The Righteous Mind in situ, I am finding that I am listening to the audiobook whilst driving and so getting ahead of myself, so I’ll have to rewind and retread in order to do this. In fact, the reason I switched back to Greene’s Moral Tribes is so I wouldn’t progress even further in Haidt’s work.
I am writing this post to acknowledge this. I’d also like to document that I don’t believe that humans are good reasoners, a situation both Haidt and Greene cite to be generally true. Humans are post hoc rationalisers, which is to say that they make up their minds and then create a narrative to justify that position. Haidt uses an analogy of an elephant and a rider, and he asserts that humans might more accurately be described as groupish than selfish. Certainly not shellfish. Greene notes that people have been shown to concede self-interest to political party interest, which helps to explain how people continually and predictably vote against their own self-interests. This also supports my position that democracy is a horrible form of government. Of course, Haidt would argue that this proves his point that people tend to adopt facts that support their perspective and diminish or disregard those that don’t.
Haidt suggests that reason is overvalued, but then he proposes intuition as a better alternative. I agree with him that reason is overvalued and for the same reasons (no pun intended) that he does. But it doesn’t follow that intuition is (1) better, (2) significantly better, or (3) good enough for (a) long term viability or (b) grasping complexity.
Whilst I am not immune to this any more than someone else. I recall Kahneman writing in Thinking Fast and Slow that even though he is well aware of cognitive biases and fallacies, he himself can’t escape them either. When I used to teach undergraduate economics, I’d give some sort of policy assignment. As a preamble, I’d instruct the students that without exception, all policy decisions have pros and cons. In their submissions, they’d need to gather both supporting and detracting arguments and then articulate why one should be adopted over another. Minimally, I’d expect at least three pros and cons.
The students would almost invariably complain about how difficult it was to imagine a counter-position. Even when they’d include some, they were usually weak tea fodder. Oftentimes, the students already shared the same perspective, so they couldn’t usually even get the opposing side until we debriefed after the assignments had been graded. Although I do recall instances where students would admit that they hadn’t considered this or that opposing view, I can’t recall a case where a position was flipped after hearing new evidence—not that this was my intention. People do engage in escalating commitment, doubling down on existing beliefs and generating defensive—sometimes tortuous—arguments to support their positions.