I read an interesting interview today, Sean Illing of Vox.com talking to Prof. Steven Sloman, a psychologist at Brown. Sloman’s on a book tour pitching The Knowledge Illusion: Why We Never Think Alone, co-written with Philip Fernbach. It seems bound to be a very worthwhile book and I’ve ordered a copy.
It’s a subject of particular interest to me, having spent a lifetime studying at close hand how people make big, consequential decisions on issues of war, money, and engineering design. Often remarkably bad decisions.
It has long been speculated that humans have two quite different modes of problem-solving mental activity. They go by various names, but one way of putting it is that we can engage in (1) calling up remembered skills, ideas, and facts and threading them together in our amazing associative memory or (2) rational analysis. Any reader of Daniel Kahneman’s marvelous Thinking, Fast and Slow will immediately recognize these as his “system 1” and “system 2,” respectively. As Kahneman explains, the associative thinking of system 1 is very fast and very efficient of energy, while the analytical thinking of system 2 is slower and costs much more energy.
It’s a little amazing that our brains are as big and powerful as they are, for that big, energy-consuming brain was a major burden for our early ancestors, who lived always awfully close to the edge of starvation. Virtually all of our evolutionary history was passed as hunter-gatherers living in tiny bands with very minimal resources, and we can feel sure that the architecture of our minds is almost entirely optimized for survival under such conditions.
This seems to imply a strong inherent tendency to minimize brain energy expenditure, among other things through reliance on the associative system 1, avoiding use of the analytical system 2. Consider a concrete example, driving a car. When you first start driving you have to think about everything you must do and it’s exhausting. You come home from a driving lesson ready to veg out, and you’re liable to make neglectful mistakes in your driving. But once you have learned the necessary repertoire of skills, driving becomes much easier and you can drive to a known location with little conscious thought or mental effort.
We all have an illusion of doing more rational analytical system 2 thinking than we ever do in reality because it’s the kind we’re aware of. It generally takes somewhere in the neighborhood of 400 milliseconds to become aware of a sensation and because associative system 1 thinking happens faster than this, we are at best only very slightly aware that it is taking place at all. The ideas and solutions that the associative memory presents come to us as revealed and if asked where it came from we cannot give an accurate account. If we are socialized to conceive of ourselves as rational beings we will rationalize them with spurious analyses.
So fast and automatic is associative thinking that it is entirely unavoidable; presented with a problem we will almost always conceive an associative response. In the vast majority of cases these associative system 1 responses are acted upon without any further analysis. And quite a large proportion of us never expend the energy and time to subject our associative responses to analytical scrutiny. Most people have the illusion of rational thinking, but almost never engage in it. And even those with a genuine analytical bent nevertheless rely very largely on associative thinking.
Sloman points out in the interview (and presumably in his book) that some substantial portion of our associative memories are filled with things we have heard and absorbed from other people. This is surely very natural, given that we are (as Samuel Bowles and Herbert Gintis have shown in one of my favorite books) above all a cooperative species. Were it not for the operation of a sort of collective group mind it is unlikely that our ancestors could have survived in a hostile world. And of course it follows from the Bowles-Gintis argument that this tendency will be especially marked in groups involved in conflict, whether commercial, political, or military.
Sloman concludes by saying
My colleagues and I are studying whether one way to open up discourse is to try to change the nature of conversation from a focus on what people value to one about actual consequences. When you talk about actual consequences, you’re forced into the weeds of what’s actually happening, which is a diversion from our normal focus on our feelings and what’s going on in our heads.
It’s a reasonable idea, but I’m not hopeful. I’ve spent a lot of time and effort on attempts at institutional ways to improve deliberation and have found that at least in the kinds of senior decision groups I have mostly dealt with getting people off their values focus is very difficult and getting them to actually engage consequences issues at all realistically is a great deal more so.
Will – I just noticed your blog site. Your illusion of thinking parallels my thoughts based on Kahneman, Haidt, Schermer and Pinker. I phrase it as “the fallacy of knowledge”, the idea that because of the points you address, we’re not as smart as we think we are (your illusion of thinking). The core of my thinking on this is the litany of cognitive biases we are bestowed with because of the evolutionary forces on our neural development. Based on my readings, I’ve made a list of a dozen or so cognitive disjunctions – fallacious thinking resulting from our inherent cognitive biases. These disjunctions create a false sense of “certitude” about our beliefs – which I would argue is why we find it so hard to “talk politics” in a rational manner and find solutions to our problems!