Tribe, part 2
We all have a “tendency to search for, interpret, favor, and recall information in a way that confirms [our] preexisting beliefs or hypotheses” (Wikipedia). Thucydides observed, some 400 years BCE, that
"it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy."Dante's Divine Comedy notes,
"opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind."Thomas Jefferson said,
"The moment a person forms a theory, his imagination sees in every object only the traits which favor that theory.”We look for – and we see everywhere – support for our own theories.
A cousin of confirmation bias is the behavioral confirmation effect – also known as the self-feeling prophecy. What you think will happen influences how you act, and your action makes it happen. Typically you don’t notice the role that your own behavior had in making the result happen.
I mention all this because I want to raise this question: Why do we have confirmation bias? Why is there a behavioral confirmation effect? Whenever we reason – that is, whenever we advance a claim and seek to support it with evidence – we’re likely to be under the sway of confirmation bias. Why is that?
If the evolutionary function of reasoning – supporting claims with evidence – were to better discern the truth, or to make better decisions – then natural selection would have weeded out confirmation bias and the behavioral confirmation effect a long time ago. In fact, they never would have arisen. This tells us that the evolutionary function of reason is NOT to discern truth or to arrive at better decisions.
So what is the evolutionary function of human reason? Scholars Hugo Mercier and Dan Sperber argue that the function of human reason is in order for us to persuade one another. That is, in human evolutionary history, it is typically valuable for a group to agree on its viewpoints and theories. It is typically less valuable that those viewpoints and theories be true, or correspond to reality. As long as the decision-making isn't terrible but only a little bit less than optimal, then group agreement is more valuable than a marginal increase in good decision-making.
We are deeply social animals, and having a shared view of things helps us like each other and get along. Very often that’s more important than whether the shared view of things is true. If the objective is to produce conclusions supported by the most careful examination of the widest possible range of evidence, then confirmation bias is a significant obstacle. But if the objective is to produce conclusions that the group collectively endorses, then confirmation bias is quite handy: it keeps us focused on the evidence we can point out to each other to reinforce our consensus and bring lagging skeptics on board.
We evolved not only in a context of dependency on others within our tribe, but also in a context of recurring conflict with other tribes. In other words, we really needed to get along with our own people and also really needed to be able to fight against outsiders. Tribal survival depended on being able to defend our stuff (our turf, our food, our males' access to our reproductive-age females), and, when times got tough, survival sometimes depended on being able to conquer a neighboring tribe and take their stuff.
Shared viewpoints would have been doubly useful. First, shared viewpoints functioned to strengthen the bonds within our tribe. Second, shared viewpoints also functioned to facilitate a useful hatred of neighboring tribes who had different viewpoints. We needed to have viewpoints that were OURS – that were a product of tribal conversation – and we also needed those viewpoints to NOT be terribly closely determined by reality -- because then the other tribe would arrive at the same conclusion, and we wouldn't be able to hate them for their corrupt beliefs. Confirmation bias suits the need with amazing efficacy.
* * *
This is part 2 of 3 of "Tribe"
Part 1: Left Alone
Part 3: What To Do About Confirmation Bias