2023-02-13

Know Thyself

Know Thyself. It’s an Ancient Greek aphorism. Socrates makes reference to it in one of Plato’s dialogues. And, before we’re done today, we will find a Valentine’s Day message in this maxim.

Supposedly, “know thyself” was the first of three maxims inscribed at the Temple of Apollo at Delphi – along with "nothing to excess" and "certainty brings ruin." So: know yourself, but not excessively, and, anyway, don’t be so sure.

It appears that the ancient Greeks used “know thyself” primarily in two ways. People whose boasts exceeded reality were cautioned to know themselves – because, if they did, they wouldn’t make such boasts. Second, saying “know thyself” was a way to suggest “be true to thyself” – like “you do you” – pay no attention to the opinion of the multitude. So: you’re not as good as you think you are – but you’re better than what your critics say.

Know thyself, the Greeks said -- recognizing that we aren’t always as self-aware as we could be – or, aren’t as self-aware as others think we could be – or, others tend not to be as self-aware as we think they could be – or something. In any case, knowing thyself is no easy thing. Thyself has been built to systematically fool itself in a number of ways.

The 86 billion neurons of your brain are firing away across about a trillion synapses – firing anywhere from once a second up to, at peak excitement, about 200 times a second. Some of those neurons, we don’t know what they’re there for. What exactly their function is, we don't know. But all of them are firing away doing something in there. Even where scientists do have a pretty good idea which neurons do what, that, of course, doesn’t mean that I know what mine are doing at any particular time.

Among the things I wouldn’t detect are the biases those neurons have – yet we know from myriad studies that a long list of cognitive biases bedevil human brains. For instance, when we explain other people’s behavior we overemphasize their personality and underemphasize situational factors – yet when we explain our own behavior we do the opposite, overemphasizing situational factors and underemphasizing our personality. That’s called actor-observer bias.

The odds of a coin toss coming up heads are 50-50. But if it’s landed on heads five times in a row, we think that it’s more likely to land tails the sixth time. Nope. It’s still 50-50. That’s the gambler’s fallacy.

We consistently underestimate the costs and the duration of basically every project we undertake. That’s optimism bias.

We are likely to think traveling by plane is more dangerous than traveling by car because images of plane crashes are more vivid and dramatic in our memory and imagination, and hence more available to our consciousness. So that’s called availability bias.

When making decisions, we over-rely on the first piece of information offered, particularly if that information is numeric. This was illustrated in one study in which participants observed a roulette wheel, and then were asked an unrelated question. Half the participants saw the roulette wheel land on 10. The other half saw the roulette wheel land on 65. All were then asked what percentage of United Nations countries is African. “The participants who saw the wheel stop on 10 guessed 25 percent, on average. The ones who saw the wheel stop on 65 guessed 45 percent.” (At the time of the experiment, the correct answer was 28 percent.)(Ben Yagoda, “Your Lying Mind,” Atlantic, 2018 Sep). This tendency to be pulled toward whatever number has most immediately entered our consciousness is called the anchoring effect.

Then there’s sunk-cost thinking tells us to stick with a bad investment because of the money we have already lost on it. Nations will continue to pour money and lives into unwinnable wars – and will have widespread popular support to do so – because, people think, “we’ve already invested so much.” The thought that pulling out would mean that the early casualties died for nothing is so powerful that we send more and more lives to die for virtually nothing. That’s the sunk-cost fallacy.

What you already have is more valuable to you than what you could get. We will put more energy and thought into avoiding losing $100 than we will into gaining $100 – and if we do lose $100 it bothers us a lot, whereas gaining a windfall of $100 feels only a little good. Participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. But when participants weren’t given the mug, but were shown it, and asked how much they would spend to buy it, the average answer is $2.21. Once the mug is yours – even if it’s been yours for less than 60 seconds – it’s more than twice as valuable to you.

Confirmation bias: that’s a big one. We look for evidence that confirms what we already think, and we discount or ignore disconfirming evidence. We interpret ambiguous results as supporting what we already believe. Consequences of confirmation bias can be enormous. The 2005 report to the president on the lead-up to the Iraq War said:
“When confronted with evidence that indicated Iraq did not have [weapons of mass destruction], analysts tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it.” (Yagoda)
Why do our brains do this? Because using the available information to solve an abstractly presented problem is never the only thing your brain is working on. Your brain is always also working on relationships.

Whatever else your brain is working on, it is always also processing relationships: monitoring relationships, assessing relationships, considering how to build relationships. It's got an eye on your relationship with A, your relationship with B, and is also watching the relationship between A and B.

If you have an opinion about abortion rights, or about gun ownership rights – and who doesn’t? – that’s partly about assessing fetal value versus valuing women’s equality and autonomy, or about assessing the safety afforded by a gun versus the danger of having a gun around. But those aren’t the only factors your brain is crunching on. It’s also working on relationships – and, arguably, in the case of these two examples, relationships are the main thing. Who are your people? Who are the opponents of your people? You need to be in solidarity with your friends and associates, so you adopt the opinion and the reasoning of your friends and associates.

The function of human reason is to persuade one another – that is, to bring people into agreement. The function of human reason is not (or not only, and not even primarily) to use available information to solve abstractly presented problems. In human evolutionary history, it is typically valuable for a group to agree on its viewpoints and theories. It is typically less valuable that those viewpoints and theories be true, or correspond to reality, or be usefully applicable for addressing the widest variety of isolated problems. As long as the decision-making isn't terrible but only a little bit less than optimal, then group agreement is more valuable than a marginal increase in good decision-making. Having a shared view of things helps us like each other and get along. Very often that’s more important than whether the shared view of things is true.

So our brains are oriented to produce conclusions that the group collectively endorses. Once we understand that, then we see that confirmation bias is quite handy: it keeps us focused on the evidence we can point out to each other to reinforce our consensus and bring lagging skeptics on board.

Whatever particular problem it might be thinking about, your brain is always also thinking about relationships. Why do we value the mug more if we already have it? Duh! It was a gift. Gifts are tokens of and reinforcers of relationship. Even if it was given by some experimenter you only just met, now she’s given you a gift. Aww. Isn’t that sweet? Of course you’re gonna value that more than you would some mug presented to you for purchase.

Every one of those cognitive biases is there because it was, in some way, functional for our ancestors’ survival. Your brain always has at least one eye on your relationships. So the source of our cognitive biases is also the great superpower of being human. At some point in about the last million years, our ancestors developed shared intentionality – that is, the ability to share mental representations of a task so that multiple people can work on it. Take something as seemingly simple as one person pulling down the branch for the other to pluck the fruit, and then both of them share the meal. Chimps never do this. We are profound collaborators, connecting our brains together to solve problems that single brains can’t. We distribute the cognitive tasks. No individual knows everything it takes to build a cathedral, or an aircraft. Our species success comes not from individual rationality but from our unparalleled ability to think in groups. Our great glory is how well we rely on each other’s expertise.

Our strong bias toward relationship means that we run into problems sometimes when addressing isolated questions abstracted from relational context. Knowing thyself means knowing this fact about how ineluctably relational you are.

* * *

We know from a number of studies that a main job of the cerebral cortex is to create an illusion of intentionality – that is, it’s not so much deciding what to do as noticing what you did and making up a story about it.

Here’s one such study. Participants are asked to look at two pictures of very different looking people, and choose which one they think is more attractive. The experimenter then turns the two photos face down, and pushes the selected photo toward the participant. The experimenter says, “Explain, please, why you found that person to be more attractive. You can turn the picture over.”

So the participant picks up the photo, turns it over, and begins telling some sort of made-up rationale for why the person in this picture is more attractive. How do we know it’s a made-up rationale? Here’s the kicker. In the role of the experimenter, a professional magician has been hired. With deft sleight of hand, he switches the pictures half the time. So half the participants pick up the picture they really did select as the more attractive, and half pick up the one they thought was less attractive, believing it to be the one they’d selected as more attractive.

You’d think, surely, they would notice. They’d be like, "wait, that’s not the one I picked." And, yeah, that’s what did happen – 27 percent of the time. Only 27 percent of subjects noticed that the experimenter had slipped them the wrong photograph – despite the fact that the two photographs were very dissimilar, and participants had had unlimited time to choose which one they thought was more attractive. The other 73 percent of participants, when asked to explain their choice, explained it in ways that were no different from
“the reports given by those who were explaining the photo they’d actually chosen as being more attractive....People who were shown the card they had not chosen nevertheless told a completely compelling story explaining why they had chosen that photograph. They failed to notice the card switch and so they devised a perfectly good explanation for a choice they had not actually made.” (Nicholas Eply, Mindwise, p. 32)
This is because your cerebral cortex is good at its job – and its job is to notice what you’ve done and to fabricate a story about why you’ve done it. The story purports to explain how you made your decision, but in fact, the decision was already made unconsciously before the cerebral cortex invents its story. It’s so good at explaining your decisions that it seamlessly does so even when the decision it’s explaining wasn’t the one you made.

So how can we possibly know ourselves? Well, for one thing, if you know that your motives are opaque to you, that’s already knowing something very important about yourself – something you wouldn’t know if you went around naively believing what you think.

For another thing, we know ourselves about the same way we know other people. We read our own minds with the same inferential habits we use to read other minds. We aren’t mind readers in the sense of “telepathy, clairvoyance, or any kind of extrasensory power that creates a psychic connection” (Epley, p. xi). But dozens of times a day we “infer what others are thinking, feeling, wanting, or intending.” This is the basic human mind-reading that gives us the power
“to build and maintain the intimate relationships that make life worth living, to maintain a desired reputation in the eyes of others, to work effectively in teams....[It] forms the foundation of all social interaction, creating the web of presumptions and assumptions that enables large societies to function effectively.” (Epley, p. xi)
So: back to that participant smoothly explaining why she chose the photo she actually didn’t choose. Rather than saying she is unaccountably self-deluded, let’s see what she’s doing in a different light. What she’s doing, with remarkable skill, is imagining why someone who did pick that picture would do so. She’s simply applying to her own mind the mind-reading skills she uses to make sense of other minds. As psychologist Nicholas Epley points out:
“The only difference in the way we make sense of our own minds versus other people’s minds is that we know we’re guessing about the minds of others.” (p. 32)
In fact, we are just as much guessing about our own mind – but, with our own mind, we have the illusion of special, privileged access to the causes and processes that guide our thoughts and behavior.

We learned from George Herbert Mead, writing 100 years ago, that the self is the generalized other. We form our conception of who we are as a generalization of the people around us. We get a sense of what makes them tick – how to infer from their behavior what they believe, desire, and intend. Then we apply this to ourselves: inferring from what we see ourselves doing and saying what we ourselves must be believing, desiring, and intending. As Epley writes:
“If you see someone smiling at a cartoon, you will assume that they find it funny. If you find yourself smiling at a cartoon, even if you are smiling only because you’ve been asked to hold a pen in the corners of your mouth so that it makes a smile, then you are likely to report finding the cartoon funny as well. If you see someone hunched over, you will assume that they are not feeling very proud. Find yourself hunching over in the same way, even if only because you’re filling out a survey on a table with very short legs, and you may report being less proud of yourself and your accomplishments, too.” (p. 32)
There are two important spiritual messages here. One is that we ARE other people. We are made of relationships. The way we think – or, more precisely, the way we construct our impression of what we think we’re thinking – was built out of interactions, starting in infancy, with other people. Yes, each of us is unique. That uniqueness lies in the distinctiveness of the significant people in your childhood, combined with the somewhat quirky way you generalized from them to form your self.

Each other is our belonging. Each other is our place. Each other is where we live and breathe and have our being.

The second spiritual lesson is humility. Now that you know that the story of your decisions is just as much guesswork as the story you might have of someone else’s decisions – that your impression of having special, privileged access into yourself is an illusion – you can be a bit more humble about the accuracy of that story.

And this is important. The illusion that we know our own minds more deeply than we actually do has a disturbing consequence: it can make your mind appear superior to the minds of others.
“If the illusions you hold about your own brain lead you to believe that you see the world as it actually is and you find that others see the world differently, then they must be the ones who are biased, distorted, uninformed, ignorant, unreasonable, or evil. Having these kinds of thoughts about the minds of others is what escalates differences of opinion into differences worth fighting (and sometimes dying) for.” (Epley, p. 33)
Know thyself – but remember also those other two maxims at Delphi: "nothing to excess," and "certainty brings ruin." Don’t know thyself so excessively that you regard the interior of your thought-world as the standard of truth. And, know thyself – but don’t be so sure. Know that thou, too, are built with unavoidable biases and illusions.

The good news is that those biases are there because we are built for relationship. We are built to love. We aren’t built to be right.

Here, then, is the Valentine’s Day message I promised at the beginning:
Love. And give up on being right.
Blessed be. Amen.

No comments:

Post a Comment