Do people believe what they want to believe? It's a bit more complicated than that, I argued in last week's Southern Star. The column is reproduced below.
Sometimes, you might fall out with someone close to you and think: how can they not see they’re wrong? How can they say black is white? How can they not see what’s right in front of their eyes?
It’s often said people believe what they want to believe, but that’s not the whole story. It’s more accurate to say we are prone to ‘motivated reasoning’, to borrow a term used by Cornell psychologist and researcher Prof Thomas Gilovich. ‘People don’t simply believe what they want to believe’, writes Gilovich. ‘The psychological mechanisms that produce motivated beliefs are much more complicated than that’.
That said, it’s not particularly complicated. Gilovich’s argument can be boiled down to this: when we want to believe something, we ask, ‘can I believe it?’ When we don’t want to believe something, we ask, ‘must I believe it?’
Essentially, we are inclined to see what we expect to see and to conclude what we expect to conclude. Information that confirms our beliefs tends to be accepted at face value, while information that contradicts them is more likely to be closely examined and discounted. In the former cases, we are satisfied by what we have learned and stop looking for more information; in the latter cases, we ‘dig deeper’, hoping to find ‘more comforting information’, or to uncover reasons that suggest the original information was flawed or incomplete.
'MUST I BELIEVE IT'?
Here’s a simple example that illustrates this “Can I?/Must I?” distinction. In an experiment, participants were told they would be tested for an enzyme deficiency that could lead to pancreatic problems in later life. The test involved spitting a small amount of saliva in a cup and then putting a piece of litmus paper into the saliva. Half the participants were led to believe they had the enzyme deficiency if the paper changed colour; the other half were told they had the deficiency if the paper didn’t change colour (in reality, the test had no validity and the litmus paper remained unchanged in all cases).
People who thought they had gotten good news were quick to accept the verdict and didn’t keep the paper in the cup very long. In contrast, those who thought they had gotten bad news kept the paper in the cup for a much longer time. They tried out many different testing behaviours – placing the litmus paper directly on their tongue, redipping it in the water (up to 12 times), shaking the paper, blowing on it, and so on. In other words, the people who got the “good” news were quick to accept it; those who got the “bad” news adopted an attitude of “must I believe this?”, desperately searching for evidence indicating the test result might have been false.
Various other experiments and studies confirm this point. When people hear something they don’t want to hear, they don’t simply ignore or deny it. However, they do look much more closely for any extra information that might ease their discomfort. Most of the time, they will find it; as Gilovich points out, ‘even the most comprehensive web of evidence will have a few holes’. Thus, people ‘end up believing what they want to believe, not through mindless wishful thinking but rather through genuine reasoning processes that seem sound to the person doing it’.
This “Can I?/Must I?” distinction is at the root of so much interpersonal conflict. Remembering it can help you understand why good, intelligent people sometimes do bad, dumb things.
Secondly, it should be a reminder to be sceptical about accounts you hear of others’ behaviour. As Gilovich notes in his book How We Know What Isn't So, we should ask ourselves ‘where the information originated, and how much distortion – deliberate or otherwise – is likely to have been introduced along the way’.
Thirdly, remember none of us are perfect in this regard. We are, as Gilovich puts it, ‘inclined to acquire and retain beliefs that make us feel good’. People often criticise others because doing so makes them feel more moral, more caring, more competent. However, our negative assessments may be as skewed as anyone else’s. Maybe your foes aren't as bad as you think; maybe you should give them another chance.