There’s a close play at the plate. Under a cloud of dust, the runner’s hand appears to sneak in just before the catcher applies the tag, but in real-time, it’s hard to say. The umpire signals safe. There’s an instant replay challenge. Video is shown on the board at Chicago’s Wrigley Field, and 41,000 Cubs fans cheer lustily at what they think is conclusive evidence that the call on the field was correct.
But home, watching on television, a million St. Louis Cardinals fans are certain the video shows the runner was out, and the call will be overturned. Cubs broadcasters wonder aloud why umpires are even looking at the call, the play is so obvious. Cardinals broadcasters wonder why the reversal is taking so long.
Same play. Same video. Same evidence. Same million-dollar, freeze-frame technology. Two very different conclusions. “My side is right. I’m sure of it,” people from both sides say. How can individuals looking at the exact same set of facts come to such diametrically opposed conclusions? And perhaps more curious…why are these conclusions so predictable?
This kind of scene plays out at nearly every serious sporting event that takes place every day, sometimes several times each game.
Something very similar happens every time there’s a car accident. “She/he cut me off!” In many family “discussions” and in business meetings everywhere, too.
“All lies and jest, still a man hears what he wants to hear and disregards the rest,” Simon and Garfunkel sang in their hit song, “The Boxer.”
This phenomenon is described in many ways: my-side bias; having an internal “yes man;” cherry picking; selective memory. Humans have a very bad habit of seeing what they want to see and being blind to what they don’t. This inconvenient trait is known by social scientists as confirmation bias, and it’s one of the most important and most studied cognitive biases.
It’s also among the hardest to overcome.
Tests about the phenomenon usually go something like this: subjects are shown research about the death penalty. They accept at face value evidence that confirms their point of view, while they subject “disconfirming” evidence to intense scrutiny. Despite what the evidence suggests, subjects on both sides end up feeling even more strongly about their original point of view after the experiment.
This might seem obvious based on today’s political situation, but Stanford University researchers who ran one such study concluded that sharing information with people might not be a good way to settle political debates.
“The result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be, not a narrowing of disagreement, but rather an increase in polarization,” they wrote.
Confirmation bias is one reason it’s so hard to change people’s minds about almost anything.
Confirmation bias plays out at work all the time. Many bosses surround themselves with people who agree with them (yes men), because it makes them feel good. Many underlings go along with the boss, believing that’s the way to get ahead. Those with contradictory opinions are labeled as difficult, or naysayers. They are ignored when they speak, interrupted, or not given prominent platforms.
Co-workers can be victims, too. If you’re told that Joe is a lazy millennial, you’ll notice the occasional day when he is late for work, and ignore all the days he arrives early. If you are told Mark is an airhead, you’ll note that he laughs a bit too loud at meetings, and won’t bother to read the excellent reports he writes. If you are in competition with Joe or Mark – say you are fighting for the same promotion, or the boss’ attention – you have all the more incentive to see the world this way.
“It is difficult to get a man to understand something, when his salary depends upon his not understanding it!” Upton Sinclair famously said while running for governor of California.
Confirmation bias is also vexing because it protects us from internal conflict. Being wrong is hard; it can shake your world view. If you were sure that Blackberry made the best mobile phone product, and spent a bunch of company resources outfitting your employees with Blackberries, you were naturally slow to shift towards other smartphone platforms. After all, doing so meant admitting a mistake. Worse yet, it meant enduring cognitive disconnect – that uneasy feeling which arises when you realize your perceptions don’t match reality. It often seems easier to rationalize your internal world view, and mentally bend reality to fit your perceptions, than tear down and rebuild your view of the world.
Our dastardly tendency to draw conclusions based on scant evidence and connect dots that don’t connect also contributes to confirmation bias. In the 1950s, psychiatrist Klaus Conrad coined the term “apophenia” to describe the human tendency to see patterns in disparate events – to take two instances of lateness and paint an entire picture about someone’s work habits. He compared this “abnormal meaningfulness” to hallucinations.
That’s taking confirmation bias a bit too far, but such “abnormal meaningfulness” is often the root of a lot of old-fashioned stereotypes. If you believe older workers can’t keep up with technology, a single example of a 50-something struggling with a new app could be all you’ll need to perpetuate that mental myth, and you’ll conveniently ignore hundreds of contradictory examples.
How does one try to avoid, or at least minimize, their own confirmation bias? Wall Street traders face this issue constantly, whenever they invest in a clunker. When do they admit defeat, reverse course, concede they’ve made a bad investment, and sell? Usually, later than they should, a mistake that can cost millions.
Billionaire investor Warren Buffet employs an interesting strategy that he credits to Charles Darwin.
“Darwin used to say that whenever he ran into something that contradicted a conclusion he cherished, he was obliged to write the new finding down within 30 minutes,” Buffet wrote in an essay more than a decade ago. “Otherwise his mind would work to reject the discordant information, much as the body rejects transplants. Man’s natural inclination is to cling to his beliefs, particularly if they are reinforced by recent experience – a flaw in our makeup that bears on what happens during secular bull markets and extended periods of stagnation.”
Another popular method is to appoint a designated devil’s advocate for each decision – someone who pokes holes and points out pitfalls in every strategy.
Disinterested parties can be really helpful, too. Way back in 1924, psychologist L.L. Thurstone wrote in The Nature of Intelligence, “If we have nothing personally at stake in a dispute between people who are strangers to us, we are remarkably intelligent about weighing the evidence.”
At least, that’s how it’s supposed to work. Forty-one thousand Cubs fans would colorfully disagree.