If you read my recent entry on the Dunning-Kruger Effect, you’ve probably figured out that I enjoy a story that takes (some) people down a peg or two. Maybe it was all those years covering tech companies in Silicon Valley and Seattle, where fortunes are made on vaporware, owing mainly to the bravado of founders and investors.
“Illusory superiority” fits in this genre, too. Arrogance is the cause of many downfalls, and folks sure do enjoy a good comeuppance. But confidence is a complicated topic; a bit of arrogance can be a good thing. And it’s not just the bravado dividend; sometimes it’s good to try things when the odds are against you.
But illusory superiority, Dunning-Kruger, and survivor bias (“I survived a plane crash; you can too!”) are playing out with dangerous implications during the Covid-19 pandemic, however. No, you are not better than the virus. No, you don’t know more than Dr. Anthony Fauci. No, you probably aren’t a better epidemiologist than people who’ve studied epidemiology for decades. And, I’m sorry to say, there’s only about a 50-50 chance that you are better than average looking.
Here’s part of my piece for PeopleScience.com; please read the rest at their site.
Self-confidence is a tricky thing. We need some to get through life, but if we have too much, we can make some pretty big mistakes. Just ask anyone who’s flexed beer muscles against a much stronger adversary. Being self-aware about your skills and picking the right battles are important keys to success.Turns out humans aren’t very good at this. Plenty of people overestimate their abilities, often by wide margins. Simple numbers tell the story:
- Tests have shown that about 90% of drivers declare themselves above-average
- 87% of MBA students estimated they were doing better than most of their peers, and 94% of professors think so, too
- Some 70% of high school seniors say they have above-average leadership skills, but only 2% say below average
- People think they are less likely to get the flu than others, so they don’t get a flu shot
- Stock brokers are convinced they make better bets than their peers, which is demonstrably false, and
- If you’ve ever been to a dating site, you know most people say they are above-average looking, when mathematics would suggest otherwise.
Researchers have plenty of names for this phenomenon: “Above average effect” suits pretty well, but it’s also known as “illusory superiority” or the superiority bias. Or simply, the “Lake Wobegon Effect,” recalling Garrison Keillor’s town where all the “women are strong, the men good-looking and the children above average.”
Setting aside a mathematical anomaly, most people cannot be above average, so what’s going on here? Some are lying to themselves, or at least living a life of blind spots, delightfully unaware of what others think of them. They don’t notice that family and friends avoid being passengers in their car, for example.
Illusory superiority is not entirely a bad thing. A certain amount of overconfidence probably helped early humans take on larger beasts and win themselves a good dinner. In modern life, thinking you are better than others can be good for mental health. The opposite is certainly true. “Depressive realism,” in which people have an all-too-real appreciation of their limitations, is a contributing factor to depression.
But naturally, overconfidence can cause a whole lot of problems. It can lead a boss to ignore the advice of employees and take the group in the wrong direction. It can lead a company to ignore data and double down on a failing product. (“This internet thing is a fad, I know it!”)
Even worse, additional research has shown that many of the people who are over-confident are precisely those who can least afford to be. The reverse is true, too. This phenomenon is called the “Dunning-Kruger Effect,” after the researchers who identified it in a paper brutally called “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” The researchers gave grammar tests to college students and found two equally depressing results.
First: The worst performers thought they were among the best. That result led David Dunning to write this rather damning phrase: “If you’re incompetent, you can’t know you’re incompetent … The skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.”
The second, and perhaps even more troublesome result: grammar nerds who took the test and finished near the top dramatically underestimated their results. This is sometimes called the “worse than average effect.” This group knows enough to be uncertain and halting about their own skills, and that makes them hesitant.
That humility might sound charming, but it’s bad for business.
Imagine your last brainstorming session at work. If Dunning-Kruger is correct, it suggests the people who had the most to say – and were most confident saying it – knew the least. Meanwhile, the smartest folks in the room remained quiet. Susan Cain, in her book about introverts called “Quiet,” explores this phenomenon and implores managers to encourage contributions from the non-Type A types in every room.
Why is illusory superiority so persistent? As we’ve noted, it has evolutionary benefits. Dunning offers other theories, too. Honest feedback is rare, he says, which keeps people blissfully ignorant.
People’s thoughts are also colored – scarred? – by life experiences, which can clutter their brains with misinformation that blocks the truth. Trying something once, and failing, might be valuable experience. Or it might just be a small sample size.
“An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors and hunches that regrettably have the look and feel of useful and accurate knowledge,” Dunning wrote for PacificStandard.com.
So if you want to get around confidence bias and blind spots, honest feedback is a good start. Other suggestions include use of the Socratic Method – during explanations, begin with misconceptions and disarm them one by one. Plenty of behavioral scientists advocate the appointment of a “devil’s advocate,” during discussions, to make sure that alternate points of view are raised. Hackers call this “penetration testing” – hiring good guys to “hack” your system – but the concept works just as well in any setting. And of course, test, test, test. No one’s opinion, or experience, should ever carry the day without data.
The real trick with illusory superiority is, of course, the people who need to learn about it are the least likely to be reading an article like this one, and the reverse is true. If this piece made you worried about your blind spots, odds are pretty good you need more confidence, not less. Use that newfound boldness to forward this story to someone who really needs it.