I’ve been fascinated lately by the inability of people to use the simple phrase: “I was wrong.” Being wrong is great! It’s a powerful life tool! Sure, it hurts for a moment or two, like ripping off a band-aid. But after that, it’s liberating. Letting go of an idea that worked for you in childhood, or in your early career, but doesn’t work now, feels great. It’s hard work to hold up the pretense that you are never wrong about things. It’s also a bit crazy. Life throws random events at us all the time. We never have all the facts at our disposal. So we make the best judgment we can at the time; and if we’re lucky and smart, we course correct along the way.
The inability to acknowledge wrongness, to make course corrections, is one of the key contributors to The Plateau Effect, our book about the science of getting stuck. Something works for a while, so you do it, you succeed, you grow, but then one day, you don’t succeed any more. Your revenue is flat. Your relationship sours. Your piano playing just isn’t getting any better. One reason: a trick you used to get you where you are has run its course, and now you need to let it go and learn some new tricks.
We call this, “learning,” and it’s fairly uncontroversial. At least until now.
There’s something about television, and now social media, that seems to inhibit this process. The digital age has created the golden age of the beginner’s bubble — people read about something for 10 minutes and think they’ve become experts — and now, it’s really hurting us.
Neil Postman, author of the great book Amusing Ourselves to Death, wrote about this phenomenon decades ago. You never see an opinion-giver on television say, “Well, I don’t know. That’s a good question,” he pointed out. Opinion givers have to know everything, or appear as if they do. In a related way, leaders of all stripes are never allowed to change their minds, to admit flaws, to concede: “Well, we did make a mess of things in the early days of the pandemic, but here’s how we are fixing that.” Instead, they must insist everything is fine, and always was.
That might be just for show. Or, it might be a function of what’s called the Dunning-Kruger Effect. People overestimate their abilities, and the less-smart they are, the more likely they are to do that. Worse yet, actual smart people are smart enough to realize they don’t know everything, so they are less confident than faux smart people. It’s a bad thing for knowledge.
I examined Dunning Kruger in more detail for PeopleScience.com recently. Here’s an excerpt of that story, which you can read in full at their site.
It can sure feel smart to mention the Dunning-Kruger Effect at a cocktail party. It’s fun to talk about and it often gives the speaker a kick of feeling superior. There’s usually a giggle or two that accompanies mention of the “people are too stupid to know they are stupid” cognitive bias. Be careful, though – ironically, many people overestimate how well they understand it.
Named after social psychologists David Dunning and Justin Kruger, the concept is based on their original study published in 1999. Their experiments were simple: subjects were asked to self-rate their skills on subjects like grammar, then they were tested. Poor performers wildly overestimated their abilities. Those who scored in the 12th percentile – at the very bottom of the class – had estimated they’d land in 62nd percentile, squarely above average. Anyone who’s ever corrected someone’s grammar on a social media post probably knows the feeling.
The idea seems pretty straightforward: People don’t know what they don’t know. The less they know, the less likely they are to seek out new information that might change their views. But there’s something more subtle about the concept:
“The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club,” Dunning told Vox in a recent interview. “People miss that.”
Beware that notion that Dunning-Kruger is about “stupid people.” The 1999 research project found that people who scored as high as the 80th percentile still overestimated their abilities. While the effect is more dramatic at the bottom of the test scores, four out of five people didn’t know as much as they thought they did. Odds are, you overestimate yourself, too. We all have a little Dunning-Kruger in us.
Since that initial study, the results have been replicated around the world; meanwhile, Dunning and Kruger have honed the concept. The latest research shows that learning something about a topic can actually make things worse. People often erroneously think they’ve become very knowledgeable about something after spending just a short amount of time researching it. Doctors who see patients after they spend 10 minutes reading about a disease on the Internet struggle with this problem daily.
People who are complete neophytes seem to be aware of that, but the Dunning-Kruger Effect spikes hard when people take on a new subject.
In a 2018 study, Dunning and co-author Carmen Sanchez called this the “beginner’s bubble.” In a concocted experiment that had subjects identify “zombie diseases,” participants “rapidly surged to overconfidence” after a few learning experiences. The authors then replicated their findings in the real world. They found young adults who enter the financial world quickly learn to overestimate their financial literacy skills, before slowly realizing how little they know about money.
“When it comes to overconfident judgment, a little learning does appear to be a dangerous thing,” the authors write. “Although beginners start with humble self-perceptions, with just a little experience, their confidence races ahead of their actual performance.”
That’s a bad combination in the age of the instant expert. This will be bad news to Internet commenters, many who comfortably pontificate about everything from global warming to cancer-causing foods.
There do seem to be additional patterns to be aware of when discussing Dunning-Kruger. This bias seems to suffer from gender bias. Plenty of research shows men are more likely than women to overestimate their abilities. Here’s one: in a study of college physiology students, the average male student had a 61 percent chance of thinking he was smarter than a classmate, while the average female only had a 33 percent chance.
Money might be a factor, too. Those from high social classes are more overconfident than lower-class individuals. In a paper titled The Social Advantage of Miscalibrated Individuals, the authors found business owners and job applicants who came from means were more likely to be overconfident.
“Overconfidence, in turn, made them appear more competent and more likely to attain social rank,” the authors note, lamenting that this characteristic of human nature can help perpetuate socioeconomic inequality.
But perhaps the least well-understood part of the “stupid people don’t know they are stupid” bias is that it also includes an equally pernicious mirror image – smart people don’t know they are smart. Back to Dunning and Kruger’s initial work: they found top 20% performers under-appreciated how good they were. Put another way, experts believe that the overconfident among them are actually their equals, or even smarter. That really opens the door for overconfident idiots.
So how to deal with Dunning-Kruger in the office? First, it’s best not to call it the “stupid” bias.
One thing Kruger suggests is to be comfortable with saying, “I don’t know.” In study after study, he says, it’s remarkable how few subjects opt for “I don’t know” when given the choice. Neil Postman also makes this point in the book “Amusing Ourselves to Death” – how often do you see an expert on television give “I don’t know” as an answer?
We all have blind spots. The more we are open to them, the less likely we’ll fall for overconfidence.
On the other hand, it’s important to recognize that there may very well be smart people in your office who are shouted down by the overconfident. Actively solicit their opinions. Don’t assume it’s a bad sign that an employee sounds hesitant or cautious when making a suggestion; it might mean they are a “superforecaster,” or someone who wisely views their opinions as potentially being wrong.
But also, most critically, be humble. Accept that, statistically, you are probably wrong several times each day. Test your choices. Examine data after the fact, and leave the door open for course correction. Don’t fall for the trap of thinking you can become an instant expert on a topic, that you can read a brief or a book and command some subject matter in a day or two.
The real lesson from Dunning-Kruger is that the learning curve doesn’t follow anything like the shape we imagine – a slow steady climb up a staircase. Instead, it’s more like falling off a cliff and climbing back up before you can ascend to the top of a mountain.
Good post, as usual. Another aspect pushing folks to not say “I don’t know” is the Internet: we sort of think everything is knowable. Who hasn’t wasted an hour searching for an article/story/person/whatever that we KNOW we dimly remember, only to finally give up? And when we can’t find it, we get angry: “C’mon, if I could just figure out the right search terms…”
So if everything is knowable, obviously saying “I don’t know” means you’re just lazy/stupid. That’s logical. Except the axiom is flawed, of course!