The person who’s best at lying to you is you

29

In 2008, the psychiatrist Stephen Greenspan published The Annals of Gullibility, a summary of his decades of research into how to avoid being gullible. Two days later, he discovered his financial advisor Bernie Madoff was a fraud, who had caused Greenspan to lose a third of his retirement savings.

This anecdote, from a presentation by University of Michigan social psychologist David Dunning, due to be presented at the 20th Sydney Symposium of Social Psychology in Visegrád, Hungary in July, highlights an unfortunate but inescapable truth: We are always most gullible to ourselves. As Dunning explains it, Greenspan—despite being the expert on gullibility—fell prey to Madoff’s fraudulent behavior not simply because Madoff was some master manipulator, but because Greenspan had, essentially, tricked himself.

“To fall prey to another person you have to fall prey to your belief that you’re a good judge of character, that you know the situation, that you’re on solid ground as opposed to shifty ground,” says Dunning. Greenspan, Dunning notes, failed to follow his own advice and take appropriate cautionary steps before trusting someone in a field he knew little about. Though he wrote the book on how not to be overly confident of your own judgments, Greenspan went against own advice when he handed over his savings without properly interrogating both Madoff’s confidence in himself, and his own sense of confidence in Madoff. Had he followed his own counsel, Greenspan would have recognized he knew little about financial investments, and would have done far more research before deciding to hand over his money to Madoff.

Dunning is an expert on the human tendency to overestimate confidence in our own knowledge and beliefs. In 1999, together with social psychologist Justin Kruger, Dunning identified the co-eponymous Dunning-Kruger effect: people who are incompetent and lack knowledge in a field tend to massively overestimate their abilities because, quite simply, they don’t know enough to recognize what they don’t know. So hugely unqualified people erroneously believe that they’re perfectly qualified. (This effect that has an unfortunate tendency to create the worst possible bosses. It’s also the opposite of imposter syndrome, which describes when qualified people worry that they aren’t qualified.)

In his latest presentation, Dunning highlights the studies that collectively show how we repeatedly and consistently fool ourselves into thinking we know more than we do, and so convince ourselves that our opinion or choice is right—even when there’s absolutely no evidence to support this. There are dozens of studies supporting this hypothesis, showing, for example, that British prisoners rate themselves as more ethical and moral than typical citizens, and that people mistakenly believe they’re better than others at reaching unbiased conclusions.

People tend to be just as confident in their false beliefs as their accurate ones. In one 2013 study, participants were asked a physics question about the trajectory of a ball after it was shot through a curved tube. Those who said the trajectory would be curved (wrong) were just as confident that their answer was correct as those who correctly stated the ball would have a straight trajectory.

A body of research has also established what scientists call “egocentric discounting”: If participants are asked to give an estimate of a particular fact, such as unemployment rate or city population, and then shown someone else’s estimate and asked if they’d like to revise their own, they consistently give greater weight to their own view than others’, even when they’re not remotely knowledgeable in these areas.

Our false confidence in our own beliefs also deters us from asking for advice when appropriate—or to even know to whom to turn. “To recognize superior expertise would require people to have already a surfeit of expertise themselves,” notes Dunning.

Gullibility to oneself is not a modern phenomenon. But the effects are exacerbated in the age of social media, when false information spreads rapidly. “We’re living in a world in which we’re awash with information and misinformation,” says Dunning. “We live in a post-truth world.”

The issue is that the current environment convinces people they’re more informed than they actually are. It might, says Dunning, actually be better for people to feel uninformed. “When people are uninformed, they know they don’t know the answer,” he says, and so they will be more open to hearing from others with real expertise. If we think they know enough, however, we’ll just “cobble together what seems to us to be the best response possible to someone asking us our opinion, or a policy, or what we think,” says Dunning. And, he adds, “unfortunately we’re programmed to know enough to cobble together an answer.”

There’s no quick fix to this, but there is a key step we could take to avoid being so willfully misinformed. We need to not only evaluate the evidence behind newly presented facts and stories, but evaluate our own capability of evaluating the evidence.

The same questions we consider when evaluating whether to trust another person should apply to ourselves: “Are you too invested in this thought or belief you have? Are you really giving the conclusions you’re reaching due diligence? Are you in over your head?”, says Dunning.

That said, constantly questioning ourselves would be impractical, leading to a constant state of self-doubt and uncertainty. Most effective, says Dunning, would be to focus on situations that are new to us, and where the stakes are high. “Normally those two situations go together,” says Dunning. “We only buy so many cars in our lives, we only invest large sums of money every so often, we only get married every so often.”

Of course, as that last example shows, at some point you have to give up being savvy and just trust your own judgment—both in yourself and others. Dunning quotes novelist Graham Greene: “It is impossible to go through life without trust…that would be to be imprisoned in the worst cell of all, oneself.”

We can though, learn to be a little more careful and wise. Just as we don’t blindly trust every person we meet, there’s no reason to be utterly trusting and gullible to ourselves.