You know by now that decision making biases are one of the centerpieces of my academic research and my consulting. So it was great to read about a series of studies that changed my mind about something I thought I knew pretty darn well. It was also good to see that I still have an open mind.
Confirmation bias happens when we have an opinion about something as big as whether the death penalty works or as small as which team to pick in tonight's basketball game. When there is mixed evidence, there is a slew of research that shows the evidence that supports our first impression has a much bigger impact on our final decision then the evidence that contradicts it.
But what I learned from these studies is about how this happens. I always thought it was because we ignored the contradictory evidence and focused on the confirming evidence. But it turns out to be just the opposite.
In a study of football betting, they found that when people win a bet they don't think about it much at all. They just chalk it up to being smart about football. The subjects thought things like "Of course they won. I knew the quarterback would pull them through." And then they move on.
But when they lose, they think about it in much more detail to explain the error. They think things like "Well, they only lost because of that bad call in the second quarter. And because the receiver dropped that easy pass. Otherwise they would have won and I would have been right." So instead of counting it as a bad bet, they count it as a bet they "should have won." There are two possible outcomes of a bet. Either you are right, or you were right but had bad luck. Either way, the outcome supports whatever process you used to pick the winner.
The same thing happened in a study of people's views on the death penalty. They had subjects read two articles about the death penalty. One with evidence that supports it and one with evidence against. Both articles had some flaws, but an equal number in each one. What happened was that when people read the article they agreed with, they just skimmed it and added a mental check mark in their opinion that they were right. They didn't notice the flaws. But when they read the opposing article they scrutinized it very carefully to find flaws and of course found them. So they discounted that article. In the end, supporters and opposers of the death penalty ended up with stronger beliefs of their prior opinion after reading the same two studies.
So the basis of confirmation bias is not necessarily that we ignore contradictory evidence. Instead, we work very hard to prove it wrong. If you want to be a convincing person, the best thing to do is keep details to yourself and don't reveal any ammunition to discredit you. The adage that "It's better to be quiet and be thought a fool then to open your mouth and remove all doubt" seems to be supported by the evidence.