As usual, I read journal articles in groups. Here is a similar paper (ungated) from Dartmouth that looked at why our political views resist evidence to the contrary. Once we get an idea in our head, we don’t like to change our minds. This research, and the work he reviews, finds that this is even stronger with opinions that are strongly held, important to our self-image, or important to our world view.
His study looks at issues including the Iraq surge, Obama’s job plan, and global warming. If you were against the war, you were less likely to give any positive credit to the surge (and if you were for the war, you gave less credit to evidence of other explanations). If you didn’t like Obama, you wouldn’t give positive credit to his plan for creating jobs (and if you liked Obama, you wouldn't give credit to alternative explanations). If you don’t want there to be global warming, you don’t give credit to evidence for it (and if you believe in it, you don't give credit to contradicting evidence). He cites previous work that found similar effects with abortion, the death penalty, and others.
There are many reasons for this. We are more likely to focus on evidence that supports our opinion (conservatives watch Fox, liberals watch MSNBC). We also counterargue against information we don’t agree with (how many times have you yelled against a talking head on TV?), but gladly accept information that we do agree with.
This study looked at two possible interventions to even out the score. One thing they tried was to deal with the challenge to our self-image if we turn out to be wrong. He had participants engage in some task them made them focus on their own good qualities. In that state, they were less biased against a totally unrelated political view.
The second intervention was intended to prevent counterarguing. They looked at graphic, visual ways of presenting evidence with a basic meaning that jumps out at you from the design. But it doesn’t give you any specific facts, so you can’t argue against them. In this condition, people were more receptive and believing of information that contradicts their political opinions, even the important ones.
So next time you are in a political argument, start out by giving your opponent a self-affirmation. “Look, you are one of smartest people I know so I am sure you will understand these new scientific findings.”
Then, don’t hedge what you say by doing a “on one hand . . . and on the other hand . . . “ The person will just listen to the hand they agree with. Stick with just the best, hardest, least questionable evidence that you have.