Thursday, January 05, 2012

political views are hard to change

As usual, I read journal articles in groups.  Here is a similar paper (ungated) from Dartmouth that looked at why our political views resist evidence to the contrary.  Once we get an idea in our head, we don’t like to change our minds.  This research, and the work he reviews, finds that this is even stronger with opinions that are strongly held, important to our self-image, or important to our world view.  

His study looks at issues including the Iraq surge, Obama’s job plan, and global warming.   If you were against the war, you were less likely to give any positive credit to the surge (and if you were for the war, you gave less credit to evidence of other explanations).  If you didn’t like Obama, you wouldn’t give positive credit to his plan for creating jobs (and if you liked Obama, you wouldn't give credit to alternative explanations).   If you don’t want there to be global warming, you don’t give credit to evidence for it (and if you believe in it, you don't give credit to contradicting evidence). He cites previous work that found similar effects with abortion, the death penalty, and others.

There are many reasons for this.  We are more likely to focus on evidence that supports our opinion (conservatives watch Fox, liberals watch MSNBC).  We also counterargue against information we don’t agree with (how many times have you yelled against a talking head on TV?), but gladly accept information that we do agree with. 

This study looked at two possible interventions to even out the score.  One thing they tried was to deal with the challenge to our self-image if we turn out to be wrong.  He had participants engage in some task them made them focus on their own good qualities.  In that state, they were less biased against a totally unrelated political view. 

The second intervention was intended to prevent counterarguing.  They looked at graphic, visual ways of presenting evidence with a basic meaning that jumps out at you from the design.  But it doesn’t give you any specific facts, so you can’t argue against them.  In this condition, people were more receptive and believing of information that contradicts their political opinions, even the important ones.

So next time you are in a political argument, start out by giving your opponent a self-affirmation.  “Look, you are one of smartest people I know so I am sure you will understand these new scientific findings.” 

Then, don’t hedge what you say by doing a “on one hand . . . and on the other hand . . . “  The person will just listen to the hand they agree with.  Stick with just the best, hardest, least questionable evidence that you have.

Unrealistic optimism

There was a fascinating paper in a recent issue of Nature Neuroscience that investigates why unrealistic optimism is so pervasive.  We all think our teams can win, we are above average drivers, this lottery ticket will be the one.  The same is true with important judgments like whether we will pass the test, get the job, or have enough money to retire.  Over the past decades, many possible reasons for this optimism have been suggested:

  • Perhaps because we like good news, we pay more attention to it. 
  • Perhaps because we like good news, we think about it more and therefore process it more deeply.
  • Perhaps good things happen more often, so we are more familiar with them. 
  • Perhaps good news is more exciting, so it increases brain activity during memory storage.
As you probably can guess from the name of the journal, these researchers hooked their participants up to fMRI machines to scan their brains and find out what causes the optimism.  They statistically controlled for all of the other possible solutions so they could be sure.  And here is what they found.

Positive information is processed in several parts of the brain, primarily the frontal cortex and the left inferior prefrontal gyrus.  Negative information only activates the left inferior prefrontal gyrus.  So what happens is that when you see positive information, your mental model gets stronger based on the activation of several brain areas, but when you see an equal amount of negative evidence, your model changes based on just one area so it changes less.  It’s like that old Lil Rascals episode with Alfalfa and Darla where Alfalfa is dividing up the candy.  One for you, one for me.  Two for you, One-two for me.  Three for you, One-two-three for me . . .

This is dangerous because optimism decreases the chance that we will take steps to protect ourselves.  We don’t always wear our seatbelt, we don’t always save enough for a rainy day, we don’t go to the doctor because “it’s probably nothing.”  Optimistic people have less stress and live longer.  But only as long as we don’t kill ourselves first.  So it is important to understand what to do about this. 

When you get information that supports your pre-existing ideas, take it with a grain of salt.  And if you get contradictory information, try to take it more seriously.  This is the opposite of what comes naturally, but could make your life much better.