Monday, January 16, 2012

Science is never 100%

The new twist in Mass Lt Gov Murray's car crash is very relevant to forensic investigations of all kinds, especially in human factors.

For those of you unfamiliar with the story, here is a basic recap.  He was driving down the highway late at night and crashed into a tree.  He claimed that he was going about the speed limit and the car slipped on some black ice.  He had no problem releasing the black box recorder (like the ones in planes) that was in his government owned car.  When the data was analyzed, they found he had accelerated to 100 mph just before hitting the tree.  They concluded that he must have fallen asleep at the wheel.

From a human factors point of view, this makes sense.  If you are nodding off and don't realize it, then suddenly wake up as the car hits the grass, it would seem like you had slipped on ice.  It is also possible that slumping forward would cause your weight to push harder on the pedal and accelerate.  So his story made sense.  There were no alcohol or drugs in his system and the forensic science fit the story. 

The Lt Gov was convinced and admitted the possibility that he could have fallen asleep, although he didn't remember it.  The media was convinced too.  Case closed.

Then today, they reported a new analysis on the black box data suggesting he really did slip on ice.  He was going 75 (not the speed limit) just prior to the acceleration.  He acknowledges that is possible.  So now the case is not closed but confused.

All science makes conclusions based on less than 100% certainty.  The standard can be 95% for many things, or 99.999% if its life threatening or mission critical.  But nothing is ever 100%.  So it is not surprising that this new conclusion could arise from a more detailed analysis of the data.

But people outside of science are led to believe (by the media and our high school science books) that science is always 100% sure of everything.  The media was sure he had fallen asleep at the wheel. The driver was convinced by the black box analysis, even though he didn't remember falling asleep.  The public bought it too.  100%.

In one sense, it is good that the public thinks in absolutes.  When scientists generally accept a theory at the appropriate level of confidence, it is better to design public policy as if it was 100%.  When the public thinks there is uncertainty in the science, even .01%, they can engage our remarkable ability at willful blindness to clutch onto a more comforting conclusion (my religion hasn't been lying all these years, there is no such thing as evolution).  It is actually better that the media usually report science as if it were 100%.  Imagine if we weren't totally convinced about gravity.  Or if we still believed in the chance the earth was flat. 

But when a theory turns out to be wrong, all of science immediately becomes suspect in the public's mind.  Wait - you have been lying to us all these years?  Anti-depressants don't work any better than a placebo?  Mammograms are not beneficial to 40-something women with no family history of breast cancer?  So I guess I don't have to give up my Hummer because global warming could be false too.  Maybe smoking doesn't increase the risk of lung cancer.

Thursday, January 05, 2012

political views are hard to change


As usual, I read journal articles in groups.  Here is a similar paper (ungated) from Dartmouth that looked at why our political views resist evidence to the contrary.  Once we get an idea in our head, we don’t like to change our minds.  This research, and the work he reviews, finds that this is even stronger with opinions that are strongly held, important to our self-image, or important to our world view.  

His study looks at issues including the Iraq surge, Obama’s job plan, and global warming.   If you were against the war, you were less likely to give any positive credit to the surge (and if you were for the war, you gave less credit to evidence of other explanations).  If you didn’t like Obama, you wouldn’t give positive credit to his plan for creating jobs (and if you liked Obama, you wouldn't give credit to alternative explanations).   If you don’t want there to be global warming, you don’t give credit to evidence for it (and if you believe in it, you don't give credit to contradicting evidence). He cites previous work that found similar effects with abortion, the death penalty, and others.

There are many reasons for this.  We are more likely to focus on evidence that supports our opinion (conservatives watch Fox, liberals watch MSNBC).  We also counterargue against information we don’t agree with (how many times have you yelled against a talking head on TV?), but gladly accept information that we do agree with. 

This study looked at two possible interventions to even out the score.  One thing they tried was to deal with the challenge to our self-image if we turn out to be wrong.  He had participants engage in some task them made them focus on their own good qualities.  In that state, they were less biased against a totally unrelated political view. 

The second intervention was intended to prevent counterarguing.  They looked at graphic, visual ways of presenting evidence with a basic meaning that jumps out at you from the design.  But it doesn’t give you any specific facts, so you can’t argue against them.  In this condition, people were more receptive and believing of information that contradicts their political opinions, even the important ones.

So next time you are in a political argument, start out by giving your opponent a self-affirmation.  “Look, you are one of smartest people I know so I am sure you will understand these new scientific findings.” 

Then, don’t hedge what you say by doing a “on one hand . . . and on the other hand . . . “  The person will just listen to the hand they agree with.  Stick with just the best, hardest, least questionable evidence that you have.

Unrealistic optimism

There was a fascinating paper in a recent issue of Nature Neuroscience that investigates why unrealistic optimism is so pervasive.  We all think our teams can win, we are above average drivers, this lottery ticket will be the one.  The same is true with important judgments like whether we will pass the test, get the job, or have enough money to retire.  Over the past decades, many possible reasons for this optimism have been suggested:

  • Perhaps because we like good news, we pay more attention to it. 
  • Perhaps because we like good news, we think about it more and therefore process it more deeply.
  • Perhaps good things happen more often, so we are more familiar with them. 
  • Perhaps good news is more exciting, so it increases brain activity during memory storage.
As you probably can guess from the name of the journal, these researchers hooked their participants up to fMRI machines to scan their brains and find out what causes the optimism.  They statistically controlled for all of the other possible solutions so they could be sure.  And here is what they found.

Positive information is processed in several parts of the brain, primarily the frontal cortex and the left inferior prefrontal gyrus.  Negative information only activates the left inferior prefrontal gyrus.  So what happens is that when you see positive information, your mental model gets stronger based on the activation of several brain areas, but when you see an equal amount of negative evidence, your model changes based on just one area so it changes less.  It’s like that old Lil Rascals episode with Alfalfa and Darla where Alfalfa is dividing up the candy.  One for you, one for me.  Two for you, One-two for me.  Three for you, One-two-three for me . . .

This is dangerous because optimism decreases the chance that we will take steps to protect ourselves.  We don’t always wear our seatbelt, we don’t always save enough for a rainy day, we don’t go to the doctor because “it’s probably nothing.”  Optimistic people have less stress and live longer.  But only as long as we don’t kill ourselves first.  So it is important to understand what to do about this. 

When you get information that supports your pre-existing ideas, take it with a grain of salt.  And if you get contradictory information, try to take it more seriously.  This is the opposite of what comes naturally, but could make your life much better.