Saturday, October 05, 2013

Metaawareness is a strong form of expertise



This blog comes from two things I have read recently. One is a discussion on Big Think about trusting your common sense.  The other is a paper on “The Hobgoblin of Consistency” in the journal of Personality and Social Psychology.

The Big Think article looks at experts.  Experts are good at something.  And they know it.  So they get pretty confident that their instincts (which are just the application of well-learned rules) will lead them to success.  And it works so it makes sense for them to do this.  The problem arises when they do something where they are not expert.

It could be that the world changed when they weren’t looking.  This happens to technology experts all the time, as we all saw when the real estate market turned around in 2007.  The “rule” that real estate prices always go up stopped working.  At the company level it is a major reason for the failure of Blackberry when companies changed to “Bring your own device” policies.   

Or it could be that they are in a new environment where there are different rules.  Apple did pretty well shifting from Macs to iMacs to iPods to iPhones to iPads.  But Sony couldn’t make the shift from TVs and Walkmen (their early mobile phones weren’t bad, but they never made it to smartphones).  Motorola couldn’t shift from feature phones to smart phones.

Or it could be that they have no business applying a business rule to some other environment.  How many business execs try to “manage” their kids and spouses like employees (and usually fail)?

So Maria Konnikova at Big Think warns experts not to get too comfy in their instincts, recognize that these are simply good domain-specific rules, and when they switch domains they have to switch rules.  If they don’t know the rules of the new domain, then they shouldn't just try to apply the old ones.  They have to learn all over again.  This meta-awareness is the sign of true expertise.

The Hobgoblins paper is similar but looks at poor performers.  They look at something called the Dunning-Kruger effect, which is the crazy phenomenon in which people who perform badly somehow think they are doing well.  We all think we are better than average drivers but Dunning Kruger performers are really clueless. 

The Hobgoblins paper discovers at least one reason why.  It turns out that people who apply a simple and logically rational rule on a consistent basis get very confident in their performance.  This makes sense except when you realize how many simple and logically rational rules are totally wrong.  “The world is flat.”  It is simple.  It is logically rational – after all the horizon extends pretty darn far.  Balls don’t keep rolling when we put them down.  But . . .  wrong. 

Unfortunately, this confidence leads to blindness towards feedback.  After all, why should you look for feedback when you already know you are correct?  So there were several kinds of people that were identified in the study.  Experts who had a correct simple rational rule applied them blindly and did very well.  They were also very confident.  There were also poor performers who had an incorrect simple rational rule, applied them blindly, and did very poorly.  They were also very confident.  Then there was a huge group of people in the middle.  Some performed well and some performed poorly and some were in the middle.  But what makes them similar is that they all considered alternatives.  Whether they were consistently correct, consistently incorrect, or back and forth, just the fact that they considered alternatives made them less confident.   

For the poor performers, this allows them to maybe learn and get better.  The study didn’t do the follow-up to see if they really did this, but at least it was possible.  They also could learn to avoid the situations where they didn’t know what to do and defer to others instead.  But the blind, confident, poor performers (can you say US Congress?) can’t.

2 comments:

Dave Marsay said...

I have often observed a similar effect when scientists and economists are talking to policy makers. How confident can we be that areas like finance and climate change, which would seem to require a degree of realism and mutual respect, have been adequately worked through?

Do you have any thoughts on whether this is due to culture or some more innate property of those who become narrow experts? (As a mathematician, I always blame the nearest human scientist, but it looks like they may be getting to grips with the problem.)

Regards.

Trey Roady said...

Something that's also important is the importance of validation. Anyone can make perfect predictions in hindsight, it's the power of prediction that is truly valuable.

Part of the problem is that, as people, we're wonderful pattern-recognizers and frequently convince ourselves of our efficacy post-hoc.

Without validation, you can simply build a Dunning Kruger case where you have much practice at "expertise" which doesn't reflect the actual situation.