Tuesday, March 2, 2010

Science's Dirty Secret

The problem with science isn't that most experiments fail - it's that most failures are ignored.

An interesting article in Wired magazine, courtesy of Dan Pink's blog, challenging what author Jonah Lehrer calls "the myth of objectivity".

In the early 1990s Kevin Dunbar undertook a research project at Stanford University looking into how scientific experiments fail or succeed.  His first surprise was how often experiments were deemed failures when the results failed to support or even contradicted the original theories; the second was the frequency with which the dissonant results were simply ignored.  In effect, the scientists had discovered a new fact, but had discarded it as a failure because it did not meet their expectations.
The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend we’re empiricists — our views dictated by nothing but the facts — we’re actually blinkered, especially when it comes to information that contradicts our theories. The problem with science, then, isn’t that most experiments fail — it’s that most failures are ignored.
Dunbar subsequently used fMRI imaging to track brain activity of both 'experts' and 'non-experts' when exposed to two video-taped experiments, one accurate and one inaccurate.  Not surprisingly, the part of the brain called the anterior cingulate cortex (ACC) - associated with the perception of errors and contradictions - was triggered in the experts' minds when shown the inaccurate video, and in the non-experts' minds when shown the video that was accurate, but which they thought was incorrect.

But Dunbar's analysis didn't stop there.  Enter another area of the brain - the dorsolateral prefrontal cortex (DLPFC) - whose task is to censor thoughts that don't mesh with our preconceptions, a bit of a problem for a scientist aiming for objective observation.  In Dunbar's words, if the ACC is the "Oh Shit" response, the DLPFC is the Delete key.  When the ACC and DLPFC “turn on together, people aren’t just noticing that something doesn’t look right,” he says. “They’re also inhibiting that information."
The lesson is that not all data is created equal in our mind’s eye: When it comes to interpreting our experiments, we see what we want to see and disregard the rest... Belief, in other words, is a kind of blindness.
So what are we to do if we can't trust our 'objective' reality?  In the section How to Learn From Failure Jonah Lehrer presents four main points, which he fleshes out in the article:
  1. Check your assumptions
  2. Seek out the ignorant
  3. Encourage diversity
  4. Beware of failure-blindness
To discover how we can learn to see more accurately, and make the most of our perceived failures, access the full article HERE.

Lehrer concludes:
What turned out to be so important, of course, was the unexpected result, the experimental error that felt like a failure. The answer had been there all along — it was just obscured by the imperfect theory, rendered invisible by our small-minded brain. It’s not until we talk to a colleague or translate our idea into an analogy that we glimpse the meaning in our mistake. Bob Dylan, in other words, was right: There’s no success quite like failure.
~~~~
In the beginner's mind there are many possibilities
In the expert's there are few
- Shunryo Suzuki Roshi

~~~~
Related RoadKill Posts:
Thought For Right Now - Success and Failure
Leave The Door Open
The Freedom To Not Know
Pay Attention
~~~~

No comments:

Post a Comment