Monday, October 26, 2009

Motivated Reasoning - a Non-Religious Analysis That Is Also Applicable to Religion

In the September 25, 2009 edition of , "the Atlantic," Lane Wallace wrote an article called, All Evidence to the Contrary.  This article deals with two very specific examples of irrational cognitive bias that can be explained using a cognitive approach called, "motivated reasoning." Quoting from the article, "Motivated reasoning is, as UCLA public policy professor Mark Keinman put it the equivalent of policy-driven data, instead of data-driven policy."  Perhaps another simple way of putting it is, "When I know something, don't try to confuse me with the facts."  People are motivated to reason their way to supporting their currently help position, in spite of what the facts say.

In the first example, Wallace tells of two competing claims to having been the first to have reached the North Pole.  Robert E. Peary led one expedition, and Fredrick A. Cook led the other.  Both claimed to have reached it while arguing that the other's claim was untrue.  This controversy has continued for over 100 years.  Interestingly, as more evidence and data has been collected over that time period, is seems most likely that neither of them actually reached the North Pole.  Based upon the currently available evidence, neither explorer apparently made the correct celestial measurements to confirm that he had arrived at the North Pole.  From the article, we can read the following.

Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth.  They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
 The second example relates to solving the mystery of a young man's disappearance that had been mythologized over many decades.  David Roberts published an article in National Geographic Adventure and received a backlash of anger and even threats from bringing a probable solution to the disappearance mystery.

In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.

The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar.
 Both of these examples should ring a familiar bell to readers of this blog.  Apologetic thinking and the unreasoned thinking of many religious believers may come to mind.

Some psychologists have theorized that "Bayesian updating" is employed by most people to change their attitudes, perceptions, and beliefs.  The basic idea behind Bayesian updating is that when people receive data and input that contradicts with currently held beliefs, they rationally consider the data and process it, perhaps making incremental adjustments to their beliefs.  "But researchers at Northwestern University found that many people instead choose the change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic."  To me this seems to be a classic representation of confirmation bias.  This bias, simply stated is that we see what we believe.

...although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.

Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues.
 This is unlikely to be surprising to anyone who walks on the fringes of Mormonism, or any other literalistic, conservative religion for that matter.  People are unlikely to give up on their previously held beliefs easily.  To do so is painful.  Kleiman suggested that part of the reason is:

"the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
 People have an innate need to be right.  Their team is the right team.  Their idea is the best.  Their group is "chosen."  To work towards a solution, Kleiman points to philosopher Karl Popper.  Kleiman says that Popper "believed fiercely in the discipline and teaching of critical thinking because it 'allows us to offer up our opinions as a sacrifice, so that they can die in our stead.'"

I found this article to be intensely interesting.  It is useful to me that the article did not apply directly to religion, but it is clear that it is applicable.  I will add more to this topic in another post, focused on the Northwestern study.

No comments:

Post a Comment