Are you the sort of person who will change their position on a subject conditional on further information gained? I personally think this is the behaviour we should all be striving for. If we do not we end up withdrawing further and further from reality, insulating ourselves from rational argument and retreating into our own self contained worlds of comforting fantasy. Not unlike a Pauly Shore fan.*

Seriously though, when confronted with information that contradicts previous knowledge there are a number of different reactions possible. One is to reject the previous knowledge in favour of the new information, there are good reasons why we shouldn’t do this uncritically. If we simply accepted every new thing we were exposed to we would become like motes in the wind, changing direction constantly. Long term decisions and actions would become impossible and our lives would be subject to the merest whim. A poor strategy.

A different approach is to consider the new information in context with previous knowledge, determine which is more likely to be correct and then determine our actions accordingly. This is the most favourable action in my view but may still result in a distressingly high chance of having to change your mind on a regular basis. Again, possibly making it difficult to forge and maintain long term alliances and life strategies. A variation on this theme might be strategically best.

Another option is to summarily reject any new fact or opinion that is counter to your own, while this may make life simpler it is also fraught with difficulties in that it can make us too rigid and resistant to change. This is a recipe for ultimate downfall. A variation on this is to not only discount opposing views but to actually reinforce your commitment to prior beliefs.

This last variation was investigated in the paper “When Corrections Fail: The Persistence of Political Misperceptions“. The primary experiment showing the effect involved subjecting 130 participants to fake news stories about the US invasion of Iraq. The news stories contained an  actual quote from then president Bush regarding the possible existence of WMDs and the risk that this capability would be passed to terrorist groups.

The stories were then split into two conditions, they either then included information from the Duelfer Report that stated there were no WMD stockpiles nor an active program to produce such directly prior to the US invasion or a version that omitted this information.

Subjects were then asked to rated how much they agreed (On a 5 point scale; “Strongly agree” to “Strongly Disagree”) with the following statement:

“Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.”

When the responses of the subjects who received the corrective information were plotted with regard to their political stance (Liberal or conservative) it was found that conservative subjects were more likely to agree with the statement, with more conservative subjects agreeing more strongly.

It would seem from this that new information which not only contradicts previous knowledge but clashes with deeply held ideological convictions will have what the researches termed a “Backfire” effect. In this case the information apparently strengthened the previously held belief.

Now the experimental conditions apparently favoured an especially strong effect on politically conservative individuals but I don’t think that those on the the opposite end of the political spectrum are immune to this phenomenon. A further experiment aimed at this group did not show a clear “Backfire” effect but it did show that increasingly liberal subjects were less likely to be affected by the corrective information, in this case regarding the incorrect notion of a stem cell ban in the US.

In conclusion I think that, as always, we should take care that we examine all of our beliefs critically and not only evaluate new information based on whether it conforms to those beliefs but also on how accurately it conforms to reality.

*I don’t really mean that.

Thanks to the Badscience blog for the topic. Read Ben Goldacre’s post here.

Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions Political Behavior, 32 (2), 303-330 DOI: 10.1007/s11109-010-9112-2

Reblog this post [with Zemanta]