A recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias
By Michael Shermer
The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises ... in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate. --Francis Bacon, Novum Organum, 1620
Pace Will Rogers, I am not a member of any organized political party. I am a libertarian. As a fiscal conservative and social liberal, I have found at least something to like about each Republican or Democrat I have met. I have close friends in both camps, in which I have observed the following: no matter the issue under discussion, both sides are equally convinced that the evidence overwhelmingly supports their position.
This surety is called the confirmation bias, whereby we seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirmatory evidence. Now a functional magnetic resonance imaging (fMRI) study shows where in the brain the confirmation bias arises and how it is unconscious and driven by emotions. Psychologist Drew Westen led the study, conducted at Emory University, and the team presented the results at the 2006 annual conference of the Society for Personality and Social Psychology.
During the run-up to the 2004 presidential election, while undergoing an fMRI bran scan, 30 men--half self-described as "strong" Republicans and half as "strong" Democrats--were tasked with assessing statements by both George W. Bush and John Kerry in which the candidates clearly contradicted themselves. Not surprisingly, in their assessments Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own candidate off the hook.
The neuroimaging results, however, revealed that the part of the brain most associated with reasoning--the dorsolateral prefrontal cortex--was quiescent. Most active were the orbital frontal cortex, which is involved in the processing of emotions; the anterior cingulate, which is associated with conflict resolution; the posterior cingulate, which is concerned with making judgments about moral accountability; and--once subjects had arrived at a conclusion that made them emotionally comfortable--the ventral striatum, which is related to reward and pleasure.
Politicians need a peer-review system.
"We did not see any increased activation of the parts of the brain normally engaged during reasoning," Westen is quoted as saying in an Emory University press release. "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts." Interestingly, neural circuits engaged in rewarding selective behaviors were activated. "Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones," Westen said.
The implications of the findings reach far beyond politics. A jury assessing evidence against a defendant, a CEO evaluating information about a company or a scientist weighing data in favor of a theory will undergo the same cognitive process. What can we do about it?
In science we have built-in self-correcting machinery. Strict double-blind controls are required in experiments, in which neither the subjects nor the experimenters know the experimental conditions during the data-collection phase. Results are vetted at professional conferences and in peer-reviewed journals. Research must be replicated in other laboratories unaffiliated with the original researcher. Disconfirmatory evidence, as well as contradictory interpretations of the data, must be included in the paper. Colleagues are rewarded for being skeptical. Extraordinary claims require extraordinary evidence.
We need similar controls for the confirmation bias in the arenas of law, business and politics. Judges and lawyers should call one another on the practice of mining data selectively to bolster an argument and warn juries about the confirmation bias. CEOs should assess critically the enthusiastic recommendations of their VPs and demand to see contradictory evidence and alternative evaluations of the same plan. Politicians need a stronger peer-review system that goes beyond the churlish opprobrium of the campaign trail, and I would love to see a political debate in which the candidates were required to make the opposite case.
Skepticism is the antidote for the confirmation bias.
First printed in Scientific American on June 26th,2006