Confirmation bias is of interest in the teaching of critical thinking, as the skill is misused if rigorous critical scrutiny is applied only to evidence challenging a preconceived idea but not to evidence supporting it.
Alternately, Murphy's Law of Research dictates that "Enough research will tend to support your theory."
Within a single experiment, confirmation bias on the part of the experimenter may exhibit itself as expectation bias in the final published results. Data agreeing with the experimenter's expectations may be more likely to be considered "good", while data that conflicts with those expectations may be more likely to be discarded as the product of assumed experimental error.
and told that triple conforms to a particular rule. They were then asked to discover the rule by generating their own triples and use the feedback they received from the experimenter. Every time the subject generated a triple, the experimenter would indicate whether the triple conformed to the rule. The subjects were told that once they were sure of the correctness of their hypothesized rule, they should announce the rule.
While the actual rule was simply “any ascending sequence”, the subjects seemed to have a great deal of difficulty in inducing it, often announcing rules that were far more complex than the correct rule. The subjects seemed to test only “positive” examples -- triples that subjects believed would conform to their rule and confirm their hypothesis. What they did not do was attempt to challenge or falsify their hypotheses by testing triples that they believed would not conform to their rule. Wason referred to this phenomenon as confirmation bias, whereby subjects systematically seek only evidence that confirms their hypotheses.
The confirmation bias was Wason’s original explanation for the systematic errors made by subjects in the Wason selection task. In essence, the subjects were choosing to examine only cards that could confirm the given rule rather than refute it. Confirmation bias has been used to explain why people believe pseudoscientific ideas.
It has been argued that like in the case of the matching bias, using more realistic content in syllogisms can facilitate more normative performance, and the use of more abstract, artificial content has a biasing effect on performance.
One explanation may lie in the workings of the human sensory system. Human brains and senses are organised in such a manner so as to facilitate rapid evaluation of social situations and others' states of mind. Studies have shown that this behaviour is evident in the choosing of friends and partners and houses, even though it is largely subconscious. Although it can be a very fast process, the initial impression has a lasting effect as a byproduct of the brain's tendency to fill in the gaps of what it perceives and the unwillingness of the believer to admit a mistake.
In 1979, Lord, Ross, and Lepper conducted an experiment to explore what would happen if they presented subjects harboring divergent opinions with the same body of mixed evidence. They hypothesized that each opposing group would use the same pieces of evidence to further support their opinions. The subjects chosen were 24 proponents and 24 opponents of the death penalty. They were given an article about the effectiveness of capital punishment and were asked to evaluate it. Then, the subjects were given detailed research descriptions of the study they had just read, but this time it included procedure, results, prominent criticisms and results shown in a table or graph. They were then asked to evaluate the study, stating how well it was conducted and how convincing the evidence was overall.
The results were congruent with the hypothesis. Students found that studies which supported their pre-existing view were superior to those which contradicted it, in a number of detailed and specific ways. In fact, the studies all described the same experimental procedure but with only the purported result changed.
Overall, there was a visible increase of attitude polarization. Initial analysis of the experiment shows that proponents and opponents confessed to shifting their attitudes slightly in the direction of the first study they read, but, once subjects read the more detailed study, they returned to their original belief regardless of the evidence provided, pointing to the details that support their viewpoint and disregarding anything contrary.
It is not accurate to say that the subjects were trying to view the evidence in a biased manner, but, since the subjects already had such strong opinions about capital punishment, their reading of the evidence was colored toward their point of view. Looking at the same piece of evidence, an opponent and proponent would each argue that it supports his own cause, thus pushing contrary opinions even further into their opposing corners.
Polarization can occur in conjunction with other assimilation biases such as illusory correlation, selective exposure, or the primary effects. The normative model for this bias is the neutral evidence principle. A formulated belief can prevail even if the evidence that was used in the initial formation of that belief is entirely negated.
"I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life".
A related Tolstoy quote is:
"The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him."
Jonathan Baron describes many instances where myside bias affects our lives. For example, students who perform poorly suffer from irrational belief persistence when they fail to criticize their own ideas and remain rigid in their mistaken beliefs. These students suffer from myside bias because they do not look for, or tend to ignore, evidence against their mistaken claims. Baron also mentions certain forms of psychopathology as good examples of myside bias. Delusional patients, for instance, might continually wrongly believe that a cough or sneeze means that they are dying, even when doctors insist that they are healthy.
Aaron T. Beck describes the role of this type of bias in depressive patients. He argues that depressive patients maintain their depressive state because they fail to recognize information that might make them happier, and only focus on evidence showing that their lives are unfulfilling. According to Beck, an important step in the cognitive treatment of these individuals is to overcome this bias, and to search and recognize information about their lives more impartially.
Morton was at one time a Young Earth creationist who later disavowed this belief. The demon was his way of referring to his own bias and that which he continued to observe in other Young Earth creationists. With time it has become a common shorthand for confirmation bias in a variety of situations.