Humanist Perspectives: issue 205: Belief in the Face of Contrary Evidence

Belief in the Face of Contrary Evidence
by James Alcock

Trump crowd
Photo by Michael Candelori

Excerpt from: Alcock, James E. (2018). Belief: What It Means to Believe and Why Our Convictions Are So Compelling. Amherst, NY: Prometheus Books. (Pp. 180-186)

Author’s note: While most of us like to consider our beliefs to be rational and reasoned, many beliefs form automatically without being vetted by critical thinking. And while we might like to think that we will readily correct important beliefs when challenged by disconfirming evidence, humans have a remarkable capacity for protecting and preserving cherished beliefs from challenge. Some of the ways that we do so are discussed in the following excerpt.

Every man who attacks my belief, diminishes in some degree my confidence in it, and therefore makes me uneasy; and I am angry with him who makes me uneasy.
~Samuel Johnson (1709–1784)
E

ven if we like to think of ourselves as open-minded and willing to consider new information carefully, there are bound to be times when that is not the case. For one thing, people are often unwilling to admit their mistakes because to do so would challenge self-esteem, and the higher one’s station, the greater the risk of public humiliation and blame. As psychologists Carol Tavris and Elliot Aronson point out, when people are directly confronted by evidence that they are wrong, most do not change their point of view or course of action but instead strive to justify it even more tenaciously.

As for open-mindedness, we are all usually quite selective in terms of what information we bother even to consider. The deeply rooted capitalist is unlikely to pore over books on communist theory, the devout Christian does not give serious consideration to atheist arguments, the atheist is not interested in Christian dogma, and the environmentalist is not likely to spend time considering industry arguments that downplay environmental concerns.

Furthermore, it is not all that difficult to defend a firm belief against challenges and contrary information. Indeed, research has found that beliefs can survive even when the evidence that originally supported them has been completely demolished. As psychologists Lee Ross and Craig Anderson have observed,

…beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential basis.

Several psychological processes operate to protect and preserve cherished beliefs even in the face of compelling contradictions.

Discounting principle

Belief can be preserved by discounting the source of contrary information. One might have expected that Donald Trump’s insistence that Barack Obama was not eligible for the presidency of the United States because he had been born outside the country would have ended once Obama produced his birth certificate proving that he had been born in Hawaii. Trump and his followers instead discounted the source of the evidence by questioning the authenticity of the birth certificate.

We all tend to seek information that supports our beliefs, although we may not be aware of doing so.

Another example: Edgar Nernberg is an amateur fossil expert with a strong belief in Creationism (the belief that God created the universe and its earthly inhabitants about six thousand years ago). While operating a backhoe to excavate a basement in 2015, he uncovered the outlines of five fish embedded in sandstone. Immediately recognizing their significance, he contacted a paleontologist, who subsequently concluded that the fish had lived some 60 million years ago. If true, that would directly counter Nernberg’s Creationist belief. However, according to a newspaper report, Nernberg dismissed the paleontologist’s fossil-dating as erroneous and instead maintained that the fish most likely lived shortly before the Great Flood described in the Bible, some 4,300 years ago by his reckoning. By discounting the expert’s assessment, he was able to maintain an important belief.

Loopholism

Instead of discounting the source of contradictory information, people can also preserve belief by downplaying the relevance of the new facts. This is what psychologist Ray Hyman refers to as loopholism. The new information is rejected by finding a loophole, arguing that “it is not the same thing.” For example, if research finds that a homeopathic remedy has no effect, the homeopathic believer may dismiss the research on the grounds that the remedy was not properly prepared, or that the circumstances under which it was administered were not appropriate.

Rationalization

People generally find inconsistency among their various beliefs, feelings, and behaviors to be uncomfortable. Psychologist Leon Festinger labeled this discomfort cognitive dissonance. The inconsistency can often be removed through rationalization. For example, suppose that you have always believed that people should do everything they can to get a good education, but you have just abandoned your university studies in order to have more time to go traveling with your friends. This produces cognitive dissonance. You remove it by rationalizing that while you can always go back to school, if you withdraw from your friends at this time you may lose them. Therefore, you conclude, you have made a good decision.

Confirmation bias

We all tend to seek information that supports our beliefs, although we may not be aware of doing so. Often, this results in a confirmation bias, which can contribute to misinterpretation of evidence. For example, many people have made efforts to find something magical or paranormal in the shapes of the Egyptian pyramids. Some have claimed that the dimensions of the pyramid correspond to the dimensions of the earth, suggesting the builders must have had knowledge far beyond what the Egyptians of the time could have possessed. There are many aspects of the pyramids that can be measured, and focusing only on those that fit with one’s belief can generate what appears to be a profound correspondence, even if it is actually meaningless.

Scientists, too, can become so invested in a particular theory or hypothesis that a confirmation bias influences their interpretation of new information. Sociologist Ian Mitroff studied forty-two scientists involved in soil analyses during the moon exploration program. Some were theoreticians and others were empirical researchers. Each was interviewed about their predictions concerning the composition of the moon rocks that were being collected and then interviewed again after the rocks had been brought to earth and analyzed. When faced with actual data from soil analysis that contradicted their predictions, the empiricists modified their views in line with the data, but the theoreticians tended to maintain their prior beliefs and find ways to interpret the data so that it was not incompatible with their predictions. Physicist Max Planck (1858–1947) was referring to such intransigence of scientific belief when he wrote,

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.

More recently, two samples of researchers, those in one group who believed that playing video games leads to increased aggression and those in the other who had found no evidence to support that claim, were provided with new evidence that was actually mixed and inconclusive. Individuals in both groups interpreted the information as supporting their own position, and this resulted in them having increased confidence in their views and in a widening of the differences between them in terms of their beliefs.

Confirmation bias often shows up during disputes, in which case it is sometimes referred to as the myside bias. Because of a preference for “my side” of an issue, an individual’s beliefs interfere with evaluating whatever challenges those beliefs, and, as a result, opposing arguments are often given short shrift In this case, not backing down plays an important role, and people typically engage in debate primarily to justify their own point of view. Research has found that the myside bias is not related to intelligence, but is influenced more by how skilled one is at logic and reasoning. Odd as it may seem, people who maintain that good arguments should be based in facts are actually more prone to the myside bias, for they often overvalue their own “facts.”

Pollyanna principle

In general, we prefer pleasant beliefs over unpleasant ones. It is more pleasant to believe, for example, that most people would help a stranger during an emergency than it is not to believe it. The Pollyanna principle leads to the application of a lower standard of evidence when assessing information that we welcome.

Compartmentalization

People are often successful in isolating incompatible beliefs from each other. Religious beliefs are often isolated from secular beliefs, so that changes in the latter do not affect them. Thus, a scientist may at the same time be a devout Christian, Hindu, or Muslim and espouse beliefs that are directly at odds with her scientific beliefs. The same is often true with regard to beliefs in psychic phenomena. Such compartmentalization, even by scientists, has long been recognized. In 1896, William James observed,

At one hour scientists, at another they are Christians or common men, with the will to live burning hot in their breasts; and holding thus the two ends of the chain, they are careless of the intermediate connection.

Similarly, psychologist Gordon Allport wrote in 1955,

No paradox is more striking than that of a scientist who as citizen makes one set of psychological assumptions and in his laboratory and writings makes opposite assumptions respecting the nature of man.
Disconfirmation and the Boomerang Effect
[F]alsification of an important belief that has been publicly shared with others can at times boomerang and actually strengthen the belief.

As noted earlier, people are less likely to back down from their belief if they have declared it publicly. Indeed, falsification of an important belief that has been publicly shared with others can at times boomerang and actually strengthen the belief. A good example is that of a self-styled 1950s religious leader, Dorothy Martin (1900–1992), who became better known as “Marion Keech,” the pseudonym given her by the social psychologists who infiltrated her group of believers and observed their reactions first-hand on the night she predicted the world would end. Mrs. Keech claimed to have received a message from God indicating that the world was about to be destroyed by a flood. Reflecting her earlier involvement with Dianetics, the precursor to Scientology, she persuaded her followers that spaceships would be sent by an advanced extraterrestrial society to save them.

Following her instructions, her adherents prepared themselves for rescue on the appointed date. They had sold their possessions, quit their jobs, and said goodbye to their disbelieving families. They were careful to remove all bits of metal from their clothing, including metal eyes on shoes and hooks from brassiere straps, that might interfere with the spaceships’ electronics. Having endured the disbelief and even ridicule of friends and family, they now gathered together at the appointed hour. Nothing happened. Hours passed but no spaceships came, and the world did not end. More hours passed before Mrs. Keech retired to her bedroom to pray. She emerged a short time later with the glorious news that, because of their show of faith, God had decided to spare the world.

While one might expect that her followers would be disillusioned and fall away from Mrs. Keech, their reaction was just the opposite. They began to proselytize, making efforts to bring others into her group. They sought interviews with newspapers and, contrary to their earlier avoidance of publicity, undertook a campaign to spread their beliefs as widely as possible. The explanation for this strange outcome lies in cognitive dissonance. Members of the group either had to admit to themselves and their friends and family that they had been foolish, or they could continue to believe that they had acted reasonably and that their faith had spared the world. They essentially doubled down, extolling the importance of Mrs. Keech’s teachings in an effort to persuade others.

The psychologists who observed Mrs. Keech’s group concluded that several conditions are necessary for people to become even more strongly committed to a belief after it has been proved false:

  1. The belief has had some influence on the believer’s behavior, making it observable.
  2. The individual has taken some nearly irrevocable action in light of the belief.
  3. The belief must be capable of being refuted by real events.
  4. The disconfirming evidence must be recognized by the believer.
  5. Subsequent to the disconfirmation, there must be continuing social support for the original belief. This became apparent when others of Mrs. Keech’s devotees who lived in another city and lacked the group support soon fell away from her influence and gave up their beliefs after her prophecy failed to materialize.
The failure of the Rapture to materialize was all but incomprehensible within their belief system.

Mrs. Keech, of course, was not alone in predicting the end of the world, and other apocalyptic predictions come and go. For example, in 2011, evangelist Harold Camping concluded, based on his own biblical analysis, that the world was to end that year. A series of earthquakes would move around the world on May 21, striking each region at 6 p.m. local time, bringing with it the Rapture through which 200 million devout Christians would be taken up into heaven as a prelude to the second coming of Christ. The world was then to be destroyed by a fireball five months later, on October 21. His California-based religious broadcasting group, Family Radio International, spent millions of dollars advertising the May 21 Judgment Day around the world, and this included putting up five thousand billboard announcements around the United States. In anticipation, many of Camping’s followers abandoned their extended families, quit their jobs, sold all their possessions, and budgeted their finances on the basis that they would not need money after May 21.

The failure of the Rapture to materialize was all but incomprehensible within their belief system. They were presumably mightily relieved when two days later, on May 23, Camping announced that he had misunderstood, but now realized that May 21 had been “an invisible Judgment Day,” when “Christ came and put the world under judgment.” He maintained that the timetable was unchanged and that the world would end with the Apocalypse on October 21, 2011. As we know, he was mistaken. Unlike Mrs. Keech, Camping apologized for his error, depriving his followers of the opportunity to believe that their faith had saved the world.

James Alcock is Professor of Psychology at Glendon College, York University. He is a Member of the College of Psychologists of Ontario, a Fellow of the Canadian Psychological Association, and is a long-serving member of the Executive Committee of the International Committee for Skeptical Inquiry.

order a copy of this issue (205)

$7.50 CAD, to a Canadian address
$7.50 USD, to an address in the USA
$11.50 USD, to an address outside Canada/USA
To receive a free sample copy of a previous issue, send your address to: ae947@ncf.ca

1909_15086_magscanada_728x90_en