Monday, August 20, 2012

self-deception

Study after study has found that the vast majority of people think they are above average, less biased, more congenial, less susceptible to improper influence, and more competent than the majority of their peers.

Ninety-four percent of university professors think they are better at their jobs than their colleagues. Seventy percent of college students think they are above average in leadership ability. Only two percent of college students think they are below average in leadership ability. Twenty-five percent of college students believe they are in the top one percent in terms of their ability to get along with others.

Eighty-five percent of medical students think it’s improper for politicians to accept gifts from lobbyists, but only forty-six percent of medical students think it’s improper for physicians to accept gifts from drug companies. A study of medical residents found that eighty-four percent thought that their colleagues were influenced by gifts from pharmaceutical companies, but only 16 percent thought that they were similarly influenced.


Most people believe that those who agree with them on issues they consider important are high-minded, diligent, and keen observers of the human condition, while those who disagree with them are biased, sloppy thinkers acting on selfish motives with little concern for the truth.

Self-deception is natural and pervasive, and while it may sometimes provide a boost to our sense of well-being to have an exaggerated opinion of our abilities or characteristics, self-deception is probably always detrimental to critical thinking. Those who consider themselves immune to cognitive biases, while supposing all their opponents are biased and ill-motivated, are not likely to be corrected of any errors they maintain. Those who think they are immune to the effects of bribes, while their colleagues are not, are likely to deceive themselves into thinking they are acting properly when, in fact, their behavior crosses the line into immorality. Those who think they are immune to flattery and appeals to their vanity are likely to be suckered into taking actions they will later regret.

The human brain evolved to be a great deceiver. One of the most common deceptions we face is thinking we see patterns and meaning in random events or coincidences. We like to fit our perceptions into a running narrative that holds our worldview together, whatever that worldview might be. Because of this natural tendency to confirm our biases and disconfirm beliefs that conflict with our beliefs, science has developed methods such as the double-blind, randomized control group study to minimize self-deception. The likelihood of self-deception increases for those who give greater weight to the evidence from personal experience than they do to the evidence from scientific studies.

Does this mean that scientists are free from self-deception? Of course not. Scientists can unconsciously bias experiments to confirm their beliefs, while convincing themselves that they are unbiased and objective. Experimenters can unconsciously influence human subjects with their expectations and biases, and thereby skew their data to favor a treasured hypothesis. One of the more famous examples of self-deception by a scientist is that of French physicist RenĂ© Blondlot, who thought he had discovered a new form of radiation. He named it the N-ray, after his university and home town of Nancy. The N-ray was a delusion, however. Actually, it was a “self-induced visual hallucination,” as Martin Gardner noted. Recent examples of self-deception in science have been provided by Pons and Fleischmann who claimed to have produced energy from cold fusion and Jacques Benveniste who claimed to have proof that water in homeopathic potions has selective memory.

There are numerous ways that scientists can deceive themselves about their objectivity and fairness in designing and evaluating experiments. One common way that many scientists deceive themselves is in the way that they use statistics. Many spirit or parascientists—those who study ESP, psychokinesis, distance healing with chants or prayers, homeopathy, and the like—have deceived themselves regarding the value of statistical significance and meta-analysis. They assume that if the data from a single study show results unlikely due to chance according to some arbitrary mathematical formula that they have found convincing evidence for their hypothesis. Or they lump together the data from several small studies and apply an arbitrary statistical formula to the data as if it were collected from a single larger study; if they find statistical significance they think they’ve found strong evidence for their beliefs.

Unlike those who rely solely on personal experience for a guide, however, scientists have other scientists to criticize their work and who publish their criticisms in peer-reviewed journals, at public meetings, and in the mass media. Personal biases are easier to reinforce by associating with people who share your biases and by distancing yourself from those you disagree with. Scientists can do this, too, but it is much more difficult for a self-deceived scientist to get away with his or her deceptions since science is essentially self-correcting over the long haul. Science, as Jacob Bronowski put it, “is a tribute to what we can know although we are fallible.”

Self-deception encrusts the worldviews of the arrogant, whether they be highly intelligent or simply incompetent. Highly intelligent people who are arrogant are capable of fending off any counterevidence to their beliefs. These are the folks described by Michael Shermer in his attempt to explain why some smart people believe dumb things. They are the great rationalizers. At the other extreme are those who are cognitively incompetent; they are incapable of recognizing their erroneous judgments. The cognitively incompetent are not necessarily stupid. Often enough they are of average intelligence, but lack knowledge and experience relevant to understanding how self-deception works and how we are all susceptible to many cognitive illusions and biases.

So, even though it might be true that if we were too brutally honest and objective about our own abilities and about life in general, we might become debilitated by depression, we should not forget that there is a dark side to self-deception. Our ability to think critically depends on our ability to overcome the many biases that lead us to self-deception.

3 comments:

  1. Once again Bob Carroll provides an excellent summary of a complex topic.

    ReplyDelete
  2. Stumbled across this by accident and am very impressed. I expect to buy a copy of your book soon. Thanks.

    ReplyDelete