Monday, February 27, 2012

irrelevant appeal to authority

The irrelevant appeal to authority is a fallacy in reasoning in which one argues that a practice or belief is justified because some authoritative person or text asserts it.

If a practice or belief is justified there must be good reasons for it and those reasons should explain why the practice is a good one or why the authoritative person or text supports it. The irrelevant appeal to authority differs from the appeal to an irrelevant authority.

An example of an irrelevant appeal to authority would be to claim that vaccines are not safe because Dr. Jay Gordon, a pediatrician and assistant professor of pediatrics at UCLA Medical School, says they're not. Quoting Gordon's reasons does not make the appeal to his belief relevant to whether vaccines are safe. The following claims don't become true just because Dr. Gordon asserts them.
Studies showing that vaccines and their many constituents do not contribute to this problem [of triggering autism] are flawed, filled with specious reasoning and, for the most part funded by the pharmaceutical industry. Even articles in reputable medical journals are often written by doctors with an economic interest in continuing the vaccination program's status quo. This does not invalidate all of these studies but it certainly makes them suspect and a poor foundation for an argument excluding vaccines from the list of environmental influences on the increase in autism in America and elsewhere.
Since there could be nothing more relevant than scientific studies to the issue of whether vaccines trigger autism, it begs the question to dismiss scientific studies as "suspect." To cite Dr. Gordon in support of not considering scientific studies when trying to determine whether vaccines trigger autism is irrelevant. A proper approach would be to analyze and evaluate the studies that defenders of the safety of vaccines put forth as the best ones showing there is no association between vaccines and autism. That is the approach Dr. Gordon should take and it is the approach anyone citing him to support the belief that vaccines aren't safe should take. Dr. Gordon may be an expert in medicine, but the value of the studies on the association between vaccines and autism depends on the nature of those studies, not on his say-so. In any case, there are many other experts, just as qualified as Dr. Gordon, who disagree with him. The fact that Gordon and other experts disagree with each does not make the issue controversial, however. Gordon is out of step with the consensus of medical experts that vaccines are safe and not associated with autism. Finding an outlier who disagrees with the scientific consensus does not mean you've established that there is a controversy over an issue. Some in the mass media present outliers in a feeble attempt at fairness (pseudosymmetry). To be controversial, there must be widespread disagreement among the experts about the issue.

(As an aside, I have looked at the scientific studies and my opinion is that there is no compelling evidence of an association of vaccines with autism or that pharmaceutical firms have corrupted the research process in this area. Don't take my word for it, though. Read what I have to say about the studies and then check them out for yourself.)

An example of the appeal to an irrelevant authority would be appealing to the advice of an actress with no education or background in medicine to justify seeking some offbeat cancer treatment or for claiming that common vaccines would be harmful to children. Citing Jenny McCarthy on scientific or medical issues is to cite an irrelevant authority. Being a mother of a child who you declare is autistic does not make you an instant expert, no matter how many conversations you've had with supporters like Dr. Jay Gordon.

It’s often the case that arguers combine the irrelevant appeal to authority with the irrelevant appeal to popularity. If it is irrelevant to appeal to one authority to prove a point, then it is irrelevant to appeal to many authorities to prove the same point. However, it is not always irrelevant to appeal to authorities. If you know nothing about medicine and your physician goes over the results of a medical test with you and recommends a course of action, you are not committing the fallacy of irrelevant appeal to authority when you justify taking that action because your physician recommends it. You might consult another physician for a second opinion, but you would be foolish to consult, say, the janitor, Suzanne Somers, or the local newspaper’s astrologer.

We must rely on experts sometimes, but experts don’t always agree with each other. If, for example, your medical test involved some back problems you’ve been having, you might get five different opinions from five equally competent physicians on what course of action would be best for you. Why? Recommendations for back problems are notoriously controversial. It would obviously be silly to claim that one recommendation must be the best one since it was made by an expert when there are five different recommendations from five equally competent experts. Ultimately, you should consider all the pros and cons of each of the recommendations and select the option that seems best to you. On the other hand, if four of five equally competent physicians recommend the same course of action, unless you can find a compelling reason for rejecting their position, it would seem that the reasonable course of action would be to follow their advice.

When the majority of experts in a field agree on something, we say there is a consensus. Such is the case with climate experts on the issue of anthropogenic global warming. There are many people, some of them scientists, who do not agree with the consensus that human activities such as deforestation and burning of fossil fuels that result in more greenhouse gases like carbon dioxide are causing changes in our planet’s climate that may prove devastating and irreversible. One tactic of the climate change deniers is The Petition Project, which features over 31,000 scientists signing a petition stating “there is no convincing scientific evidence that human release of carbon dioxide will, in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere.” It is true that 31,000 scientists is a large number, but it is irrelevant to the issue of whether humans are largely responsible for climate change. Most of these 31,000 scientists aren’t experts in climate science and, in this case, that matters because when anyone speaks outside his or her own area of expertise their view carries no more weight than that of any other non-expert. What makes it reasonable to accept anthropogenic climate change is not the fact that almost all climate scientists agree. It’s why they agree. Even non-experts can figure out that the experts agree: a survey of all peer-reviewed abstracts on the subject ‘global climate change’ published between 1993 and 2003 showed that not a single paper rejected the position that global warming is largely caused by human behavior. Climate scientists are not arguing about whether global warming is happening. They’re not arguing about whether humans are largely responsible for global warming. They may be arguing about what action to take. In that case, they should be considered as advisers by those who make policy. Unfortunately, many of those who make policy seem to be ignoring the climate scientists in favor of beliefs pushed by gas, oil, and other corporate interests. Those interests should be considered, but not to the exclusion of the science experts.

Sunday, February 19, 2012

backfire effect

The backfire effect is a curious response many people have to evidence that conflicts with their beliefs: instead of becoming open to the possibility that the evidence might be correct and one might have to change one's mind, many people become more convinced that they were right in the first place. Yes, that's right. Some people's beliefs get stronger when evidence against their belief is presented to them. You would think that a rational person would base his beliefs on the strength of the evidence and that evidence against his belief should weaken rather than strengthen his belief, but there is a growing body of scientific evidence that has found most of us are not that rational when it comes to dealing with evidence that conflicts with beliefs we already hold.

Journalist David McRaney sums up the backfire effect nicely: "When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger."

Political scientists Brendan Nyhan and Jason Reifler coined the term “backfire effect” to describe this irrational response of coming to hold one's original position even more strongly when confronted with evidence that conflicts with one's belief. For journalists and anyone engaging in debate and argument who hopes to persuade others and change their minds or correct misinformation, the backfire effect is more than annoying; it indicates our goals are pointless and we are guaranteed to fail. Why bother to provide evidence for global warming or point out the errors of the climate change deniers if the better our arguments are the more the deniers will dig in and be encouraged in their continued errors? Why bother explaining evolution to young Earth creationists and rebutting their inane arguments when the likely effect is to strengthen their erroneous beliefs? Why waste time explaining to anti-vaccinationists the benefits of vaccinations and the harm done by not vaccinating children when all we are likely to accomplish is to fuel their hostility toward the truth? It becomes a futile exercise to argue with people who believe Obama is a Muslim or wasn't born in Hawaii. Not only will no amount of evidence change their minds, but the more evidence we provide to show they're wrong, the stronger their conviction becomes that they're right.

Is there any hope, then, of debunking myths such as the birther myth that President Obama was not born in Hawaii, the creationism myth that a magical being created all species at once and there has been no evolution, or the anti-vaxxer myth that vaccinations are chock full of harmful substances that cause everything from mental retardation to autism to death and so should not be given to our children? According to John Cook of the Global Change Institute, University of Queensland, and Stephan Lewandowsky, School of Psychology, University of Western Australia, there is a way to debunk myths that shows some promise.

...an effective debunking requires three major elements. First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation. (The Debunking Handbook.)

Sounds simple, eh? Anyway, I think there is little dispute among those of us who frequently engage in public arguments that all we need to do is provide good information and we will change the hearts and minds of those we argue against. On some issues, the propaganda machinery is a Goliath that no army of independent Davids can hope to take down. For example, Donald Prothero [How We Know Global Warming is Real and Human Caused] reports that "the day that the 2007 IPCC report was released (Feb. 2, 2007), the British newspaper The Guardian reported that the conservative American Enterprise Institute (funded largely by oil companies and conservative think tanks) had offered $10,000 plus travel expenses to scientists who would write negatively about the IPCC report." The Intergovernmental Panel on Climate Change provides an abundance of solid information that journalists, government officials, and citizens alike can look to for guidance in what to believe about climate change. The propaganda machine of the self-interested oil and coal companies and the conservative think tanks and media outlets can easily outpace the flow and influence of scientific data, however.

There's an old saying "if you repeat something often enough it becomes true." Of course, repeating something has no effect on the truth-value of any claim, but familiarity with a claim does increase the chances of accepting it as true. If propaganda machines are good at anything it is at getting the same misinformation repeated again and again in various media outlets. Combating the misinformation by providing accurate information may have the undesired effect of strengthening the belief in the misinformation. So, what's a fellow to do? Sounds like one of those damned if you do, damned if you don't situations.

The fact is, though, that those of us who do battle with astrologers, birthers, anti-vaxxers, homeopaths, Holocaust deniers, evolution deniers, climate change deniers, and the like aren't really hoping to change the minds and hearts of those we confront. We're hoping that many of those who read our arguments, attend our debates, hear our presentations, or watch our videos are not fully committed to the beliefs of those we challenge. The hope is that among the bystanders, the audience, the viewers, and the readers there will be many who will be influenced by the information and arguments we provide. The goal of combating the tobacco companies' propaganda about smoking, for example, was not to change the minds of tobacco executives, but to provide good information that the general public, public health officials, and even politicians, might consider in making decisions about smoking and regulations governing the sale of tobacco products. The goal of debunking the myth that the world will end in 2012 as predicted by the Maya centuries ago is not to change the minds of those writing books or posting on websites promoting this ridiculous idea. The goal is to provide some counterpoints to the myth mongers that might alleviate some of the unnecessary fear and anxiety they've created.

Are we justified in believing that we can influence some people by providing good arguments for accepting evolution, anthropogenic climate change, science-based medicine, etc.? If we're not, then we may as well abandon education altogether. People do change their minds about many things and all of us have learned new things from time to time. Still, it is worthwhile to know that our brains may not be the unbiased truth-seeking missiles we imagine them to be. Knowing our own weaknesses can help us in our attempts at persuading others. For example, studies have shown that people are more receptive to ideas that conflict with their worldviews when they're in a good mood. We might try priming a hostile audience with some affect bias by getting them to think about how wonderful they are or about some time in the past when they felt really great because they acted on a value that was important to them (self-affirmation). Once they're feeling good about themselves maybe they won't feel so threatened by information that conflicts with their beliefs. This might work, but I wouldn't count on it.

Monday, February 13, 2012

argumentum ad ignorantiam (argument to ignorance)

The expression argumentum ad ignorantiam (usually translated from the Latin as argument to ignorance) was apparently first used by the philosopher John Locke (1632-1704) to describe a debater's tactic:
Locke described the argumentum ad ignorantiam as a way that 'men ordinarily use to drive others and force them to submit their judgments and receive the opinion in debate.' Locke defined this type of argument as the kind of move where one party in such a debate requires the other party to admit what the first party alleges as a proof or assign a better. In other words, what the arguer is saying is, 'I offered you what I think constitutes a proof, so we have to tentatively accept it unless you can offer a proof to the contrary.' In other words, the arguer is saying he has a right to put this proposition forward as a judgment that both parties should receive or accept, at least tentatively, until the other party can disprove it, or put some proposition in its place that is proved. (Douglas Walton)
That said, the fact is that the expression argumentum ad ignorantiam has morphed to mean something very different from what Locke intended. Some of the various uses of the expression that have sprung from the Latin expression can be seen in the various ways it has been translated into English: argument to ignorance, argument from ignorance, and appeal to ignorance. One will also find closely related discussions regarding the evidence of absence and the absence of evidence. Also, some writers have associated the argumentum ad ignorantiam with the idea of proving, or not being able to prove, a negative.

Many logic texts list the argumentum ad ignorantiam as a fallacy of reasoning. Examples vary, but some of the more popular ones refer to Sen. Joseph McCarthy's justifying a name remaining on a list of suspected Communists because "there is nothing in the files to disprove his Communist connections." I used to call this the "Mike Wallace fallacy" when I was teaching logic courses; I named it after a tactic Mr. Wallace frequently used on "60 Minutes." He would show up unannounced, confront a surprised person with accusations of some sort of wrongdoing, and then the scene would cut to a slamming door or a grainy film of a car driving out of a parking lot. Wallace would then announce something to the effect of: Mr. X refuses to answer our questions and still has not shown any signs that he is innocent of the charges we've made. It should be obvious that not having proof that someone is not a Communist is not proof that he is and not defending yourself against charges is not the same as admitting they are true.

Another common example given in text books is from the Salem witch trials of 1692 where some of those testifying claimed that they could see specters or auras around the accused, but these specters were visible only to the witnesses. Such claims are impossible to disprove. They're in the same class as the claims of mediums who say they are getting messages from the dead. One would assume that a reasonable person would require more evidence than just the word of a witness or medium when judging either the cause of the perception or the veracity of the sensations reported. Furthermore, the fact that an accused witch could not prove that she didn't have a demon's specter around her or that a skeptic cannot prove that John Edward is not getting messages from someone's Aunt Sadie does not imply that the accused is a witch or that Edward is really psychic.

I remember, and hope I am remembering accurately, a televised speech by Ronald Reagan where he defended the notion that a fetus is a person by noting that scientists haven't proved that the fetus isn't a person. It is true that scientists haven't proved that a fetus isn't a person, but being a person in this context is not a matter of discovery but of definition. There is no imaginable discovery any scientist could make that would be proof that a fetus is a person. It is irrelevant to the issue of whether a fetus is a person to point out that scientists haven't proved that fetuses are not persons. The U.S. Supreme Court has ruled that corporations are persons. One day perhaps dolphins and chimpanzees will be declared persons by lawmakers somewhere. You can't turn a corporation into a biological human being by definition, but you can put them both in the class of persons by definition.

Clearly, there are times when not knowing whether something exists does not mean that that something does not exist. The fact that the U.S. or other international agents did not discover weapons of mass destruction in Iraq before President George W. Bush ordered the invasion of that country did not prove that there weren't any such weapons in Iraq. Now, several years after the invasion and having had plenty of time to locate such weapons, it seems highly unlikely that Iraq possessed such weapons.

Given all the time to find evidence for Biblical stories like the universal flood that the god of the Hebrews allegedly inflicted on creation, it is reasonable to reject the story as a myth. Scientists know what kinds of evidence there should be on the planet had such a universal flood ever occurred. The lack of such evidence and the appeal to such things as the Grand Canyon as evidence of The Flood make belief in this story rather absurd. To appeal to miracles or divine intervention to clean up the evidence just makes the belief even less defensible. Such appeals are clearly ad hoc and have no basis in reality. Nobody can prove that The Flood didn't occur, but no reasonable person can believe that it did without giving up the basis of reasonable belief: considering all the available evidence rather than speculating about miracles and question-begging interventions from supernatural forces. Often, a tactic of Bible defenders who are challenged to provide positive evidence for some story or belief respond by trying to shift the burden of proof by challenging an opponent to prove the supernatural speculations are wrong. Here it seems appropriate to bring in Bertrand Russell's famous illustration involving a celestial teapot:
If I were to suggest that between the Earth and Mars there is a china teapot revolving about the sun in an elliptical orbit, nobody would be able to disprove my assertion provided I were careful to add that the teapot is too small to be revealed even by our most powerful telescopes.

But if I were to go on to say that, since my assertion cannot be disproved, it is an intolerable presumption on the part of human reason to doubt it, I should rightly be thought to be talking nonsense.

If, however, the existence of such a teapot were affirmed in ancient books, taught as the sacred truth every Sunday, and instilled into the minds of children at school, hesitation to believe in its existence would become a mark of eccentricity and entitle the doubter to the attentions of the psychiatrist in an enlightened age or of the Inquisitor in an earlier time.

Monday, February 6, 2012

priming effect

The priming effect is a biasing effect on judgment or action by the cognitive meaning or emotive aura of memories, words, images, or symbols. Most of us have had an experience where we misheard some words in a song, a prayer, or a pledge and then continued to mishear the same words--sometimes for years--until somebody corrects us. We might call such cases examples of self-priming. (This kind of mishearing is called a mondegreen.) Another example of priming comes from backmasking. What at first sounds like gibberish becomes a clear message after somebody tells you what to listen for. Another example of priming comes from allegedly outraged parents and a talking doll: "Little Mommy Real Loving Baby Cuddle and Coo" doll from Fisher-Price. Some folks swear the doll mumbles "Satan is king" and "Islam is the light." Some might even hear "Palin is a terrorist who is perpetrating voter fraud"  once they're told that's what the doll is saying.

A person's prejudices, preoccupations, or vital interests might prime one to mishear or misread words. For example, an evolutionary biologist might misread a headline in a magazine article as saying that Charles Darwin committed fraud when the headline actually says that Charles Dawson (of Piltdown infamy) was the miscreant. Because the headline would strike the scientist as false, however, a quick review would probably correct the misreading. In most cases of priming, however, we are unaware of the influence. Many studies have demonstrated that we are influenced in our judgments and actions both by words themselves and by the order in which words, images, or statements are presented to us or which present themselves to us naturally.

Just hearing someone utter the word 'beautiful' before you view a sunset or a work of art may influence both your judgment and the speed with which you make the judgment. Psi researcher Daryl Bem tested for precognition by modifying a standard test of priming. Instead of showing subjects a word like 'ugly' or 'beautiful' before they viewed a picture of something like a sunset or a sex act and then testing how long it takes to respond either favorably or unfavorably to the picture, Bem showed the picture first, measured response time, and then showed the "priming word."

Sometimes we see or hear things without being conscious of seeing or hearing them. Evidence of unconscious perception may become clear at a later time. For example, a person may go many years without understanding why seeing a road sign with the words “hidden meadow” in it produces sexual arousal. Then, one day she returns to a place she hadn’t been in many years. She remembers that this was where she met her first lover and the place is called Hidden Meadow.

The priming effect is evident in the unconscious influence of beliefs on actions, such as the hearing of intelligible speech by bird owners and devotees of EVP, and the ideomotor effect on dowsers, Ouija board users, table tilters in séances, assistants in facilitated communication, subjects of hypnotic suggestion, and both parties in applied kinesiology.

Priming has been shown to be powerful enough to create false memories. Priming is especially problematic in hypnotherapy. Many hypnotherapists seem unaware that they are priming their patients. The dangers of this practice are stated by Martin Orne: "The cues as to what is expected may be unwittingly communicated before or during the hypnotic procedure, either by the hypnotist or by someone else, for example, a previous subject, a story, a movie, a stage show, etc. Further, the nature of these cues may be quite obscure to the hypnotist, to the subject, and even to the trained observer."

The priming effect is also evident in the unconscious influence of symbols and metaphors, as Sigmund Freud noted long ago. There is a reason that presidents pose for photos while sitting at a desk with a library of books in the background guarded by a hanging American flag and fronted by a family photo. A recent study found that a person will usually vote more politically conservative if he or she votes or completes a survey near or in a church location. "These same voters are also more negative toward non-Christians, as compared to people who vote or answer polls near government or non-Christian buildings. Also:
A study of voting patterns in precincts of Arizona in 2000 showed that the support for propositions to increase the funding of schools was significantly greater when the polling station was in a school than when it was in a nearby location. A separate experiment showed that exposing people to images of classrooms and school lockers also increased the tendency of participants to support a school initiative. The effect of the images was larger than the difference between parents and other voters! (Daniel Kahneman, Thinking, Fast and Slow, p. 55, Macmillan, Kindle Edition.)

It's easy to understand why a person won't pick up a wallet with a red circle drawn around it or how photo cameras on traffic signals would have an effect on the number of drivers who go through red lights, but it is not so obvious why putting a poster of two eyes looking down at you above an "honesty box" for dropping money to cover the cost of tea or coffee taken would increase the amount of money collected in the same office over the same length of time when there was no poster.

Pollsters know, or should know, that they will get different results from a random sample of adults who are asked if they support affirmative action or preferential treatment of underrepresented groups. Differences in opinion will also occur if the question is put negatively rather than positively. Apparently, opposing something is not understood as the direct opposite of supporting something.

Pollsters know, or should know, that how people answer a question is affected by what question or questions were asked previously. That is why professional pollsters should and usually do have their pollsters ask the same questions to members of the sample, but ask them in different orders to different segments of those polled.