Monday, November 12, 2012

negativity bias

  "The evil that men do lives after them; The good is oft interred with their bones." --Marc Antony, Julius Caesar by Shakespeare (act 3, scene ii)
"I hate losing more than I love winning." --Billy Beane:
Brief contact with a cockroach will usually render a delicious meal inedible. The inverse phenomenon—rendering a pile of cockroaches on a platter edible by contact with one’s favorite food—is unheard of." So begins a classic paper by Paul Rozin and Edward B. Royzman: "Negativity Bias, Negativity Dominance, and Contagion (2001).

You might think you're weird when a thousand good things happen but you focus on the one bad thing. You're not. That's the way our brains are hardwired. We're designed by nature to pay more attention and react more quickly and more strongly to negative than to positive news. One salient misdeed by a person will often outweigh years of good works. Years of building up a positive image can be destroyed in an instant by a single misstep. This tendency to give more weight to the negative is called negativity bias and is defined as "the propensity to attend to, learn from, and use negative information far more than positive information." Our brain evolved to react more quickly to fear than to hope, to respond to a threat more quickly and more intensely than to an opportunity for pleasure. And this trait has carried over into modern times in ways that are not always beneficial.

A friend of mine is a headhunter for executives in the field of education. His latest job ended with the Board of a college split between two candidates for the position of president. Their solution? Each board member would call someone who works with one of the candidates and ask the one called questions about the candidate. The Board also plans to ask the former president of the college for his opinion on the finalists. It is very likely that one negative comment about either candidate will outweigh several positive comments and that that one negative comment, even if just a personal opinion or not true (it is unlikely any investigation will be made to verify whatever the Board members are told over the telephone), will probably doom one candidate or the other.

Daniel Kahneman writes:
The brains of humans and other animals contain a mechanism that is designed to give priority to bad news. By shaving a few hundredths of a second from the time needed to detect a predator, this circuit improves the animal’s odds of living long enough to reproduce. (Thinking, Fast and Slow, p. 301.)
However, negativity bias makes us vulnerable to manipulation by those who would play on our fears. For example, National Security Advisor Condoleezza Rice may not have had any evidence that Saddam Hussein had weapons of mass destruction, but she could put the fear of god into many people just by warning us that the "smoking gun of evidence for WMDs in Iraq could come in the form of a mushroom cloud."

Loss aversion is another way that negativity bias manifests itself. Paul Rozin writes in "Bad is Stronger Than Good":
Bad emotions, bad parents, and bad feedback have more impact than good ones, and bad information is processed more thoroughly than good. The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones. (Quoted in Daniel Kahneman's Thinking, Fast and Slow, p. 302.)
Potential losses affect us more deeply than potential gains. This can lead to some irrational behavior, as is evidenced by those many times that we pass up an opportunity to benefit either financially or psychologically because we are afraid to take the risk of a loss. Long term investors who put large chunks of their portfolio in bonds are a prime example. Over the past 80 years, stocks have provided a 6.5% return (adjusted for inflation), while bonds have returned 0.5% (Lehrer, Jonah. How We Decide, p. 77). Bonds are considered a safer investment because there is a greater chance of losing money in stocks. For many people, the chance of losing money has a larger effect on their decision making than the chance of gaining money by investing in what may be riskier in the short run but more profitable in the long run. Most people, if offered a chance to win $150 or lose $100 on a coin toss won't take the deal. The potential loss, though less than the potential gain by a substantial amount, isn't worth the risk.

Loss aversion may explain why Pascal's wager seems reasonable to many people. People don't want to take the risk of losing eternal life by not believing in the god of Abraham. The safe bet is to believe. The 17th century mathematician argued that it would be wise to believe in the god of Abraham because you risk eternal life by not believing and if this god doesn't exist, you lose nothing in comparison to eternal life. If eternal life with this god is not attractive to you, then the potential loss by not believing isn't likely to affect you much. Also, if you have no fear of hell (i.e., eternal suffering of some sort) because you consider its actual existence to be near zero in probability, then it is unlikely that loss aversion will drive you to believe in this god, even though your wager is just your life, i.e., you must act as if this god exists. On the other hand, the general principle behind the wager seems sensible: only a fool wouldn't wager next to nothing when the prize, if you win, is of infinite value. You might not bet $100 for a chance to win $150 on a coin toss, but you would be a fool not to bet $1 on a chance to win, say, $1,000,000 on a coin toss.

One effect of negativity bias is that we are likely to give more credence and more weight to negative claims about positions or candidates that we oppose than we are to positive claims about them. We are likely to not be very critical in our examination of such negative claims, certainly not as critical as when negative claims are made against views we cherish.

Another effect of negativity bias is that we are likely to be afraid of things disproportionately to the evidence, e.g., most people who are afraid of flying in airplanes have little fear of driving in an automobile even though their chance of being killed in an automobile crash is much higher than their chance of being killed in an airplane crash.

Negativity bias manifests itself in various ways involving contagion. For example, a person of the Brahmin (priestly) caste can be sullied by contact with a member of the Shudra (servant) class, but Shudras are not purified or elevated in status by contact with the Brahmins. Rozin and Royzman write:
The contamination often occurs by eating food prepared by a lower caste. On the other hand, when people of lower castes consume foods prepared by higher castes, there is no corresponding elevation in their status. Stevenson  summarized this feature of the caste system with the phrase “pollution always overcomes purity” ("Status evaluation in the Hindu caste system." Journal of the Royal Anthropological Institute of Great Britain and Ireland, 84, p. 50).
On the other hand, an argument has been made for a positivity bias in The Pollyanna Principle: Selectivity in Language, Memory, and Thought by Margaret Matlin and David Stang (1978). Matlin and Stang claimed that their research showed that people are more likely to expose themselves to positive stimuli than they are to avoid negative stimuli and that they encounter more positive stimuli than negative stimuli,* which seems intuitively what you'd expect from us pleasure-seeking animals.

If you are familiar with the Forer effect, you know that people tend to agree with positive statements made about themselves (whether they're true or not) and these kinds of statements are more likely to be accepted than negative statements about themselves. Studies on self-deception consistently find most people overestimate their possession of positive traits. So, when it comes to evaluating oneself, the negativity bias seems to be overpowered by the positivity bias.

When it comes to politics, however, negativity bias reigns supreme. The overwhelming appeal in ads for candidates or ballot propositions is to arouse negative feelings about an opponent or a position on an issue. After the last national election, there were several letters to the editor of my local fishwrap decrying the negativity of Republicans, who did much more poorly than they had expected in the election that saw Barack Obama re-elected. (Some cynics might say that Obama beat Romney because Romney aroused slightly more fear in a certain segment of the electorate than Obama did.) One wrote that negativity "dominates the Republican attitude of the 21st century. Everything is bad, everything government touches is poison--there's no positive policy regarding the economy, women's rights, minority rights, environmental issues, etc." On the same day in an op ed, Paul Krugman bemoaned the Republican method of using threats as their main negotiating tool regarding the economy: "They're threatening to block any deal on anything unless they get their way." Some even call the Republican Party the "party of 'No'." Whether these criticisms are true or not, negativity bias is, and probably always will be, the lifeblood of politics.

Finally, I recall my local fishwrap responding to a reader request that it print more good news by agreeing to set aside a segment of the paper for the "Good News" once a week. The only catch was that the readers were given the responsibility of finding the good news and reporting it to the Saramento Bee editors. The column never got off the ground.


  1. In addition to the Forer effect, you could talk about the Dunning-Kruger effect here.

    1. For those who don't know, the Dunning-Kruger effect is the claim that poor performers of some task overestimate their abilities relative to other people and, to a lesser extent, high performers underestimate their abilities. Dunning and Kruger attribute this difference to different abilities to recognize competence or incompetence.

      I think the jury is still out on the explanation. See

      There is good data that this distinction breaks down for more difficult tasks, i.e., the incompetent at very difficult tasks are as likely to recognize their incompetence as the competent are to recognize their competence.

      There are also other possible explanations, e.g., regression to the mean and positivity bias (the tendency of people to overestimate their possession of positive traits relative to others).