NEWS

The Road to Rational Decision-Making

By John Barham, senior editor

Most of us are lousy decision-makers, and unfortunately many of the decisions that we take as individuals or as a society can be both very wrong and have very harmful or costly consequences. But there are many tried and tested ways of making relatively sound decisions. Cost-benefit analysis, risk management, and related disciplines that rely on statistics, economics, the law, and psychology allow us to weigh choices rationally and increase the odds of making wise choices.

Cass Sunstein, a prolific University of Chicago law professor, has a new book (his 32nd) on the topic, called Worst-Case Scenarios. In it, he explains why our intuitive, gut-reactions are usually wrong and how we can, instead, control our emotions and use our brains to make decisions. He makes a passionate argument for reintroducing rationality into the policy process and into our everyday lives.

Many of the points he makes can be applied to security-related decision-making. Sunstein uses the challenge of global climate change and the threat of terrorism to illustrate most of his points, but he also mentions other risks, like tsunamis, meteor impacts, and avian flu.

Speaking at a Washington, D.C., book launch, Sunstein said humans are victims of three broad types of faulty thinking.

First is availability bias. This refers to our tendency to respond to terrifying but very rare crises with panic, consuming resources, time and money needlessly.

 Next is probability neglect, the reverse of availability bias. Probability neglect describes our habit of ignoring the risk of high-impact, low probability events. This is why people complacently build houses in flood plains that suffer infrequent but catastrophic floods or ignore building codes in hurricane zones.

 Finally, we are prone to unreasoning fears. Sunstein calls this the Goldstein Effect. In George Orwell’s 1984, The Party has declared Emmanuel Goldstein to be an Enemy of the People and made him the target of mindless Two Minute Hate rituals. Sunstein says an example of the Goldstein Effect in contemporary America is our irrational rejection of nuclear power.

 There is a good case to be made for expanding our use of nuclear power, once we balance the high environmental cost of increased coal consumption against the small risk of a catastrophic nuclear accident. But Sunstein says psychologists have run experiments showing we have a deep, visceral aversion to nuclear power that resists rational analysis. This has made it politically unviable for governments to approve new nuclear power stations and storage sites—at least in the United States. 

As an expert himself, Sunstein says we should always trust the judgment of our experts. But experts need to be independent and competent with no hidden agendas or conflicts of interest—not a safe assumption by any means. Often the experts don’t even agree amongst themselves. Sunstein may usually be right about the reliability of experts in most cases, but it’s a hard sell in a society where citizens suspect—sometimes rightly—that the policy agenda is being shaped by ill-informed media and populist politicians.

To most people, it seems cold and inhuman to allow technocrats to weigh decisions in terms of lives and money.

Danish Economist Bjorn Lomborg has become the environmental movement’s Goldstein for questioning some of its most deeply felt beliefs. He became notorious for analyzing climate data and running cost-benefit analyses that threw doubt on the wisdom of environmental policies that he says will deliver negligible results at a very high cost.

Today’s experts are aided by astonishing levels of computing power. Ian Ayres, a Yale University law professor, has just published a book called Supercrunchers that describes how data mining and sophisticated programming enable computers to make far better decisions than people can in fields that include teaching, medicine, finance, and even the movie business. Sunstein recognizes that there are limitations to expert knowledge and models are all imperfect. It is still humans who collect, input, and set the parameters for the analysis of that data, which means that even computed analyses can be in error. Call that the garbage in, garbage out rule.

Another problem is what might be called the hammer of accountability—where public officials forever worry about how they will be held to account for actions taken in good faith if something goes wrong. For instance, government officials take over-cautious decisions about food and drug regulations, since they are penalized for being too lax in managing risk, not for being too strict.

As a society, we are constantly struggling with whether it may be better to accept the small cost in money or lives of authorizing a life-saving drug quickly, as was done with some early AIDS drug treatments. But we often err on the side of safety, as when the Food and Drug Administration pulled Vioxx because of its harmful health effect on a small percentage of users or when it raised warnings about the suicide risk of prescribing antidepressants for teenagers. A new report now suggests that this cautious approach had a tangible cost—it led to higher death rates as anguished teenagers who could not obtain antidepressant medication began committing suicide more often than before.

Governments, banks and corporations use cost-benefit analysis all the time, but they could do a better job of it. One Homeland Security Department official who was at the Sunstein event said in the Q&A that his department often lacks enough time to make a thorough cost-benefit analysis of its actions and policies. Implicit in his statement was the fear that not acting fast enough might cost lives, particularly when society seems willing to bear almost any cost, however irrational, for the sake of security. He is as much victim of availability bias as the rest of us.

Comments

View Recent News (by day)

 




Beyond Print

SM Online

See all the latest links and resources that supplement the current issue of Security Management magazine.