THE MAGAZINE

An Inconceivable Exposure to Risk

By Sherry Harowitz

Some people lump all of Wall Street’s current problems together and collectively blame greed and malfeasance. Doing so obscures lessons that can be learned from honest, but mistaken, business strategies. Bernard Madoff’s Ponzi scheme was simple fraud, but insurance giant AIG’s downfall had its roots in flawed risk analysis.

At AIG and other firms, financial experts genuinely thought they were smart enough to eliminate risk altogether. The fact that they might so greatly miscalculate the risk that it could bring down the firm was inconceivable to them.

What led to this overconfidence was something approaching a religious faith in the infallibility of number crunching, what Wall Street calls quantitative analysis. All of Wall Street—and much of Main Street—have come to believe in the math path to enlightenment. Ian Ayres’ book Super Crunchers is something of a paean to the concept that because computers make it possible to analyze larger and larger databases, “thinking by numbers is the new way to be smart.”

Statistical analysis has its place, but the problem with the deification of number crunching is that it makes the experts think they are working with more precision than is possible in the real world. In his book I Am a Strange Loop, Douglas Hofstadter notes, “Things that happen in the mathematical world strike mathematicians as happening without exceptions, for stable, understandable reasons.”

In a Washington Post series on the AIG collapse, AIG executives noted that they used a computer model “based on years of data,” and some attributed the collapse to the fact that they had moved away from strict adherence to the culture of “assessing data daily…counterbalancing one risk against another and making hedges.” 

They are missing the real lesson: The world is too complex to allow anyone to calculate risk factors with infallible results. That level of quantitative analysis is unattainable. No one can anticipate every contingency. That’s why, despite increasingly large historical datasets, we still can’t accurately predict the weather—and that’s why even companies with the brightest analysts will always be at the mercy of events that break from what was presumed to be the stable, understandable pattern.

In the movie The Princess Bride, the evil mastermind dismisses each warning of a bad event, which then comes to pass, by saying, “That’s inconceivable.” Finally, his sidekick, exasperated, says, “I don’t think that means what you think it means.”

Studies show that statistical models do beat human experts at predicting outcomes, but they still fail to get the right answer about 25 percent of the time. So to those who think quantitative analysis means taking risk out of the equation, I say: I don’t think that means what you think it means.

Comments

 

The Magazine — Past Issues

 




Beyond Print

SM Online

See all the latest links and resources that supplement the current issue of Security Management magazine.