NEWS

Intelligence Chiefs Learn From Experience

By John Barham, International Editor

Security professionals as well as intelligence officers have had plenty to learn from the debacle over the search for Saddam Hussein’s weapons of mass destruction, as a rare seminar given by two of the men closely involved in one of America’s greatest intelligence failures showed.

John E. McLaughlin, a former top Central Intelligence Agency (CIA) official, and Charles A. Duelfer, the former chief U.S. weapons inspector in Iraq, explained at a George Washington University seminar how and why the intelligence community went wrong in Iraq—and what it got right. Their accounts apply as much to sifting through secret intelligence as it does to solving business challenges. Business executives need to weigh information, assess risk, understand different cultures, and learn from their own errors and those of others.

McLaughlin says that success or failure in the intelligence world is never “black and white. Sometimes it is hard to characterize [them] clearly and it’s easier to see [them] along a spectrum.” Sometimes successes are easy to identify, such as the 2003 dismantling of a network set up by Pakistan’s AQ Khan to supply nuclear technology to Iran, Libya, and North Korea. However, success in intelligence work rarely attracts as much attention as failure does. McLaughlin worked for 30 years at the CIA where he rose to become acting director in 2004. He says there are five broad types of failure.

There is a failure of “mind-set in which analysts fail to understand events as they unfold, such as the 1979 fall of the Shah of Iran. Intelligence also fails when analysts are victims of deception, as when the Soviet Union fooled the CIA prior to its 1968 invasion of Czechoslovakia. Lack of resources or inattention also leads analysts to misunderstand events, such as the turmoil that led to the fall of Indonesian strongman Suharto in 1998.

Timing is all. Analysts who are able to forecast events correctly will still fail unless they get the exact timing right. Examples of this include India’s atomic weapons test in 1998 and the 9-11 attacks. Both risks were identified correctly, but analysts were surprised when they happened. The final type of failure is what McLaughlin terms the failure to persuade. He said “the intelligence community clearly understood that the Soviet Union was about to collapse” in the 1990s but failed “to take people by [the] shoulder and shake them” in Washington. As a result, U.S. leaders were taken by surprise at the speed of Communist Russia’s implosion in the early 1990s.

The CIA’s record in Iraq contained elements of all five types of failure, but it also got many things right. McLaughlin said these include rejecting the existence of operational ties between Al Qaeda. The intelligence community also did “reasonably” in forecasting Iraqi conventional weapons. It was “eerily prescient” about the consequences of prolonged occupation and the scope and resiliency of the insurgency.

However, the intelligence community did very poorly in assessing Saddam’s purported WMD stockpiles. McLaughlin says this was due to three factors.

To begin with, the intelligence community lost access to a wealth of raw data from the United Nations when Saddam threw its inspectors out of the country in 1998. The U.S. also suspended U2 overflights for fear that Iraq missiles could hit the spy planes. CIA leadership also diverted scarce resources away from Iraq to other parts of the world where information was even sketchier, and where risks were equally severe, such as North Korea. Nonetheless, assumptions remained in place about Iraq, and remained unchallenged even as intelligence became stale.

In fact, Saddam had dismantled his WMD stockpiles, but he deceived his archenemy Iran into believing that Iraq still retained a strategic weapons program. As a result, the conviction that Iraq’s weapons program remained active was widely shared both in U.S. and foreign intelligence agencies and in the academic world. “Red teaming the data probably would not have helped,” said McLaughlin.

A second problem concerned what McLaughlin calls “policymakers’ passivity.” Because the CIA’s erroneous findings fit the expectations of government officials, they made little effort to probe or question the intelligence reports. “A healthy dialogue sometimes verging on argument is essential for the relationship” between government and the intelligence community, said McLaughlin. According to him, few people in government or Congress actually read the (admittedly incorrect) 2002 National Intelligence Estimate on Iraq. Although it was wrong about Saddam’s chemical and nuclear weapons, it contained about ten pages of footnotes and dissents that provided “a lot of grist for argument,” said McLaughlin. Officials, he said, “did not absorb it in all its texture."

Underlying this was a demand from intelligence consumers in the government that findings be cut and dry, stripped of nuance. Officials were too busy to assess reports carefully and demanded definitive “bottom line” findings on which to base their policies. Critics of the CIA and the Bush administration go further and accuse government officials of manipulating intelligence data to support decisions they had already reached. But the intelligence community’s leadership was at the very least guilty of passivity.

Finally and most famously, the CIA was guilty of errors of analysis and tradecraft. McLaughlin said assumptions “formed in the 1990s were passed along and not questioned.” These faulty assumptions gained a momentum of their own, affecting judgments throughout government. Furthermore, caveats “dropped off” reports after a time. The CIA did not vet its sources properly, its record keeping was inadequate, and staff failed to detect Saddam’s deception of Iran and the West.

Once the extent of the WMD intelligence failure became clear in 2003, McLaughlin said, the CIA ordered its best staff to go over their work and they “tore the intel apart.”

The new intelligence paradigm now favors extensive “red teaming,” the use of devil’s advocates to question assumptions, the avoidance of “bottom line” findings, an emphasis on nuance, and the inclusion of alternative hypotheses in reports.

Duelfer, a highly respected intelligence veteran who spent years working in Iraq with the UN, emphasized how cultural subtleties shape assumptions both in the West and in the Middle East. His years in Iraq taught him how our views of the world are shaped by “assumptions that we are not even aware of.” He said: “We do cost-benefit analyses without realizing it.”  The West and Saddam were unable to understand each other clearly, since their worldviews were so distinct.

He gave an example of the rigidity of U.S. thinking on Iraq. He recalled that a top Saddam aide asked him why the U.S. would attack buildings suspected of being illegal weapons research and storage sites. “The assumption is that we equate buildings with the capability we are trying to degrade,” said Duelfer. “We have a technically driven mind-set,” he added, and “we assume that there is a single objective reality.” Inspectors and intelligence officers would focus on locating objective evidence that Saddam’s weapons program remained operational. But Duelfer said it “became abundantly clear that the way Saddam ran his government” relied on “implicit guidance” as much as on explicit written instructions. Definitive evidence confirming or denying the existence of a clandestine weapons program could not be found because it did not exist.

Duelfer said confirmation bias, the habit by which people discard evidence that conflicts with their established outlook, pervaded the West’s approach to Saddam. “Everyone had this hypothesis that Saddam had WMDs and so you organize the data that you understand,” he said. Saddam’s history of deception deepened this erroneous belief. “It led to the view that Iraq was clearly hiding something and not telling the truth. But that did not prove they had WMD. That was the analytical leap that people were taking,” said Duelfer.

As these cultural disconnects became clear to him, Duelfer said that he struggled to explain the apparent irrationality of Saddam’s policies to Americans. “How do we convey something that is not in the lexicon of your consumer?” he said.

The outlook for the future is challenging. While the lessons learned in Iraq have probably improved the process of gathering and analyzing intelligence, the very public discussion about CIA methods has altered the environment in which it works.  McLaughlin said, “So many of our intelligence methods have been exposed, so we are very vulnerable to deception.” Targets now have a better understanding of how to trick U.S. satellites and manipulate electronic intercepts.

Perhaps the most difficult lesson from the WMD debacle concerns the very nature of intelligence. McLaughlin and Duelfer both emphasized that intelligence is never definitive. “It is mathematically certain that you will make mistakes,” said McLaughlin. Suppliers and consumers of intelligence products both need to be aware of the fragility of the data and be prepared to “think all the time about why did [an error] happened and how can you learn from it.”

Comments

View Recent News (by day)

 




Beyond Print

SM Online

See all the latest links and resources that supplement the current issue of Security Management magazine.