In lieu of an abstract, here is a brief excerpt of the content:

  • Of Note:Mirror-Imaging and Its Dangers
  • Lauren Witlin

Of the obstacles the intelligence community must overcome, mirror-imaging is among the most challenging. Mirror-imaging means that an analyst may perceive and process information through the filter of personal experience.1 Mirror-imaging imposes personal perspectives and cultural background on incomplete data, undermining objectivity. Because objectivity is a key component of intelligence analysis, mirror-imaging impedes efforts to make accurate judgments about incomplete information.

Mirror-imaging frequently results in gross distortions of intelligence and raw data, forcing the information to fit into a framework for which it may not be suited. This improper juxtaposition has led to massive oversights and poor planning in the face of national security threats, such as those related to the attacks on Pearl Harbor and the World Trade Center.2 Part of the problem with mirror-imaging is its prevalence throughout the intelligence community and the very ease of its use. Indeed, when confronted with a new situation or challenging data there may be a strong temptation for an analyst to resort to what is familiar.

With a greater percentage of the world embracing Western culture in the form of media, technology, and even government it is more tempting than ever for intelligence analysts to rely on mirror-imaging, "estimating the risk-benefit calculations of a foreign government or non-state group based on what would make sense in a US or Western Europe context."3 However, making analytical decisions based on this assumption can be considered both shortsighted and false. The problem of mirror-imaging extends back into intelligence history, with hindsight allowing for detailed analysis of how mirror-imaging may be prevented in the future.

The Japanese attacks on Pearl Harbor reflect the fallacy of decisions based upon mirror-imaging. According to Douglas Porch and James Wirtz, "It seemed inconceivable to the U.S. planners in 1941 that the Japanese would be so foolish to attack a power whose resources so exceeded those of Japan, thus virtually guaranteeing defeat."4 The failure of the US planners to think beyond their own cultural experiences and reasoning allowed for the possibility of the surprise attack by the Japanese, who evidently held different perceptions of the strategic value and repercussions of the Pearl Harbor assault. In "15 Axioms for Intelligence Analysts," Frank Watanabe of the Directorate of Intelligence warns that simply because something seems logical to an analyst does not mean that the subject being analyzed will see [End Page 89] it that way—especially when differences in thought processes and beliefs are factored in to the equation.5

Consider insurgency in Iraq, which poses a great challenge for US forces. Troops are no longer faced with straightforward, conventional military tactics. The enemy is not organized in a traditional military structure, but rather in a more complex, loose network of combatants. Their tactics are not conventional, but asymmetrical—relying heavily on car bombs and suicide attacks. Finally, their weapons are not tanks and fighter planes, but improvised explosive devices (IEDs).6 All of these components indicate that the insurgency situation in Iraq should not be subjected to mirror-imaging, yet in the face of this distinct situation analysts may opt to rely on their prior experience and personal reasoning to draw conclusions.

Fortunately, the intelligence community is aware of the dangers of mirror-imaging. Intelligence analysts have cited a variety a traps that can cause an analyst to rely on mirror-imaging. One trap is a pure lack of data. When an analyst lacks information he or she is more likely to look within his or her own experiences to fill in any informational gaps. Another trap is a lack of training.7 Training is crucial for ensuring a proper analysis of different forms of raw data.8

Finally, the most effective way to combat mirror-imaging is supplementing an analyst's personal experiences. Robert Steele, a former CIA officer, notes "the average analyst [in the CIA] has 2 to 5 years experience. They haven't been to the countries they're analyzing."9 Naturally, it may not always be possible to have analysts with extensive experience in the region to which they are assigned...

pdf

Share