SKINNING A CAT FROM THE INSIDE OUT: DIFFERENT APPROACHES TO PREDICTIVE ANALYSIS
A post at the Armchair Generalist
and a new article up at Scientific American set up an excellent juxtaposition in terms of analytical methodology to predict potential outcomes.
The Armchair Generalist
was reviewing a laudatory article about INR
, the State Department’s Bureau of Intelligence and Research
. At the conclusion of his post, AG asks:“It's an interesting comparison between the CIA and the INR, and it makes you wonder why the CIA persists in hiring young (and cheap) analysts without much foreign service or technical experience, as opposed to the INR's efforts to recruit and retain seasoned veterans”
The virtues of the INR type of analyst – and they are considerable in my view – are also a reason for the CIA to recruit and train analysts of a different kind ( though moving CIA analysts around to different fields as frequently as the article indicates is not a requirement for this different perspective). INR analytical expertise epitomizes the field depth and vertical thinking model.
Having mastered the language and spent considerable time “in-country” in the foreign service, if they have spent enough time among ordinary folk as well as elites to internalize some aspects of their cultural rule-set, an INR specialist will have several advantages. First, true language fluency shapes thought and gives the analyst insight into the mental architecture that frames the perspective of the target state’s decision makers. Stepping into their shoes becomes easier when faced with incomplete information. Secondly, in terms of pattern recognition – fitting new data into the mosaic is easier because the INR analyst has a greater sum total of the mosaic in their head – something one of my old profs liked to call the advantage of having “a bigger cognitive map”. Or, to excerpt from the article:“And while the CIA's young analysts occasionally travel to their countries of responsibility and bone up by reading at their desk, they have little first-hand experience of their regions. INR couldn't be more different. Among the civil servants who make up two-thirds of its staff are many scholars lured out of the academy who come with years of knowledge. Fingar is one of them: He spent a decade-and-a-half as a scholar at Stanford's U.S.-China relations program, speaks fluent Mandarin, and has traveled widely in China. The other third of INR's staff are Foreign Service officers rotating through who usually have spent several diplomatic tours in the country or region they are focusing on at INR, and who thus have both a reservoir of knowledge about its personalities and history, and a deep well of personal contacts.”
The drawback of course to relying solely upon upon intuitive judgment calls of experts is “ educated incapacity
” where “…The more expert—or at least the more educated—a person is, the less likely that person is to see a solution when it is not within the framework in which he or she was taught to think. When a possibility comes up that is ruled out by the accepted framework, an expert—or well-educated individual—is often less likely to see it than an amateur without the confining framework”. This intrinsic blind spot may explain why well designed Bayesian Probability analysis often proves to be more accurate in predicting outcomes than an expert’s forecast. There is also the danger of an expert “ buying in” to the status quo upon which their expertise is based and reacting with hostility to a hypothesis or a strategy that contemplates radical change – case in point, the Sovietologists on both ends of the political spectrum who missed seeing the USSR’s imminent collapse. This is one reason to train CIA analysts differently, so that their biases are not in sync with the folks at INR.
This brings me to an article up at Scientific American advocating a flexible combination
of the use of scenarios
and computer modeling as an analytical base to deal with a concept they call “ Deep Uncertainty”. The authors are geared toward climate modeling but the premise would also apply to complex human systems equally well ( assuming of course that the mathematicians and software designers were to get cracking):“The three of us--an economist, a physicist and a computer scientist all working in RAND's Pardee Center--have been fundamentally rethinking the role of analysis. We have constructed rigorous, systematic methods for dealing with deep uncertainty. The basic idea is to liberate ourselves from the need for precise prediction by using the computer to help frame strategies that work well over a very wide range of plausible futures. Rather than seeking to eliminate uncertainty, we highlight it and then find ways to manage it……Our approach is to look not for optimal strategies but for robust ones. A robust strategy performs well when compared with the alternatives across a wide range of plausible futures. It need not be the optimal strategy in any future; it will, however, yield satisfactory outcomes in both easy-to-envision futures and hard-to-anticipate contingencies…In contrast, for robust decision making the computer is integral to the reasoning process. It stress-tests candidate strategies, searching for plausible scenarios that could defeat them. Robust decision making interactively combines the complementary abilities of humans and machines. People excel at seeking patterns, drawing inferences and framing new questions. But they can fail to recognize inconvenient facts and can lose track of how long chains of causes relate to effects. The machine ensures that all claims about strategies are consistent with the data and can reveal scenarios that challenge people's cherished assumptions. No strategy is completely immune to uncertainty, but the computer helps decision makers exploit whatever information they do have to make choices that can endure a wide range of trends and surprises.”
This approach would seem to be, at a minimum, complementary to the expert driven analysis at INR and the multi-field analytical model used at the CIA. It could also greatly enhance strategic planning for initiating or defending against system perturbation