Cultural Revolution in Intelligence

The piece below is my contribution to a special report on the revolution in intelligence affairs and was originally published by the International Relations and Security Network. Particularly insightful is the editorial by Kris Wheaton and a topic piece by Ken Egli on the potential role of academia in intelligence collection and analysis.

 

Cultural Revolution in Intelligence: From Government to Business Enterprise

Earlier this year, the Office of the Director of National Intelligence published a document entitled Vision 2015: A Globally Networked and Integrated Intelligence Enterprise. The first part of this bold intelligence community statement begins with an evaluation of the “shifting strategic landscape,” the defining characteristic of which is said to be uncertainty:

“We live in a dynamic world in which pace, scope, and complexity of change are increasing. The continued march of globalization, the growing number of independent actors, and advancing technology have increased global connectivity, interdependence and complexity, creating greater uncertainties, systemic risk and a less predictable future.”

Uncertainty has become one of the trendier concepts over the past few years, and is currently used profusely in the jargon of a variety of disciplines from intelligence to complexity and network sciences to corporate and risk management. The intelligence community is not the trendsetter. Originally stemming from the physical and natural sciences, the emergence of the concept of uncertainty has been accompanied by the development of a homogenized lexicon to talk about “new risks” generated and driven by globalization and network growth to a point where domains previously falling outside the scope of intelligence and security have been securitized. These domains run the gamut from society and culture to demographics and health to economics and finance to innovation and technology to natural resources and the environment. Regardless the domain, we now talk about complex adaptive systems whether we are examining conceptual physical models, bio organisms, tribes and clans, financial markets, terrorist or organized crime networks, or corporate knowledge management.

The list of globalized mashed-up vocabulary is long. It would appear that whichever way we turn, we find researchers, analysts and managers trying to detect emergence patterns, spot uncertain and unstable environments, aggregate and mine various types of data, develop systemic and holistic strategies and approaches, build resilient models, integrate systems within systems, collaborate and share knowledge across domains, form strategic partnerships, build agile infrastructures, transform organizational cultures and mindsets, and win the war for talent.

So how, apart from adapting to a new vocabulary, is the intelligence community going to achieve the transformation it so vehemently advocates? How is a largely static government enterprise to turn into a dynamic business enterprise? What is actually happening in the process of transforming the culture and mindset of the intelligence community so it may accomplish its mission to create decision advantages? What kind of education is needed to kick start the transformation? Is descriptive qualitative analysis obsolete and should the intuition-led approach be substituted with formal structured methodologies?

Vision 2015 proposes that in order for the intelligence community to transform into an enterprise able to provide decision advantage to policymakers, it must transform from a government enterprise into a “globally networked and integrated intelligence enterprise.” In other words, the intelligence community must start thinking and acting like a business. How well does the business metaphor hold in the government/national security context?

Government, critics of the business analogy have argued, is not comparable to business because it cannot be responsive to market forces since it has a higher purpose: public welfare. These critics also see the competitive advantage of intelligence in the community’s ability to “steal secrets”, which further implies a stronger emphasis on collection over analysis. Such an argument epitomizes the mentality and culture that the new vision is trying to counter. It is a snapshot, a still life if you will, of the Cold War mindset as to what characterizes intelligence. This mindset envisages a centralized national customer, promotes the obsession with secrecy, places value on the finished intelligence product rather than the process of intelligence, and treats flexibility as a foreign word.

Applying a business metaphor to intelligence processes in the national security context is not only valid; it is highly desirable. What the market is to business, international relations is to government. Are we to believe that government should not pay attention to the forces driving the developments on the international arena and respond accordingly? With globalization, where once particular domains were immune to changes outside their immediate environment, and cause-effect analysis had a more linear dimension, the interconnectedness and resulting complexity of drivers cutting across disciplines, calls for non-linear approaches both in terms of collection and analysis.

For at least two decades now it has widely been acknowledged that the so-called intelligence cycle (the process of collection, analysis and dissemination) is an idealized Platonic model that is not only obsolete in today’s environment, but also dangerous and misleading. The first step toward transforming the intelligence community from a creeping and decrepit government apparatus to a dynamic enterprise is providing whatever education necessary to curb the old mindset. Business and national security intelligence share the same strategic objectives: avoid surprises, identify threats and opportunities, gain competitive advantage by decreasing reaction time, and improve long- and short-term planning. With this in mind, the intelligence community should most certainly be responsive to market forces. It should allow for the formation and dismantlement of processes on a need basis. If a process is recognized to be “unprofitable”, it should not be allowed let to drag on for decades because government institutions have a “higher calling”!

Vision 2015 recognizes that the most difficult part of implementing the envisaged transformation is cultural change:

“The first and most significant impediment to implementation is internal and cultural: we are challenging an operating model of this vision that worked, and proponents of that model will resist change on the basis that it is unnecessary, risky, or faddish.”

Yet the real challenge of transforming the culture lies neither at the top (the Cold War veterans of the intelligence community who by the sheer force of nature are on their way out) nor at the bottom (the fresh-off-college Generation Y recruits who may have the “right” attitude and ideas but too little real world experience to know how to best apply them). The challenge lies in the lack of mid-level leadership as this is the level at which bottom-up generated ideas are filtered to form strategic direction at the top and get the buy-in from the customer. Inability to recruit and sustain competent middle management will translate into either empty rhetoric and a hodge-podge of recycled vocabulary, or in stagnation, lack of flexibility, and death by a thousand paper cuts.

If the intelligence community is serious about winning “the war for talent” (an expression around which its human capital strategy  is fixed), it should aim at developing its mid-level capabilities. “Investing in our people” is a nice enough sounding cliché. This does not mean, however, ensuring competitive compensation and providing competitive benefits because in the war for talent, there will always be someone ready to offer bigger, better, more competitive compensation packages. Adequate compensation should not be a strategic human capital goal. It should be a given. Strategically speaking, investing in people should translate into offering them the opportunity to grow their potential through continuous learning, which in turn, will increase their sense of ownership and loyalty. True, one can change a culture by throwing money at it, but the resulting culture is hardly the type that is likely to stand up to the values set in Vision 2015: commitment, courage, and collaboration.

Winning the war for talent is not a silver bullet for a successful cultural transformation. If we think of information as the currency in the world of intelligence affairs, then surely we must observe fluctuations in this currency as the external environment changes. The relative scarcity of information during the Cold War era resulted in putting a high price tag on information. Not only was there a lot less information available in contrast to today’s web- and telecommunications-networked world, but this information was collected secretly by means of human intelligence (HUMINT). Hence the culture in which the intelligence community operated was one that first, placed far greater emphasis on collection than analysis; and second, created a glamourous, cult-like image of secrecy.

The “information tsunami” as information overload is figuratively referred to, together with proliferation of telecommunication and media technology, has clearly devaluated not only information as a currency, but also its attribute – “secret”, thereby creating a shift from the emphasis on collection to that of analysis. More value is now placed on sorting out relevant information from the ubiquitous noise, which has resulted in the creation of a grey area somewhere between collection and analysis, namely synthesis. Yet synthesis is no new fad. It is an analytic process that every person in academia, from a freshmen to a graduate researcher to an established professor engages in daily. While some more progressive elements of the intelligence community have supported “outsourcing” the synthesis of open source information (the most voluminous type of information) to knowledge workers outside the community, be that academic institutions, think tanks, or in some ultra-progressive cases – crowdsourcing, such initiatives are still in the single digit count.

There is some evidence of cultural change in the intelligence community of acknowledging the value of open source intelligence (OSINT) such as the creation of an Open Source Center at the Office of the Director of National Intelligence (ODNI) and the ODNI sponsored Open Source conferences in 2007 and 2008, which served as an outreach activity to bring together intelligence professionals, academic institutions, think tanks, private sector intelligence providers and the media. Nevertheless, a successful cultural transformation from obsession with classified information to a wider use (not just acknowledgement) of OSINT has not been achieved.

While one of the key design principles upon which Vision 2015 rests is adaptability and the document duly declares: “The keys to adaptability are active engagement and openness to outside ideas and influences.” The implementation plan fails to mention either OSINT exploitation or openness to collaboration and contribution by non-community members, such as think tanks and academia, where a large volume of vetted OSINT resides. Failure to take actionable steps in this regard will not serve the community well in its attempts at cultural transformation. Promoting ideas without an actionable plan is like taking one step forward and two steps back; worse – it creates a “cry-wolf” image.

All that said, it should be acknowledged that the United States is a pioneer in promoting the use of OSINT among intelligence professionals. The OSINT discussion at  the EU-level is lagging behind. As for countries with alternative understanding of democracy, transparency and accountability, such a discussion is not only non-existent, but very likely sends ripples of cynical laughter in the midst of planning the next black PR campaign.

Another due acknowledgement in this discussion should be the fact that cultural transformation rarely occurs with a swipe of a blade, but undergoes various phases over a period of time. Following a re-evaluation of the definition of intelligence in the post-Cold War environment, the type of human capital the community wants to attract and retain and a makeover of inward and outward-looking operation models, is a re-evaluation of what constitutes quality intelligence products and the development of quality benchmarks. In this respect, Vision 2015 provides a bullet point under the section of adaptability actions, which reads as follows:

• Build the organic capability to conduct exercises and modeling and simulations throughout our processes (e.g., analytics, collection, mission management, etc.) to innovate and test new concepts and technologies.

For the reader unfamiliar with the intelligence community’s internal debates, the above provision might sound somewhat surprising. What? Doesn’t the community already have such capabilities? Aren’t collection and analysis done according to structured methodologies? Stephen Marrin, a CIA analyst from 1996 to 2000, reveals a different picture. In an article for the American Intelligence Journal (Summer 2007), he clearly outlines what is known in the community as the “intuition vs. structured methods” debate:

“Even though there are over 200 analytic methods that intelligence analysts could choose from, the intelligence analysis process frequently involves intuition rather than structured methods. As someone who worked at the CIA from 1996 to 2000, I possess firsthand knowledge of the kind of analytic approaches used at the time. While I was there, the reigning analytic paradigm was based on generalized intuition; an analyst would read a lot, come up with some analytic judgment, and send that judgment up the line without much focus on either the process involved in coming to that judgment, or making that process transparent to others. No one I knew – except for maybe the economic analysts – used any form of structured analytic process that was transparent to others. No quantitative methods; no special software; no analysis of competing hypotheses; not even link charts.”

For the sake of clarity, it should be said that “intuition” is meant here not in the sense of some extrasensory paranormal activity. It simply refers to arriving at a judgment by means of extensive experience that cannot be clearly demonstrated. Another word commonly used to describe this process is heuristics, or a rule of thumb. The preference of old school intelligence analysts for using intuition rather than structured methodologies stems from the historical Cold War mindset that was described above, and the reasons for its perpetuation are to be found in…human nature.

During the Cold War, the intelligence community operated in an environment characterized by opposing ideologies, the bulk of analysis constituted political analysis: political situation assessments, profiling of political leaders, etc. To attempt to quantify such analysis would rightly be considered pseudo-science. Qualitative analysis, which is often based on intuition (that is opinion vs. fact) is suitable to such an environment and to the problems it is tasked to analyze. However, with the securitization of domains previously not on the agenda of national security professionals such as energy security, environmental issues, proliferation of networked non-state actors, qualitative analysis falls short in its ability to provide the type of rigorous analysis the new vision outlines. Perhaps even more importantly, in the aftermath of 11 September, analysis based on non-structured methodologies evades both the transparency of how the analytic judgment was formed and the ensuing accountability.

Significantly, a number of academic intelligence programs have sprung up during the past decade offering advanced education in the field of Intelligence Studies. It is interesting to note that most of the advanced degrees they offer are Master of Arts degrees rather than Master of Science degrees. This indicates that the debate whether intelligence is an art or a science persists. A cultural change will not follow until people in the community stop thinking along black and white lines. Intelligence is both an art and a science. Resistance to implement structured methodologies stems from habit, from “this is not the way we do things around here” mentality, from the numerical illiteracy inherent in the Humanities and many Social Sciences, and a “if it were so great, why do you have to always prove it to me” attitude. Countering such deeply ingrained habits will take time and there are no quick fixes to this problem other than investing in people’s learning on the job. The intelligence community’s return on investment will be nothing short of realizing its lofty vision.

Advertisements

Donald Rumsfeld’s Legacy: strategic thinking in a world of unknown unknowns

The Atlantic has published an excellent new article by Robert D. Kaplan, What Rumsfeld Got Right, summarizing Donald Rumsfeld’s career, including a well-balanced argument of the rights and wrongs of his strategic thinking from the 1990s U.S. intervention on the Balkans to present day Iraq. As usual, Kaplan is thought provoking, smart and beautifully versed – a good antidote to raving left critics and biased media reports.

Also of interest:

Warrior Politics: Why Leadership Demands a Pagan Ethos – Kaplan’s lessons learned from Greek, Roman and Chinese history and military strategy for today’s political leadership

Scenario Planning as Part of Strategy Development

In my daily work, I hear the phrase “strategic planning” with such frequency that it has become by now a signal for “switching off”, i.e. stop paying attention…here we go again…not another strategy…and not the same strategy with different words, please! What a relief to read something actually intelligently written, like Conway’s article “Scenario Planning: An Innovative Approach to Strategy Development“, in which he makes a distinction between strategic planning and strategic thinking. Sure, it’s a piece dense with “management-speak”, but given the quality of ideas, one doesn’t mind the odd “innovative approach” or “setting direction”, or “immersion in foresight concepts”.

Conway argues that traditional strategic planning, based on deductive reasoning falls short of being effective in a complex, interdependent and highly uncertain environment in that it focuses more on past experience, data and fact driven thought processes. I’d like to call this the microscope-focused approach. What he advocates instead, or rather, in addition to, is “strategic thinking” as part of the planning process, or in other words, the ability to develop foresight capacity, a “big picture” view that is less concerned with the here and now of the details and the particular but adjusts the aperture to provide a universal, telescopic view. Of course, he doesn’t say “universal” vs. “particular”, which is how a philosopher might phrase the concept; nor does he use the microscope-telescope analogy, which would be more at home in the fields of myth, anthropology and psychology. Writing for a corporate audience, global vs. local is what one might expect to hear.

I think an interesting conclusion can be drawn from this observation. I believe Conway himself reaches this conclusion even if he doesn’t explicitly say so. In an environment of growing complexity and interdependence, strategic thinking implies being able to see connections that one might not do if he/she adheres to linear logic, and what’s more, one might never see by him/herself without the contribution of others. The crisscrossing of concepts from traditionally different disciplines and the fusion of individual brains into one collective intelligence is what strategic thinking for the 21st century seems to be all about.

I concur with Conway’s take on scenario planning as a way of creating alternative future narratives, and was happy to see the Dave Snowden reference on page 12. Although Snowden has become in the past few years a household brand name in (knowledge) management and a quote by him adorns the annual or centennial  corporate strategy paper of every Tom, Dick and Harry organization that likes to talk about “innovative approaches”, Snowden does often provide food for thought. [In my capacity as CKO, I once attended a presentation of his, which sparked enough interest to add his Cognitive Edge website/blog to my RSS feeds and read a number of articles he provides there under a creative commons license.] Conway outlines Snowden’s thoughts on the irrationality of human decision processes as a way of stressing the influence of human agency in strategy development. However, the quote ends too quickly. Elsewhere, when discussing the assumption of rational choice, Snowden goes through great pains to distinguish between “objective” reality and perceptions of reality. He argues that understanding these different perceptions or perspectives of reality can lead to strategic advantages and he sees narrative techniques (scenarios for our purposes) as a way to gain greater exposure to different perspectives:

The assumption of rational choice
Relaxing this assumption means that context and perspective become as important as rationality. This is an important reason that the Cynefin framework is not about “objective” reality but about perception and understanding; it helps us to think about the ways in which different people might be perceiving the same situation. For example, there is an old folk tale from India in which a wise man decides that in order to escape an impossible royal demand, he will fake insanity in the king’s court. He is operating in complex space because he is using cultural shorthands to provoke predictable reactions but is gambling that his ruse will seed the pattern he wants to create. He knows that from the perspective of his audience, who are operating in the space where things are bound by tradition and thus known, he appears to be acting chaotically, because they can conceive of no other reason for him to act this way in front of the king (who would surely behead him if he was faking). Thus by proving that he cannot be faking, he pulls off the fake. Understanding not only that there are different perspectives on an event or situation, but that this understanding can be used to one’s advantage, is the strategic benefit of relaxing this assumption. Narrative techniques are particularly suited to increasing one’s exposure to many perspectives on a situation. In management, there is much to be gained by understanding that entrained patterns determine reactions. This realization has major implications for organizational change and for branding and marketing. Our own work on narrative as a patterning device is gaining presence in this and other areas. Speculating, one of the most significant possible applications of this understanding is a move away from incentive-based targets and formal budgeting processes—both of which, we contend, produce as much negative as positive behavior. It is a truism to say that any explicit system will always be open to “gaming.” Paradox and dialectical reasoning are key tools for managers in the un-ordered domains.

C. F. Kurtz, D. J. Snowden, IBM SYSTEMS JOURNAL, VOL 42, NO 3, 2003, http://www.research.ibm.com/journal/sj/423/kurtz.pdf

Back to Conway, the scenario planning process he outlines is the same as that of Project Horizon. I wonder if the people responsible for managing that project did so intentionally, following Conway’s model or it was more of a Snowdean serendipity moment. I also wonder if the Project Horizon team answered the model questions from the decision tree for scenario planning that Conway provides. I must confess, I put the tree to test in terms of my own work and found it extremely difficult to answer the questions with “yes” and “no”. The reason for this is that it is often impossible to speak about an organization with one voice. There are, individuals, teams, silos, middle management, senior management – all open to change and dialogue to different degrees. Should one engage in scenario planning if the staff are open to change and dialogue but management isn’t and vice versa? If I attempt to answer this question, another point Conway makes comes to mind: “The organization will need to focus its foresight work – is it about helping the organization develop its preferred future and documenting that in a plan, or is it about considering all potential futures, whether possible, plausible or probable.” (p.21) From the point of view of management, I would say the focus tends to be on the former – developing your preferred future. From the point of view of staff – considering all options. How feasible is it then to apply scenario planning in a government organization, where both planning and decisions are more likely than not to be strictly top down? It takes a certain entrepreneurial spirit, of which government institutions are devoid almost by default, to engage in this type of exercise. Strategic or visionary thinkers are hardly welcome in such environments. At best, their boldest big picture strategy is dismissed as day-dreaming; at worst, it is seen as a threat to the managing body. In this respect, I find it commendable of the US government to support a project such as Horizon and would be very curious to know at what stage the project is two years after publishing the preliminary report. In particular, it would be interesting to know the progress on the Global Hazards Planning and Response capability, the US Government Partnership framework and the Global Affairs Learning Consortium since all these are sub-projects I am also trying to pursue in my work.

Scenario planning and the national security issue I’m working on (Russia and the Balkans):

I had already considered using scenarios prior to reading Conway and the project Horizon report, however, I’m not sure if scenario planning is appropriate when developing a situation assessment, which is the overarching analytical technique I’m applying to my project. In my view, a situation assessment should be limited to objective capabilities rather than alternative futures. I don’t think a situation assessment is or should be concerned with forecasting; rather it should be based on what Conway refers to as “traditional strategic planning” – deductive not inductive.

Still, had I chosen the same topic but a different method, I would apply scenario planning to examine potential “new” alliance formations on the Balkans. Bulgaria is a particularly interesting case due to its membership in both NATO and the EU, and its historical ties to Russia. Alternative scenarios could throw light on what role the country is going to play in terms of energy security in Europe. With four planned major pipeline routes transiting the country – 2 Russian projects and 2 EU/US-sponsored ones, it would be interesting to develop a scenario exercise to determine if Bulgaria will choose to “bandwagon” to the EU/US greater powers or become a Trojan horse by strengthening its ties with Russia. Other countries in the region (Romania and Serbia for instance) are less problematic because they have expressively stronger affinities to one camp or the other, hence the relatively low uncertainty would not merit the use of scenario planning. Greece could be another potential wild card despite its long history of NATO/EU membership. Dissatisfaction with some EU policies and a prospect of becoming a regional energy power through closer alliance with Russia, Greece’s behavior will be anything but predictable. Throw in Turkey’s contested EU application and the event that it actually succeeds, and a reshuffle of alliances and spheres of influence is sure to experience some major shifts. Cyprus, for one, will open its arms toward Russia even wider than it currently does.

To sum up, I would not apply scenario planning to my national security issue as long as I’m doing a situation assessment. However, I do believe scenario planning as a technique could be a valuable addition to long-term strategic analysis, especially when used to challenge assumptions about rational choice whether on an individual or a collective level.

 

 

Can structured methodology reduce intelligence failure?

If there is one single common “concept” that occupies the minds of all four authors, Jervis, Betts, Marrin and Clemente, this “concept” would be uncertainty. All four authors present their views on uncertainty applied to intelligence analysis, with varying degrees of optimism, pessimism and fatalism. It seems that uncertainty has become quite a fashionable concept, and is currently used profusely in a variety of disciplines from complexity to intelligence to military strategy to corporate management to the social sciences.

The temporal aspect of this fascination with uncertainty has striked me as rather exaggerated in that it is hardly specific to modernity. The examples from history from the 19th c. Prussian general Carl von Clausewitz (I take this as an arbitrary starting point; any other one would do as well) to as far back as the Delphic oracle are pertinent illustrations of mankind’s preoccupation with uncertainty and the desire to eliminate the latter, be it by military force or strategy (>Gr. Strategos, ‘general’) or by means of spiritual evocation of an intermediary oracle to interpret – call it what you will – God’s, fate’s, Chance’s “intentions”.

The negative aspects that all four authors attribute to uncertainty and particularly Betts’ fatalistic approach, are precisely a result of our modern obsession with certainty, security and risk elimination. In my view, uncertainty is not only not an obstacle but the very heart of opportunity, which in turn is more a source of optimism than of pessimism. The statement “failure is inevitable” (substitute failure with success or any other noun) sounds little more than a false truism. It is reminiscent of the type of Parmedian philosophy which asserts that all is one and change is impossible. I would go as far as to argue that this type of reasoning is a product of conscious and/or subconscious Christian rhetoric of the type that burns scientists and philosophers at the stake.

Being better at ease in Herakletian water, I would argue that nothing is inevitable because everything is in a flux. This flux, which is often ambiguous, uncertain, now visible, now not, sometimes linear, more often not, is precisely the strategic place of opportunity, the forking of the path, and the place where the potential for great leadership can emerge. To go back to Clausewitz, it is precisely in times of great uncertainty that great leaders are born, he argued.

All that said, there are a number of points that Betts raises, with which I concur. First of all, the idea that intelligence reform, whether procedural or product-oriented is based on trade offs, seems to me a logical observation in a world view that is based on polarities. What’s more important (Betts makes a mere mention of this but Marrin takes the argument a few steps further) is how, or rather where, such trade offs can be optimally utilized when the binary system of polarities assumes a more complex and amorphous form through the injection of sub-polar categories, i.e. when we are presented with additional “circumstantial evidence” that dilutes the black and white picture.

This place and agent of change is, I believe, correctly identified by all four authors as the margin, or the periphery. It is particularly fitting to think of strategy in spatial, not only temporal terms. The periphery, not the center, is often the space of pragmatic change. The center of gravity, to use a military term, represents not only the strength of a system but also its vulnerability. Again, history is rich with precedents of the shifting dynamics between the center and the periphery. Bolko von Oetinger, a strategist for Boston Consulting, argues in “Constructing Strategic Spaces”(Nov 2006) that “Strategy requires regular visits to the periphery in order to explore and learn” precisely because the center does not always remain the center and because “outsiders on the periphery are happy to traverse the distance to the center and conquer it.” He provides a fitting example of a center-periphery dynamic, with consequences the center could not have anticipated at the time. He takes 31 October 1517 as a temporal indicator of radical spatial change and asks (rhetorically): could Pope Leo X in Rome have anticipated that the 95 Theses Marthin Luther nailed to a church door in Wittenberg (the periphery’s ends by Roman standards) on that day would eventually result in Rome’s loosing its privileged position as the center of Christianity?

Going back to Betts article, I found his descriptions of patterned behavior in the face of strategic surprise well thought of and instructional to an intelligence analysis student. I agree with his evaluation of the difficulties and ultimate small benefit of applying worst-case scenario methods as particularly ineffective in terms of operations. Further, if multiple advocacy increases rather than decreases ambiguity and uncertainty, I would argue that this method should only be used in cases where decision-making and good leadership go hand-in-hand, i.e. when political leadership is synonymous with intellectual rigor, courage, and a dose of entrepreneurship.

The Devil’s Advocacy method is to me an intellectual exercise, which should be limited to academia. While guilty of the pleasure of playing this game myself, I believe it does little else than encourage mistrust among the “wrong people” (decision-makers) at the “wrong time” (time for decisive action rather than intellectual speculation).

Jervis identifies more or less the same intelligence failure causes – uncertainty, ambiguity and deception – and offers valuable practical examples for improvement of the intelligence processes and products. One thing that struck me as rather unique in his paper was his emphasis on human resources. In my professional function as chief knowledge officer, I’m confronted with similar HR issues that shape the internal environment. Particularly worthy of note were the sections on multidisciplinary training and the vertical-horizontal organizational structures, and how the former inhibits quality analysis/performance at the expense of organizational politics.

With regard to Perrow’s “error-inducing system”, which Jervis chooses to support through what he calls “informal norms and incentives of the intelligence community”, my response is the same as with Betts’ argument. Providing alternative competing hypotheses should be done with caution, depending on the customer. Further, the idea that intelligence analysis should borrow academic method of testing hypotheses by drawing predictions, is theoretically sound, and perhaps even applicable to long-term strategic analysis (or, on a second thought, maybe not as the more distant the future, the harder to make accurate predictions, except in Black Swan cases, where prediction becomes irrelevant), but has the operational trade offs of time and money. Therefore, I don’t think that any analytic method on its own can improve the analytic product. As Jervis argues himself, interlocking and supporting factors must reflect the requirements imposed by appropriate style: length of the analysis according to the consumer’s requirements, peer review processes among the analysts, and a horizontal hierarchical structure, to name but a few.

Another point that struck me as particularly apt was Marrin’s observation that: “The CIA’s Directorate of Intelligence – the home of analysts – appears to operate according to a culture that rewards service to policy makers but does little to distinguish between information and conceptual products.” Jervis expresses a similar opinion when he criticizes analysts for producing political reporting rather than political analysis. My personal opinion here is that this shortcoming is due to a certain type of an educational system that promotes knowledge over learning; the statement format over the question format. If I’m correcting in thinking so, it would take a long time to overhaul the fundamental principles that form our didactic processes. Another explanation could be a cognitive one, i.e. it takes less mental effort to produce a statement than to come up with a question. And, finally, it could be psychological: reporting facts is largely an anonymous activity that more people would be comfortable with than making a prediction or asking a question, which is an expression of individualism, and by extension more open to criticism.

In this light, I read the final Marrin and Clemente article with great enthusiasm because the medical analogy is a comparative method I’ve spent some time musing over myself in its application to religion, philosophy, writing, and memory. My only concern with this method is that it would attract a certain breed of human, be it an academic dilettante  or professional, whose passion for comparative analysis (of any type), would be the emphasis of theory over practice. And while I think a certain amount of theory can be beneficial to intelligence, its main purpose is and should remain actionable.

The analogy provided by Marrin and Clemente between the process of arriving at a medical diagnosis by medical professionals and articulating an intelligence analysis assessment by intelligence professionals provides an alternative way of looking at the discipline of intelligence analysis, and it is for the most part, useful.

First, the authors identify parallels between the two disciplines. In terms of collection practices, they draw attention to the similarities of employed techniques to gather information upon which different hypotheses can be identified. They compare the medical history questionnaire a doctor first compiles in the diagnostic process to what can be roughly summarized as a situation assessment in intelligence, i.e. any known historical precedents or patterns of events and relationships between actors.

Second, the “review of systems”, i.e. the assessment of specific organs, can be viewed as similar to the individual steps in a country profile assessment, including foreign policy, domestic policy, politicians and political leadership, diplomatic relations, cultural, socio-economic relations, etc. Marrin and Clemente claim that the stage involving the physical examination itself is least conducive to analogy, except in the form of overseas visits aiming at gathering first-hand knowledge of the area, or alternatively, cables from government representatives stationed in the given area.

Finally, additional information provided by various technological systems, such as MRI and IMINT respectively, further reinforces the analogy.

What was interesting to note was the observation that “90% of all diagnoses are made by clinical history alone, 9% by the physical exam, and 1% by laboratory tests and imaging studies such as CT and MRI scans.” (p.710) This finding has interesting implications for at least two reasons. First, if we extend this analogy to the intelligence field, it would seem that in the science vs.intuition debate, intuition is the clear winner in practice. Secondly, the finding poses a serious concern regarding collection requirements, needs and spending. If a situation assessment will be the core component of the final intelligence product, and most of the data can be obtained by open sources, this would necessarily minimize requirements and costs. Further, as Marrin and Clemente observe, the human element, i.e. the experience and the developed intuitive capabilities of a professional from either field, will be indispensable in interpreting the raw data gathered from MRIs, IMINT, SIGINT, and other technical subfields.

In the analytical process, there seems to be a strong argument in favor of comparing how a medical doctor arrives at a diagnosis by examining alternative hypotheses and the way an intelligence analyst might employ Heuer’s method of analysis of competing hypotheses (ACH) .

Parallels also exist in the examination of causes of inaccurate diagnosis vs. intelligence failure: inevitable limitations in the collection and analysis; cognitive limitations of the practitioner/analyst, such as biases, stereotypes, etc; and failure in the application and implementation of scientific methods.

Marrin and Clemente also identify limitations to the proposed analogy between medicin and intelligence. Three key differences are worth acknowledging here. First, medicine has an advantage as a scientific discipline over intelligence in the sheer length in existence of the field, which offers medical practitioners a much wider empirical and theoretical knowledge base. Second, the difference in degrees of denial and deception are also noteworthy of the medical field’s advantage in that, only rarely, do patients conceal or deliberately manipulate their ailing symptoms, whereas in the intelligence field, denial and deception is standard practice. Finally, the doctor-patient relationship does not sustain a parallel to the intelligence analyst-decision/policymaker relationship in that “National security decisionmakers, however, do not make decisions only after receiving finished intelligence analysis (i.e. what a doctor would do prior to initiating treatment), in many cases they are their own analysts, and they have entirely separate sources of information.” (p.722)

The lack of trust between intelligence professionals and decision/policymakers, and the inadequate feedback mechanisms are a well known problem, which makes the doctor-patient relationship closer to a symbiotic one while leaving the latter incomplete at best, parasitic at worst, or even self-destructive.