Were the lack of precision the only criticism of the intelligence cycle, it might be able to weather the storm. As suggested previously, there do appear to be general themes that are relevant, and the cycle’s continued existence suggests that its inconsistencies are outweighed, to some extent, by its simplicity.

Unfortunately, the second type of criticism typically leveled against the cycle is much more damning. In fact, it is fatal. Simply put, there is virtually no knowledgeable practitioner or theorist who claims that the cycle reflects, in any substantial way or in any sub-discipline, the reality of how intelligence is actually done.

Consider these quotes from some of the most authoritative voices in each of the three intelligence communities:

Once you start looking for them, it is easy to find detailed critiques of the intelligence cycle (and, please, don't hesitate to add your own). The only argument that still seems worth debating is whether or not the cost of maintaining this flawed model of the process is worth the benefit (a question about which readers of this blog were almost evenly split).

Once again, themes emerge from the general discontent with the inadequacies of the intelligence cycle. Many of these themes I will touch upon as I discuss alternatives to the intelligence cycle in later posts. One theme, however, leaps off each page and tends to dominate the discussion: The intelligence cycle is linear and intelligence, as practiced, is not. Tasks move from one part of the cycle to another like an assembly line, where parts are bolted on in a specific order to create a consistent product.

While this approach might be appropriate for early 20th century manufacturers, it doesn’t work with intelligence, where each product, ideally, contains information that is somehow unique. Consider, for example, this hypothetical dialogue between Mary, the CEO of Acme Widgets and Joe, her chief of competitive intelligence:

Mary: I need to know everything there is to know about the Zed Widgets Company.
Joe: Sure. What’s up?
Mary: We are thinking about introducing a new widget and I want to know what the competition is up to.
Joe: Anything in particular you are interested in?
Mary: Well, I can see their marketing efforts on the TV every day, so I am not really interested in that. I guess the most important thing is their cost structure. I want to know how much it costs them to make their widgets and where those costs are.
Joe: Right. Labor, overhead, materials. Got it. Is one part of the cost structure more important than another to you?
Mary. They pay about the same amount in labor and overhead that we do so I guess I am most interested in the materials; particularly Material X. That is our most expensive material.
Joe: I just read a report that indicated that the cost of material X is set to rise worldwide. Would you also like us to take a harder look at that and give you our estimate?
Mary: Absolutely.

While this example is simplistic, it makes the point. Intelligence, even in this one minor example within only one of the many parts of the traditional intelligence cycle is, or should be, at least, interactive, simultaneous, iterative. In the above example, this interaction between the intelligence professional and the CEO resulted in a more detailed and nuanced intelligence requirement going, as it did, from the very general, “Tell me everything…” requirement to the highly focused, “Tell me about Zed Company’s Material X costs and give me an estimate of where the price of Material X is likely to go.”

It is equally easy to imagine this kind of interaction within and between parts of the cycle as well. Collectors and analysts will inevitably go back and forth as the analysts attempt to add depth to their reporting and as the collector develops new collection capabilities. It is even likely that parts of the cycle that are not adjacent to one another will work very closely together, such as an analyst and the briefer responsible for the final dissemination of the product (in its oral form). Decisionmakers, too, may well remain involved throughout the process, seeking status reports and perhaps even modifying the requirement as new information or preliminary analysis becomes available.

The US military's Joint Staff Publication 2.0, Joint Intelligence, states the case more strongly:

"In many situations, the various intelligence operations occur nearly simultaneous with one another or may be bypassed altogether. For example, a request for imagery will require planning and direction activity but may not involve new collection, processing, or exploitation. In this example, the imagery request could go directly to a production facility where previously collected and exploited imagery is reviewed to determine if it will satisfy the request. Likewise, during processing and exploitation, relevant information may be disseminated directly to the user without first undergoing detailed all-source analysis and intelligence production. Significant unanalyzed combat information must be simultaneously available to both the commander (for time-critical decision-making) and to the intelligence analyst (for production of current intelligence assessments). Additionally, the activities within each type of intelligence operation are conducted continuously and in conjunction with activities in each of the other categories of intelligence operations. For example, intelligence planning is updated based on previous information requirements being satisfied during collection and upon new requirements being identified during analysis and production."

The situation is even more complex when you imagine an intelligence unit without teams of people working each of the discrete parts of the cycle. In situations involving small intelligence shops, where a single indivdual collects, processes, translates, analyzes, formats and produces the intelligence, the cycle breaks down completely.

The human mind simply does not work in this strictly linear fashion. Instead, it jumps from task to task. Imagine your own habits when researching a topic. You think a bit, search a bit, get some information, integrate that into the whole and then search some more. This approach inevitably leads to analytic dead ends, requiring more collection. At the same time, you are thinking about the form of the final report. If you are putting together an intelligence product that will use multimedia in its final form, for example, you are constantly on the lookout for relevant graphics or film footage you can use, regardless of its analytic value. To even suggest that you should collect all of your information, stop, and then go and do analysis without ever doing any further collection, is absurd.

One of the most recent and widely publicized innovations within the US national security community is the advent of “Intellipedia”, a Wikipedia-like tool for the intelligence community. Wikipedia, of course, is the online encyclopedia that is free to use and editable by anyone. It is one of the most popular sites on the web and, according to at least some research, is as accurate as other generally accepted encyclopedias. It has become, in its short lifespan, the tertiary source of first resort for both analysts and academics.

One of the things it is not is linear. There is no "Table of Contents" and researchers, authors and editors choose their own path through the resource. Some people generate full articles; others only dive in occasionally to fix a particular fact or even a grammatical or spelling error. There are even full-fledged “edit wars” where a particular version of an especially hot topic changes back and forth between competing points of view until either one side gets tired and gives up or, more likely, the sides reach a version acceptable to all. In the end, it is openness and interactivity that give Wikipedia its strength.

The US national security community acknowledged the value of such a tool, at least with respect to its descriptive products, when it launched Intellipedia. Begun in April, 2006, Intellipedia, according to information from June, 2010, now has 250,000 registered users and is accessed over 2 million times per week. This effort, which is clearly far beyond the experimental stage, plainly shows that collaboration and interactivity – the anti-intelligence cycle -- are core to any modern description of the intelligence process.

Despite its popularity, the intelligence cycle is widely criticized by intelligence professionals. These criticisms generally break down along three lines. First, there are those who say that what appears to be a theoretical monolith is actually open to a wide variety of interpretations, depending on perspective. Indeed, there is not one intelligence cycle but a series of intelligence cycles, each substantively different from the rest.

Second, many authors have claimed, quite convincingly, that the intelligence cycle, as generally described, does not, in many material ways, reflect the reality of how intelligence actually is done. The simplicity of the cycle, to these critics, is both seductive and deceiving.

Finally, there are those critics who claim that the so-called intelligence cycle is simply a marketing tool that rebrands overly simplistic "cycles" from business and leadership courses. I intend to discuss each in turn.Which Intelligence Cycle?

Compare the diagram of the intelligence cycle which, until recently, graced the Intelligence.gov website (owned by the Director of National Intelligence -- DNI) with the diagram of the intelligence cycle from the Federal Bureau Of Investigation (FBI) below.

﻿

FBI Version Of The Intelligence Cycle

﻿﻿﻿

Recent DNI Version Of The Intelligence Cycle

﻿

Besides the obvious differences in graphic representation, what differences in content do you notice? If you look carefully, you will see that the FBI has decided to include a phase that is not in the DNI’s image, the “requirements phase”.

For a seasoned professional, this difference is trivial. Indeed, as I discussed in my overview of the intelligence cycle in Part 4 of this series, there is an explicit need for requirements and the FBI’s inclusion of them as a separate phase of the process might seem to be a matter of professional choice.

A student of intelligence, particularly a new student, might legitimately question this explanation, however. Perhaps there is a difference. Perhaps the FBI’s characterization represents a new way of thinking about intelligence as a process.

Perhaps, in fact, one description of the process is substantively better than the other. If this is not the case, then what is the explanation for the differences? There does not seem to be a good reason why the FBI’s take on the intelligence cycle should differ from that of the main intelligence site for the US Government, particularly since the FBI’s intelligence function, since the 2004 restructuring of the US intelligence community, is, in many ways, subordinate to that of the Director of National Intelligence. In short, does this difference represent legitimate theoretical differences or is it merely the result of a lack of coordination or, worse, sloppy thinking?

To make matters worse, the DNI's recently updated version of the intelligence cycle confuses the issue even more. You can see the the graphic currently in use at intelligence.gov below:

A quick examination of the current version of the DNI's cycle seems to differ from the previous version is several substantive ways. "Direction", "Exploitation" and "Production" all appear to be subsumed into broader categories of activities. Is there a reason for this? Did the DNI conduct studies to determine the best, most accurate, description of the cycle? Or was this a graphic design decision made becasue there was simply not enough room in the graphic for the additional words?

It gets worse.

On the same page that contains the graphic above, the DNI promotes not one but two additional variations of the cycle. In the first, more modest, variation (contained in the text that describes the picture), the DNI says, "The process begins with identifying the issues in which policy makers are interested and defining the answers they need to make educated decisions regarding those issues. We then lay out a plan for acquiring that infromation and go about collecting it." If this is true, then why doesn't the graphic also "begin" with requirements? Why does the graphic seem to begin with planning?

It gets even worse.

The third variation of the cycle (all on the same page) from the DNI comes at the very top of the page. Here one finds five links, "Management", "Data Gathering", "Interpretation", "Analysis and Reporting", and "Distribution". Clicking on the "Management" link indicates that management -- not requirements, not planning -- "is the initial stage of the intelligence cycle".

Sigh.

I wonder which version is taught in the Intel 101 courses?

I wonder how you grade a student who uses an "alternative" cycle as an answer on a test?

Were these differences the only differences within the US national security intelligence community, they might be explained away more simply but they are not. In fact, there is very little consistency across and even within a number of important elements of the US national security community. These inconsistencies also exist across disciplines as well.

Examine the chart below. Only one function, collection, is universally attributed to intelligence across all 10 organizations examined.

Within the DNI, CIA and FBI there are minor but important differences – not one of the three is exactly like either of the other two.

Even more baffling are the differences within the US military, however. The Defense Technical Information Center (“the premier provider of Department Of Defense technical information”) has a streamlined four-part description of the cycle, one that largely (but not completely) agrees with the cycle as taught at Fort Huachuca, the Army’s home for its military intelligence professionals. This cycle, however, is substantially different from the process defined in the US Military’s highest-level publication on intelligence doctrine, Joint Publication 2.0.

The differences evident in the US military may well be due to different publation dates or my own lack of access to the most recent revisions of some of these documents. In this regard, though, the 2007 Joint Pub is worthy of further commentary. In it, the US military seems to abandon the intelligence cycle in favor of a more generic intelligence "process". Some have suggested that this proves the military has already killed the intelligence cycle (but it just didn't get the memo...).

While it is (from my viewpoint, at least) a step in the right direction, it only exacerbates the impression that either the left hand is not speaking to the right in the US national security intelligence community or that the DNI doesn't control or doesn't care what the Joint Staff puts out with respect to the intelligence process. All of those alternatives make the US IC look sloppy and disorganized.

I also think the Joint Staff is trying to have its cake and eat it, too. Compare the two images below. The first is from the most recent public version of Joint Pub 2.0. The second is from the 1990 version of the US Army's Field Manual 34-3, Intelligence Analysis. While the words in the two publications contain many significant differences, the pictures seem to say that the military has not backed too far away from its conception of the process as a cycle.
﻿

Join Pub 2.0 Intel Process 2007

FM 34-3 Intel Cycle 1990

﻿﻿﻿﻿﻿﻿

These descriptions of the cycle differ, again, in significant ways from the descriptions provided by two oversight bodies commissioned to examine intelligence activities listed on the chart, the 1996 Graham Rudman Commission and the 2004 Weapons of Mass Destruction Commission. To round out the confusion, the description of the cycle offered by the International Association Of Law Enforcement Intelligence Analysts and the classic competitive intelligence model (as described by longtime private sector intelligence specialist, John McGonagle) also differ from each other and from the other 8 examples.

This analysis, while interesting, comes across as perhaps a bit more picky than it should. Other processes in other disciplines lend themselves to various descriptions. Indeed, despite the differences, there are clear themes that emerge even from this analysis. Few, for example, would question whether requirements, needs, direction, and planning fell into a single, generic category.

Themes, however, is all these are. A rigid approach to intelligence, implied visually in the pictures above and in many of the descriptions of these processes by each of these intelligence organizations, seems inappropriate under these conditions for teaching these concepts to new members of the intelligence profession or, indeed, explaining the process to the decisionmakers that intelligence supports. Instead, a more nuanced and less absolutist approach appears to be called for.

There is one specific area where this analysis does create cause for concern, however. Only three of the 10 organizations examined include a feedback or evaluation function within their versions of the cycle.

While some of the other organizations did include feedback as a subset of the dissemination process, subordinating this crucial evaluative process is not likely to endear the intelligence function to the decisionmakers that intelligence supports. It seems much better practice to include explicitly the role of feedback in the process, whether the decisionmaker chooses to take advantage of it or not.

Finding descriptions of the intelligence cycle is not difficult. Virtually every organization, company or law enforcement agency that has even a modest intelligence capability has a picture, much like the one to the right (which, until recently, graced the US national security intelligence community’s main web site, Intelligence.gov).

So pervasive is this traditional image of the intelligence process that it comes across as generally accepted theory. Indeed, many private sector practitioners have built much of their marketing campaigns on touting the benefits of the cycle.

Likewise, it is commonplace to see the cycle featured prominently in government publications, statements of doctrine, training publications and even in critiques of intelligence. The entire architecture of intelligence, across all three major sub-disciplines of intelligence, is caught up in this more or less common vision of how intelligence professionals perform the functions of intelligence.

Despite its popularity, the history of the cycle is unclear. US army regulations published during WWI identify collection, collation and dissemination of military intelligence as essential duties of what was then called the Military Intelligence Division (Fun fact: According to Congressional testimony in 1919 the whole budget for military intelligence in 1913 was $10,000 -- or roughly $227,000 in 2011 dollars) but there was no suggestion that these three functions happen in a sequence, much less in a cycle.

By 1926, military intellgence officers were recommending four distinct functions for tactical combat intelligence: Requirements, collection, "utilization" (i.e. analysis), and dissemination, though, again, there was no explicit mention of an intelligence cycle.

The first direct mention of the intelligence cycle (see image to the right) I could find is from the 1948 book, Intelligence Is For Commanders (more on this book here). I hypothesize that the intelligence cycle probably came into use during WWII as a training aid but I have not been able to find any evidence to corroborate this bit of speculation on my part.

Since that time, the cycle, as a model of how intelligence works, has become pervasive. A simple Google image search on the term, "Intelligence Cycle" rapidly gives one a sense of the wide variety of agencies, organizations and businesses that use some variant of the cycle.

(Note: Experienced intelligence professionals might want to skip the next few paragraphs, which outline a more or less generic version of the intelligence cycle based on the image at the top of the post. I include it here for readers who are not familiar with the cycle or for students of intelligence who might need a refresher.)

While the actual details vary dramatically (something we will turn to in the next post) in a typical description of the process, the intelligence cycle usually begins with planning and direction or similar language. Direction usually comes from the decisionmakers that the intelligence activity supports although it can also come from the senior leaders of the intelligence activity or even from the analysts themselves.

It is important to note here that direction and planning can be formal but is often done informally (most variants of the cycle make no distinction). This is seen most often when there is no time for a more formal process. Less formal tasking is also often seen in smaller intelligence units, such as in business environments, or in units where the intelligence professionals and decisionmakers have a long-term relationship.

After the planning and direction phase, the collection phase, in a typical version of the intelligence cycle, begins. Here the intelligence professional begins to execute the plan to collect the types of information necessary to understand and answer the requirement.

Once the intelligence unit collects the information necessary, other intelligence professionals within the same unit might have to process and exploit it. Processing and exploitation takes on a number of forms including decrypting encrypted transmissions, turning a variety of conversations into a single cohesive report, translating source information from one language into another or identifying buildings and other features in an aerial image, among others. In short, processing and exploitation is the phase where very raw information becomes usable to the largest number of people authorized to view it within the intelligence organization.

With planning, direction, collection, exploitation and processing complete, the focus of the traditional intelligence cycle shifts to analysis and production, or the interpretation of the collected data and the creation of a product that best meets the decisionmaker’s needs.

Analysts need the widest possible variety of information sources in order to be able to corroborate other information and to answer the requirement with which they are dealing. The notional source of the information is much less important than its relevance to the requirement at hand and that source’s reliability.Production carries its own concerns and they are often independent of the analysis. If the analyst is concerned with the content of the analysis, the intelligence production specialist is concerned with its form. Appropriate forms, in turn, depend on the needs and desires of the decisionmaker the intelligence unit supports. For example, while the traditional intelligence product within many areas of the US national security community is a written document with a smattering of pictures and graphs to explain key points, business decisionmakers tend to be much more comfortable with charts, graphs, and numerical data accompanied by a few explanatory bullets.

The final phase according to the diagram is dissemination. This is where the intelligence specialist delivers the final product to the decisionmaker. While this sounds fairly easy, it, too, has a number of pitfalls associated with it. Questions concerning exactly to whom an intelligence document should go to and exactly how it should get there are fundamental to this phase. For example, classification, or the level of secrecy or confidentiality associated with the intelligence product, is one such issue.

Likewise, many people in developed countries take high-speed voice and data communications capabilities for granted. Yet, in many cases, particularly in intelligence work, such capabilities are scarce or degraded due to geographic isolation. In these cases, the bandwidth available may determine where the intelligence product is sent or even if it is sent via electronic means at all.

Intelligence is not something that appears, autogenously; it is something that gets done, a process. This idea, that intelligence is a process, is one of the least controversial among intelligence professionals.

However, a general description of the process of intelligence -- that is, the best way to characterize and classify those consistent elements across intelligence sub-disciplines --is still very much an open theoretical question.

Intelligence professionals have long known that the traditional way of describing the intelligence process, the so-called "intelligence cycle", is flawed; yet none of the alternatives proposed has yet captured the nuance of the process as practiced or, for that matter, the mind of the intelligence community.

This disconnect between theory and practice, between the imperfections of the intelligence cycle and the way intelligence is actually done, has real-world consequences. While I will return to this theme many times throughout this series of posts, it is useful to get a sense of the costs associated with perpetuating a faulty model of the process.

For example, without a consensus on the way in which intelligence "happens" that works across the various sub-disciplines of law enforcement, business and national security intelligence, it is impossible to study the process for potential improvements.

In addition, reforms proposed under flawed models are likely to be flawed reforms, incapable of solving systemic problems because the system itself is so poorly understood.

Furthermore, training students with models of a process that falls apart when first touched by reality reduces the perceived value of training as well as the morale of those trained.

Budgets built around a flawed model are likely to mis-allocate funds and require work-around solutions that consume even more scarce resources.

Hiring people to fill positions created under an unsound model of the process is nearly certain to create a mismatch in terms of skills and competencies needeed vs. skills and competencies acquired.

The list goes on.

In this series of posts, I will begin by examining the intelligence cycle and some of the critiques of it. Next, I will examine the alternatives to the intelligence cycle. Finally, I will lay out my own understanding of the process. While every intelligence project is different, my own experience and the evidence I have collected over the last eight years indicates that there are patterns in this activity, whether in the national security, business or law enforcement fields, that are consistent across the entire intelligence profession.

The final goal of this exercise, then, is to outline this new description of the process as clearly as possible based on intelligence, as it is practiced, across all its sub-disciplines and regardless of the size of the intelligence activity involved. Furthermore, I want to balance the need for both simplicity and detail such that this explanation of the process is accessible to all students of intelligence -- at whatever age or level of experience.

Monday, May 23, 2011

My intent today was to jump right into my series on the intelligence cycle and why we should get rid of it (put a wooden stake through its heart were the exact words I think I used...).

However, over the weekend, I received a torrent of emails and the post received a number of comments and it occurred to me that, before I got started in earnest, it might be useful to do a little wholly unscientific sentiment analysis on this issue.

Using the Swayable tool (which many of you have already tested here and here), I intend to first test the underlying assumption behind this study and second to ask two related but independent questions about your perceptions of the intelligence cycle and its place in intelligence theory.

The Assumption Check

The first question is: "Is the traditional intelligence cycle a perfect representation of the current intelligence process? By "perfect" I mean perfect -- does the intelligence cycle accurately model the intelligence process as it is currently done? Trivial issues count here (we will deal with them later).

Something A Bit More Substantial

The second question addresses the degree to which the cycle is imperfect (assuming you thought it was imperfect in the first place): "Do the benefits derived from continuing to use the intelligence cycle as a depiction of the intelligence process outweigh the costs?" I would ask you to think carefully about both the costs and the benefits before answering.

Finally, I want to get at your beliefs: "Without reference to perfection (or imperfection), costs or benefits, do you believe that a better general description of the modern intelligence process is possible?" (Note: Extra credit for guessing why I chose pictures of Leibniz and Voltaire and double secret extra credit for knowing which is which...)

That's it. Please do not hesitate to pass this post and the series on to anyone who might be interested. In addition, please do not hesitate to join in the discussion by dropping me an email or posting a comment (comments are better as they can be seen by all but I understand if that is not possible).