The weakest link in anti-terror systems

By William Jackson

Jan 11, 2010

President Barack Obama has described recent lapses in intelligence that allowed a would-be terrorist to board a Christmas Day flight to Detroit as systemic failures. But these were not failures of systems; they were failures of people using the systems.

It's important to remember that the best of systems are no good if they are not used properly or not used at all. The use of computers and networks to share information and make it available throughout a widespread intelligence community has improved greatly in the last eight years but, in the end, our ability to use the intelligence depends on our ability to put eyeballs on the data and make decisions.

It was the eyeballs that apparently failed in the Christmas Day incident, not the technology. Correcting the problems that led to that failure is not going to happen quickly.

Effective sharing of intelligence traditionally has been hampered by technology and by culture. Computer systems that store, process and transmit the data were built more with an eye toward controlling the data rather than sharing it. And in a culture where knowledge is power, you don’t give away information.

When it became clear that this environment was failing to protect the nation from many threats, correcting the technology part was relatively simple. The databases and other systems now serving the intelligence community might not be perfect but, in this case, they appear to have performed as intended and the necessary information was available.

It is not clear that the problem of culture has been completely solved. Things have improved, but priorities still seem to fall along organizational lines so that pieces of information do not get the attention they deserve outside the organization that generated them. That probably is due in part to the amount of manpower available to do the analysis.

The failures described in Obama's review of the incident were primarily on the analysis side. Available information was not properly prioritized and followed up and the would-be terrorist’s name, although in the system, was not moved to the proper list. It was in the focus and priorities that things failed, not in information sharing.

The president has promised to sharpen the focus and make officials accountable for following up critical information. But no mention is being made of the critical element needed to make these promises pay off. That is manpower. We have automated systems to collect, filter, process, and transmit the data. But we still need to have eyeballs to examine it and make decisions.

We need people making critical decisions because computers, although they can be fast and efficient, are stupid. They will search for and find exactly what they are told to look for, but they need people to tell them what that is. In the end, it is far more effective for a human brain to pick out the tell-tale traits of a terrorist than it is to try to describe those traits to a computer.

Computers are useful in filtering data according to set criteria, flagging it when criteria are met and then alerting someone. But eventually someone with reasoning power needs to make the decision whether the conditions identified by a computer amount to terrorism.

We don’t know exactly why the whistles did not go off when a person whose name was in a database purchased a one-way ticket with cash and then checked no luggage on the flight. That is the kind of correlation a computer should be good at. But the plot could have been identified well before that point, and even if the system is tweaked to blow the whistles when something like that occurs, we still need more people looking at the data.

If Obama's reforms are to work, if focus is going to be sharpened and people held accountable for improved results, we are going to need more trained, experienced people examining suspicious data, and that is going to take time. We should start now finding, hiring and training those people.