US Intelligence Drowning In Information

The US intelligence community has more information at its disposal than ever. Unfortunately, it can't efficiently process it and make the necessary connections.

The US intelligence community has more information at its disposal than ever. Unfortunately, it can’t efficiently process it and make the necessary connections.

NPR begins its story on this (“Why America’s Spies Struggle To Keep Up“) by following the wrong trail–the old bureaucratic infighting and stovepiping meme:

Before Sept. 11, 2011, there were 16 intelligence agencies in the United States. But after the attacks, the 9/11 Commission recommended creating a 17th intelligence agency — the Office of the Director of National Intelligence (ODNI) — to coordinate intelligence operations.

The 16 already existing agencies didn’t react well, says historian and former intelligence analyst Matthew Aid.

“They hated the idea of a [so-called] ‘intelligence czar,’ ” he tells Fresh Air’s Dave Davies. “Each of the 16 intelligence agencies that existed before the creation of the ODNI [are] bureaucracies. They have a bureaucratic identity … and they love their independence.”

After much debate, the ODNI was created — but given almost no authority over the 100,000 or so spies who work for the Pentagon. The result? Essentially two separate spy networks within the intelligence community: the civilians who work for the 16 agencies reporting to the ODNI, and the 100,000 spies at the Pentagon who report to the undersecretary of defense for intelligence.

“They have separate budgets, they report to separate committees, and it is a structural nightmare,” says Aid.

In his book Intel Wars, Aid details how overlapping jurisdictions, bureaucratic policies and a glut of data have crippled the intelligence community in its war against would-be terrorists.

“You talk to officials who used to work or work today at the ODNI, and there’s just frustration,” he says. “I quoted one official as saying, ‘It would be nice if the boys over at the Pentagon let us know what they were up to,’ which I think gives a hint that says things could be more tightly controlled than they are right now.”

While I’m sure that turf wars exist–it’s the nature of bureaucracy in general and intelligence, with its understandable culture of secrecy, in particular–but the real point is this:

Before Sept. 11, Aid says, the U.S. had 200 drones collecting data all over the world. That number climbed to over 6,000 after the attacks. Many of these drones provide essential information for intelligence forces, says Aid, but there’s a problem: Mixed in with the good stuff is also a lot of nonessential information. Aid says intelligence analysts are drowning in the data — particularly because there aren’t enough analysts to sift through what’s potentially important.

“I’ve interviewed a number of collectors who worked in Iraq and Afghanistan, and one of their complaints is, ‘I’m sitting in a foxhole and I’ve got 3,000 emails coming in from Washington every morning with all the latest intelligence. And the guy said, ‘It’s wonderful that they’re sharing this stuff with me; I just wish they were a little more selective about what they were sending me.’ ”

“They’re essential, but the problem is … the amount of data is literally drowning the analyst on the order of something like 275 operators and analysts to analyze the result of each drone intelligence mission,” says Aid.

The main problem is that there aren’t enough people who can intelligently skim all this data and decide what’s worth passing on, so they pass it all on. The corollary problem is that agencies have different priorities and missions and information that’s completely worthless to most can be essential to one or two. So, the raw information really has to be made available to everyone and the culling decisions have to be made at the local level.

Except that it’s really impossible to do. Even with unlimited budgets, it would be impossible to recruit and train enough competent people.

Collecting information, at least this sort of information, is easy. Performing competent analysis quickly enough to be useful to policy-makers and action officers is really, really hard.

FILED UNDER: General
James Joyner
About James Joyner
James Joyner is a Security Studies professor at Marine Corps University's Command and Staff College and a nonresident senior fellow at the Scowcroft Center for Strategy and Security at the Atlantic Council. He's a former Army officer and Desert Storm vet. Views expressed here are his own. Follow James on Twitter @DrJJoyner.

Comments

  1. michael reynolds says:

    I’m not surprised by this. (Of course if the Pentagon hadn’t been hunting gays they might have at least a few more analysts.) Who wants to do analysis for the intelligence communities? The pay is civil service, the chance of meaningful advancement to serious income levels is minimal, you’re drug-tested, vetted for everything up to and including chronic masturbation, why would some smart kid choose this path?

  2. James Joyner says:

    @michael reynolds: It’s a bizarre lifestyle choice, to be sure. There are those willing to put up with a whole lot in the service of their country, but it’s really absurd.

  3. I’ve been retired for 16 years – but I can still agree with your conclusion.

  4. There’s public choice theory issue here. At each stage along the way, a decision must be made about what to pass on and what to throw out. If you choose to throw out something that later proves to be important, even if couldn’t reasonably have been known to be important at the time, you’re the one who’s going to be punished for it.

    If you pass it on and the receiver fails to act on it, they’re the one who’s going to be punished for it.

    So there’s a perverse incentive to pass on pretty much everything, no matter how useless.

  5. Rob in CT says:

    Maybe the problem is, much like with the military, the scope of what we’re trying to do.

    Perhaps if we were to reduce our role (judiciously – I’m not arguing for an instant sea-change in how we interact with the world), the strain would be reduced?

  6. tps says:

    @Rob in CT:

    I doubt it. A lot of intell product is from the everyday ‘keeping an eye on things’. Trying to make sure another Pearl Harbor or 9/11 doesn’t happen. Its all got to be processed and made sense of somehow.

  7. nicolas says:

    This problem arises in many industries. Like the one I know of, finance.
    There will be a solution to tackle this and I think I might have part of the solution….
    We’ll see !!

  8. Tillman says:

    Sounds like the American SIGINT fetish is going strong.

  9. John Peabody says:

    This data is worthless! Delete it all! …but be sure to make copies first, just in case.

  10. Rob in CT says:

    @tps:

    Right. I’m suggesting that we might not have to keep an eye on *everything* if we weren’t quite so engaged all over the world.

    That might ease the flow of info a little bit. It wouldn’t change the nature of the problem, certainly.

  11. A funny thing happens with very large data sets. It actually becomes less necessary to completely review it all in order to return a significant answer. Google’s top page for a search is often useful, even though no human ever graded it.

    Basically, you are telling us that Intel still operates on the Yahoo model, with curation rather than statistical significance. That’s actually old news. I remember a story that FBI agents could only search on one key, post-9/11. They were talking about adding a second search key, rather than just dumping it into a secure-clearance Google pool.

    Put another way, if a soldier gets a Siri style adviser, expect her to be running on correlations, and not full comprehension of all available intel.

  12. @nicolas:

    This problem arises in many industries. Like the one I know of, finance.

    I just wrote of statistical methods. This is the state of the art in finance. There is no AI that can truly read the umpty-ump gigabytes of daily financial text, so what they do is mine for a simple paring: stock symbol and happy/sad words.

    A high speed trading system that finds a stock symbols and “disappoints” will take a fraction of a second short. But it isn’t because it really knows. It can very easily be wrong, as in an article talking about B&N Nook and Amazon Kindle in the same paragraph. Should it short the symbol the least words away from the sad word?

    They are tuning these algorithms as we speak.

  13. Dave Schuler says:

    Acquisition of information: easy. Analysis: hard.

    Another issue is that our system encourages specialists while generalists may make better analysts.

  14. Person of Interest is of course the fictionalized version of this future intelligence.

  15. Liberty60 says:

    @Rob in CT:

    Hmm, so it sounds like you are saying that micromanaging the entire fucking planet is difficult to impossible?

    Or do you just a communist who hates America?

    Its funny, that one of the arguments I used to use against socialism was that managing the entire economy centrally was impossible, even with the perfect set of ideal conditions.

    Sure enough, after the fall of Communism, one of the reports that came out was that the East German Stasi had a file on virtually every single man, woman, and child in the country; everything from your kindergarten grades to who you dated to what books you read was meticulously catalogued and cross checked.

    Yet it was the Stasi most of all who were shocked and suprised at how sudden the end came.
    They simply couldn’t digest or analyze the vast sum of data they had.

  16. Gold Star for Robot Boy says:

    @michael reynolds: Define “chronic.”

  17. Barry says:

    @James Joyner: And throw in the corruption level (probably already endemic, but pushed up in the Bush-Cheney era. On another blog, it was pointed out that the NIE for Afghanistan was released; it was pessimistic. The DoD intelligence estimates are more optimistic.

    And wrong.

    For ~9 years now.

    Dealing with all of the crap, only to have somebody flop a report back on your desk and order a rewrite to please the brass, has got to be a morale killer. And that’s before you get blamed for the rewrite not being true.

  18. Barry says:

    @john personna: “A funny thing happens with very large data sets. It actually becomes less necessary to completely review it all in order to return a significant answer. Google’s top page for a search is often useful, even though no human ever graded it.”

    The point with Google is that there’s vast human input. People are constantly search and making choices from the results. And Google has a massive amount of data to pull from, and massive breadth – they can tie together results which would be compartmentalized in the intelligence world.

  19. Barry says:

    @john personna: “A funny thing happens with very large data sets. It actually becomes less necessary to completely review it all in order to return a significant answer. Google’s top page for a search is often useful, even though no human ever graded it.”

    Addendum – this statement is only true for some technical meanings of ‘large’, and only under certain circumstances.

    “,,,so what they do is mine for a simple paring: stock symbol and happy/sad words.”

    Yes, leading to the Ann Hathaway/Berkshire Hathaway effect.

  20. Barry says:

    @Dave Schuler: “Another issue is that our system encourages specialists while generalists may make better analysts. ”

    I’ve heard it the other way; generalists get promoted, while specialists are stuck in place. The end result is a management of generalists.