When it comes to nation-states, you'd think that the smaller an event is, the more likely the U.S. intelligence community will miss it. If something big is going to happen, lots of moving parts are going to change from a state of rest to a state of action. There will be chatter. There will be activity. The big intelligence dragnet will surely pick up something and send a pulse up the food chain.
But historically, the opposite has been true: Huge events, events that should have been predictable (we think), seem to take us by surprise, over and over. To name a few examples: Nuclear tests in North Korea and India, the 1973 Arab invasion of Israel, the Iranian revolution, the vulnerability of countries to revolution, and now, the passive-aggressive invasion of Crimea by Russian troops all caught America unaware.
The intelligence community will respond that secrecy requires that only its failures be made public, but it's very difficult for intelligence officials to explain how they missed what now appears to everyone to be what was obviously going to happen.
In the spy world, this mechanism, the alerting of policymakers to sudden and imminent changes in the global condition, is called "warning," and its function is to provide decision advantage so that the president can respond and reply more intelligently.
Like many other intelligence disciplines, warning relies on what political scientists like to call "priors." Priors are, basically, biases, or preconditions that influence the confidence one has in making a conclusion. Figuring out the novel from the prior is an art form. In your mind, maybe you're thinking that some NSA or Navy intercept post failed to pick up a change in the traffic pattern used by the Black Sea Fleet, or that maybe U.S. reconnaissance satellites didn't pick up the velocity with which Russian border troops were exercising. Or maybe there was some other failure of technical intelligence.
But these ex post facto explanations fail to account for the challenges of warning. Warning, generally, relies on evaluating patterns of intelligence and integrating them with known priors, some of which are assumptions. Warning is controversial because although its function is the essence of what we expect an intelligence agency to be able to do, getting it right is often beyond the ken of humans. Of all intelligence analytical disciplines, warning requires the most speculation, the greatest degree of insight into the mind of a foreign leader, and, importantly, the most efficient and proper use of all-source technical intelligence to test conclusions.
What is warning? If the CIA and the Defense Intelligence Agency both noticed that the Black Sea Fleet and Russian ground troops were acting as though they were preparing to invade, and this conclusion alone was passed up the food chain, wouldn't this be enough? In other words, the question that comes next — will these moves cross the line between exercise and war? — is much more difficult to answer, and, in any event, it is up to policymakers to use the warning to help forestall what now appears to be inevitable.
That's always the push-back.
Policymaker: Hey, how come you didn't predict that Putin was gonna invade?
DIA: We told you that they were massing forces, that propaganda in Crimea was getting kinda intense, that our intercepts showed an increase in military traffic in the region, that Russian generals were called to a meeting at the Kremlin. With these facts in hand, you had enough time to game out your response either way.
That's not to excuse the CIA and DIA or the Office of the Director of National Intelligence for guessing improperly or to suggest that they apportioned all their priors correctly.
But as we're learning, the CIA actually wouldn't hazard a firm guess about Putin's next move, probably because analysts were not confident in making one. The DIA did guess, and it got it wrong. I wonder what the State Department's Bureau of Intelligence and Research projected.
The fact is, intelligence isn't something that's handed to policymakers on a silver platter for them to consume. They are cooks in the kitchen, too. The community gives them guesses and information, and then it's up to those in power to use the information properly. In retrospect, it's easy to say that because of course the KGB-trained Vladimir Putin's aim has been to retake the Crimea and reclaim the territorial glory of the USSR, he would take advantage of the chaos in Kiev to force his way onto the peninsula. It's easy to say that, though, only if you assume a finite universe of possible alternatives.
Certainly, the prospect of an invasion was considered as a potential move. The fact that different agencies concluded differently is not a sign of failure. It's a sign that, given the same information, smart people can come to different conclusions. It's a reminder that intelligence and prognostication are two entirely different disciplines, and that policymakers who expect intelligence analysts to behave like clairvoyants are always going to be disappointed.
If something did go wrong here, if there was a piece of intelligence that was missed, I'm guessing that there was a collection gap somewhere down the line. The U.S. devotes billions of dollars to gathering and analyzing intelligence on Russia, but we only have so many reconnaissance satellites, for example, and there are a lot of priorities that often trump, let's see, a 24/7 overhead view of Russian troop formations. Maybe the U.S. lacks good human intelligence. Maybe the Russians practiced strategic deception, knowing that the CIA or DIA was watching. In these cases, the U.S. can do something: It can spend more money on Russia. It can devote more national technical means to Russia. it can crash together signals intelligence resources more effectively. It can try to use counterdeception more efficiently.
What it can't do is predict the future.