The NSA's big problem, explained by the NSA

June 19, 2014, at 9:17 AM

Amongst the new trove of classified documents released by Der Speigel is a rather academic discussion, in the NSA's own foreign affairs journal, about the differences between American signals intelligence collection and German signals intelligence collection.

One passage in particular stands out, as it highlights how the Germans give far more weight to privacy than the NSA does.

For the Germans, "...spam filters are used to process large data volumes. Selected traffic is passed through an automated privacy protection system, ensuring analysts cannot view German protected traffic. On-site BND analysts then manually assess all selected traffic to determine intelligence value." (BND refers to the German foreign intelligence agency, the Bundeshachrichtendienst.) It continues:

NSA analysts discussed NSA's "hunt versus gather" philosophy, our multi-stage selection and filtering process, and the evolution of DNI targeting systems from GRANDMASTER to WEALTHYCLUSTER, and in the future, TURMOIL. BND appeared especially interested in the TURMOIL approach of scanning and making judgments at the packet level prior to sessioning.

...NSA and BND use opposite selection and filtering approaches. Where NSA primarily relies on equipment for selection ... and analyst minimization for privacy protection, the BND relies on analysts to manually scan traffic for selection, and then equipment to filter data for privacy protection.

Full use of NSA DNI processing systems and technologies at the JSA will be key to influencing the BND to alter their strategic DNI approach.

Let me translate. The NSA has chosen the "gather" philosophy, which means they collect as much information as possible, from everywhere and anywhere, use sophisticated technology and analytics to figure out what's important automatically, determine targets and selectors by algorithms and metrics, and then, with the data that has "won" this Darwinian competition to get to the human analyst's desk, determines whether the target is legitimate.

BND, by contrast, employs privacy protection technology before raw intelligence reaches an analyst. The analyst is therefore only seeing the data cuts that do not meet pre-determined criteria for being screened out. In real terms, that means that the BND programs in a bunch of domestic German domain names and IP addresses into its system and asks the system to delete the "digital network information" — metadata and such — that matches them.

The NSA's cable taps do this a little bit. But the agency errs on the side of gathering information. It empowers (and obligates) the analyst herself to determine, using both her brain and other data, whether a selector, a target, is legitimate and foreign.

The BND approach, if adopted by the NSA, would probably reduce the number of so-called over-collects — where U.S. persons' information is inadvertently ingested and stored. The BND does not store the content that it deletes.

The NSA justifies its approach, generally, by pointing to the scope of its mission. It would be, the agency believes, far too time intensive to manually assess traffic for intelligence value. It is much easier to let computers see patterns and make matches — there is just so much damn data — and then "minimize" any domestic data that's collected. The NSA would also say that there is simply no other way to prosecute its mission.

I think the NSA might be correct. But I also see, from this assessment, that other countries have determined that the balance between collection and privacy can tip in the opposite direction and still be steady enough to get the job done.






Subscribe to the Week