What if Google ruled North Korea?
Wired's vision of a surveillance utopia is basically that
Kevin Kelly of Wired magazine wants us to welcome the future, where the ubiquity and power of surveillance technology make our lives completely transparent to society, government, and even ourselves. It will be fine, he says, if we just use the same technology to watch the watchers:
Everything that can be measured is already tracked, and all that was previously unmeasureable is becoming quantified, digitized, and trackable.
We're expanding the data sphere to sci-fi levels and there's no stopping it. Too many of the benefits we covet derive from it. So our central choice now is whether this surveillance is a secret, one-way panopticon — or a mutual, transparent kind of "coveillance" that involves watching the watchers. The first option is hell, the second redeemable. [Wired]
Where to begin? First, not everything that can be measured is tracked. Lots of crunchable data about our lives is not totaled in a margin anywhere — we are not actually plugged in to the Matrix yet. We still make cash transactions, for instance, and I highly doubt anyone could establish to a judge's satisfaction whether I took a walk around my town today. Assuring us that everything measurable is "already tracked" is a rhetorical trick meant to quell any objections to doing so in the future.
Second, Kelly's assertion that "all that was previously unmeasureable is becoming quantified, digitized, and trackable" is false. Almost "all" that was previously unmeasurable will remain so simply because most of human experience doesn't actually translate into something you put in Mongo databases.
But far worse is Kelly's simple fatalism. There's no stopping this, says Kelly, because mass surveillance "is the bias of digital technology." Wrong again. Digital technology doesn't have biases; it does what we program it to do. The desire for surveillance is as old as espionage itself. A real bias towards mass surveillance of society exists, but it comes from a socio-political context geared toward what James Burnham called "managerialism," the mid-20th century ideology that promised the rule of experts. Big data just gives managerialism a steroidal strength and rage. It also gives the digirati and their apologists a leg up on sociology majors in the scramble for managerial authority.
Let's try Kelly's rhetoric by applying it to another technology: Firearms technology means we will all be armed with semi-autos with the safety off, because we covet protection and "that's what the bullets want?" Does anybody buy that?
But it actually gets even more ludicrous. He continues,
We shouldn't be surprised by this bias because transparency is truly ancient. For eons humans have lived in tribes and clans where every act was open and visible and there were no secrets. [Wired]
That's right, Kelly argues that not-knowing-everything about everyone is historical novelty. Privacy and discretion are barely older than Pogs when you think about it, and just as unnecessary. No one in those long ago eons closed their tents, really, or spoke to each other out of earshot of others. Our "bias" towards mass surveillance and a wealth of data about not just everyone in our local tribe but on planet earth is driven by an evolutionary desire to return to a pre-civilizational state of privation — who knew?
Occasionally, Kelly just asserts things about the future where we're all monitored and monitoring the monitors.
The cold justice of every tiny infraction by a citizen, whether knowingly or inadvertent, would be as inescapable as the logic of a software program. Yet we need the humanity of motive and context. One solution is to personalize justice to the context of that particular infraction. A symmetrically surveilled world needs a robust and flexible government — and transparency — to enforce adaptable fairness. [Wired]
In other words, there will be some serious debugging to do. But the part about every tiny inadvertent infraction being coldly logged and infallibly punished is a comfort. Whereas under constitutional government we aimed for an approximation of law and order, and sent police to the sites of breakdown, in the "coveillance" future, we will seek perfect obedience at all times through harmonization with an omnipresent, omnicompetent law enforcement system. It's New Jerusalem on a pentium chip, or North Korea with hip multi-colored Google as Supreme Leader. At The New Atlantis, Alan Jacobs writes that Kelly had asserted a new theology, one that demands we desire such a future. "[The] one remaining spiritual discipline in Kelly's theology," Jacob writes, "is learning to love Big Brother."
Kelly's vision of the future is more straightforwardly dystopian than any of the other controversial visions of a libertarian "opt-out" society emanating from Silicon Valley. He believes human life can, and ought to be, reduced to data points, that people will consent to be ruled like a mere cell in an Excel Spreadsheet and will even begin to understand themselves in the rudimentary categories of computation.
His essay is an occasion to remind ourselves that the future doesn't unfurl out of present trends extrapolated into infinity; it is contested. And we have older, rather durable technologies that can prevent us from being dropped into a digitized fishbowl: the art of political resistance for one. Our Constitution, another.