People who tuned in to Sky News' livestream of the Royal Wedding over the weekend got a glimpse of the future of technology: An Amazon service called Rekognition was able to identify celebrities' faces via on-screen captions. Can't place that familiar-looking face? Now you don't have to. How nifty!
But what seemed like a cute addition to the broadcast took a more sinister turn on Tuesday when the ACLU published a report on other ways Amazon is deploying Rekognition. This facial-recognition technology is being hawked to police departments all over the country as a way to cheaply track and catch suspects, and some, such as the Orlando PD, have already begun testing it. An Amazon director has even bragged that the service could be used by Orlando to "find the whereabouts of the mayor through cameras around the city," The New York Times reports.
The adoption of cutting-edge technology by police forces and other arms of government isn't new, of course. But the reports about Rekognition come on the heels of revelations about a broader involvement by the tech world in creating instruments of surveillance and tools for the military, almost like a new "military-technology complex" to replace the military-industrial complex of the 20th century. As technologies like AI and machine learning become more commonplace and sophisticated, consumers are going to have to ask whether they want to support companies who dabble in the business of war and law enforcement.
The big tech companies are already dabbling, however. Recently, it was revealed that there was something of an internal revolt at Google over whether the company should be providing its AI to the U.S. military to improve the capacity of drones to recognize people and objects. Four thousand employees signed a letter asking CEO Sundar Pichai to stop work on Project Maven, the name given to the operation.
Google is not alone. Just this week Microsoft won a contract to provide services to 17 intelligence agencies. It's part of a broader effort in which tech companies are bidding to provide cloud services for the Pentagon, a deal which could be worth billions.
And it's just that tension that is the core of the issue here. For giant companies like Google or Microsoft, interaction with government is almost inevitable. Both companies already provide software and hardware for governments around the world, and do so because government contracts, military ones in particular, provide large and sustainable sources of revenue.
In that sense, the mixing of Silicon Valley and the state is just an extension of the obvious reality that governments need to buy stuff too.
But there is something unique about digital technology that bears reflection. Technologies like artificial intelligence and the cloud enable a whole host of applications: facial recognition, enormous databases of information like DNA or consumer behavior, crime prediction, battlefield analysis, and many more. Some of these uses might be quite harmless, but others are the very stuff of dystopia because what they enable is a level of surveillance that simply wasn't possible before.
Consider facial recognition technology applied at large scale. It would exponentially increase the capacity of police departments or intelligence agencies to track people in any place they might be recorded. If we lived in a perfect world, perhaps this wouldn't be so bad; but in the real world, the potential for misuse is high. In fact, China is already using this kind of technology to track its citizens' movements, assigning them "social scores" that can be dinged for infractions as minor as jaywalking. The U.S. might never adopt this particular system, but it's easy to see how the country could embrace its own unique dystopia. Would America's communities of color, for instance, bear the brunt of government surveillance?
The world is on the cusp of a profound shift in the capacity of both the state and companies to surveil, track, and influence people's lives. Artificial intelligence can be found in drones but also in the technology that decides whether someone gets a loan or gets parole. The responsible use of this incredibly powerful technology must be subject to intense scrutiny and public pressure.
After all, while a tool like facial recognition can be a fun addition to a wedding spectacle, the marriage of digital and the new military-industrial complex could end up being far more sinister.