Facial recognition is for-profit policing
The only thing standing between you and a jail cell is your ability to prove that you are not the person in a grainy video taken five months or a year ago
On a Thursday evening five months ago, one day before his 42nd birthday, Robert Julian-Borchak Williams was arrested in front of his wife and children at his home in Farmington Hills, Michigan. Asked what crime he was accused of committing, police refused to say. In response to questions about where her husband was being taken, one officer told Williams's wife that she ought to "Google it."
According to The New York Times, Williams was then brought to a police detention center. His mugshot was taken, as were samples of his fingerprints and DNA. He spent the night in jail. It was not until the middle of the next day that he would learn why he was there: because of a computer.
That, at any rate, is the excuse officers made when it became clear that Williams was not the man identified by an employee of a private security corporation, who passed on surveillance footage from Shinola, the Detroit-based luxury goods manufacturer, to the police department, who in turn ran the low-quality image through a database of some 49 million pictures, and on this basis arrested him for allegedly stealing $3,800 of merchandise from a store he had not visited in six years.
I say "excuse" because at no point in the course of the investigation — if that is the right word for what took place here — did officers attempt to verify or even question the identification yielded by the computer system. No one seems to have questioned whether the large Black man in the St. Louis Cardinals hat who appeared in the Shinola footage even slightly resembled Williams; no one bothered to ask in advance of his arrest whether he owned such a hat, whether he had been to Shinola recently, or indeed to ask him questions of any kind, including whether he had an alibi. (He did: it would in fact have taken all of 30 seconds for Williams to prove his innocence beyond any doubt, reasonable or otherwise, via his Instagram feed.)
What happened to Williams — whose case was recently thrown out, albeit without prejudice — is being explained away as the result of a software error. Such an explanation conveniently elides the question of why the case was taken up in the first place and, more important, why it was carried out in this fumbling manner. The algorithm provides cover for the shoddy investigation that would not have been undertaken without it.
This was not police work. Officers in Williams' case played the role of middlemen between two private companies that exist for the sole purpose of profiting from ensuring that men like him can be locked up with horrifying pseudo-efficiency on the basis of a supposedly disinterested technological assessment. This de facto privatization of police work is the logical continuation of the trend that began in 1984, when the hideously named Corrections Corporation of America (now CoreCivic) was given management of a prison in Shelby County, Tennessee. Since then, hundreds of thousands of Americans have lived in the custody of for-profit jailers.
The facial recognition technology that landed Williams in jail for a crime he could not possibly have committed is one of the greatest dangers to peace and justice in this country. It has the potential to be weaponized against the entire population. In the billions of hours of security footage taken in nearly every public location, to say nothing of the virtually limitless number of images of ourselves we have all made available online, it has an infinite amount of material to work with. The only thing standing between you and a jail cell is your ability to prove that you are not the person in a grainy video taken five months or a year ago. Here's hoping you have a tweet that can bail you out.
As I write this, some municipalities are banning the use of facial recognition technology in police work: Oakland, San Francisco, and six cities in Massachusetts, including Boston and Cambridge. This is a good thing, but its value will be limited if state police and federal law enforcement are not similarly restricted, and I, for one, am dubious about the prospects of a federal ban. A proposed moratorium in February went nowhere. The technology is already used to screen passengers for international flights and in countless other situations, many of them presently unknown.
No American should be subject to the techniques being used by the authoritarian regime in China to carry out its depraved campaign of repression against the Uighur minority. Everyone remembers what Ben Franklin said about sacrificing liberty for security. In this case the stakes are even clearer because there is no proposed tradeoff between the two. Instead we are being asked to sacrifice both our freedom and our safety so that robots can put us in jail for no reason.
Want more essential commentary and analysis like this delivered straight to your inbox? Sign up for The Week's "Today's best articles" newsletter here.