I've been thinking about Robocop lately.
In that landmark 1987 dystopian satire by Paul Verhoeven, the titular character is actually the scrappy B-team's alternative to the main crime-fighting product being developed by the corporation: a terrifying fully-automated peacekeeping machine with obvious military applications being test-driven against America's criminal class.
When we meet the ED-209 enforcement droid, though, it fails its first test, in the corporate boardroom. An employee is told to threaten it with a gun, but when the employee puts the gun down, the robot fails to recognize that fact, and obliterates the innocent man in a barrage of automatic weapons fire.
Its programming was faulty, and so it was rejected in favor of an alternative, Robocop, which was at least semi-human. Because you couldn't put an indiscriminate killing machine on the streets, not even in a dystopian future Detroit. The public wouldn't stand for it.
Watching the dash cam video of the shooting of Philando Castile, a black motorist killed by a cop in Minnesota, I can't help wonder whether we wouldn't accept the ED-209 today. Indeed, in one particular way, I wonder whether we might be better off: If the ED-209 were to malfunction spectacularly, everyone would see the inadequacy of responding by putting the robot on trial.
St. Anthony Police Officer Jeronimo Yanez is clearly responsible for his actions on July 6, 2016, in both a moral and a legal sense. But from watching the footage of the shooting, it's plain that his actions sprang not from malice or cruelty but from pure, blind panic — a panic that his partner did not participate in, and for which no adequate justification has been provided.
As David French has argued in a pointed criticism of the verdict in the case, irrational panic is not supposed to be a legal defense against culpability. But by the same token, it's not hard to understand why it's hard to convince a jury beyond a reasonable doubt that an officer's fear was unreasonable, which is the standard for criminal conviction. In a strict sense, juries are reluctant to acknowledge themselves the peers of an accused officer in such a circumstance, and so to pass judgment on that officer's judgment, however fatally poor.
But what if officer Yanez were a robot?
In that case, there would be no question of morally identifying with the officer, or a juror questioning whether he or she could really know what it's like to be in that situation. The case would likely be open and shut. It just wouldn't be a case against the robot, but against the manufacturer, who put an incredibly dangerous machine on the street without properly testing whether it functioned properly.
Bluntly, if officer Yanez were a robot, the corporation responsible for building him would be staring at a massive lawsuit, and a very expensive product recall. Why isn't the same thing true of the St. Anthony Police Department?
Yes, there's a legitimate question of justice with respect to Officer Yanez's culpability. But the far more important question of justice relates to how someone who would malfunction this badly could have wound up in his position in the first place.
Think of the question in terms of incentives and deterrence. Who really needs to be deterred from doing something that could get an innocent person killed — the cop who might panic? Or the people who trained him, vetted him, assigned him, supervised him, and run his department? Who actually has the power to improve outcomes in these kinds of situations?
The investigations of the St. Anthony Police Department since the shooting have focused on the question of racial bias. But the miscarriage of justice is not primarily that a disproportionate number of African Americans are shot by panicky cops, but that anyone was shot while plainly complying with an officer's orders. If suspects' lives matter, then an outrageous failure of training that gets an innocent person killed should automatically trigger questions about that training and the management of the department, questions that could lead to serious consequences for the department and even the city as a whole.
There are genuinely bad apples out there — corrupt cops, brutal cops, deeply racist cops. They are a small minority of our nation's law enforcement personnel, and many of the cops who are being acquitted in killing after killing after killing are being acquitted substantially because they do not fit that profile. But that doesn't make the people they killed any less dead, or their deaths any less unjust.
If it was robot cops killing innocents, we wouldn't be focusing on removing bad apples. We be focused on rewriting bad apps.