In The Minority Report, novelist Philip K. Dick spun a fanciful tale (later turned into a movie by Steven Spielberg) of a futuristic society practically devoid of crime. With the help of seers (known as precogs) who can spot criminal acts before they happen, the officers behind the PreCrime police force reduce felonies by 99.8 percent, and New York City goes murder-free for five consecutive years. Of course, the police have to arrest and imprison criminals before the thought of a misdeed even occurs to them. But who could argue with the results?
Does that scenario sound far-fetched? It shouldn't. While police departments today aren't even close to eliminating crime altogether, they are developing something akin to digital versions of precogs.
Thanks to innovations in data analysis and surveillance technology, law enforcement officials are increasingly able to predict who will commit crimes, when they will be committed, and where — long before they have occurred. And as these technologies become more widespread among law enforcement agencies, they're raising some serious questions about the implications of pre-emptive policing.
Here, a few of the most cutting-edge predictive technologies being deployed across the country:
San Francisco has installed surveillance cameras in its subway system that can purportedly identify criminals or terrorists based on their actions, before any crime is perpetrated.
The cameras are pre-programmed with a list of "normal" behaviors. Once they detect anything "unusual" (such as a person loitering, making sudden movements, entering a restricted area, leaving luggage in a crowded area, or groups of two or more approaching a solitary individual), guards are alerted via phone call or text message. Capable of tracking up to 150 suspects at a time, the cameras are also able to learn by building a memory of suspicious behavior that they use to determine potential criminal activity in the future.
In 2010, a similar system was installed in East Orange, N.J. It reduced police response times to mere seconds. More recently, those cameras have been outfitted with a red spotlight that will shine on the potential offender to signal that they are being watched and recorded.
Soon, surveillance cameras could become even more powerful with the addition of facial recognition technology. The Department of Homeland Security (DHS) is at work on BOSS (Biometric Optical Surveillance System), which uses powerful computers with facial recognition software hooked up to surveillance cameras to automatically find and track targeted individuals within a crowd.
Hostile intent scanners
Moving further into the realm of Minority Report, DHS is developing FAST (Future Attribute Screening Technology), a surveillance system designed to sense malicious intent by remotely reading changes in an individual's vital signs like heart rate, temperature, and respiratory patterns, as well as physical cues like body language and eye movements. DHS believes these physical cues belie mal-intent, similar to the underlying assumptions behind a polygraph, and claims the system yielded a 70 percent accuracy rate in lab tests.
On the data-driven side, an increasing number of police departments around the country, including Los Angeles, Washington, D.C., and Philadelphia, are using predictive analytics to stop crime before it happens. This computer software sifts through thousands of criminal cases and examines variables like geographic location, criminal records, and time of attack to predict where and when future crimes will be committed and by whom.
While it may not be the Precrime Squad, the Virginia State Police used predictive analytics in the summer of 2011 to determine where the culprit behind a string of shootings would most likely strike next and when. Lying in wait at Arlington National Cemetery, police stopped Yonathan Melaku who was behaving erratically and found to be carrying a backpack filled with shell casings and homemade bomb-making materials. He eventually pled guilty and was sentenced to 25 years in prison.
While these technologies may be powerful law enforcement tools, they also raise serious concerns.
In an analysis of pre-crime police tactics and their effects in Europe, University of Brussels law professor Paul De Hert and researcher Rosamunde Van Brakel worry that the combination of pre-crime police tactics and modern surveillance technology "undermines the presumption of innocence." In other words, rather than simply deterring crime, law enforcement agencies are now seeking to actively pre-empt it, fundamentally altering the interactions between the police and citizens.
"Mass surveillance promotes the view…that everybody is untrustworthy," criminology professor Clive Norris testified to the U.K.'s House of Lords Selected Committee on the Constitution. "If we are gathering data on people all the time on the basis that they may do something wrong, this is promoting a view that as citizens we cannot be trusted."
Advocates of pre-crime technology, however, downplay this new mentality and insist that technology like predictive analytics is worth it.
Zach Friend, a crime analyst with the Santa Cruz Police Department, which has seen a 19 percent reduction in crime since implementing predictive analytics software, defends the technology, stating, "It doesn't replace what [cops] do. When they get into those locations, they still need to be good cops." Unlike the precogs in Minority Report, predictive technology doesn't automatically lead to arrests, but rather guides law enforcement agents to potential crime scenes and points to who should be examined more closely.
Yet, even take if we take Friend's defense at face value, these technologies still have ramifications for reasonable suspicion, much less probable cause.
Because computer software is pre-programmed with notions of what constitutes "suspicious behavior," De Hert and Van Brakel write that "questions need to be addressed about what is 'normal' and 'abnormal' behavior." Without a public debate and transparency on how police pre-determine suspicious behavior, these programmed machines still contain the same biases and prejudices that humans have.
When police searches are based on a computer algorithm, issues of transparency and accountability inevitably arise, and in the event, that the case is challenged in court, "How do you cross-examine a computer?" asks Andrew Guthrie Ferguson, a law professor at the University of the District of Columbia.
Defenders push back, however, and say these technologies can actually bolster accountability and help improve privacy.
"You can pinpoint the record of who has access to information, you have a solid history of what's going on, so if someone is using the system for ill you have an audit trail," says Mark Cleverly, the head of the IBM unit for predictive crime analytics, which has worked with police departments around the world.
Dismissing concerns over pre-crime technology's potential to create a police state like Minority Report, Cleverly tells AFP, "It was a great film and great short story, but it's science fiction and will remain science fiction. That's not what this is about."
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- Why Pakistan won't hunt down the terrorists within its borders
- 43 TV shows to watch in 2014
- What would a U.S.-Russia war look like?
- Sorry, GOP, tax cuts don't pay for themselves
- How academia's liberal bias is killing social science
- How to be the most productive person in your office — and still get home by 5:30 p.m.
- Pope Francis' American problem
- 7 grammar rules you really should pay attention to
- Why the Sony hack changes everything
- Alien conspiracy theorists think the government is on the verge of spilling big secrets
Subscribe to the Week