Law: Amazon’s facial recognition software
Amazon is under fire for “essentially giving away facial recognition tools to law enforcement agencies,” said Elizabeth Dwoskin in The Washington Post. In late 2016, America’s biggest e-commerce company introduced Rekognition, an online service that could help companies and other customers identify faces and objects in images in real time. The technology “works through pattern recognition”: Customers put images into a database, and the software uses artificial intelligence to scan other known images for a match. Amusement parks have used Rekognition to locate lost children, and during the royal wedding last month, the U.K.’s Sky News deployed it to identify guests. But Amazon has also been pitching the technology to police departments across the country, said Nick Wingfield in The New York Times. The Orlando Police Department in Florida and the Washington County Sheriff’s Office in Oregon are already customers, and reportedly paying very low costs in exchange for Amazon citing their experience in sales pitches. Last week, more than two dozen civil rights organizations demanded that Amazon stop selling Rekognition to law enforcement, saying it could become “an instrument of mass surveillance.”
“We shouldn’t be using unaudited facial recognition systems on public streets,” said Russell Brandom in TheVerge.com. Even if Rekognition is only used for legal purposes, facial recognition software has long struggled with higher error rates for women and people of color—“error rates that can translate directly into more stops and arrests.” It’s estimated that more than 130 million American adults are already in facial recognition databases. But Amazon has not revealed what testing on false identifications it has done, if any. The potential for abuse is obvious, said Maya Kosoff in VanityFair.com. In China, facial recognition is “widely deployed,” used for everything from ticketing jaywalkers to pinpointing accused thieves in huge crowds. “It’s not hard” to envision how the technology could be used here in the U.S. “in the wake of another deadly school shooting, or under a federal government that’s openly hostile to immigrants.”
I wanted to give police here in Orlando the benefit of the doubt, said Scott Maxwell in the Orlando Sentinel. The chief repeatedly assured me last week that Amazon’s software is only used inside police headquarters, where it is first being tested on officers’ faces. The next day, however, he abruptly “changed his story,” acknowledging that the department has three facial recognition cameras operating in downtown Orlando. I “don’t inherently object to law enforcement using technology—in public places—to fight crime and stop bad guys. But I can’t support any program when I don’t know what’s really going on”—and neither should the public.