Amazon Is Selling Cops Its Facial Recognition Tool. That’s A Bad, Potentially Racist Move.

Facial recognition software often does not properly identify darker-skinned people.
Joe Raedle via Getty Images

Law enforcement agencies are using Rekognition, Amazon’s facial recognition system, to identify people and track them in real time. This isn’t a case of an outside party making opportunistic use of an emerging technology. The American Civil Liberties Union of Northern California recently obtained documents that show just how closely Amazon is working with cops around the country to implement its product.

In one email, an account manager for Amazon Web Services eagerly offered up his or her services to a Washington County, Oregon, employee: “I am the Account Manager for AWS covering Oregon, and I noticed that you were leveraging our new Rekognition service. Because the service is so new, we are reaching out to customers to make sure they get all the support they need to succeed with their particular use case.”

As the people “leveraging” the Rekognition technology may or may not realize, facial recognition software tends to be racist.

The artificial intelligence-powered system can analyze faces and almost immediately run them through larger databases featuring tens of millions of faces to produce a similar result. Amazon sees police use of its tool as “common” and describes the technology as an “easy and accurate way” to monitor public spaces and identify “people of interest.” Law enforcement agencies in Orlando, Florida, and Washington County have praised the software.

Orlando has the ability to identify faces in real-time via a network of cameras located across the city. Washington County has taken at least 300,000 mugshots and built a database to use with Rekognition.

But the technology is fallible. It often does not properly identify darker-skinned people. The Rekogniton software can identify “all faces in group photos, crowded events, and public places such as airports.” It can also recognize up to 100 people in a single image. Such broad capabilities, when coupled with balky technology, could lead to the wrong people being targeted during investigations.

Research from Joy Buolamwini of M.I.T. Media Lab found that three AI-powered facial recognition systems — Microsoft, IBM, and Face++ — identified men more accurately than women and performed better on lighter-skinned faces than those with dark skin despite having fairly high accuracy overall.

A 2016 study from Georgetown Law’s Center on Privacy and Technology found that over 117 million American adults are included in facial recognition databases used by law enforcement agencies. The databases have a disproportionate effect on African-Americans, said Clare Garvie, one of the lead researchers, at the time. The software less accurately identifies black people even though they are overrepresented in the databases.

“Local police and the federal government have a history of surveilling social movements ― most notably COINTELPRO, a civil rights era ploy on the part of the FBI to stifle progressive organizations and black social movements.”

“If police are looking for an African-American suspect, they may miss even if that person is in their database — it may not find that person,” she said. “But these systems are not designed to give no for an answer. They’re designed to give a list of possible matches. So if they don’t find the right person, they provide a list of the wrong people — and that will happen more with African-Americans.”

Academic research has not been conducted on the accuracy of Amazon’s product, but history tells us what happens when black people are thrown into the panopticon. Local police and the federal government have a history of surveilling social movements ― most notably COINTELPRO, a civil rights era ploy on the part of the FBI to stifle progressive organizations and black social movements.

In an August 2017 report, the FBI claimed that “black identity extremists” concerned with “alleged police brutality” were likely to take up arms against law enforcement. The Los Angeles Police already monitors the “suspicious activities” of civilians through the Nationwide Suspicious Activity Reporting Initiative. An inspector general’s audit of the program in January 2015 found that more than 30 percent of the reports involved black residents despite black people making up only 9.6 percent of the city’s population. Baltimore’s police department tracks the cell phone use of residents and films their movements using drones.

Border Patrol has expressed a desire to build face-recognizing drones and have tested facial scans at airports to track undocumented immigrants who may have overstayed their visas. (Just over 1 percent of all travelers overstayed their visas in 2016.) And former Congressman Jason Chaffetz (R-Utah) also proposed using the software to track undocumented immigrants.

“We already know that facial recognition algorithms discriminate against Black faces, and are being used to violate the human rights of immigrants,” said Malkia Cyril, executive director of the Center for Media Justice, in a press release.

“We know that putting this technology into the hands of already brutal and unaccountable law enforcement agencies places both democracy and dissidence at great risk,” she added. “Amazon should never be in the business of aiding and abetting racial discrimination and xenophobia — but that’s exactly what Amazon CEO Jeff Bezos is doing when he sells these loosely regulated facial recognition tools to local police departments.”

Popular in the Community

Close

What's Hot