Microsoft Is Developing An App That Can Predict Crimes Of The Future

Conjuring images of the dystopian short story Minority Report, Microsoft is developing a new program it says can accurately predict which inmates will wind up back in jail within six months of release.
Tooga via Getty Images

Conjuring images of the dystopian short story Minority Report, Microsoft is developing a new program it says can accurately predict which inmates will wind up back in jail within six months of release.

“It’s all about software that has the ability to predict the future,” said Jeff King, a senior program manager at Microsoft, in a webcast with police departments earlier this year that has virtually gone unnoticed. (A YouTube video of the broadcast had about 100 views as of last week.)

King said the program, which plugs standard inmate data into a complex algorithm, works with up to 91 percent accuracy based on historic data. “Does the offender have a gang affiliation? Are they part of a rehab program? How many violations do they have in jail? How many hours in administrative segregation?,” King explained in the webcast. “Things like that.”

The message: “We are watching you, and if you slip we will come down on you extra hard.”

In principle, he said, the software isn’t unlike what the company is doing with Halo games on its X-Box console, when it predicts how a user is going to react to the game based on how they’ve responded in previous similar situations.

A Microsoft representative cautioned in a statement to Fusion that the program is still in the development stages, and that the inmate population data used in the model was fabricated to provide a “proof of concept.”

But the back-to-jail product, whatever it ends up being called, is only the latest utterance in the realm of predictive policing: when police use historical data to identify criminal patterns, and in some cases take preemptive measures to stop a crime from happening. Civil-rights groups recognize the promise of the technology, but worry that what sociologist Gary Marx called the “tyranny of the algorithm” will lead to a categorical suspicion on certain neighborhoods or people who share a set of characteristics.

“Every way of looking is also a way of not looking,” said Marx, professor emeritus of sociology at the Massachusetts Institute of Technology who is at work on a book about the shortcomings of data to solve the world’s problems. Putting too much faith in the technology, he warned, could make police lazy in gathering human intelligence. Over time, it could present a “fallacy of confusing data with knowledge,” Marx argues in his forthcoming book.

A large issue is what gets done with the algorithm’s results: would the person get more social services, or would they get more surveillance and harsher prison sentences, asked Jay Stanley, a senior policy analyst at the American Civil Liberties Union. Humans must remain a vital part of the process. (One only has to look at the destructive long term effects of mandatory sentencing laws, which removed discretion from sentencing for some crimes, to see what happens when humans are removed from a crucial step of the justice system.)

Across the U.S., major cities from New York and Atlanta to Los Angeles are already using data-heavy predictive policing software to make educated guesses about future crimes, to varying degrees of success.

In Chicago, the city has listed 400 residents on a “heat list” of “potential victims and subjects with the greatest propensity for violence,” based on “certain actions and associations within an individual’s environment.” The commander in charge of the program once said it “will become a national best practice.”

As part of the program, Chicago Police knock on the doors of those featured on the list. The message: “we are watching you, and if you slip we will come down on you extra hard.”

“My fear is that these programs are creating an environment where police can show up at anyone’s door at any time for any reason,” Hanni Fakhoury, staff attorney at the Electronic Frontier Foundation, a digital rights group based in San Francisco, said of the Chicago program earlier this year.

In Kansas City, Mo., a smaller but similar program seemed to be making a dent in the homicide rate over recent years until a recent spate of homicides happened in places and situations that defied the data pattern police were tracking. Instead of known criminals killing each other, a series of domestic disputes ended in bloodshed, along with several arguments outside of bars or clubs.

“When you break the homicides down, there is not an organized group doing anything. There is not a drug or gang nexus,” Kansas City Police Chief Darryl Forté told the Kansas City Star of the data’s inability to predict the homicides. “If you look at these individual incidents, how do you stop someone from killing their girlfriend when nobody outside of them knows that there is a problem? How do you keep someone from killing three people because they are jealous?”

The Chicago and Kansas City police departments did not respond to requests for comment on these programs and their effectiveness.

“We have to think very carefully about what the roles for this technology are,” said Stanley, of the ACLU. “The data that goes into these algorithms needs to be open and transparent, and the outcome of what gets done with it needs to be closely examined, especially when we’re talking about corporate, proprietary software that nobody knows how it works.”

Though it was originally published as a short story by Philip K. Dick in 1956, the 2002 film version of Minority Report came at a time where advances in technology allowed us to actually envision a future where we might joust with the ethical and moral issues at play throughout. As 2015 draws to a close, we seem to standing on the precipice of that cliff.

Printers

7 Household Items That Pose Privacy Problems

Popular in the Community

Close

What's Hot