Precognition software used to predict which prisoners will murder

Minority Report style systems might seem creepy, but they're no more flawed than human parole officers

Topics: precognition, Murder, Prison System, criminal, minority report, ,

Precognition software used to predict which prisoners will murder (Credit: Joe Belanger/Shutterstock)

When a prisoner goes on parole, a parole officer determines the level of supervision the individual requires based on the perceived likelihood of his committing a crime again. In a number of states, these determinations are being increasingly taken out of the flesh and blood hands of parole officers, settled instead by algorithms.

Precognition software, already in use in Baltimore and Philadelphia, determines how likely a prisoner is to commit murder and thus how much parole supervision he should receive. Wired explains how the algorithm works:

To create the software, researchers assembled a dataset of more than 60,000 crimes, including homicides, then wrote an algorithm to find the people behind the crimes who were more likely to commit murder when paroled or put on probation. Berk claims the software could identify eight future murderers out of 100.

The software parses about two dozen variables, including criminal record and geographic location. The type of crime and the age at which it was committed, however, turned out to be two of the most predictive variables.

Of course, the software produces false positives and will thus condemn a number of parolees with no likelihood of committing murder to heavy supervision. This sort of Minority Report-reminiscent precognition system rightly raises concerns about criminal profiling based on all too few variables. Shawn Bushway, a professor of criminal justice at the State University of New York at Albany, noted the software could result in “punishing people who, most likely, will not commit a crime in the future.”

You Might Also Like

However, in this instance directing concerns at the technology itself is misplaced. This would assume that parole officers don’t also base their decisions on which parolees remain “criminal risks” on limited, flawed and highly problematic variables, and also punish people accordingly. As the software’s creator told Wired, the algorithms simply replace the ad hoc decision making done by parole officers. As long as parole boards employing the software don’t view it as some perfect predictor of criminal futures, the precognition technology is no more troubling than the vagaries of human decision making in our prison systems.

Natasha Lennard

Natasha Lennard is an assistant news editor at Salon, covering non-electoral politics, general news and rabble-rousing. Follow her on Twitter @natashalennard, email

More Related Stories

Featured Slide Shows

  • Share on Twitter
  • Share on Facebook
  • 1 of 11
  • Close
  • Fullscreen
  • Thumbnails

    Cities without landmarks

    Slide 1

    Niagara Falls, U.S./Canada


    Cities without landmarks

    Slide 2

    Sydney Opera House, Sydney, Australia


    Cities without landmarks

    Slide 3

    Mount Rushmore, South Dakota, U.S.


    Cities without landmarks

    Slide 4

    Eiffel Tower, Paris, France

    Robert R.,

    Cities without landmarks

    Slide 5

    Colosseum, Rome, Italy


    Cities without landmarks

    Slide 6

    Taj Mahal, Agra, India

    Sergio Coelho,

    Cities without landmarks

    Slide 7

    Siena Cathedral, Siena, Italy


    Cities without landmarks

    Slide 8

    Christ the Redeemer, Rio de Janeiro, Brazil


    Cities without landmarks

    Slide 9

    Arc de Triomphe, Paris, France


    Cities without landmarks

    Slide 10

    Lost City of Petra, Jordan

  • Recent Slide Shows



Comment Preview

Your name will appear as username ( settings | log out )

You may use these HTML tags and attributes: <a href=""> <b> <em> <strong> <i> <blockquote>