Block on Trump's Asylum Ban Upheld by Supreme Court
Some things are true even if you can't explain them.
It's true of a sixth sense and, apparently, the ability to predict recidivism. According to researchers, random people can predict a defendant's likelihood of reoffending better than a computer.
It's another confirmation that people really do have brains and that computers don't. Who'd have thought?
Published in Science Advances, the study found that people who took an online survey accurately predicted recidivism at least 67 percent of the time. The smart software scored about 65 percent.
It's just one study, but a humbling result for COMPAS, also known as Correctional Offender Management Profiling for Alternative Sanctions. It has been used to assess over 1 million offenders in bail, parole, and sentencing decisions since 1998.
The software program uses 137 pieces of information to make a prediction. But the study showed the same level of accuracy with only two variables: a defendant's age and number of prior convictions.
For the online survey, respondents saw a defendant's sex, age, and previous criminal history. In more than two thirds of the cases, they accurately predicted that defendants would commit another crime within two years of their last conviction.
The participants did not know the race of the defendants, but their accuracy of predictions was higher with black defendants. According to the study, the respondents correctly predicted recidivism 68.2 percent of the time for blacks.
However, the rate of false predictions -- false positives -- was higher for blacks than whites. COMPAS was wrong 40 percent of the time for blacks and 25 percent for whites, while the survey respondents were more accurate overall.
Attorneys and others have challenged COMPAS in the past. ProPublica, an investigative journalism group, called its predictive ability "somewhat more accurate than a coin flip."
The organization found in Broward County, Florida, that COMPAS was right about 61 percent of the time. However, the journalists said, the algorithm was twice as likely to predict black defendants as "future criminals" as white defendants.