
A widely used commercial software program is no better–and no less biased–than untrained human judges at predicting criminal recidivism rates, finds research in Science Advances. Researchers asked 400 online participants to read about a real criminal case, and using only seven pieces of information (such as sex, age, and previous convictions) decide whether the person was likely to reoffend. These untrained raters accurately predicted recidivism 67 percent of the time, compared with a 65 percent success rate for the commercial system that many jurisdictions use to help make bail and other judicial decisions. Both human participants and the software system were also more likely to incorrectly predict that black offenders would reoffend more than white offenders, even when not directly told the offender’s race. The researchers suggest this may be because a combination of other factors, including the number of previous convictions, acts as a proxy for race in both human and computer judgments. (Monitor on Psychology)
Susie Bean Gives