News

Starr critical of risk assessment scores for sentencing

July 20, 2016

Some courts use automatically generated risk scores in sentencing and paroling criminals. Generated by algorithms using factors such as prior arrests, types of crimes, and demographic information, risk scores are intended to help judges overcome unconscious bias. But critics like Sonja Starr think they can hide prejudices beneath a façade of computerized objectivity. Starr says that although social science research uses socioeconomic and demographic variables to predict population-level behaviors, these models should not be used to calculate a single person’s future, especially if the variables used are influenced by decades of racial and socioeconomic disparities. It amounts to a computerized version of racial profiling. Starr says she and others are particularly concerned that “every mark of poverty serves as a risk factor” in these assessments.

More about Starr critical of risk assessment scores for sentencing >