Machine Bias

https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

I just wanted to bring the above article into our discourse.  I apologize for adding it so late in the week.

It covers a set of algorithms designed to estimate recidivism risk in convicted criminals for consideration in sentencing and parole calculations.  Without knowledge of a subject’s race, nor by asking questions to determine a subject’s race, the algorithm has been shown to significantly overestimate recidivism for black convicts while significantly underestimating the same for white convicts.

The fact that the algorithms were designed and owned by a private company brings to light all we are reading and thinking about algorithmic transparency, the economic motivations to resist said transparency, and how bias can be perpetuated more easily without it.

3 thoughts on “Machine Bias

  1. Sabina Pringle (she/ella)

    Hey Rob, thanks for this troubling article. When a judge assesses risk based on “criminogenic needs” and other factors determined by a statistician in a commercial venture, we are in serious trouble. It’s extremely hard for a judge – or anyone – to predict the risk of an offender committing future crimes, so using algorithmic risk-assessment tools is a cop out on the part of the judge. It’s interesting to note that Broward county chose COMPAS because it produced “simple yet effective charts and graphs for judicial review.” User-friendly with an attractive interface? And it costs Broward county about $22,000 a year. Far more expensive and dangerous than a crystal ball.

Comments are closed.