I just wanted to bring the above article into our discourse. I apologize for adding it so late in the week.
It covers a set of algorithms designed to estimate recidivism risk in convicted criminals for consideration in sentencing and parole calculations. Without knowledge of a subject’s race, nor by asking questions to determine a subject’s race, the algorithm has been shown to significantly overestimate recidivism for black convicts while significantly underestimating the same for white convicts.
The fact that the algorithms were designed and owned by a private company brings to light all we are reading and thinking about algorithmic transparency, the economic motivations to resist said transparency, and how bias can be perpetuated more easily without it.