« Vacancy On Supreme Court 2017 - Round 2: Report On Interview #4, Mr. William N. Riley | Main | Vacancy On Supreme Court 2017 - Round 2: Report On Interview #5, Hon. Maria D. Granger »

Tuesday, April 18, 2017

Ind. Decisions - Still more on "Wis. Supreme Court to Rule on Predictive Algorithms Used in Sentencing"

Updating this July 14, 2016 ILB post, WIRED has an April 17th article by Jason Tashea headed "Courts Are Using AI to Sentence Criminals. That Must Stop Now." Some quotes:

Algorithms pervade our lives today, from music recommendations to credit scores to now, bail and sentencing decisions. But there is little oversight and transparency regarding how they work. Nowhere is this lack of oversight more stark than in the criminal justice system. Without proper safeguards, these tools risk eroding the rule of law and diminishing individual rights.

Currently, courts and corrections departments around the US use algorithms to determine a defendant’s “risk”, which ranges from the probability that an individual will commit another crime to the likelihood a defendant will appear for his or her court date. These algorithmic outputs inform decisions about bail, sentencing, and parole. Each tool aspires to improve on the accuracy of human decision-making that allows for a better allocation of finite resources.

Typically, government agencies do not write their own algorithms; they buy them from private businesses. This often means the algorithm is proprietary or “black boxed”, meaning only the owners, and to a limited degree the purchaser, can see how the software makes decisions. Currently, there is no federal law that sets standards or requires the inspection of these tools, the way the FDA does with new drugs.

This lack of transparency has real consequences. In the case of Wisconsin v. Loomis, defendant Eric Loomis was found guilty for his role in a drive-by shooting. During intake, Loomis answered a series of questions that were then entered into Compas, a risk-assessment tool developed by a privately held company and used by the Wisconsin Department of Corrections. The trial judge gave Loomis a long sentence partially because of the “high risk” score the defendant received from this black box risk-assessment tool. Loomis challenged his sentence, because he was not allowed to assess the algorithm. Last summer, the state supreme court ruled against Loomis, reasoning that knowledge of the algorithm’s output was a sufficient level of transparency. * * *

The legal community has never fully discussed the implications of algorithmic risk assessments. Now, attorneys and judges are grappling with the lack of oversight and impact of these tools after their proliferation.

Posted by Marcia Oddi on April 18, 2017 12:52 PM
Posted to Ind. Sup.Ct. Decisions