Last month, when asked about the possible future use of AI-driven “smart machines” in courtroom fact-finding, Chief Justice John G. Roberts Jr. said that day is already here.
“And it’s putting significant strain on how the judiciary goes about doing things,” he said.
Here’s an example: Eric L. Loomis was sentenced to 6 years in jail based in part on a report compiled by Compas, a software program sold by for-profit company Northpointe Inc.
Judges do not have access to the Compas formula, which according to the company is a “trade secret.”
The Compas report is kind of like a poorly informed credit score – only it’s used to make decisions about a person’s liberty. The report takes into account several factors about a person’s life before assigning a “risk score,” which predicts how likely that person is to commit future crimes.
The Compas report on Loomis showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge told Loomis that the assessment identified him as “an individual who is a high risk to the community.”
Loomis rightly argued that his right to due process was violated by the judge’s consideration of the report, which was generated by a secret algorithm. The Wisconsin Supreme Court ruled against Loomis, arguing that he would have received the same sentence with or without the Compas report.
Justice Ann Walsh Bradley seemed uncomfortable with the decision, and brought the court’s attention to a ProPublica study that was conducted to judge the effectiveness of Northpointe’s algorithm.
ProPublica obtained the risk scores for 7,000 people who had been arrested in Broward County, Florida. Over the next two years, only 20% of those who had been predicted to commit a violent crime actually did so. ProPublica said the algorithm was “remarkably unreliable.” Even worse, the program was twice as likely to falsely flag black defendants than white defendants.
“This study and others raise concerns regarding how a Compas assessment’s risk factors correlate with race,” wrote Bradley. Even so, she continued to allow sentencing judges to utilize the software because it could be helpful “in providing the sentencing court with as much information as possible in order to arrive at an individualized sentence.”
Northpointe disputes the results of the study, but refuses to release its algorithm. So what this means is that a computer is using a secret, objectively racist formula to incorrectly classify individuals. Courts know this, but are still using it to influence judges’ decisions.
Wisconsin Attorney General Brad D. Schimel, who urged the United State Supreme Court not to hear Loomis’s case, says the “use of risk assessments by sentencing courts is a novel issue, which needs time for further percolation.”
How many more people will receive unjust sentences in the meantime?
Editor’s note: This very simple folks, the (much simplified) basis of AI in its current state is the “neural network” which creates a series of weights based on the inputs. If race is an input, then it is a factor. So yes, this software is indeed racist.
I recently attended an artificial intelligence conferene where Northpointe was discussed as an example. It was indicated that race was indeed an input factor.
Want to check? Very easy – take this model with its current training to Moscow, or Nairobi or Beijing and see if it still works. Bet it won’t. Bottom line is that race may correlate to other socioeconomic factors in a given area. Race itself is not the factor.
After thinking about this a bit, there are a lot of factors that could play into this. Income levels, number of clubs, number of churches, kinds of businesses, vacancy rates, concentrations of arrests, job history, parental and sibling relationships, change of residence after incarceration, etc.
Unfortunately this data is “expensive.” Meaning, let’s build a racist model that judges humans in a racist manner rather than do the work to get it right using actual factors that encourage crime.
Sorry folks, but I’m a technologist by training and it really irks me when technology gets built with such tragic and obvious flaws.