Judges Now Using Artificial Intelligence to Rule on Prisoners

07 February, 2018

Machines powered by artificial intelligence, or AI, are increasingly used to help people perform many different jobs. One area where AI is currently being used is in the American court system.

In U.S. courts, defendants appear before a judge shortly after they are arrested. The judge then sets a trial date for the defendant, which could be weeks or months in the future. The judge must decide whether the defendant should remain in jail while awaiting trial or can be safely released until the court date.

Many courts also use a bail system. This is where a judge sets an amount of bail money that a defendant can pay to avoid having to remain in jail while awaiting trial. In this system, judges often set a very high bail amount for defendants they see as having a high risk of not returning for trial.

Artificial intelligence to assess risk

In some American courts, judges are beginning to use AI systems to help decide when – and for how long - criminals should be jailed.

To create the AI system, researchers use computers to analyze data from thousands of court cases. The computers then use that data to predict whether a defendant will commit a new crime or fail to return to court.

One AI system being used by U.S. judges is called the Public Safety Assessment. The tool was developed by the privately-financed Laura and John Arnold Foundation, based in Texas. It says the system is designed to give judges the most objective information available to make fair decisions about prisoners.

State judges in New Jersey are now using the Public Safety Assessment to assist in making pretrial decisions about defendants. Judges in other states have also used the system.

The assessment process begins as soon as a suspect is fingerprinted, with information going into a centralized computerized system. At the first hearing from the jailhouse, defendants appear by videoconference and their risk score is presented to the judge. Defendants with lower scores are often released under court supervision until the next court date.

In this Aug. 30, 2017, photo, probation officer Stephanie Pope-Earley sorts through defendant files scored with risk-assessment software on the first day of the software's use at the Cleveland Municipal Court. (AP Photo/Dake Kang)
In this Aug. 30, 2017, photo, probation officer Stephanie Pope-Earley sorts through defendant files scored with risk-assessment software on the first day of the software's use at the Cleveland Municipal Court. (AP Photo/Dake Kang)

Judge Ernest Caposela told the Associated Press he supports the state's efforts to use technology to provide the best information available to help judges make careful decisions about defendants.

Caposela compared the automatic system to "the same way you buy something from Amazon. Once you're in the system, they've got everything they need on you."

Can data replace judgment?

Some legal experts have praised the system for keeping dangerous people off the streets, while letting other defendants go free if they are not a safety threat.

The AI system also aims to reduce biased rulings that could be influenced by a defendant's race, gender or appearance. The risk factors used in the assessment include age and past criminal convictions. But they do not include race, gender, employment background, where a person lives or a history of arrests.

Some critics say they worry that AI-powered data could end up replacing a judge's own judgment in pre-trial decisions and sentencings.

Kristian Hammond is a computer scientist at America's Northwestern University who co-founded his own AI company. He says the danger is that judges - like all people - may find it easy to drop their own critical thinking skills when presented with what seems like an easy answer.

Hammond told the Associated Press he thinks the solution is to "refuse to build boxes that give you answers." Instead, judges need "boxes that give you answers and explanations and ask you if there's anything you want to change."

The Arnold Foundation makes clear its Public Safety Assessment is only designed for the pretrial process, not for use by judges to decide on actual prison sentences. The group also notes that the workings of the AI-powered system are open to inspection by all. "There's no mystery as to how a risk score is arrived at for any given defendant," foundation official Matt Alsdorf said.

I'm Bryan Lynn.

Bryan Lynn adapted this story for VOA Learning English, based on reports from the Associated Press, the Des Moines Register and other sources. Hai Do was the editor.

We want to hear from you. Write to us in the Comments section, and visit 51VOA.COM.


Words in This Story

bail n. an amount of money given to a court to let a prisoner to leave jail and return later for a trial

assessment n. a judgement made about various parts of something

objective adj. based on facts rather than feelings or opinions

biased adj. showing unfairness to a particular group

factor n. something that helps produce or influence a result

critical adj. using or involving careful judgment about the good and bad parts of something