Algorithm to Reduce Overcrowded Prison System?

Philadelphia's Journey into Automating it's Court Processes.

Across the United States, more than 600,000 people sit in jails who have not yet been convicted of a crime. Philadelphia jails, like many jails across the U.S., are overcrowded with people who have not been found guilty of a crime and who are waiting for their bail hearing. To combat this issue, in 2010, a state Senate bill was passed for the development of Pennsylvania’s sentencing risk assessment tool to address the issue of increasing prison populations. The hope was that this program would sort through pre-trial inmates and determine who should be released on bail and their bail amounts quickly and efficiently.  The program would also be used to aid judges in their sentencing decisions. 

What would this risk-assessment algorithm look like? Richard Berk, professor of criminology at the University of Pennsylvania, created the algorithm to consider several pre-determined factors at different weights to rank an individual as “low” or “high” risk of being re-arrested again. Those who are identified as “low risk” can be fast-tracked for parole release, and as a result, save the city resources for reduction of inmates being housed unnecessarily while awaiting parole evaluation.  The intention would be that the ranking could also be used during sentencing to give judges more information when making their decision.

The factors for the algorithm could evaluate several variables including: past arrest records, age, gender, and employment history. Certain factors, such as race and zip code, would not be considered in the algorithm. An article in 2016 by City Lab quoted Richard Berk as saying that not including race from the considered factors would not allow the algorithm to perform to its best capabilities.

The idea behind an algorithm and its methodical approach to the bailing procedure is that it eliminates any chance for human bias. However, opposers of the algorithm argue that human evaluation is inherently required – a bail commissioner can more efficiently weigh the gravity of all the involved factors – including showing remorse for the crime committed.

An Ethical Dilemma?

There have been several concerns raised with the proposal of the algorithm. First, many parties have expressed concerns that this program will have a racial bias. This was the case in Bergen County, Florida when a risk-assessment algorithm started showing a racial bias towards African-Americans. It was discovered that the system was ranking African-Americans who were arrested for “low level” crimes as “high risk” and their Caucasian counterparts who were arrested for “riskier “crimes were being assessed lower.

Another concern that has been highlighted is that being arrested for a crime and being convicted of the crime are two very different situations. You are innocent until proven guilty and being arrested for a crime does not mean you are guilty. African-Americans are 3.73 times likely to be arrested on marijuana possession charges despite using the drug at similar rates of Caucasian people. As a result, the algorithm would double the chances of African-Americans being arrested for marijuana possession versus a Caucasian defendant.

A third objection is that female defendants are significantly less likely to commit a violent crime upon release when compared to men. However, this objection is not relevant as the algorithm does take into consideration gender as one of the many factors when evaluating individuals.

The loudest and strongest objection of all is that the algorithim has not stood up against testing. Dartmouth College researchers Julia Dressel and Hany Farid published a paper in the journal Science Advances, in which they found that a risk-assessment algorithm called CAMPAS (used by several cities across the U.S.) predicted recidivism at the same rates as a random online poll of people who had no criminal justice training at all.

These programs also come under public and media scrutiny as people feel violated when a program is used to comb through their personal data, as seen with the New Orleans Police Department (NOPD). The NOPD was using a software called Palantir to comb through social media and criminal history data to predict that an individual would commit a violent crime. The NOPD was using this system for years and can be dated as early back as the 2012 trial of 22-year old Evans “Easy” Lewis, when they brought forward over 60,000 pages of documents proving Easy was a gang member. The NOPD did not disclose to the court at the time where these documents were acquired. However, after the media revealed in February 2018 the NOPD and Palantir partnership, it became very clear how the incriminating information on Easy was acquired. A law suit has been filed claiming that this method of gaining information was a violation of Easy’s basic human rights. If the suit is awarded, the NOPD could be forced to re-examine hundreds of convictions dating back to 2012. Since 2012, the NOPD has renewed their contract with Panatir three times, but when the contract expired most recently in Feb 2018, the mayor’s office decided not to renew considering the public backlash.

Other Experiences in Different Cities

That isn’t to say that every city who has implemented a similar program has seen negative results. Agencies in Virginia used an algorithm to predict a defendant’s probability of skipping trial and their probability of being arrested if released. Using the algorithm, almost twice as many defendants were released without any increase in in pretrial crime.

In 2017, New Jersey also implemented an algorithm for its parole process and saw a 16% decrease in pretrial jail populations without an increase in pretrial crime.

Lastly, this approach has worked for Philadelphia before! The case for this algorithm implementation was likely started by a 2008 experiment in Philadelphia where an algorithm was used to identify probationers and parolees at a low risk of future violence. The study found that parole officers could decrease their attention on the low-risk parolees and increase attention on their high-risk individuals, without seeing any increase in rates of re-offense.

Philadelphia’s Journey with the Algorithm: 2010 – Current Day

Eight years later and the Philadelphia risk-assessment algorithm still has not been launched. However, a lot has happened and changed since the bill passed in 2010. The Sentencing Commission has spent the last 8 years developing and improving the algorithm. After some concerns were raised that using a factor of past arrests would create a racial bias  (“as African Americans are incarcerated at more than 5 times the rate of Caucasians”), the program was adapted to look at convictions instead of just arrests.

A reoccurring issue with these programs is oftentimes they are not being used with the public’s knowledge, as was the case in New Orleans. The other issue is the lack of transparency when using these tools– several US states use the program COMPAS as a risk-assessment tool, however COMPAS refuses to share key details of how it calculates its scores. Pittsburgh, however, has been incredibly transparent with its process. “Since 2010, the commission has released more than 15 reports detailing the development of the algorithm and has held 11 public hearings to invite feedback.” This transparency is a refreshing move and allows the community, who is directly impacted by the implementation of the algorithm, to feel that they have a say in the matter.

While many cities have begun to successfully use risk assessment tools for parole decisions, the extension to using these ‘ratings’ in sentencing is another issue entirely. Many advocates argue that if a judge was to see a “high risk” rating for an individual, they may be biased to sentence the person more harshly. However, “the committee said the assessment alone should not be used as a factor to add or subtract prison time, but should help judges determine whether to seek more information before imposing a sentence.”

In June 2018, dozens of speakers showed up to condemn the algorithm at a hearing. The speakers included community activists, defense attorneys, and officials from the DA’s office. A petition had been signed with over 2,000 signatures. Concerns were raised once again about the algorithm encouraging racial biases and not holding an acceptable rate of accuracy.

The commission had planned to vote on the algorithm in July 2018 however the decision has been delayed by at least 6 months to deliberate the objections that have been brought up. Protestors are not saying that the program should not be used at all, they just demand to see more accuracy and proof that the algorithm will not perpetuate racial biases.

“No tool like this should be introduced into such a high-stakes decision-making process unless we can control it to make sure its winding down mass incarceration as well as winding down the racial, ethnic, and other bias that exists currently in the system.” 

Hannah Sassaman

Policy Director, Media Mobilizing Project

Risk assessment algorithms are likely the way of the future – technology is a way for us to fast-track systems and processes. Technology should operate above human-standard, offering no biases or mistakes. Until we get there, the kinks must continue to be worked out.

About the Author

Kristina Obodovskiy

Marketing Specialist

Kristina Obodovskiy is a Marketing Specialist at InTime. With a BBA in Marketing Management and over 4 years of marketing experience, Kristina has written guest contribution content for several organizations in the past. If you would like to connect with Kristina, find her on LinkedIn here.

Liked this post? Subscribe so you never miss an article.

By submitting this form, you are confirming you have read and agree to our Terms and Privacy Policy.