Opinion

Predicting recidivism – the questionable role of algorithms in the criminal justice system

30th Mar 2023

Many American courts are using an opaque computer algorithm to help judges decide whether a person should be granted bail or parole, and the length of their sentence. However, some have pointed to the apparent racial bias against African-Americans in such algorithms and have questioned their effectiveness in predicting recidivism (reoffending).

Courts increasingly using AI algorithms for predicting recidivism 

One such artificial intelligence algorithm is called COMPAS (correctional offender management profiling for alternative sanctions). Over the past 10 years it has been used in more than a million court hearings from California to New York to predict whether a person facing the court will commit another crime if they are released on bail or parole. Judges in some US states are also using COMPAS to decide the length and severity of sentences they hand down. This is based on the algorithm predicting the likelihood of recidivism and assessing the convicted person’s danger to society.

Code and processes of algorithm used for predicting recidivism kept secret

It works like this: defendants fill in a questionnaire about their personal history, job, education and family, to which their criminal record is added. The algorithm scores them from one to ten based on 137 factors to indicate their potential to reoffend. The code and processes underlying COMPAS are secret, known only to the algorithm’s maker, US company Northpointe (now Equivant). Judges, prosecutors and defence lawyers do not know how COMPAS reaches its conclusions, but many US states have ruled that judges can use it to help make decisions on bail, parole and sentencing.

Claims algorithm no better than humans at predicting recidivism – and biased

Some studies have found the algorithm is no better at predicting criminal recidivism than humans. (See Sentence by numbers: The scary truth behind risk assessment algorithms, published by the Center for Digital Ethics and Policy.)There are also accusations that COMPAS is biased against African-Americans. In 2016 ProPublica compared the outcome of two risk assessments in Florida. In one case, an 18-year-old African-American woman who had taken a child’s bicycle was judged by the algorithm to be at ‘high risk’ of reoffending, while a 41-year-old white man, a seasoned criminal who had been convicted of armed robbery, was assessed as ‘low risk’. The algorithm got both assessments wrong: two years later, the woman had not been charged with any new crime; while the man got eight years for burglary.; In another case in Wisconsin, the court prosecution and defence agreed a plea deal for a year in jail, with follow-up supervision, for an African-American man found guilty of stealing a lawnmower and some tools. However, the judge said COMPAS predicted a high risk of future violent crime, overturned the plea deal and imposed a two-year prison sentence followed by three years of supervision instead. (See Machine bias.)

Australian research on computerised screening for predicting recidivism 

Australian academic research into risk assessment has examined COMPAS and other forms of computerised predictions of criminal behaviour. This includes research by the NSW Bureau of Crime Statistics and Research on risk assessment of offenders. (See Improving the efficiency and effectiveness of the risk/needs assessment process for community-based offenders.)

In response to the increasing use of AI in different jurisdictions, the Australasian Institute of Judicial Administration recently released a guide for courts and tribunals of the different AI and automated decision-making tools available, as well as the opportunities and challenges they bring. (See AI Decision-Making and the Courts: A guide for Judges, Tribunal Members and Court Administrators.)

Future of AI in Australian courts for predicting recidivism

The increasing popularity of artificial intelligence in courts internationally raises the question of its place in Australian courts. (See AI is creeping into our courts. Should we be concerned?) While NSW judges must make difficult decisions on whether to grant bail and on the length of sentences to protect society, they also have a duty to consider justice. In my opinion, it is not possible to remove the human element from sentencing (for example, the nuances involved in submissions on sentence) and other procedures, without undermining the basis of the current legal system.

Bail and parole are similarly exercises where AI may in fact be a belated attempt to catch up to and exploit the existing databases available to lawyers and the judiciary.

The AI input seems to be a method of ‘averaging’, more suited to some of the online betting outlets massively expanding in Australia, than to the judicial system.

It would be interesting to see if the use of algorithms in the criminal justice system in the US has any beneficial outcome whatsoever over time.

Juries are the ultimate leveller in our legal system, and the mere availability of an alternative system – such as the algorithms underpinning the many wonderful social media sites that now exist – should not replace the jury without qualitative outcome data. Using a secret computer algorithm to decide whether to grant bail or how long a person should be in jail flies against the legal ethics and moral principles of our courts.

While technology may be able to help in risk assessment, it should never be left up to a computer algorithm to decide a person’s fate.

This is an edited version of an article first published by Stacks Law Firm.

The ALA thanks John Gooley for this contribution.

John Gooley practises across criminal, family and employment law and associated commercial matters at Stacks Collins Thompson. He enjoys Legal Aid briefs and duty work, as well as defended hearings, special fixtures and severity appeals. John has experience in the regulation of advertising and its content, NSW local government planning and licensing and CTTT matters, as well as HREOC and NSW ADB actions. He has wide-ranging experience in both the federal and state employment jurisdictions, and is a life member of the NSW Public Service Professional Officers Association.

The views and opinions expressed in this article are the author's and do not necessarily represent the views and opinions of the Australian Lawyers Alliance (ALA).

Learn how you can get involved and contribute an article.

Tags: Human rights John Gooley AI Recidivism United States