Ken Strutin

The algorithms of “risk and needs assessment” are the new bedrock of sentencing and parole; their accuracy, fairness and quality, the unpronounced measures of justice. For these tools that calculate recidivism and social order are too rapidly acquiring an exalted place in human decision-making.

For instance, the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), like any analytic, can be flawed and biased, yet relied on without question. See Jeff Larson et al., How We Analyzed the COMPAS Recidivism Algorithm, Pro Publica, May 23, 2016.

The problem is compounded when human prejudices; data accuracy and currency; coding and classification of people; and selection bias take refuge behind the inscrutability of computer thinking. See Parmy Olson, Racist, Sexist AI Could Be a Bigger Problem Than Lost Jobs, Forbes, Feb. 26, 2018. Machines cannot completely purge our prejudices, but they can magnify and expose them.

From Facts to Factoring

We have only just arrived at a time when implicit bias in judicial sentencing has been acknowledged. See Mark W. Bennett, Confronting Cognitive Anchoring Effect’ and ‘Blind Spot’ Biases in Federal Sentencing, 104 J. Crim. L. & Criminology 489 (2014).

Meantime, the faults and prejudices of human thinking are being passed down to, and legitimized by, machine learning and big data. See Stephen Buranyi, Rise of the Racist Robots—How AI Is Learning All Our Worst Impulses, The Guardian, Aug. 8, 2017. A new computerized prejudice against the poor and people of color has been sanitized by algorithms adopted without scrutiny. See Julia Angwin et al., Machine Bias, Pro Publica, May 23, 2016.

Risk assessment instruments reduce people to digits with criminal histories. At the least, the risks of inaccuracy, bias, and error in their outcomes demand disclosure and transparency of their inner workings. See Dan Rosenblum, The Fight to Make New York City’s Complex Algorithmic Math Public, City & State New York, Nov. 21, 2017.

Risk of Risk Assessment

In State v. Loomis, 881 N.W.2d 749 (Wis. 2016), when COMPAS was only one factor in sentencing, due process did not suffer. However, the decision cautioned against “reliance” on COMPAS forecasts and only condoned its “consideration.”

The Wisconsin Supreme Court concluded that presentence reports must warn that: (1) COMPAS scoring, a proprietary commercial process, is not open sourced; (2) risk assessments are based on national (group) sampling, not state sub-populations; (3) studies have raised questions about racial bias in scoring; and (4) risk assessment accuracy is based on changing populations.

Automated decision-making’s reputation has been inflated by an unjustified confidence in technology. However, Loomis‘ admonition puts the burden of evaluating evaluations on the sentencing judge. See Jeffrey A. Butts and Vincent Schiraldi, Recidivism Reconsidered (Harvard Kennedy School 2018).

The chief criticism of computerized risk assessment is that it focuses on “static factors and immutable characteristics,” true of groups but not individuals. See Katherine Freeman, Algorithmic Injustice, 18 N.C.J.L. & Tech. 75 (2016). Something they have in common with human decision-making.

Dying for Due Process

An incarcerated man or woman can reform their behavior, learn, achieve, become a new person, but no one has yet mastered the art of rewriting history. And history will always condemn them.

Concentrating release determinations on immutable facts, while disregarding human change, undermines re-entry. See Barbara Hanson Treen, Parole Board Ignores the Capacity for Change, Albany Times Union, Aug. 26, 2017. And encourages indifferent computerized decision-making.

Appellate decisions have rejected Parole Board emphasis on the original offense to the exclusion of all else. See, e.g., Wallman v. Travis, 18 A.D.3d 304, 794 N.Y.S.2d 381 (1st Dept 2005); Gelsomino v N.Y.S. Bd. of Parole, 82 A.D.3d 1097 (2nd Dept 2011); Friedgood v. N.Y.S. Bd. of Parole, 22 A.D.3d 950 (3d Dept 2005); Johnson v. N.Y.S. Div. of Parole, 65 A.D.3d 838 (4th Dept 2009).

Still, prison has become the grey ward of generation length punishments for people who have paid a debt society will not forgive. See New York State’s Aging Prison Population (NYS Comptroller 2017).

Consider MacKenzie v. Stanford, No. 2789-15 (Dutchess Cty Sup. Ct. May 24, 2016), where the Parole Board denied a sanguine 70-year old candidate’s release twice in as many years, the second time after being ordered to conduct a de novo hearing.

Finding that the board ignored the Supreme Court’ s earlier mandate to apply all statutory criteria, the judge held them in contempt pending a de novo parole hearing before a new panel.

The question is now moot. After spending 40 years incarcerated for his 25-year sentence, Mr. MacKenzie died in confinement. About a week earlier, he had been denied parole for the tenth time.

So, prison has become the vanishing point, where the incarcerated are literally dying for due process.

Parole decision-making can be infected with implicit cognitive biases, overwrought by caseload fatigue and marred by inequity and inequality. See Robert Gebeloff et al., A Parole Decision in Minutes, N.Y. Times, Dec. 4, 2016. And it stamps the destiny of the poor and people of color to absorb shrapnel from the war on crime. See Michael Winerip et al., For Blacks Facing Parole in New York State, Signs of a Broken System, N.Y. Times, Dec. 5, 2016, at A1.

Conclusion

The millstone of “conviction severity” in parole decision-making eliminates doubt and dispossesses the space left for personal transformation and atonement. And when the quantum of guilt for conviction defines the measure of personhood for parole, it rescinds the promise of liberty.

By consistently denying release on facts that never change, human and computerized risk assessments create the impression that people don’t change.

Still, courtroom sentencing sets the stage for parole and creates a moral responsibility to speak to the future, where reformation is possible and rehabilitation real. And justice is ill-served by broken bargains and unkept promises. Ultimately, the fault lies not in machine thinking but in ours.

Ken Strutin is director of legal information services at the New York State Defenders Association.