AI Algorithm in Judicial Justice

Overview of the State v. Loomis Case

  • Legal Context: The case highlights the use of proprietary algorithms in judicial decisions, particularly in assessing risk of recidivism.

  • Central Figure: Eric Loomis, whose fate was determined by an algorithm rather than human judgment.

Key Events in the Case

  • Court's Decision:

    • The court ruled that Loomis posed a high risk of recidivism, leading to denial of probation.

    • Loomis was sentenced to six years in prison based on an algorithmic decision, showing reliance on a proprietary software called COMPASS.

Role of Algorithm in Legal Decision-Making

  • COMPASS Software:

    • A proprietary tool used to assess the likelihood of a defendant committing further crimes.

    • Generated visual outputs (e.g., bar charts) which influenced judicial decisions.

  • Judge's Interaction with the Algorithm:

    • The judge relied on the risk score without significant engagement with Loomis’s defense.

Ethical and Legal Implications

  • AI in Legal Judgments:

    • The decision raises concerns about the use of AI in making life-altering legal judgments.

    • Loomis’ inability to scrutinize the algorithm’s decision-making process points to potential injustices.

  • Duties of Professionals Relying on AI:

    • Attorneys and HR professionals bear the responsibility of ensuring ethical and legal standards in decisions made using AI.

    • There is a warning for all professionals that their reliance on AI tools cannot replace due diligence and ethical considerations.

The Issue of Transparency in AI Tools

  • Right to Know - Due Process:

    • Loomis argued his due process rights were violated due to lack of transparency in how the risk score was computed.

    • The proprietary nature of COMPASS prevented him from verifying its scientific validity and assessing potential biases.

  • Trade Secrets vs. Individual Rights:

    • The court acknowledged the conflict between proprietary information and the rights of defendants to challenge evidentiary basis for judgments.

Court's Ruling and its Implications

  • Wisconsin Supreme Court Decision:

    • Although the court recognized the issues surrounding COMPASS, they upheld Loomis’ sentence.

    • The ruling allows judges to utilize secret risk assessments if accompanied by a significant warning.

    • Warning Label Established:

    • COMPASS scores are to be seen as supplementary evidence rather than definitive proof for sentencing.

    • Judges must inform that these scores should not be determinative in individual cases, emphasizing the need for a complete consideration of the individual's circumstance.

  • Group Data vs. Individual Reality:

    • The algorithms provide assessments based on group behavior statistics, which may not accurately reflect individual risk factors.

Call to Action for Professionals

  • Responsibilities of Legal and HR Professionals:

    • Professionals must act as a ‘human firewall’ ensuring that AI does not dictate judgments on liberty, employment, or reputations.

    • There is a strong emphasis on demanding transparency and accountability from AI systems used in decision-making.

  • Need for Education on AI Usage:

    • Subscribe to platforms like AI Justice Unpacked for further understanding of integrating AI responsibly in professional settings.