# COMPAS Algorithm Case Study
> [!metadata]- Metadata
> **Published:** [[2025-02-09|Feb 09, 2025]]
> **Tags:** #🌐 #learning-in-public #artificial-intelligence #ethical-ai #bias-mitigation #cognitive-science
The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a prominent example of [[Algorithmic Bias]] in the criminal justice system. This case study demonstrates the real-world impact of biased AI systems on people's lives.
## Overview
COMPAS is a software tool developed by Northpointe (now Equivant) that:
- Assesses likelihood of [[Recidivism|defendant reoffending]]
- Used in various U.S. jurisdictions (New York, Wisconsin, California)
- Aids courts in decisions regarding:
- Bail
- Sentencing
- Parole
## Bias Discovery
A 2016 ProPublica investigation revealed significant racial bias:
- Disproportionately labeled Black defendants as high-risk
- Higher false positive rates for Black defendants
- Lower accuracy in predictions for minority groups
## Impact on Justice System
The COMPAS case highlighted several critical issues:
- Automated decision-making in sensitive contexts
- Lack of [[Procedural Justice|procedural justice]]
- Need for [[Fairness Definitions|algorithmic fairness]]
- Transparency in AI systems
## Key Findings
1. **Predictive Accuracy**:
- No more accurate than non-experts
- Demonstrated systematic bias against minorities
- Failed to achieve intended objective of fair risk assessment
2. **Bias Manifestation**:
- Black defendants twice as likely to be misclassified as high-risk
- White defendants more likely to be misclassified as low-risk
- Perpetuated existing systemic biases
## Lessons Learned
1. **Importance of Auditing**:
- Regular evaluation of AI systems
- Independent third-party assessments
- Transparency in algorithmic decision-making
2. **Need for Safeguards**:
- [[Bias Mitigation Techniques]]
- Human oversight in critical decisions
- Clear appeals process
3. **Policy Implications**:
- Regulation of AI in sensitive domains
- Standards for algorithmic fairness
- Protection of individual rights
[Learn more about the COMPAS algorithm and its implications](@https://datatron.com/real-life-examples-of-discriminating-artificial-intelligence/)