Premium
The sexist algorithm
Author(s) -
Hamilton Melissa
Publication year - 2019
Publication title -
behavioral sciences and the law
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.649
H-Index - 74
eISSN - 1099-0798
pISSN - 0735-3936
DOI - 10.1002/bsl.2406
Subject(s) - recidivism , equity (law) , algorithm , criminal justice , computer science , risk assessment , human factors and ergonomics , parity (physics) , psychology , poison control , machine learning , computer security , clinical psychology , medicine , criminology , medical emergency , law , political science , physics , particle physics
Algorithmic risk assessment tools are informed by scientific research concerning which factors are predictive of recidivism and thus support the evidence‐based practice movement in criminal justice. Automated assessments of individualized risk (low, medium, high) permit officials to make more effective management decisions. Computer‐generated algorithms appear to be objective and neutral. But are these algorithms actually fair? The focus herein is on gender equity. Studies confirm that women typically have far lower recidivism rates than men. This differential raises the question of how well algorithmic outcomes fare in terms of predictive parity by gender. This essay reports original research using a large dataset of offenders who were scored on the popular risk assessment tool COMPAS. Findings indicate that COMPAS performs reasonably well at discriminating between recidivists and non‐recidivists for men and women. Nonetheless, COMPAS algorithmic outcomes systemically overclassify women in higher risk groupings. Multiple measures of algorithmic equity and predictive accuracy are provided to support the conclusion that this algorithm is sexist.