Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

Justice by Algorithm: Are Artificial Intelligence Risk Assessment Tools Biased Against Minorities?

Conklin, Michael and Wu, Jun, Justice by Algorithm: Are Artificial Intelligence Risk Assessment Tools Biased Against Minorities? (June 30, 2021). Available at SSRN: https://ssrn.com/abstract=3877686 or http://dx.doi.org/10.2139/ssrn.3877686

“This is a review of Katherine B. Forrest’s new book When Machines Can Be Judge, Jury, and Executioner. The book does an excellent job discussing issues of fairness and racial disparities from the use of artificial intelligence risk assessment tools (hereinafter “AI”) for decisions such as pretrial release and likelihood of recidivism. This is a timely topic as the technology is currently a tipping point. While Europe has begun to implement protections for defendants regarding AI, the U.S. is increasing its reliance on AI without such safeguards. This review includes a discussion on the topics of how AI compares to human judge predictions and decisions, fairness and racial outcomes, how recidivism is frequently misunderstood and its relevance, how human decisions are inextricably intertwined with AI, and the proper understanding of an AI’s “error rate.”

Sorry, comments are closed for this post.