When Bail Is Denied By An Algorithm

June 7, 2016

Algorithms are finding their way into the criminal justice system, and some argue that it’s making the system more fair by taking bias out of the equation. There’s evidence, however, that it may have the opposite effect, because bias can be built right into the algorithm. In a recent ProPublica article, that charge – and specifically a charge of racial bias – was leveled against COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), a system widely used in U.S. courts to predict the flight risk of defendants as a way to set bail. Boston Globe writer Hiawatha Bray examines those claims, along with similar problems reportedly built into more common algorithms like those controlling Google’s search engines and Facebook’s “Trending Topics.” The idea that algorithms may not be neutral is especially disturbing in the criminal justice system, where the case can be made that it amounts to the accused having no way to confront the accuser. Like most entities that work primarily with an algorithm, the company that sells COMPAS considers it proprietary.

Read full article at:

Daily Updates

Sign up for our free daily newsletter for the latest news and business legal developments.

Scroll to Top