000 02826cam a2200337 4500500
005 20250121115550.0
041 _afre
042 _adc
100 1 0 _aBeaudouin, Valérie
_eauthor
700 1 0 _a Maxwell, Winston
_eauthor
245 0 0 _aPredicting risk in criminal justice in the United States: The ProPublica-COMPAS case
260 _c2023.
500 _a37
520 _aAn article published by the independent non-profit news media Pro Publica in 2016 argued that Compas software, used in the United States to predict recidivism, was ‘biased against blacks’. The publication sent shockwaves through the public sphere, fuelling broad debate on the fairness of algorithms and the merits of risk prediction tools – debates that had previously been limited to specialists in criminal justice. Starting with the ProPublica-Compas case, we explore the various facets of this controversy, both in the world of data science and in the world of criminal justice. In the media sphere, the Compas affair brought to the surface the potential abuses associated with algorithms, and it intensified concerns surrounding artificial intelligence (fear of AI replacing human judgment, worsening of inequalities and opacity). In the academic world, the subject was pursued in two separate arenas. First, in the arena of data sciences, researchers focused on two issues: fairness criteria and their mutual incompatibility, showing just how problematic it is to translate a moral principle into statistical indicators; and the supposed superiority of machines over humans in prediction tasks. In the criminal justice arena, which is much more heterogeneous, the ProPublica-Compas case strengthened the realization that it is necessary to evaluate predictive tools more thoroughly before using them, and to understand how judges use these tools in context, causing lawmakers and NGOs defending prisoners’ rights to modify their viewpoint on the matter. While the data science arena is relatively self-contained, focusing on data and algorithms out of their operational context, the criminal justice arena, which brings together heterogeneous actors, focuses on the tools’ actual usage in the criminal justice process.
690 _arisk assessment
690 _aalgorithms
690 _acriminal justice
690 _aalgorithm evaluation
690 _afairness
690 _acontroversy
690 _apredictive algorithms
690 _arisk assessment
690 _aalgorithms
690 _acriminal justice
690 _aalgorithm evaluation
690 _afairness
690 _acontroversy
690 _apredictive algorithms
786 0 _nRéseaux | o 240 | 4 | 2023-09-21 | p. 71-109 | 0751-7971
856 4 1 _uhttps://shs.cairn.info/journal-reseaux-2023-4-page-71?lang=en&redirect-ssocas=7080
999 _c549363
_d549363