Marcela del Pilar Roa Avella
Predicting crime is concerned with the search / examination of predisposing factors of criminal activity in people or places. Currently, there exists A.I tools which attempt to predict levels of risk of crime and facial recognition software. All of these aim to anticipate criminal decisions by tracking trigger factors. Amongst the most recognised algorithms are COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), PROMETEA, Big Brother Watch, PredPol, Harm Assessment Risk Tool (HART) and Actuarial Risk Assessment Instruments (ARAIs).
The aim of this paper is to determine the impact of these tools on human rights through a descriptive investigation using deductive analysis. The results demonstrate that the incorporation of A.I tools for predicting crimes does not guarantee the avoidance of discrimination or bias due to human intervention when it comes to selecting the data that feeds into the algorithm. Moreover, the so-called black box prevents reverse engineering to understand the software’s intelligent process in the decision-making and deliberation of the factors that are being analysed, which constitutes a violation of human rights
Partagez cet article