Ethical Algorithmic Justice: Risks of AI in Decision Making in Criminal Justice

Abstract

Statement of the Problem: Many countries are developing Artificial intelligence (AI) to make decisions in criminal justice. However, efforts have focused on technology and the technical component, without creating a regulatory framework that guarantees that the human rights of those involved will be respected. An AI is programmed to make decisions automatically, which involves risks that have not been solved by legal systems. These risks have not been previously studied. The purpose of this study is to describe the risks associated with the implementation of AI in decision-making in criminal justice. 
Methodology & Theoretical Orientation: A data analysis and a hermeneutical method were used during this research. The analysis started from the identification of the “algorithmic decision-making” related to the administration of justice, taking into accounts the doctrinal contributions and concepts of experts in charge of the development of these technologies. 
Findings: The main risks identified were three: 
1) The biases that come from the information with which these systems are fed. The racial bias is highlighted, which leads to decisions that violate equality, impartiality and objectivity in the justice decision. 
2) The private provenance of the algorithms. These systems are designed and offered by private companies (legaltech startaups) so the algorithms are trade secrets that cannot be subject to public scrutiny. 
3) Difficulties in accountability. AI systems are based on machine learning, so it is difficult to access the way in which systems internally do the analysis, making it impossible for them to be held accountable. 
Conclusion & Significance: Various risks associated with the implementation of AI in criminal justice may mean the violation of the human rights of those involved. Legal systems need to take legal measures to solve these difficulties. Recommendations are made to guarantee that the implementation of these systems respect the rights of the people.
 

Select your language of interest to view the total content in your interested language

Viewing options

Flyer image
journal indexing image

Share This Article