Upgrade Alerts: Preventing the Use of Bad App Versions; Discrimination in the U.S. Criminal Justice System from Recidivism Score Algorithms

Author:
Kisly, Grace, School of Engineering and Applied Science, University of Virginia
Advisors:
Bloomfield, Aaron, EN-Comp Science Dept, University of Virginia
Forelle, MC, EN-Engineering and Society, University of Virginia
Abstract:

Software engineers must understand the potential repercussions of automation prior to launching new software, and also have barriers in place to keep their software in check if problems arise. Courts have been using artificial intelligence that scores defendants on their likelihood of reoffending to aid in sentencing, which can be extremely detrimental to defendants when the AI gives incorrect scores. The science, technology, and society (STS) research examines the consequences of the racial and gender bias in recidivism score algorithms and how the bias manifests. While the STS topic aims to explore where the discrimination in the algorithm stems from in order to learn what needs to be addressed to reduce and control bias, the technical project strives to create a feature that blocks app users from accessing broken or outdated software. Both topics convey solutions that prevent software misuse and that resolve the issues encountered by those who interact with the respective technologies.

The technical project focuses on Mastercard, a financial services company, that had no way of preventing users from using faulty or outdated versions for their expense tracking application. This could potentially lead to app crashes and failures on the user end. In order to alleviate this, I developed two types of alerts for iOS to either suggest or force the user to update their version prior to entering the app. Therefore, the alerts restrict access to broken software and allow developers to control what software the users can access through setting which version number must be used. This prevents poor user experience, limits bugs and defects, and ensures users are interacting with the newest features and app improvements. The feature was tested successfully and deployed to over 500 users.

The STS research explored the criminal risk assessment tool, Correctional Offender Management Profiling for Alternative Sactions (COMPAS), and how it perpetuated discrimination in the criminal justice system. Recidivism score algorithms were introduced to decrease inconsistency and inaccuracy in judicial decisions in order to reduce the United States’ highest rate of incarceration in the world and the disproportionate rates of incarceration for people of color. However, in reality COMPAS is used to determine sentencing, where it gives harsher scores to people of color and women, which continues the disproportionate rates of incarceration for minorities. The research examines how biased historical criminal data, lack of people of color and female representation on developer teams, ambiguous guidelines for judges, outdated civil rights acts, and inconsistent AI regulation contribute to COMPAS giving inaccurate and unjust scores to defendants. Therefore, to control and reduce the bias, there is a need for adequate testing of data for racial and gender bias, diverse representation on developer teams, clear guidelines and teaching on proper use for judges, updated civil rights acts to include AI, and stricter AI regulation. With these changes, COMPAS would only be used to determine parole eligibility while continuously being in check for appropriate and unbiased use. The solutions proposed by my research aim to help COMPAS move toward achieving its intended goals of reducing the prison population of low-risk non-violent offenders to combat disproportionate incarceration rates.

Both projects encouraged me to employ solution-oriented thinking when examining ethical and technical issues with software. I approached each project with a goal of proposing pathways toward positive change, instead of solely explaining what the problem is and why it is important. Working on the thesis prompted me to more critically consider what were the social motivations for developing my technical project, rather than just the desire to build a new feature. It made me reflect on the necessity for developers to be able to control the software that users have access to, and it made me appreciate an element found in many other apps that often seems like a hindrance on the surface. Moreover, my technical work supported my research for my thesis on what can be done from the developer’s side when social issues ensue. While preliminary research seemed as though the COMPAS algorithm was being wrongfully used in courts resulting in discrimination, understanding the engineer’s perspective helped me realize it traces back to the problems of the technology and the teams that create it. This imparted the crucial knowledge to me that software developers have a responsibility to be mindful of the detrimental consequences for what they may design and develop. By looking at a severe case where software is negatively impacting the lives of those who depend on the output of the algorithm, I now have the background of what can go wrong when implementing technology and will remember COMPAS in the growing age of artificial intelligence. The STS research coupled with the technical project taught me to be cognizant of social ramifications as I move forward in my software engineering career.

Degree:
BS (Bachelor of Science)
Keywords:
artificial intelligence, recidivism score algorithm, COMPAS, criminal justice, update alert, criminal assessment tool, update alert, force update, soft update
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: Aaron Bloomfield
STS Advisor: MC Forelle

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2023/05/11