Integration of Augmented Reality Vision and Speech Modules into The Cognitive Assistant System; Investigating the effects of biased facial recognition AI algorithms used by law enforcement in criminal investigations on people of color

Author:
Iyer, Sneha, School of Engineering and Applied Science, University of Virginia
Advisors:
Foley, Rider, EN-Engineering and Society, University of Virginia
Alemzadeh, Homa, EN-Elec & Comp Engr Dept, University of Virginia
Abstract:

Emergency medical technicians (EMTs) collect, filter, and interpret information from real-time sources in order to provide timely and appropriate medical interventions during emergency situations. However, doing so in a high-pressure situation causes a cognitive strain on those performing during a crisis. Assistive technologies can help to lessen this pressure on first responders by improving situational awareness and facilitating appropriate decision making. Additionally, the integration of machine learning within cognitive assistant systems aims to improve the accuracy and effectiveness of EMTs by utilizing analytical algorithms to provide applicable feedback. Thus, we have the Cognitive Assistant for Emergency Medical Services, a system that assists the responders in observing and processing the data and interacting with responders during response operations to provide them with feedback and insights. It is important to consider the human and social dimensions of such technology because with the assistance of such tools, we can help health care professionals in reducing waiting times in the emergency department, decreasing errors, and increasing the efficiency of care.

While the emergency care cognitive assistant technical project has significant human and social dimensions, the societal aspect of this thesis transitions to examining biased facial recognition AI algorithms used by law enforcement in criminal investigations on people of color. I used Latour’s actor-network theory (ANT) to analyze each actor’s role in a network involving AI, law enforcement agents, government, as well as other smaller actors and investigated the consequences of biased algorithms in this environment. In terms of my research method, I looked into media accounts, online articles, and prior research on this topic. I also explored both the development and application of FRT, and the challenges associated with creating complex, unbiased “smart” technology. I investigated three cases where FRT has been inaccurate or led to wrongful arrests due to a person having darker skin: Robert Williams, Michael Oliver, and Nijeer Park. Finally, I interviewed Dr. Sheng Li, a professional researcher who has done extensive research in the field of facial recognition technology.

I found that the social and cultural contexts in which FRT is deployed have exacerbated existing biases, such as in law enforcement settings, where FRT has disproportionately affected certain minority groups. Bias has arisen among several different actors. Based on research and analysis, key requirements of the successful implementation of FRT use by law enforcement include clear ethical standards for development, regulating the creation and use of FRT by both private and public entities, as well as introducing training for police officers to use FRT as an assistive technology in addition to other evidence/investigation. Collaboration between the different actors is also essential to advance FRT.

Overall, both my technical and STS research aim to investigate the role of AI technology in different environments – emergency medicine and criminal investigations, respectively. In both these cases, I analyze how AI agents interact with different actors in said environments, as well as the consequences of these interactions.

Degree:
BS (Bachelor of Science)
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: Homa Alemzadeh
STS Advisor: Rider Foley
Technical Team Members: Keshara Weerasinghe

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2023/05/11