Algorithmic Bias in Facial Recognition: Exploring Racial and Gender Disparities through CNN Models; Gender Shades and George Floyd’s Death: Raising Public Awareness of Algorithmic Bias

Author:
Yoon, Claire, School of Engineering and Applied Science, University of Virginia
Advisors:
Vrugtman, Rosanne, EN-Comp Science Dept, University of Virginia
Neeley, Kathryn, EN-Engineering and Society, University of Virginia
Abstract:

Algorithmic bias in facial recognition technology has emerged as a significant concern since it consistently misidentifies certain races and genders, leading to undesirable outcomes. To investigate how facial recognition systems work and the potential factors that affect misidentification, I worked on a project titled Celebrity Facial Recognition with two fellow students. The project was designed to verify whether our machine learning models accurately identify and detect the correct person from a particular group of people with similarities in their appearances. By utilizing the Convolutional Neural Network (CNN) for image classification, web scraping and web crawling for data collection of six different groups (White male, White female, Black male, Black female, Asian male, and Asian female), and the PyTorch framework for transfer learning, we developed the project to train and evaluate the models. Once images were gathered from a search engine, the models were trained using input data and processed through multiple layers to detect the correct subject. The project resulted in the models detecting the correct person with about 80% validation accuracy across all six groups, but lower validation accuracy among them occurred within the White male and Asian male groups. Future work on the project could include expansion of high-quality datasets and different configurations for equitable performance across diverse populations.

Degree:
BS (Bachelor of Science)
Keywords:
Artificial Intelligence, Algorithmic Bias, Facial Recognition Technology, Gender Shades, George Floyd
Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2023/12/18