Automated Ring Light; How Technology Shapes Gender Bias

Author:
Knorr, Daniel, School of Engineering and Applied Science, University of Virginia
Advisors:
Rogers, Hannah, EN-Engineering and Society, University of Virginia
Powell, Harry, EN-Elec/Computer Engr Dept, University of Virginia
Abstract:

The technical project in this portfolio was inspired by an interaction the author had at an internship in summer 2019. He was designing a website for a new project and was using headshots from the company human resources system to accompany biographies of key people involve with the project. His manager approached him about one of the pictures. She was concerned about a picture of one of her colleagues. That colleague, also a woman, was not looking her best in the headshot. His manager informed him that women in technical roles often wrongly have their technical abilities associated with their appearance. Unfairly, a woman who is seen without makeup or proper lighting in images may be seen as less capable. As remote workplaces and video conferencing become more common, close up appearances become more important. The Automated Ring Light project aimed to present users at their best during video conferences. In the end, the light was able to track the user’s head movements about their workstation and provide optimal lighting and improve their appearance.

The Automated Ring Light project is a form of technological fix. Ideally, people would not judge the abilities of others based on their appearances. This ring light project was intended to alleviate inequities in the use of video conferencing technologies and combat bias in the virtual workplace. This goal inspired the STS project in this portfolio to be an investigation into the gender biases present in the use and development of digital technologies at large. This project found the relationship between gender bias and technology to be reciprocal. In some cases, gender biases at work during the development process yielded biased products. In other cases, biased products perpetuated gender biases in the users themselves. During his research for this project, the author corresponded with the leading author in the space of techno-feminism, Professor Judy Wajcman at the London School of Economics. Wajcman shared the author’s view that pandemic technologies would unequally effect men and women. Specifically, time spent on video conferencing technologies would harm women more due to their unique experience of appearance bias. Through the examination of industry practices and digital product success and failures, this project resulted in suggestions for technology designers making an effort to reduce the bias built into and perpetuated by their products.

Degree:
BS (Bachelor of Science)
Keywords:
TechnoFeminism, Political Technologies, Automation, Bias
Notes:

School of Engineering and Applied Sciences
Bachelor of Science in Computer Engineering
Technical Advisor: Harry Powell
STS Advisor: Hannah Rogers
Technical Team Members: Sophia Fasano, Charles Ferraro, Ethan Staten

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2021/05/05