Gesture Watch; Computer Algorithms and Their Inherent Social Bias

Author:
Tan, Pearak, School of Engineering and Applied Science, University of Virginia
Advisor:
Powell, Harry, EN-Elec/Computer Engr Dept, University of Virginia
Abstract:

Computer algorithms/technologies and their effects can be easily misunderstood and their uses can have unintended effects. As of now, there is little national regulation in place in the U.S. and no signs that this will change in the near future. The technical project associated with this paper consists of creating a gesture based watch that is able to interpret commands with a paired device and sending that command to a paired device. The software is designed in order to meet only its use cases, and functionality of the watch is disabled when not in use in order to eliminate unintended uses that could cause bias. The STS research paper proposes a viable solution to the social discrimination problems that come with computer algorithms. In the majority of cases, discriminatory computer algorithms are only dealt with after they have caused damage. The problem is further exacerbated by the fact that there is little to no interaction between the U.S. government, its population, and the entities that create and publish the computer algorithm. Every computer algorithm is created in order to solve a societal problem. However, while the computer algorithm might solve the initial problem that it sought to fix, other problems could arise, especially in the realm of social discrimination. The gesture watch will serve to facilitate an easier means of performing functions remotely while also trying to avoid potential bias in the software algorithm. The technical project was successfully completed at the conclusion of the Fall 2019 semester. The watch has two major modes, timekeeping and gesture. The timekeeping mode displays the time using a ring LED array display. The gesture mode enables the watch to be paired with a device via Bluetooth, and enables the watch to send commands to the paired device. An example of a command would be performing a swiping gesture with a hand and going through a presentation slide as a result. The device produced has little chance to be used in a malicious biased manner and all of the coded gestures work well, however, ore work could have been done to mitigate false positive gestures.How can we make sure computer algorithms developed have little to no unintended social bias? This paper outlines a solution that is based on an increased interaction between relevant stakeholders such as the US population and the government. The solution seeks to change the status quo of current software development practices and introduce government intervention into the software development environment. Based on accountability of entities developing computer algorithms, citizen awareness, and regulation, and outlined using an Actor Network Theory Model. The model introduces two new, currently nonexisting actors, technologically literate computer ethics testers and a new government agency to regulate computer algorithms. As new software technologies and computer algorithms are introduced, the need to protect from their harmful effects on society becomes increasingly necessary. The paper explores current instances of software causing discriminatory harm such as Amazon and their AI based resume selection and the University of Wisconsin-Stout ranking students based on website cookies. It uses these instances as well as look at the current software legislation in place in order to provide an alternative to current practices. Certain computer algorithm guidelines also explored, mainly, the Ten Commandments of Computer Ethics, a set of ethical commandments created by the Computer Ethics Institute.Having a concrete ethics guideline for software developers to abide by mitigates the possibility of unintended uses of software algorithms to be exploited. These measures would be a step in the right direction in making computer algorithms more transparent.

Degree:
BS (Bachelor of Science)
Keywords:
Actor-Network Theory, Computer Algorithms, Social Bias, Gesture Watch
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Engineering
Technical Advisor: Harry Powell
STS Advisor: Catherine Baritaud
Technical Team Members: Julian Nguyen, Edward Ryan

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2020/05/08