Thermoelectric Water Bottle Cooling Station; Societal Implications From Racial Bias In Machine Learning and Artificial Intelligence
Patterson, Everett, School of Engineering and Applied Science, University of Virginia
Powell, Harry, EN-Elec/Computer Engr Dept, University of Virginia
Ferguson, Sean, EN-Engineering and Society, University of Virginia
The technical portion of the portfolio details a project endured by myself and other group members who had a genuine interest in sports science and medicine. As a computer engineer and someone who has always enjoyed athletics as both a competitor and spectator, I have always been intrigued with technological innovation in sports. And, from our mutual interest as a group, we then brainstormed on a few project ideas and eventually stumbled into the issue of athletes properly staying hydrated when training outside in hot weather. From this, we then were able to narrow down our idea to involve a sort of self cooling water bottle device which we later refined to an innovation we called a Water Bottle Cooling Station. The goal behind this was to design a refrigeration device that was portable enough to be placed on an office desk for example, and could efficiently cool a water bottle down to 40°F. A surprising amount of water waste results from people either throwing out warm water or waiting for warm tap water in the faucet to turn cool. As an attempt to address this issue, we wanted to see if we could engineer a portable and convenient self contained cooling process. Even though our final project deviated from the initial proposals relating to sports science, we still enjoyed undertaking a complex project that aimed to address what seemed to be a simple problem.
My STS thesis on the other hand, concentrated on a much larger scale issue. Growing up, I had always noticed the subtle instances of racial bias in technology, such as internet search engine results reinforcing certain racial stereotypes. So I decided to look further into racial bias and technology, more specifically racial bias and artificial intelligence. While at first it might be a ridiculous proposition to declare that computers are capable of racism, as their purpose is to quite literally operate with solely logic. However, after conducting my first initial research, I realized that the issue is not with the technology, but with human behavior instead. In other words, the data used to train AI machines can actually contain biases that reflect human biases, especially if that data was improperly collected and maintained. Which is why I decided to write my topic about racial bias from predictive policing. My STS research paper takes a deeper look into how poor policing practices and lack of hierarchical oversight in the law enforcement system leads to policing departments across the nation using powerful AI tools that only magnify current poor policing practices.
BS (Bachelor of Science)
wicked problem, artificial intelligence, predictive policing
School of Engineering and Applied Sciences
Bachelor of Science in Computer Engineering
Technical Advisor: Harry Powell
STS Advisor: Sean Ferguson
Technical Team Members: Micah Harris, Robin Watkins, Mac Baskin
English
All rights reserved (no additional license for public reuse)
2020/05/05