Alexa Data Monitoring: Virtual Personal Assistants and Information Privacy Protection; Bias in Policing: Predictive Policing Advocates and Opposition in the U.S

Author:
Lemley, Chase, School of Engineering and Applied Science, University of Virginia
Advisors:
Norton, Peter, Department of Engineering and Society, University of Virginia
Tian, Yuan, Department of Computer science, University of Virginia
Abstract:

Data can be misleading and cause systems in everyday life to make discriminatory or biased decisions about those that the data was gathered from. How can the dangers from these misjudgements be minimized?
The Amazon Alexa is a voice recognition tool that can be used to answer questions and help with simple tasks around the house. This tool has become a common aspect of many households, but poses a threat to families' privacy. To allow for a better understanding of what data is being retained by the Alexa device a dashboard of all conversations had with your device was developed. Testing was done to see how users reacted to the more readily available information, and it was shown to be a better process than the current technology developed by Amazon.
In predictive policing, police departments use data analytics to allocate law enforcement resources. The entered data originate from case files. More police departments are adopting the technology, believing that it can help them with crime rates. As the rapid adoption continues there are growing concerns over the embedded biases within these algorithms. Social groups are competing to determine the future of predictive policing and the extent of its use in law enforcement.

Degree:
BS (Bachelor of Science)
Keywords:
Predictive Policing, Voice Monitoring, Data Collection
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: Yuan Tian
STS Advisor: Peter Norton

Language:
English
Issued Date:
2021/05/10