Interfaces and Web Applications for Military Systems; Conversations Surrounding Lethal Autonomous Weapons

Levine, Abigail, School of Engineering and Applied Science, University of Virginia
Elliott, Travis, EN-Engineering and Society, University of Virginia

As technology advances, one of the topics that is brought up most is autonomy. Autonomy is found in many different fields, most notably is self-driving vehicles. While self-driving vehicles are the most well-known implementation of autonomy, another large contributor is weapons. Lethal autonomous weapons systems (LAWS) have been a major topic of conversation and controversy regarding the use of autonomous technology. Some weapons systems now apply artificial intelligence (AI) to “replicate the human decision-making process …, outside the confines of a script” (Bills, 2015). One of the main concerns is, and always will be, the effect that this recent technology has on the lives of humans.

Distinct groups have their own opinions on how the development of LAWS should be managed, some opting for a preemptive ban on such weapons and others taking a more lenient approach. Through the Social Construction of Technology (SCOT) framework, the effect human involvement has on weapons development can be analyzed. This framework broadly states that human action affects technological development. The main factors of consideration in the framework are interpretive flexibility, relevant social groups, closure and stabilization, and wider context. By looking at social groups affected by the development of LAWS and the controversies surrounding them, we can analyze the way LAWS get created.

BS (Bachelor of Science)
LAWS, autonomous

School of Engineering and Applied Science
Bachelor of Science in Computer Science
STS Advisor: S. Travis Elliott

All rights reserved (no additional license for public reuse)
Issued Date: