Autonomous Obstacle Avoidance for Unmanned Aerial Vehicles (UAV); Lack of International Consensus Regarding Usability of Lethal Autonomous Weapons Systems (LAWS)

Author: ORCID icon
Nayhouse, Sammy, School of Engineering and Applied Science, University of Virginia
Forelle, MC, Engineering and Society, University of Virginia

As weaponry has progressed from primitive spears to modern day nuclear bombs, the requirement of a human agent to utilize such weaponry has remain untouched. However, new advances in artificial intelligence and autonomous technologies are paving the way for a new series of weaponry less reliant on human agents. To address the ethical and technical questions caused by this shift, this portfolio explores autonomy in aerial systems and weaponry through a technical report, involving the development of semi-autonomous drones, and an STS research paper, which investigates the historical impediments to an international consensus surrounding the definition and usability of lethal autonomous weapons systems (LAWS).

The technical report presents a novel approach to drone autonomy that addresses the challenges associated with fully autonomous unmanned aerial vehicles (UAVs) and human-controlled aerial vehicles. This project investigates and enables shared autonomy, a combination of human-controlled input and onboard autonomy, to maintain safety while ensuring human input for desired actions. My team designed and developed an aerial platform for shared autonomy and obstacle avoidance in UAVs, utilizing a custom-made printed circuit board (PCB) with 1-D time of flight LIDAR sensors. An embedded Robotic Operating System (ROS) is employed to visualize real-time LIDAR data and simulate obstacle avoidance for UAV systems. This report highlights our innovative approach to shared autonomy, which allows for safer and more efficient drones, distinguishing our project from past work that relied on switching between human-controlled inputs and onboard autonomy. By striking a balance between human intervention and automation, this technical project contributes to the responsible and ethical development of drone technology in various applications, such as aerial photography, infrastructure inspection, surveillance, and search and rescue operations.

In the STS research paper, the the Social Construction of Technology (SCOT) framework was utilized to examine the ethical and political considerations of the United States, Russia, and China in relation to lethal autonomous weapons systems (LAWS). This paper argued that the lack of global agreement on LAWS stems from divergent views on the delegation of life-and-death decisions to machines, with potential progress from prioritizing defensive autonomous systems and promoting trust through transparency and confidence-building measures. Due to China’s support for a ban on only offensive LAWS, the paper suggests that the development of strictly defensive LAWS may be a point of common ground among the global leaders, as it would allow Russia to continue developing autonomous systems for their military goals while addressing the ethical concerns raised by the United States and other countries. Ultimately, this paper provides insights for stakeholders, proposes future research directions, and highlights the implications of ethical guidelines and international collaboration regarding LAWS.

BS (Bachelor of Science)
Artificial intelligence, Autonomous technologies, Weaponry, Unmanned Aerial Vehicles (UAVs), Lethal Autonomous Weapons Systems (LAWS), Shared Autonomy, Obstacle Avoidance, International Consensus
Issued Date: