EEG Controlled Robotics; The Cause of Costly Prosthetics

Author:
Rivas-Zelaya, Joshua, School of Engineering and Applied Science, University of Virginia
Advisors:
Sun, Sarah, EN-Mech & Aero Engr Dept, University of Virginia
Francisco, Pedro Augusto, EN-Engineering and Society, University of Virginia
Abstract:

In an age where science fiction rapidly turns into a reality, the dreams of futuristic technology enhancing our capabilities as humans has become tangible. Developing an EEG controlled prosthetic is one step closer to this enhancement, having an external robotic limb controlled by a user’s thoughts with the goal of providing a non-invasive manner of prosthetic control for amputees. Its socio-technical component focuses on examining the causes behind prosthetic device and orthotic service inaccessibility for amputees, as a significant demographic are impacted by a loss in daily capabilities. The junction of both allows for a prosthetic device to be created with current social issues in mind in order to mitigate these circumstances that cause inaccessibility so that an inexpensive and all around accessible solution can be made.

Currently EEG controlled prosthetics exist only in the field of research – that is, they currently do not exist on the market for amputees to utilize outside of academia. Its close counterpart, electromyography (EMG), which uses external signals from the muscles, is currently the primary method of non-invasive prosthetic control on the market. These devices, however, cost approximately thousands of dollars on the lower end, and can get as expensive as over $80,000. By developing EEG controlled prosthetics, a more seamless connection between the device and the user can be made while at the same time being conscious of cost. The device was 3D printed to save cost, and hardware was kept under an $800 budget to ensure accessibility was a priority. The OpenBCI Mark IV EEG Headset was used for control and data was fed into a deep q-learning algorithm (DQN) on a raspberry pi which controlled motors from its GPIO pins depending on the output from the DQN based on a user’s EEG signals.
A DQN was created and trained on pre collected data of EEG signals. The model reached an accuracy of 99% at correctly categorizing labeled data. Live data was able to be tested on the Raspberry Pi using the Brainflow library and the trained model was uploaded to a test script. Testing on live data, the model reached an accuracy of over 88%, surpassing the benchmark of 80% which would label the model successful.
The STS research study examined reasons why prosthetic and orthotic services had a global inaccessibility to amputees. With 40 million amputees around the world and almost 200 thousand annual amputations in the US, there is a large population affected by prosthetic inaccessibility. A literature review of scientific journals, articles, interviews, and data analysis of key statistics was utilized to answer the reason behind prosthetic inaccessibility.

Inaccessibility of prosthetic devices and orthotic services derives from social issues, such as ethnicity and economic status, lack of relevant government and healthcare infrastructure, and lack of aid from insurance companies. Certain ethnic groups have increased risk of health issues such as diabetes, which can result in amputations if left untreated, consequently leading to the need of prosthetic and orthotic services. These same ethnic groups tend to have a larger rate of impoverishment making it so that the most susceptible people needing prosthetic and orthotic services are those with the least amount of resources to afford the related services that go with it.

Degree:
BS (Bachelor of Science)
Keywords:
Prosthetics, Electroencephalography (EEG), Reinforcement Learning, Machine Learning, Wearable Robotics
Notes:

School of Engineering and Applied Science

Bachelor of Science in Mechanical Engineering

Technical Advisor: Sarah Sun

STS Advisor: Pedro Francisco

Technical Team Members: Abigail Dodd, Hailey Boyd, Joshua Rivas-Zelaya, Cayla Celis

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2025/05/08