Human Activity Recognition and Movement Visualization using Smartwatches
Mondol, Md Abu Sayeed, Computer Science - School of Engineering and Applied Science, University of Virginia
Stankovic, John, EN-Comp Science Dept, University of Virginia
Automatic recognition of human activities is attributed with great importance for its widespread and potential applications in different domains, including healthcare, safety, behavior monitoring, energy management, manufacturing, and elderly care. Human Activity Recognition (HAR) is the cornerstone for most of these applications, and so the performance of such an application largely depends on the accuracy and robustness of the underlying activity recognition models. However, activity recognition is challenging, particularly in natural settings, due to issues like confounding gestures present in different activities, large diversity in performing the same activity, a wide range of possible human activities, and usability of the solutions. Also, on-device processing, required by many real-time applications, is challenging due to limited resources available in the wearable devices. In contrast to the state-of-the-art methods that mostly emphasize feature engineering and classification techniques for HAR, our works focus on leveraging the orientation of the device and the distribution of the data in developing more efficient, robust, and accurate solutions for activity recognition. We developed a novel and efficient algorithm for detecting eating events from wrist-worn accelerometers. The algorithm improves eating gesture detection f1-score by 0.19 with less than 20% computation compared to a baseline method. We developed and deployed a comprehensive system for monitoring family eating dynamics that uses the solution for eating event detection. We present a solution for hand washing detection that reduces the false positives from unseen activities by about 77% and improves the overall F1-score by 0.17 compared to a baseline method. Data visualization is useful in understanding the data and their characteristics, and it is a fundamental step toward developing data-driven solutions. A novel visualization method that provides additional utility to the existing methods is of utmost desire. We developed a novel method for visualizing movement and orientation using inertial sensors. We analyzed a dataset related to smoking activity recognition and provided several insights through representing the data using our method. We developed an efficient solution for smoking puff detection using insights from the data. Additionally, we illustrated how our method can be used to monitor movement patterns related to several rehabilitation exercises. Reminders are often used in activity recognition systems. Though the focus of the thesis is activity recognition and movement visualization, we developed a voice-based interactive reminder system to close the loop. Finally, this dissertation describes an easy-to-use tool that we have developed to collect sensor data from smartwatches.
PHD (Doctor of Philosophy)
Human Activity Recognition, Smartwatch, Eating Detection, Hand Washing, Movement Visualization
All rights reserved (no additional license for public reuse)