Toward Accurate and Comprehensive Multimodal Health Sensing Using Wearables

Samyoun, Sirat, Computer Science - School of Engineering and Applied Science, University of Virginia
Stankovic, John, EN-Comp Science Dept, University of Virginia

Modern wearable devices having different embedded physiological and motion sensors have emerged as a subtle way to sense the physical and mental health of the user in daily life. For example, it is possible to monitor daily life handwashing, exercising by utilizing the motion sensors data as well as to monitor different mental health states (e.g., stress, anxiety, emotions) based on the physiological signals on a continuous basis. However, these health sensing solutions, when built on wrist-based wearables only (e.g., smartwatches) generally suffer from performance degradation due to poor quality of the sensor signals. Although learning representations across multiple modalities helps to improve the detection accuracy, the literature works also has several limitations in terms of comprehensive health sensing. First, the existing works have failed to provide any qualitative interactive assistance for following the guidelines, for example, assisting the user with which step(s) or repetition(s) were missed during an exercise or a handwashing event. Moreover, these works do not focus on building a generalized or unified solution that can utilize the commonalities of these services. Second, these systems are highly dependent on the availability of expert-annotated data from wrist wearables, which is often hard to acquire for wide-ranging health states, and therefore limits fine-grained detection of these states. Third, compared to the other devices, wrist wearables have very limited resources in terms of modalities, processing power and battery, which makes it difficult to build and run accurate and comprehensive solutions.

In this thesis, we overcome these notable limitations of the state-of-the-art. This thesis promotes the idea of using wearable sensors from the wrist only for health sensing by intelligently combining several novel machine learning and multimodal sensing techniques and reduces the need for using modalities that are either inconvenient for daily life (e.g., chest or head-based wearable) or are privacy-invasive (e.g., video camera). We build multimodal sensing solutions for wide ranging healthcare applications for monitoring the physical and mental health of the user, which provide accurate, resource-efficient and fine-grained detection, reduce the burden on expert-annotated data, and aid interactive and integrated healthcare services on a smartwatch. We extensively evaluate our solutions using several datasets and real-life user studies to demonstrate the performance improvement toward making the state-of-the-art more accurate and comprehensive.

PHD (Doctor of Philosophy)
All rights reserved (no additional license for public reuse)
Issued Date: