DARA: Development of a Chatbot Support System for an Anxiety Reduction Digital Intervention; Second-Order Consequences of Differential Privacy

Author: ORCID icon orcid.org/0000-0002-8072-6420
Schwartz, R.X., School of Engineering and Applied Science, University of Virginia
Advisors:
Barnes, Laura, EN-Eng Sys and Environment, University of Virginia
Earle, Joshua, EN-Engineering and Society, University of Virginia
Abstract:

This portfolio contains a technical report, “DARA: Development of a Chatbot Support System for an Anxiety Reduction Digital Intervention,” and an STS research paper, “Second-Order Consequences of Differential Privacy.” Both projects focus on concepts within engineering disciplines; the technical report involves the creation and evaluation of a chatbot for mental health support, while the STS research paper focuses on the wider-reaching sociotechnical problems posed by differential privacy, a data-protection technique.

The chatbot-focused technical report was developed in response to the high rate of participant attrition from digital mental health applications, particularly in the anxiety reduction area. For many digital interventions, participant attrition (leaving the study before an adequate dose of treatment is received) poses a substantial challenge to the success of the intervention. A high rate of participant attrition has been confirmed within the MindTrails program, a UVA cognitive bias intervention which aims to treat anxiety.

Previous research has suggested that human coaching (regular check-ins by non-specialist human beings who respond to user needs) may reduce attrition in digital health interventions. In fact, human coaching was implemented in one of the recent versions of the MindTrails program. However, some participants in the recent MindTrails program who did not complete the full intervention expressed that interacting with human coaches was anxiety-producing and a factor in them quitting the study. In response to the simultaneous goals of reducing participant attrition and meeting the needs of anxious participants, our student team designed a chatbot, “DARA,” that would help support anxious users when they have questions or technical issues. Our chatbot was designed and implemented using the Juji platform. The final version of our chatbot gave users suggested responses (instead of free-text responses) in the three domains of Technical Issues, Usability and Knowledge Issues, and Implementation Issues.

We tested the chatbot among a group of subject-matter experts, who evaluated the usability of the chatbot using a standardized questionnaire and supplemental questions. Participants found the chatbot experience to be streamlined, and enjoyed the design of the chatbot, finding it friendly and personable. However, participants expressed concerns about the ability of the chatbot to answer complex questions, as well as the inflexibility of the chatbot’s quick replies option, among other concerns. We proposed a number of improvements, such as free-text responses and hybridizing human-chatbot interactions, in order to help fix these issues in future work.

The STS research paper was developed in order to better understand the conflicts and concerns that are present when differential privacy is used in society. Differential privacy (DP) is a computational technique that reduces the accuracy of a query on a given dataset in order to better protect the privacy of individuals in the dataset. DP has shown increasing utility in recent years due to greater and greater computational power available to combine and analyze datasets: the availability of this high computational power allows adversaries to extract information from datasets that may have previously been considered sufficiently anonymous. Another benefit of DP is that it provides a mathematical guarantee of privacy protection, enabling a permanent claim to dataset protection that will not be affected by further increases in computational power or the presence of other datasets.

The analysis in the STS research paper draws from Bijker’s social construction of technology (SCOT) and Bauer et al.’s concept of second-order consequences. In particular, SCOT posits that technological development is not autonomous from social influences, and rather that technological development should be considered to be driven by social groups that hold “problems” relevant to a technology. The concept of second-order consequences holds that any technological innovation leads to additional social effects outside of the technological problem it was originally intended to solve. These consequences should be managed, and, when possible, anticipated and directed towards positive ends.

As a part of the STS analysis, social groups are identified that are considered to be essential to DP: namely, these groups are DP researchers, DP implementers, dataset analysts, and dataset participants. These groups are considered to have certain objectives or problems with respect to DP. After outlining these groups, three second-order consequences are considered in the STS research paper: concretization of conflict, centralization, and obscuring certain forms of analysis. Lastly, recommendations for resolving unintended and negative second-order consequences are considered at the end of the paper. These recommendations include the diffusion of DP education, training, and discussion; the use of simulations; and the consideration of DP as a single solution in a wider risk framework.

Degree:
BS (Bachelor of Science)
Keywords:
digital mental health, chatbot, conversational agent, anxiety, differential privacy, social construction of technology, second-order consequences, epsilon
Sponsoring Agency:
National Institute of Mental Health
Notes:

School of Engineering and Applied Science

Bachelor of Science in Systems Engineering

Technical Advisor: Laura Barnes

STS Advisor: Joshua Earle

Technical Team Members: Annabel Lynch, Disha Patel, Aparna Ramanan

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2022/05/14