Refining a Novel AI Restaurant Recommender Application: A Systems Approach to Increasing User Engagement and Retention; Gendered Bias in AI: A Feminist Analysis of Societal Norms in Voice Assistants and a Novel Restaurant Recommender System

Author:
Darbha, Shreya, School of Engineering and Applied Science, University of Virginia
Advisors:
Riggs, Robert, EN-SIE, University of Virginia
Chung, Seokhyun, EN-SIE, University of Virginia
Earle, Joshua, EN-Engineering and Society, University of Virginia
Carrigan, Coleen, EN-Engineering and Society, University of Virginia
Abstract:

Technical Project Abstract
Deciding which restaurant to eat at often poses an inconvenience for many individuals. The decision-making process is riddled with a variety of factors such as personal preferences, social dynamics, and an overwhelming number of options. Our study addresses this issue by partnering with a startup, dinemait, that utilizes an artificial intelligence (AI) recommendation model to provide curated restaurant suggestions to its mobile application users. Our objective was to enhance user engagement and retention by applying a systems engineering approach that integrates product evaluation with targeted outreach strategies; this work was centered around improving dinemait’s application to retain and grow its active user base. Our team used the following approach to increase user engagement by: (1) evaluating the existing application, and (2) improving outreach features and techniques. After an internal review of the application, a study was conducted to gain user-centric data to further assess it. It consisted of focus group discussions and surveys to gain insight into necessary improvements and valuation of the application. Key usability challenges were identified–including unclear onboarding terminology, limited feature discoverability, and gaps in user trust with AI-generated suggestions. Next, by researching specific marketing strategies and ideating push notifications to encourage interaction with the application, we provided suggestions for dinemait to deploy in order to gain exposure. Our findings offer dinemait data-driven foundation for sustainable growth, combining iterative UX enhancements with targeted marketing efforts. More broadly, this paper demonstrates how a holistic, systems-based methodology can support AI-driven applications–especially during the initial stages of a startup.

STS Research Paper Abstract
This paper examines the influence of societal gender norms on AI-driven systems, using
Amazon’s Alexa and a novel AI restaurant recommender system as case studies. Employing a Feminist STS framework and Escroi et. al.’s Pygmalion lens to drive this analysis, it argues that these systems perpetuate gendered biases through biased training data, skewed word embeddings, and algorithmic feedback loops. The analysis reveals how feminine-voiced AI assistants, like Alexa, reinforce subservient roles for women, mirroring historical expectations of female politeness and service. Furthermore, biases entrenched in word embeddings within
Natural Language Processing models,which are often utilized by Large Language Models like ChatGPT and DeepSeek, can skew AI recommender systems; as a result, 'feminine' terms are likely to be associated with caregiving roles and 'masculine' terms with authority-based roles. Ultimately, these embeddings have the potential to shape restaurant recommendations along gendered boundaries. Algorithmic confounding and feedback loops exacerbate these biases, personalizing user experiences in ways that reinforce stereotypes and limit choices in the first place. While addressing the inherent challenge of fully eliminating bias in a world already rife with it, this paper addresses the need for tackling bias early in AI system development by utilizing the novel AI restaurant recommender system as a model for incorporating the Trustworthy Recommender Systems (TRS) framework, bias quantification models, and mitigation techniques. These strategies aim to enhance explainability, fairness, and transparency–ultimately fostering more equitable outcomes and challenging the perpetuation of
harmful gender norms in AI-driven technologies.

Synthesis of Technical and STS Research Paper
The technical and STS projects outlined above are deeply intertwined, addressing both the creation and societal impact of AI-driven recommender systems. While the technical project focused on refining dinemait’s user experience through a foundational systems engineering methodology (improving usability, engagement, and retention) it quickly became evident that technical solutions alone could not account for the social dynamics embedded within AI design–particularly one that deals with restaurant recommendations, which are riddled with various implications. The STS research complemented this by critically examining how societal gender norms influence AI behavior, highlighting risks such as algorithmic bias and the reinforcement of stereotypes. Insights from usability testing and user feedback in the technical project revealed patterns that aligned with concerns explored in the STS analysis: how personalization features could inadvertently reflect gendered assumptions. For example, recommendation outputs or interface language risked reinforcing normative expectations around dining preferences. This overlap emphasized the need for continual ethical awareness in technical development, particularly regarding how data, design choices, and feedback loops shape user interactions in the ever-growing AI space. The STS framework provided a feminist lens to question gendered biases, encouraging proactive strategies like transparency, fairness metrics, and bias detection within the technical workflow. In turn, the technical project grounded the STS research in practical application, offering real-world scenarios where social considerations directly informed design decisions. In conjunction, these projects illustrate that effective AI system development requires more than just technical efficiency—it demands a holistic approach that integrates ethical, social, and cultural awareness to ensure responsible technology.

Degree:
BS (Bachelor of Science)
Keywords:
Artificial Intelligence, Systems Engineering, Recommender System, Feminist STS Framework, Hughes Award 2025 Finalist
Notes:

School of Engineering and Applied Science

Bachelor of Science in Systems and Information Engineering

Technical Advisor: Robert Riggs, Seokhyun Chung

STS Advisor: Joshua Earle, Coleen Carrigan

Technical Team Members: Kat Fong, Kirsten Fung, Elizabeth Hunter

Language:
English
Issued Date:
2025/05/02