Shared Data Models: Synchronizing Schemas Across Codebases; Dead Internet Theory: How Online Interactions are Becoming More Robotic

Author:
Soh, Chiao, School of Engineering and Applied Science, University of Virginia
Advisors:
Morrison, Briana, EN-Comp Science Dept, University of Virginia
Earle, Joshua, University of Virginia
Abstract:

Technical Project Abstract
After years of development, a fleet management solutions company wanted to finally connect their onboarding system to their main fleet management software for new users to easily upload their data. In implementing an API for this purpose, I discovered that there were data model inconsistencies between the two codebases.
To address this, I explored several solutions: making no changes and tolerating the slow speed; extracting the model code into a shared repository and including it via Git submodules or a Python package; and extracting the model code into a standalone microservice. While I was unable to implement a fix within my internship timeline, I proposed that the direct code inclusion would best resolve the schema inconsistencies, improve data transfer speeds and minimize unnecessary complexity.
In re-exploring this issue, I found value in another solution: merging both codebases into a single monolith, given how the onboarding system is intrinsically coupled to the main fleet management system and does not have sufficient justification to be a stand alone system. This approach would provide significantly more maintainability and save developer time in the long run. Implementing this solution would require ensuring that the main fleet management software remains functionally unchanged while adapting the onboarding system to the standardized schema.

STS Project Abstract
I examine the growing role of bots and AI-generated content within social media platforms, using Actor Network Theory to analyze both individual motivations and corporate policies.
I begin by contextualizing the issue in the rapidly evolving digital landscape, where automation and artificial intelligence intersect with human interaction. I highlight the real and escalating problem of bots, showing how their proliferation disrupts communities and alters the dynamics of online communication. Bots, once seen as a niche phenomenon, are now integral to online interactions, with their presence having significant consequences for user trust and engagement.
I explore the motivations driving bot creators, specifically trolling and harassment, politics, pornographic advertisement, content creation/art, and scams, all of which exploit platform algorithms designed to maximize engagement. In response, some companies, like TikTok and YouTube, have implemented policies to disclose AI-generated content and limit bot activity. However, platforms such as X (formerly Twitter) have created environments that inadvertently incentivize bot activity through features like paid verification. All of this comes together to create an ecosystem where bots are welcome.
Ultimately, there is an inherent tension between platform profit motives and ethical concerns. While regulations and policy changes can mitigate some of the damage, the weaving of bots into platform ecosystems makes a complete solution unlikely. Instead, a careful balance of transparency, stricter identity verification, and a reevaluation of corporate incentives will ensure a healthier, more authentic digital environment.

Similarity
Although the topics in both papers, one exploring bots and AI-generated content on social media platforms and the other addressing technical challenges in fleet management system integration, may appear unrelated, they share underlying themes of design, optimization, and balancing competing priorities. Both papers explore the tension between complexity and simplicity in system development, whether it's the complexities of automated bots on social platforms or the technical challenges of aligning two disparate software systems.
In the context of bots, the issue lies in the unintended consequences of platform algorithms and policies, where engagement optimization often leads to the spread of inauthentic, automated content. Similarly, in the fleet management system, optimizing for efficiency and speed requires aligning two inconsistent data models, an issue that could introduce complexity and slow down the entire system. Both scenarios involve balancing the needs for efficient, effective outcomes with the risks that arise from complicated systems, whether it's the negative impact on user trust and platform integrity due to bot-driven engagement or the technical difficulties in ensuring seamless data transfer between systems.
Additionally, both papers reflect on the role of oversight and control in system implementation. In the case of bots, social media companies must regulate bot activity and prioritize authenticity, while in the fleet management system, the need for a clear solution (such as merging codebases) reflects the importance of direct, hands-on intervention to prevent further complications and technical debt. Ultimately, both situations underscore the challenges of integrating complex systems, whether social or technical, and the importance of strategic decision-making to ensure long-term sustainability and functionality.

Degree:
BS (Bachelor of Science)
Keywords:
Schema, Codebases, Bots, Dead Internet Theory
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: Briana Morrison
STS Advisor: Joshua Earle

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2025/05/03