Detecting and Tracking Polarized Communities in Temporal Networks; Social Media As a Public Good: Finding a Compromise Between Public And Private Interests In Social Media Content Moderation

Author: ORCID icon
Spaeth, Joseph, School of Engineering and Applied Science, University of Virginia
Hott, John, EN-Comp Science Dept, University of Virginia
Neeley, Kathryn, EN-Engineering and Society, University of Virginia

As social media has become increasingly ubiquitous in the last two decades, the user bases of those sites have become increasingly fragmented. For example, insular communities surrounding Donald Trump’s false claims about the 2020 election. This fragmentation allows for the formation of “echo chambers”, insular communities with a large amount of sway over their members. Particularly as politics have become more polarized, the echo chamber phenomenon has become more important than ever. My technical work revolves around using graph theory to find and analyze the development of echo chambers over time, particularly examining the initial debate over mask-wearing in the United States. Political extremists often operate in these chambers, and false news often spreads out of them. This phenomenon has been accompanied with the rise to prominence of false news and hate speech spreading via social media platforms. The spread of false news in particular is driven by echo chambers. In order to combat the spread of false news and hate speech, my STS research examines how social media companies ought to handle content moderation, particularly as echo chambers become easier to identify.
The goal of my technical work was to examine a dataset of coronavirus related tweets, attempting to see how and if echo chambers formed around the pro- and anti-mask communities on Twitter between the months of March and June of 2020. The goal of this was to examine tweets surrounding masks and similar public health measures and see if and how communities formed around those public health measures. To do this, I created networks relating users to other users as well as users to the hashtags they tweeted over time. Then to find the communities, I ran community detection algorithms on the temporal networks I created. This allows for the examination and detection of communities both at discrete time steps within the dataset, as well as communities over the entire time series or smaller sections of it. By doing this, I was able to identify specific groups, as well as groups split along ideological lines, and track their development. Though my results in finding both pro- and anti-mask groups were mixed, my approach was effective. Additionally, my method and the methods of others can be iterated upon, and provide a framework by which time series social network data can be analyzed for community detection.
Although we are capable of detecting echo chambers and other types of communities in social networks it is also important to understand the role social media companies have in fostering the creation of those communities, as well as the role they play in the propagation of false news and hate speech. Currently, moderation of false news and hate speech is handled poorly, and this poor handling is the focus of my STS research. I applied the private and public regulation framework of Emmanuel Mesthene to determine how this regulation should work, as well as who should be making regulation decisions with regard to content moderation online. In short, Mesthene claims that as new technologies (in this case social media) develop and integrate into society, government oversight over those technologies becomes more important. Because of how widely used social media is, as well as its continuing development, it is vital to make decisions which help maintain those technologies without harming society as a whole. Mesthene’s framework demonstrates that government intervention is needed in order to best regulate these massive corporations, as compromises need to be struck between the private interests of the corporations owning social media sites and the public interest of free and safe speech online. This research has demonstrated to me that for the technical work of myself and others continues to advance it will only become more important that these websites develop well defined policies in the field of content moderation.
Stan Lee once opined that “With great power, comes great responsibility.” That has been the primary takeaway from the work outlined here. Though my technical work gives us the ability to leverage data about community formation, my STS research has demonstrated the dangers of automation, and the care with which we must treat widely used technologies such as Facebook and Twitter. This work however, is not just my own. It would not have been possible without the help and support of my advisor, Robbie Hott, who has given me the ability to explore these ideas as I saw fit. Additionally, the emotional support and encouragement I received through the last year from my friend and roommate Chris Zaino, my girlfriend Julia Pasco-Anderson, my parents, George and Julie Spaeth, and the University of Virginia through CAPS have been vital to my success.

BS (Bachelor of Science)
graph theory, temporal networks, community detection, social media, content moderation, dynamic networks, twitter, COVID-19

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: John Hott
STS Advisor: Kathryn Neeley

Issued Date: