Facebook algorithms and user polarization; Exploring methods to address COVID-19 misinformation on Facebook

Author:
White, Zachary, School of Engineering and Applied Science, University of Virginia
Advisors:
Baritaud, Catherine, EN-Engineering and Society, University of Virginia
Pettit, Raymond, EN-Comp Science Dept, University of Virginia
Sun, Yixin, EN-Comp Science Dept, University of Virginia
Abstract:

Facebook News Feed algorithms have become a gateway to much of the information that is seen on the platform. Because these are specialized in content personalization, they have subsequently been cited for enabling and enhancing an increasingly polarized online environment for users, as well as contributing to the spread of harmful misinformation through. Both parts of this project discuss problems around algorithms and informational integrity. This technical report observes the algorithm as an entity and how it interacts with users on the way to propagating biased tendencies; this is done through exploring existing research in algorithms and constraining bias. The STS research paper analyzes how News Feed algorithms played specifically into the widespread misinformation presence around the COVID-19 pandemic. Further methods to mitigate misinformation online are also discussed related to this topic.

The technical research report analyzes two studies around Facebook and its algorithms. The first study interviews Facebook users from Myanmar in 2016, a country that saw unprecedented surges of Facebook use for information access. The author observes how users engage as much or more often with unfamiliar contacts online to access their news and information content in their News Feed. User actions like this cause those information connections to be prioritized by the algorithm, showing how the symbolic meaning and interpretation of Facebook as freedom of information altered its use and effect in this environment.
The second study demonstrated the effects of constraining personalization algorithms by implementing a balanced-opinion News page, fueled by limitations on bias. This demonstrates how constraining greedy algorithms at every step can diversify the set of opinions a user sees online, reducing the tilt of bias from the otherwise extreme convergence. Researching how users interact with and understand algorithms is a necessary supplement to directly changing the outputs the algorithm selects for its personalized worlds.

The STS research paper seeks to answer the research question: How can Facebook better create an online environment without misinformation around the COVID-19 pandemic? Facebook can do this by diversifying networks, actively seeking truth in journalism, and integrating more effective user participation in fact-checking. Studies have shown the effectiveness of manifesting social norms online is a necessary supplement to third-party fact checking, and this is accomplished more easily when universal truths are coherent in the same place a user sees personalized information.
These two reports analyze crucial links in the chain of Facebook news content; the technical shows how the user interactions and algorithm constraints are vital components needing consistent attention and research, and the STS explores how coronavirus misinformation can be mitigated through healthier online environments.

Degree:
BS (Bachelor of Science)
Keywords:
Facebook algorithms, Actor-network theory, Coronavirus misinformation, Polarization, fact checking
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: Raymond Pettit
STS Advisor: Catherine Baritaud

Language:
English
Rights:
All rights reserved (no additional license for public reuse)
Issued Date:
2021/05/14