Smart Charlottesville: Designing the Future; The Effects of Facebook's Response to Content Moderators

Ayers, Cory, School of Engineering and Applied Science, University of Virginia
Wayland, Kent, EN-Engineering and Society, University of Virginia
Ibrahim, Ahmed, EN-Comp Science Dept, University of Virginia

How can websites promote positive online interactions while protecting against explicit content? Social media and the internet have greatly changed the way we communicate with others in the 21st century. People can now hide behind screen names and talk to others with little to no consequences. Over time, the interactions we have online have uncovered many new issues, including cyber-bullying, seeing graphic images/videos, and hate speech being spread. These interactions can have long-lasting effects on the people who witness them and it is a problem that needs more attention. Every year, millions of posts online are removed due to graphic or inappropriate content. This does not mean that online interactions cannot have positive effects on humans and their communities. When fostered correctly, the internet can be a place that brings people together and promotes positive experiences for users. Collaboration with others through the internet can allow companies to do business with each other across the world and other feats that were impossible before computers. If hateful content remains on sites though, users will begin to find new alternatives to spend their time. For this reason, sites being built are carefully created and monitored to promote healthy interactions. I will be examining the effects of harmful posts online, and why combating these posts is an extremely important issue in protecting not only the users of the site, but the humans who help moderate this negative web traffic.
The implementation of technical online platforms has become an increasingly popular idea to engage residents of a city with local government, and the University of Virginia plays a vital role in this due to its technical expertise. My team’s technical project involved creating a platform that promotes positive interactions and communication among residence, government, and members of the university community. This was done through creating a web application with Django, a Python framework, the help of Google Maps APIs, and having it hosted on Amazon Web Services. Our team got feedback from community members on what an ideal site would look like and, with the help of our clients, created a platform for members to share projects, problems, communicate, and share valuable resources with one another. The goal of our project is to connect as much of the greater Charlottesville area as possible to promote positive change, which we believe will happen as the site continues to be used by community members.
The STS research paper revolves around Facebook content moderators and recent reports about how they are being treated in the current work environments. These workers get paid less than $30,000 a year, and are forced to moderate the worst and most graphic content the web has to offer. Hundreds of employees have reported getting post-traumatic stress disorder from this job, while others still lay awake at night with guns underneath their pillows, fearful of the entire world. The importance of these employees and narrowing the knowledge gap between moderators and the public was the overall goal of my research, gaining information from testimonials, books on the subject, and ongoing law suits against Facebook. While at the end of my research it appears that Facebook is doing as much as they can to hide the truth about these workers, I believe my research is a great step in showing the situation as a whole to help people better understand what is truly going on.
Overall, I was very pleased with the value of my work on these two projects this year. At the beginning I set out to create a neat, usable platform for the Charlottesville community, which was completed on time and with praise from our client. We were able to input the majority of his requests while also getting feedback from students and members of the community to improve the usability of the site. On the STS research side, I was slightly disappointed with the overall results of Facebook’s response to the content moderation debate, but believe it gives even more reason for my work to be necessary, to help people gain knowledgeable information on this sensitive subject. For anyone picking up where I left off, I would recommend looking deeper into lawsuits that are still ongoing. Facebook Ireland is being sued by the workers of a contracting company called CPL that is waiting a result, as well as a higher-up Facebook employee who has sued the company for its efforts on content moderation. These cases will likely impact the continued response of Facebook and could change the narrative going forward.

BS (Bachelor of Science)
moderation, community, interaction

School of Engineering and Applied Sciences
Bachelor of Science in Computer Science
Technical Advisor: Dr. Ahmed Ibrahim
STS Advisor: Kent Wayland
Technical Team Members: Sanjana Hajela, Jared Tufts, Luke Deni, Anthony Lancaster, Conner Hutson, Kajal Sheth

All rights reserved (no additional license for public reuse)
Issued Date: