Media Perceptions of Big Tech: A Political Shift; Demographic Data Reconciliation

Author:
Bala, Shruti, School of Engineering and Applied Science, University of Virginia
Advisors:
Morrison, Briana, EN-Comp Science Dept, University of Virginia
Forelle, MC, EN-Engineering and Society, University of Virginia
Abstract:

While working on my STS research paper and technical capstone project, I had the opportunity to engage with two very different aspects of computer science. My STS research paper engaged with mainstream media’s view of big tech companies, while my technical capstone was a project building a data remediation pipeline. While these two topics are not directly related, they both relate to data ethics and technology’s role in society to protect consumers. Many of the news articles analyzed in my STS research paper dealt with the implications of insecure data practices from big tech companies and called out these companies for failing to protect their users. Some main themes from this analysis were public trust, transparency, and technological accountability, all of which were also apparent in my technical capstone project. Similarly, my technical project addressed concerns of customer privacy and improved standardization, compliance, and data privacy by creating an automated data reconciliation process. Furthermore, cybersecurity experts attribute a 78% rise in cyberattacks between 2022 and 2023 to risk assessments based on outdated data (Ackerman, 2024). With the rise of digital information, institutions must rely on accurate, up-to-date data to protect against these vulnerabilities. This need for updated data directly reflects the media’s coverage on the downfalls of big tech which I was able to address in my technical project. Working on both projects concurrently allowed me to approach each one with a more nuanced understanding of the broader implications of data governance to how external narratives shape public expectations for ethical oversight on companies.
My technical capstone addressed the challenge of inconsistent and undocumented customer data corrections. Online institutions manage vast amounts of sensitive customer data, where inaccuracies can lead to security vulnerabilities and lack of compliance with regulatory requirements. To address inefficiencies in the data correction process, I worked with a team to develop an automated data remediation pipeline. Before this project, updates to user records were handled manually and inconsistently, introducing compliance risks and inefficiencies. In response, my team of interns and I designed a fully automated pipeline which standardizes and automates the correction of demographic data, primarily using AWS services and an Apache Spark job. This solution improved workflow efficiency, traceability, and human-oversight for the data remediation process for my team. Additionally, its robust documentation supports compliance with regulatory standards while fostering transparency across all operations. Future work will include expanding the pipeline’s capabilities to handle more complex data correction scenarios such as correcting multiple attributes at once or expanding to correcting different customer attributes.
In contrast, my STS research paper analyzed how media narratives around big tech have shifted from admiration to skepticism. I used a critical discourse analysis to analyze 13 different articles published between 2015 and 2025 by outlets such as The New York Times, CNN, and The Guardian. I showed how early reporting embraced cyberlibertarian ideals, praising innovation and self-regulation, while more recent narratives have focused on privacy violations, the rise of generative AI, and the entanglement of big tech with political and global power. Through the framework of David Golumbia’s critique of cyberlibertarianism, I argued that this shift in discourse represents the media’s distrust with the practice of self-regulation and the need for more robust accountability structures (Golumbia, 2024). I used the framework of cyberlibertarianism to show how the media has uncovered that cyberlibertarianism and minimal government intervention leads to corporate prioritization of profit over ethics. The media’s increasing conflation of big tech with governmental influence, such as reports on lobbying efforts, political donations, and international data controversies, highlights this flawed ideal. The paper ultimately showed that media narratives are not only reactive to Big Tech's actions, but also actively shape regulatory priorities and societal values.
While I did not work on these two projects at the same time, working on my technical capstone first and my experience in a corporate setting helped enrich my understanding of my STS research paper. Being a part of a team whose main purpose is to protect and manage sensitive customer data made me more aware of challenges that companies face when trying to implement data governance and privacy. This hands-on experience helped me engage critically with media portrayals of big tech as a reflection of tensions between innovation, AI, and regulation that were reflected in many f the articles analyzed in my research. This experience helped me recognize the struggle to balance innovation and customer consent/privacy in a corporate setting, which are increasingly reported by mainstream media in the U.S.

Degree:
BS (Bachelor of Science)
Keywords:
Computer Science, Data Remediation, Cloud Computing, AWS, Mainstream Media, Big Tech
Notes:

School of Engineering and Applied Science
Bachelor of Science in Computer Science
Technical Advisor: Briana Morrison
STS Advisor: MC Forelle

Language:
English
Issued Date:
2025/05/08