Home » 10th Season (2022-2023) » Social Media & Privacy: Why Should We Care?

Social Media & Privacy: Why Should We Care?

By Anika Sukthankar, VI Form

Social Media & Privacy: Why Should We Care?

Editor’s Note: This project was made possible with the support of the Thomas H. Kean ’53 Fellowship. At their 25th Reunion in 1987, the Class of 1962 established the Thomas H. Kean ’53 Fellowship Program to honor Tom Kean, their teacher, advisor, mentor, and friend. The purpose of this fellowship is to enable students to explore important public policy topics and to embark upon exemplary lives of public service in the spirit of Governor Kean. 

Student-Submitted Note: As part of the Kean Fellowship, I took a college-level course called STS 1101: Science, Technology, and Politics. We studied several scientific controversies to further explore the relationship shared between technology and politics, and understood the societal implications. My deliverable was to write a LEO article on what I have found and researched.

“Behavioral advertising generates profits by turning users into products, their activity into assets, their communities into targets, and social media platforms into weapons of mass manipulation.”

-Rohit Chopra in his 2019 dissent against Facebook

As technology evolves and becomes an integral part of our society, the controversies surrounding its proper use and associated governmental policies have become increasingly complicated. We are building complex socio-technical systems that seem to guide our very behaviors and thinking. From the addictive nature of social media to privacy concerns, governmental policies seem to be lagging technological advancements. Events, such as the Capitol hearings, have made this topic of great interest.

Social media has become incredibly popular in recent years, with over 400 million new users joining these platforms annually. Despite this popularity, the majority of users are uncomfortable with the collection of personal data and believe that the government needs to do more to regulate the tech companies. Rebuilding trust between users and the social media companies will take a triumvirate of public awareness, self-regulation by the social media companies themselves, and government regulations.

Social platforms use algorithms to predict content that will best appeal to each user. The more engrossing or outrageous the post or content is, the more views they get, thus maximizing the platform’s ad-revenue and profit. Most users unwittingly or wittingly provide data by clicking on websites and their content. Even the amount of time one remains on a page, called ‘hover time’ is measured and recorded for each user. One of the industry’s measures of performance called the monetizable Daily Average User (mDAU) is used to define the attractiveness of a social media platform. The higher the number of mDAUs for a social media firm, the higher its draw for advertisers, and therefore higher its ad revenue. Most tech companies are built on the premise of excessive data collection, processing, and then exploitation for commercial use. This is generally in the form of sharing personal data and preferences with marketing firms to create “micro-targeted” ads. Many users place their blind trust, by sharing various aspects of their personal lives, and through their clicks and “hovers,” their preferences to these platforms.

While companies share their policies about data sharing, it is embedded in multi-page legal documents that can only be described as incomprehensible for the average user. This lack of awareness among most users about the data collection and usage practices of tech firms is what causes users to put their privacy at risk. Organizations such as the Electronic Privacy Information Center (EPIC) are trying to change that by creating awareness among the public and lobbying the US Congress to enact privacy laws like GDPR (Global Data Privacy Regulation) in Europe.

Over the past few years, there has been much discussion on the need to moderate user-specific content on social media. Many companies have taken action to suspend what they deem as hateful, inappropriate, or violence engendering content. This allows these private companies lawful control of the content on their platforms, which has created much debate about companies obstructing users’ freedom of expression and speech. The suspension of former President Trump’s social media accounts further fueled this debate with even world leaders like Angela Merkel questioning whether social media can restrict the freedom of expression by “de-platforming” individuals and groups.

Social media outlets are inherently different from traditional media outlets in that they are exempt from Section 230 of the Communication Decency Act, thereby limiting the liability of companies due to user-generated content on their websites. The sheer volume of content disseminated on social media (estimated at four billion videos viewed daily on Facebook alone) far outweighs the “curated and controlled” content shared via traditional media. In response to public pressure, social media companies have created mechanisms for self-governance. Meta, the parent of Facebook, has published community standards and a transparency report that highlights the enforcement actions taken. These initial steps towards such self-regulation are both necessary and positive.

The US has several federal and state-level statutes that govern data collection, privacy, and protection all aimed at protecting consumers. However, unlike Europe, which has implemented a Global Data Privacy Regulation (GDPR), the US is still working on advancing the American Data Privacy Protection Act (ADPPA) which provides similar protection for consumers. There are also several other laws with specific protections such as the Children’s Online Privacy Protection Act (COPPA), which governs the collection of information about minors; the Health Insurance Portability and Accounting Act (HIPAA), which governs the collection of health information; the Gramm Leach Bliley Act (GLBA), which governs personal information collected by banks and financial institutions; and the Fair Credit Reporting Act (FCRA), which regulates the collection and use of credit information. The Federal Trade Commission (FTC) is also responsible for enforcing adherence to applicable laws to ensure consumer protection around online data collection, security, privacy, and sharing.

The public in general are now more curious than ever about securing their data. Many have attempted to distance themselves from social media and are now looking more into the moral responsibilities that social media platforms claim to uphold. But until effective policy laws come into practice, the protection of digital privacy will rely on an educated and aware consumer and self-regulation by social media companies to keep each user safe. As a society, we can simply be careful with the personal information that we share online and report inappropriate content to the authorities when needed.

Anika Sukthankar is a VI form day student from Shrewsbury, MA. Anika enjoys studying math, Latin, physics, English, and ethics. She hopes to major in science and technology related fields in college.

Search Volumes