Moderation and Censorship in Media

July 28, 2022

Introduction

Social media plays a crucial role in our lives. I believe the things we see on the internet can severely affect our perspective of the world. Social media also serves as a platform for people to share their thoughts and emotions. Now, these thoughts and emotions can vary from an individual showing their love or hatred towards someone or something. For instance, in cases of hatred or anger, one can write badly about someone or, even worse, threaten them through social media with fake images and rumors. We can take many examples like this to show why moderation and censorship are important in social media, but at the same time, moderation and censorship should not take the right from people to speak freely on the internet. Elon Musk (Musk, 2022) said on Twitter that "failing to adhere to free speech principles fundamentally undermines democracy." He further highlighted that his move to buy Twitter was an attempt to support free speech online and that he aimed to develop a better platform to restore some of Twitter's current content monitoring. Throughout the essay, I will talk about how moderation is important, my own experience as a moderator, the Oversight Board from Facebook, which I see as a notable example of connecting the idea I am trying to portray, and diverse ways of moderating and censorship.

Problem

One of the critical queries is who should be given the power to moderate and censor speech on social media—should it be the government or the tech companies? The Oversight Board is the best example to understand this question. Facebook created the Oversight Board to give users an unbiased observer through which they can challenge Facebook's moderating judgments. The board also helped Facebook make decisions on what to delete and what not to. Banning Donald Trump because he posted a video in which he asked his supporters to leave the Capitol was one of the many decisions made by the board. This choice, in my opinion, was entirely appropriate, as Facebook is a private company, and people should respect that they have their own ways of handling things. Mark Zuckerberg (Zuckerberg, 2022) said, "We believe the risks of allowing the President to continue to use our services during this period are simply too great"; this motivated other big tech companies to take the same step as well.

Thoughts

As we see in the above example, the power of making decisions was not in the hands of an individual but a group of people. Now there are millions of posts that do not follow the guidelines and which, I believe, are not made by someone famous but by someone who is just a regular person and uses social media; these posts still need to be moderated and censored. To handle this issue, companies have prohibited words that we cannot use on the platform, and some are also developing AI to make decisions about these posts. From my perspective, these methods are valid because humans can't moderate this many posts. However, again, it is our responsibility to ensure that these methods are within the guidelines and they do not steal the right from users to express themselves. Discord bots are a good example of that; one of the most famous bots on Discord is MEE6. Although this bot has many additional characteristics, its moderation capabilities are where it really shines. It can give timeouts to users who spam, abuse, use shady links, all-caps messages, and has many more features. This is one of the most helpful bots for moderators, and it notifies the moderators if anyone is using harsh words, etc. The bot also has premade commands to help moderators. Throughout my time as a moderator, this bot helped me do my tasks more effectively and reduced my time.

Contradictory to my perspective shown above, I also feel that moderation can affect a platform's profitability; let us say, as a moderator, I decide not to take down a post that offends other users on the platform. This will eventually lead to users leaving the platform. Many big tech companies make most of their profit from advertising, and they cannot afford to lose their user base and engagements. Now, social media moderators are humans, too; they will surely feel disturbed while moderating vulgar content on social media. In an interview called "Who Will Guard the (Social) Guardians with Ella Dawson," Ella Dawson said that "There are moments where it does feel like you are the one being insulted or attacked" (Crossfield, 2022). Facebook has allocated 5% of its revenue to moderation, which is significantly more than Twitter's yearly revenue, but many people are upset with the company's efforts. I believe that this failure in satisfying users in moderation is connected to moderators getting affected by the content on social media.

Conclusion

Since so many people may express themselves too strongly on social media, moderation and restriction are crucial in today's world of social media. At the microscopic level, where millions of posts are being generated where humans are unable to evaluate each one of them, with the aid of techniques like bots and AI, moderation can be improved in order to give people the freedom of speech while abiding by certain norms and regulations. This would not only make the process quick but also less stressful for the moderators. Instead of having a single person deciding what's wrong or right on the platform, there should be a group of people (like the Oversight Board) making a set of rules which everyone on the platform should abide by. I would certainly not be comfortable making these decisions alone, nor, I expect, would you. Critics have written extensively on the internet's risks, both real and imagined, but since it is here to stay, it is time to pay close attention to its positive potential.

References

Musk, E. (2022). twitter.com. Retrieved 27 July 2022, from https://twitter.com/elonmusk/status/1507777261654605828?s=20&t=YLZVnPSzMwq3FcaUpngckg.

Zuckerberg, M. (2022). independent.co.uk. Retrieved 27 July 2022, from https://www.independent.co.uk/news/world/americas/us-politics/why-donald-trump-banned-facebook-b1859976.html.

Crossfield, J. (2022). The Hidden Consequences of Moderating Social Media's Dark Side - Content Marketing Institute. Content Marketing Institute. Retrieved 28 July 2022, from https://contentmarketinginstitute.com/cco-digital/july-2019/social-media-moderators-stress/.