Good mornings, Cornellians. Final post for the writing seminar summer semester here. To Dr. Jason Luther, thank you.
“21-year-old Brandon Andrew Clark posted a series of graphic images on Sunday of the slain corpse of 17-year-old Bianca Devins to Instagram and Discord, users immediately began spreading the gory pictures online, often alongside brutal, misogynist commentary… The Instagram post of Devins’ body was left on a platform shared by 1 billion monthly active users, for what was reportedly most of Sunday.” (Merchant). This case of a murder that was kept on Instagram, a company under Facebook, is an instance that shows exactly the flaws of Facebook and Instagram’s moderation system. The social media Facebook has some of the highest user count out of any other platform. As such, it’s content moderation team features a staggering count of 15,000 (Newton). So then how was this post kept up for almost an entire day? Unfortunately posts kept up for a period of time is nothing new for the company and in addition with recent exposure of multiple news sources about its treatment of their content moderators, has ultimately put pressure on the company for a way to combat this criticism. Although it is generally agreed on by researchers and journalists that Facebook is trying to fix its content moderation system I suggest that Facebook is struggling to address the real issues with content moderation by focusing on and publicizing an inefficient and ineffective method of review, a system they call their ‘supreme court’.
Facebook is a public social platform, under the direction of CEO Mark Zuckerberg. Under a high amount of content moderators, most content that is inappropriate to the platform is removed almost immediately. However, recently there has been skepticism on Facebook’s methods for moderation. Content moderators for the longest time seem to be unknown to the public. In an article by Casey Newton of the Verge, ”The Trauma Floor” Newton fully explores the life of content moderators at Facebook. She states first in the article a real life example of a content moderator in training at Facebook. This moderator, after finishing her last task in training, leaves the room in despair, “No one tries to comfort her. This is the job she was hired to do. And for the 1,000 people like Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is just another day at the office” (Newton). The moderator is just an example of the psychological toll the job entails. After Newton interviews more moderators at the workplace, “a workplace that is perpetually teetering on the brink of chaos… an environment where workers cope by telling dark jokes about committing suicide… a place where employees can be fired for making just a few errors a week” (Newton). This small branch interviewed in Phoenix represents the content moderation team of Facebook as a whole. These conditions are seen all around the world. Recent studies have suggested that some moderators now deal with PTSD (Chen). Facebook, when the public presented their complaints about their system, responded with, “We know this work can be difficult and we work closely with our partners to support the people that review content for Facebook. Their well-being is incredibly important to us. We take any reports that suggest our high standards are not being met seriously and are working with Cognizant to understand and address concerns” (Sullivan). It seems as though a new system by Facebook will be implemented to combat these issues.
A new system Facebook has been talking about developing is a ‘supreme court’. A court which features 40 members who have the final decision on whether a post is deleted or not (Newton). It works as such as the United States Judicial System in the sense that the normal moderators appeal to this ‘supreme court’ to ratify a decision that is particularly tough to make by the moderator alone. However, this supreme court has faced a lot of criticism from multiple sources. In an article by Shira Ovide of Bloomberg, “Zuckerberg’s Facebook ‘Supreme Court’ Can’t Handle All the Disputes”, Ovide explores just exactly why the supreme court will not work. When addressing the idea of the supreme court she states, “It’s a worthy step but also a 1% solution for an unimaginably vast problem” (Ovide). Her major concern is the frequent concern among the public of just how a 40-person board can tackle the millions of posts Facebook offers. Facebook themselves state, “the oversight board would be helpful for ‘dozens’ of cases every year in which there is debate within the company on the right approach for a post or video” (Ovide). In a chart provided by her by a Facebook Transparency Report for January through March 2019 uploaded by Facebook it shows the millions of posts that human moderators must moderate (Ovide). In particular the category ‘Violence/graphic content’ has more than 30 million posts, in just 3 months (Ovide). She states, “The Supreme Court cannot possibly scale to 2.7 billion people who use Facebook’s internet hangouts” (Ovide). Her statements are factual, and the biggest problem with the supreme court of Facebook. Although the court is designed for a dozen cases, it seems to be promoted in a way by the public much more than this. The team of 40 should be revised to include more to cover more posts. Facebook’s supreme court is still in testing and states that it is looking into the user feedback it has gathered since its initial news release back in 2018. Although the development of this supreme court does show that Facebook is trying to fix its moderation system, the common conception among reporters and journalists is that this ‘supreme court’ is inefficient and ineffective and by keeping their population they are providing a false sense of hope to the public. Overall, without addressing this issue of Facebook’s supreme court, it will largely be ineffective for curbing the larger issues that have been documented regarding content moderation
IFacebook, with one of the highest users reported on their platform, has a large content moderation team at around 15,000. Under recent exposure of their skewed content moderation team, announced their development of a supreme court which features 40 members that are given certain posts to ratify their final decision, and this is seen as a step towards fixing one small aspect of their moderation system: that of high profile cases that may affect their public image.. However, despite its efforts, a team of 40, designed to only ratify a few cases for out of the 30 million graphic/violent content is ineffective. Thus, the public should look beyond Facebook’s supreme court to give real pressure on the company to continue finding ways to fix its larger problem: its content moderation team.