In the most recent issue of the New Yorker, writer and author of Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation, Andrew Marantz, published an article that has held Facebook’s hypocrisies up to the light yet again. While Netflix’s documentary the Social Dilemma brought the dark sides of Facebook and similar social media networks to the forefront of our minds, Marantz is continuing this theme of holding billion-dollar social corporations to account.
The particular theme that Marantz has focused on throughout his piece, has centred around the issues of hate speech and misinformation. Through various interviews with different people who either have or are still currently working for Facebook, Marantz uncovers just how hypocritical and flippant Facebook’s Implementation Standards are. These are the integral things about Facebook that we learn from the interviews:
1. Moderators have a much harder job than you might think
In order to enforce Facebook’s Implementation Standards, the company hires around 15,000 content moderators around the globe. Their job is to flag or delete any content that is considered conflicting with Facebook’s Implementation Standards. In conjunction with working long hours, often at strange times in order to deal with the time-zone differences, these moderators are viewing some of the most disturbing footage all day every day in their jobs. These include things like: violence, child pornography, terrorist threats and so on.
Martin Holzmeister, a Brazilian art director who used to work as a moderator in Barcelona expressed to Marantz, “You’re sleep-deprived, your subconscious is completely open, and you’re pouring in the most psychologically radioactive content you can imagine”.
It is no surprise then that a lot of moderators walk away from their jobs at Facebook with PTSD. With thousands of moderators coming together in May of 2020 to create a class-action lawsuit against Facebook due to these severe mental health implications.
2. The Implementation Standards don’t actually guide what is taken down?
You’d think that if you were working such crazy hours and enduring such horrific content every day, than maybe you could justify it all with the idea that you were contributing to the greater good? That perhaps, putting yourself through such disgust would mean that millions of people wouldn’t have to endure the same thing? Unfortunately, this doesn’t seem to be the case.
What shines through within Marantz interviews, is that Facebook does not always take down things that go against their Implementation Standards, but rather, what things are causing bad press towards the site. Many of the moderators that Marantz expressed that quite often, they will see something that they deem as against the guidelines, such as racist undertones of a comment or post, but unless there is specific negative press regarding the comment, they won’t see reasoning behind taking it down.
For example, Britain First, an alt-right group that spouts nationalist, anti-Muslim hatred, was not taken down from Facebook for some time. A recruiter within the Facebook London office wrote for the Guardian declaring Britain First as “pretty much a hate group”, and that “today YouTube and Twitter banned them”, so, why weren’t Facebook doing the same?
In response, Facebook stated that they did not consider Britain First as a hate organisation, as “we define hate organisations as those that advance hatred as one of their primary objectives, or that they have leaders who have been convicted of hate-related offences”. Under this definition, Facebook decided there was no need to take Britain First’s page down.
However, a few weeks later, Darren Osborne, a member of Britain First, drove a van into a crowd close to a London mosque and killed one Muslim man, as well as injuring nine other people. During the case surrounding the crime, prosecutors repeatedly drew on evidence regarding Osborne’s inspiration and desire to kill stemming from Britain First’s influential presence on social media. It was then six weeks later, that Britain First was removed from Facebook.
But why does it have to take horrific murders for Facebook to call out and silence this sort of rhetoric? If people on the inside were red flagging this page, why wasn’t it taken down earlier? Time and time again, members of the moderating team within Facebook have noticed the connection between their concerns regarding content actually being acted on and the presence of a “press fire” surrounding the issue (or a #PRFire as the content is tagged as).
3. Trump and his ability to seemingly change the rules
Facebook has often been criticised regarding their inability to hold Trump to account on their platform, as he often uses Facebook and similar sites such as Twitter, to spread misinformation and lies. The key motivation for Facebook here seems to be a potential loss of revenue should Trump be banned from using the platform. Considering Trump’s huge sway not only on conversations within the US, but around the world, via his social media presence, Facebook is well aware of the threat that Trump could bring to Facebook’s membership community should they decide to censor him.
Initially when Facebook created their “Community Standards” (which would later be re-named their “Implementation Standards”) they outlined that they would prohibit all “content that directly attacks people based on race, ethnicity, national origin, or religion”. However, when Trump was running as a candidate for the Republican Presidential nomination back in 2015, he repeatedly used his Facebook page to call for “a total and complete shutdown of Muslims entering the United States”. According to the definitions outlined within the Community Standards, this was hate speech. It, therefore, should have been deleted. However, the executives of Facebook later announced that they would make “a one-time exception” for Trump’s post, despite the fact that the Standards definition outlining that any “calls for exclusion or segregation” would be removed from the site.
Later, in 2017, a loophole was created for hate speech within the Implementation Standards. What Facebook now decided was that they would “allow content that excludes a group of people who share a protected characteristic from entering a country or continent”. What this meant in practice, was that Trump could now get away with posting things such as “I am calling for a total and complete shutdown of Muslims entering the United States”, as well as, “We should build a wall to keep Mexicans out of the country”.
What this did was set a precedent in which Facebook showed they were willing to slide, nitpick and change their policies, in order to suit those who supported their revenues the most. And this was seen yet again in October of last year, where Trump was able to use his platform to spread lies and misinformation regarding his electoral opposition, Joe Biden. Again, Facebook rejected the idea that they should have to censor Trump, saying their, “approach is grounded in Facebook’s fundamental belief in free expression”.
However, this is in direct opposition to Zuckerberg’s own statement to representative Alexandria Ocasio-Cortez in which he stated, “if anyone, including a politician, is saying things that are calling for violence. . . or voter or census suppression, we will take that content down”. Emphasising yet again, that the Implementation Standards that Facebook apparently ‘enforces’, are really more of a suggestive list of guidelines, rather than an enacted framework.
4. So what has Facebook actually done since then?
As recently as August of this year, Facebook officially changed its guidelines. What has changed involves the following:
- organisations and movements that demonstrate significant risks to public safety, including US-based militia organisations, will be restricted
- 300 militarised social movements have been banned from Facebook pages, groups and Instagram accounts
- all content relating to QAnon has been banned
- posts from Trump regarding misinformation regarding the coronavirus have been removed
- all political ads have been banned for an indefinite period beginning on Election Night
Whilst these actions are definitely a step in the right direction, many critics fear not only change too little, too late but that also there needs to be more structural changes to Facebook and it’s algorithm in order for these problems to not continue popping up.
Elizabeth Warren described these new plans as “performative changes”, stating that Facebook would ultimately still fail to change until it had reformed “its broke algorithm, or take responsibility for the power it’s amassed”.
It appears that Warren is right. If Facebook continues to undergo an algorithm in which the most scandalous and emotionally manipulative content continue to capture our attention, then there is no hope that misinformation will cease to spread around the world like wildfire.
If you want to read Marantz’ article in the New Yorker you can click here. Alternatively, if you want to learn more about Netflix’s new documentary the Social Dilemma, you can also click here.
Subscribe to FIB’s Weekly Alchemy Report for your weekly dose of music, fashion and pop culture news!