The Guardian has managed to get their hands on more than 100 internal training manuals with spreadsheets and graphs that show how Facebook handles the moderation of sensitive content containing violence, terrorism, pornography, hate speech, and various other guidelines secrets of the social network.
The rules are recent and were disclosed to the company’s employees who filter the contents of denounced posts, but someone inside leaked and shocked the press by revealing how decisions on these topics are Taken.
For example, under Facebook’s rules, issuing violent or misogynistic comments like “kick that girl with green hair” are completely allowed, while anyone who suggests killing President Trump is banned immediately from the social network (In the US, threatening the president is a criminal act.)
Real vs. digital world
For the most popular social network in the world, which reaches close to 2 billion people, what users online speakers do not necessarily represent what they would say face to face with a person in the real world, as the following passage makes explicit:
“We intend to allow as much speech as possible, but we need to create a line of content that can cause credible damage in the real world. People commonly express disdain or disagreement by threatening or calling for violence in ways that are not so serious.
Our goal is to stop the potential for real harm caused by people who incite or coordinate damage to other people or property, requiring certain details to be present in order to consider a credible threat. In our experience, it is this detail that helps establish that a likely threat will occur.”
This leaves room for comments like “to break a woman’s neck, put pressure on the middle of her throat” not to be deleted. After all, according to Facebook, they are not a direct threat. The same goes for videos with very violent deaths. As much as they are marked as disturbing, the social network will not delete them because, according to her, these videos help to create awareness of problems.
Fun Fact: handmade art that depicts nudity or sexual activities is permitted. Not digitally taken, however.
Photos of physical abuse of children, as long as they are not sexually or bullying, cannot be deleted unless they contain sadistic elements. The same goes for photos of animal abuse, which can be shared without any problems. Abortion videos are also allowed if they don’t show nudity.
Live videos with self-harm? Allowed. The argument is that Facebook is not a publisher and therefore cannot punish or censor “suffering” people. “Moderate displays of sexuality such as open mouth kisses, simulated sex in clothes, and pixelated sexual activity” as well as sexual jokes are also allowed. Therefore, issuing offensive expressions with sexual profanity is not banned.
Facebook now also allows, among other things, “photos of nude babies so young if the focus is not on the genitals… [and] adult nudity images in the context of the Holocaust”. However, images of the Holocaust depicting nude children should be removed if users complain.
Fun Fact: page of a person with over 100,000 followers is considered a public figure , depriving it of having the same protections as private individuals.
How to moderate billions of content every day?
Monika Bickert, head of global policy management at Facebook, explained that it’s impossible to police the entire diverse global community that exists on the big network. And that’s when we’re only talking about the English language. Faced with pressure from recent months, Facebook is hiring 3,000 new moderators, in addition to the 4,500 already in existence, to try to reduce the spread of hate speech and child exploitation.
“We have a very diverse global community and people will have very different ideas about what is OK to share. No matter where you draw the line, there will always be gray areas. For example, the line between satire and humor is very thin, it’s very gray. It’s very difficult. decide whether some things belong on the site or not,” Bickert tells The Guardian.
The executive adds: “We feel responsible to our community to keep them safe and feel more aware. It’s absolutely It’s our responsibility to stay on top of it. It’s a company commitment. We’ll continue to invest proactively to make the social network safe, but we also want to empower people to report any content that violates our standards.”
combat and revenge porn
Although some rules have caused some shock from the press, there is also good news in this whole story. In January, Facebook deactivated more than 14,000 accounts that were propagating sexual extortion and rage porn, in addition to 33 incidents involving children.
Facebook recently declared war on sensational headlines and took the opportunity to say that it is developing and testing various tools to combat shared content.
What do you think of the policies of Facebook Facebook? Do you agree with them? Share your opinion in the comments!
Source: The Guardian