The internal documents lay out the social media giant’s stance on moral and ethical decisions related to user content.
Facebook’s secret rules and guidelines which lay out how moderators should tackle everything from self-harm to violence and hate-speech reveal the daily struggle the firm shoulders to control content sharing on the network.
Facebook Live is a livestreaming service originally intended to act as a rival of other livestreaming platforms including YouTube, Ustream, and Twitch. Recording and sharing video footage on-the-go has become popular in recent years, and with the expansion of Facebook’s platform to include the feature, more of us than ever are either recording or tuning into our contact’s streams.
However, criticism has been levied against Facebook since the launch of the platform as the livestreaming service has not only been used for benign videos but also to broadcast a number of violent incidents.
According to the Wall Street Journal, at least 50 acts of violence have been broadcast to the network’s 1.8 billion users, including live suicides, murder, and sexual assaults.
The line between free speech and protecting users from such content is a difficult one to draw. Due to the size of the network, Facebook will often take a few hours — with the assistance of reports — to take down unacceptable footage and to help shoulder this task, CEO Mark Zuckerberg has promised to hire an additional 3,000 people on top of 4,500 members of staff already monitoring livestreams and video uploads.
In the meantime, administrators need to know where the line has been drawn between unacceptable content and footage which should not be censored or taken down.
On Sunday, the social media giant’s files were revealed by The Guardian. In over 100 internal training manuals, spreadsheets, and charts, Facebook has laid out guidelines for everything from violence and terrorism to pornography, racism, self-harm and even cannibalism.
Threats against so-called „protected categories“ such as President Trump should be deleted according to the publication’s files — and so „Someone shoot Trump“ is not acceptable to post. However, general, emotive threats such as „f*** off and die“ are allowed to stand.
In addition, threats which are not considered credible such as „To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat“ are permissible.
The files say:
Videos of violent deaths, alongside non-sexual physical abuse and child bullying, as well as self-harm livestreams may be marked as „disturbing, “ but do not have to be deleted unless there is a „sadistic or celebratory element, “ according to The Guardian.
The files reportedly state that such footage can be part of awareness campaigns and can „be valuable in creating awareness for self-harm afflictions and mental illness or war crimes and other important issues.“
Facebook will allow livestreams of self-harming for a time, as the company „doesn’t want to censor or punish people in distress who are attempting suicide, “ but it will be deleted once there is „no longer an opportunity to help the person.“
In addition, images of animal abuse may be shared, as well as artistic works showing nudity and sexual activity — as long as they have not been digitally created.
It will always be difficult, if not impossible, to always agree on what content is considered acceptable to users and what content violates community standards.
By its nature, what content you would consider to be „offensive“ or „obscene“ — with potentially a few exceptions, such as child abuse — is subjective, and maintaining the „right“ level of censorship, deletion, and free speech is a challenge that will only get harder for Facebook to tackle as the network expands and digital video viewing becomes even more popular.
See also: Thailand backs down on threats to block Facebook
„Keeping people on Facebook safe is the most important thing we do, “ Monika Bickert, head of global policy management at Facebook, said in a statement to sister site CNET. „In addition to investing in more people, we’re also building better tools to keep our community safe.“
„We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help, “ Bickert added.