Домой United States USA — IT What do Facebook's internal content guidelines reveal?

What do Facebook's internal content guidelines reveal?

398
0
ПОДЕЛИТЬСЯ

The social media company’s rulebook says they won’t necessarily remove violent or abusive images. But should they?
There’s no smoking gun in the documents, published by the Guardian newspaper: they don’t differ hugely from the guidelines it publishes. But they offer an insight into just how tricky these issues are — and show some inconsistencies on the grey areas.
Dealing with millions of flagged posts is a team of just 4,500 moderators — although as Mark Zuckerberg recently announced in his own Facebook post, 3,000 more will be joining them.
That’s a total of 7,500 moderators for nearly 2 billion users.
Some of the details seem shocking on first sight — namely that Facebook won’t necessarily remove images of violent death, non-sexual abuse of children and self-harm.
For each, though, there’s a counterargument.
Violent death might be part of a news story that deserves an audience. Non-sexual abuse or bullying of children is allowed so long as there is no sadistic or celebratory element — again, creating awareness of a problem might override that.
And with live-streaming self-harm, Facebook’s response is in fact quite nuanced: «We don’t want to censor or punish people in distress who are attempting suicide, » the Guardian reports. «Experts have told us what’s best for these people’s safety is to let them live-stream as long as they are engaging with viewers.»
And beyond all of these instances is a concern not to act as a global censor.
Politicians have called on Facebook to do more to remove this content.
But do we really want Facebook to decide what is and what isn’t appropriate speech?
We already have laws to deal with real abuse and hate speech. Those should be enforced properly online, rather than putting the onus on Facebook.
But Facebook’s scale makes the challenge much bigger. Images and posts can circulate widely before they’re flagged and removed. Images and posts can be reuploaded time and time again.
Part of the challenge is that each country has its own norms and laws. Some might find it ridiculous that the Thai government can threaten Facebook with legal action over anti-monarchy posts.
But those are the local laws and it makes more sense for Facebook to obey them than ride roughshod over them.
I think there are a couple of other options for content here in the UK.
One is that Facebook signs up to an Ofcom-like regime, as broadcasters do here. If someone complains about a post, as they do with a broadcast, it could be investigated. More leeway should be given to Facebook, as a platform for 30 million users here in the UK. The emphasis would be on how quick it is to remove content, rather than publish it in the first place.
Regulating social media has an Orwellian tinge, though.
Another option: all social media companies could sign up to a voluntary, independent body — a bit like the Independent Press Standards Organisation (IPSO) . This could offer quick redress to people who makes complaints. It should also offer more transparency on moderation processes, whether it’s on Facebook, YouTube or Twitter. Like IPSO, it would be self-regulating.
IPSO is the successor to the maligned Press Complaints Commission. Despite its existence, newspapers routinely published inaccurate and damaging stories.
We’re seeing a similar thing with Facebook — just on a massive scale. With newspapers, no one has found a solution that strikes the balance between protecting newspaper’s freedom of speech and protecting the subjects of their stories.
It’s going to be even harder doing the same for 2 billion people.

Continue reading...