Graphic Business News

Leaks 'expose peculiar Facebook moderation policy'

The guidelines Facebook uses to decide what users see are 'confusing' say staff
The guidelines Facebook uses to decide what users see are 'confusing' say staff

How Facebook censors what its users see has been revealed by internal documents, the Guardian newspaper says.

It said the manuals revealed the criteria used to judge if posts were too violent, sexual, racist, hateful or supported terrorism.

The Guardian said Facebook's moderators were "overwhelmed" and had only seconds to decide if posts should stay.

The BBC understands the documents seen by the newspaper closely resemble those Facebook currently uses to guide staff.

The leak comes soon after British MPs said social media giants were "failing" to tackle toxic content.

Careful policing

The newspaper said it had managed to get hold of more than 100 manuals used internally at Facebook to educate moderators about what could, and could not, be posted on the site.

The manuals cover a vast array of sensitive subjects, including hate speech, revenge porn, self-harm, suicide, cannibalism and threats of violence.

Facebook moderators interviewed by the newspaper said the policies Facebook used to judge content were "inconsistent" and "peculiar".

The decision-making process for judging whether content about sexual topics should stay or go were among the most "confusing", they said.

The Open Rights Group, which campaigns on digital rights issues, said the report started to show how much influence Facebook could wield over its two billion users.

"Facebook's decisions about what is and isn't acceptable have huge implications for free speech," said an ORG statement. "These leaks show that making these decisions is complex and fraught with difficulty."

It added: "Facebook will probably never get it right but at the very least there should be more transparency about their processes."

'Alarming' insight

In a statement, Monica Bickert, Facebook's head of global policy management, said: "We work hard to make Facebook as safe as possible, while enabling free speech.

"This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously," she added.

As well as human moderators that look over possibly contentious posts, Facebook is also known to use AI-derived algorithms to review images and other information before they are posted. It also encourages users to report pages, profiles and content they feel is abusive.

In early May, the UK parliament's influential Home Affairs Select Committee strongly criticised Facebook and other social media companies as being "shamefully far" from tackling the spread of hate speech and other illegal and dangerous content.

The government should consider making sites pay to help police content, it said.

Soon after, Facebook revealed it had set out to hire more than 3,000 more people to review content.

British charity the National Society for the Prevention of Cruelty to Children (NSPCC) said the report into how Facebook worked was "alarming to say the least".

"It needs to do more than hire an extra 3,000 moderators," said a statement from the organisation.

"Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe."

Analysis: Rory Cellan-Jones, BBC Technology Correspondent

It has been clear for a while that dealing with controversial content is just about the most serious challenge that Facebook faces.

These leaked documents show how fine a line its moderators have to tread between keeping offensive and dangerous material off the site - and suppressing free speech.

A Facebook insider told me he thought the documents would show just how seriously and thoughtfully the company took these issues.

Why then does it not publish its training manual for moderators so that the world could see where it draws the line?

There are community guidelines available to read on Facebook but the company fears that if it gives away too much detail on its rules, that will act as a guide to those trying to game the system.

But what will strike many is that they have seen this kind of document before. Most big media organisations will have a set of editorial guidelines, coupled with a style guide, laying out just what should be published and how. Staff know that if they contravene those rules they are in trouble.

Of course, Facebook insists that it is a platform where people come to share content, rather than a media business.

That line is becoming ever harder to maintain, as governments wake up to the fact that the social media giant is more powerful than any newspaper or TV channel in shaping how the public sees the world.