Facebook: leaked documents revealed internal rules

Facebook is allowing users to share death threats, videos of self-harm and pictures of animal torture, it emerged yesterday.
The english newspaper Guardian has obtained leaked copies of over 100 internal documents outlining Facebook's rules for handling sensitive content that reveal staff moderating the social media website are told not to delete such content.

The images may be removed from the site "once there's no longer an opportunity to help the person," unless the incident has news value, according to the documents. Facebook is said to have an extensive list of secret rules and guidelines for deciding what its 2 billion users can and cannot post.

The leaked policies also state videos of violent deaths can be marked as disturbing but not necessarily deleted to create awareness for “self-harm afflictions and mental illness or war crimes and other important issues”.
The site’s polices on a wide-ranging number of issues are said to be “overwhelming” its moderators with one source telling The Guardian the site has “grown too big, too quickly” and it “cannot keep control of its content”.

Moderators who spoke anonymously to The Guardian also revealed they find themselves overwhelmed with work and review more than 6.5 million reports of fake accounts a week.

They also say new challenges are creating more problems using revenge porn as an example where users post nude or explicit photos of someone without their permission. 

Facebook had no specific comment on the report but said safety was its overriding concern. "Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech," Facebook's Head of Global Policy Management Monica Bickert said in a statement.

"This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously."

Facebook confirmed that it was using software to intercept graphic content before it went on the website, but it was still in its early stages.

The company has taken flak for removing material with social significance, like a livestream showing the aftermath of a black man shot at a traffic stop in July and a posting of an iconic Vietnam war photo because it included child nudity.

To address the issue, CEO Mark Zuckerberg said earlier this month that Facebook will hire 3,000 more people over the next year to monitor reports about violent videos and other objectionable material. That team already had 4,500 people reviewing millions of reports every week.