The social media giant has published new guidelines on the standards it expects from its users and has urged people to “share responsibly”.
In an exclusive interview with Sky News, one of Facebook’s top executives has admitted the company’s enforcement of policy violations is not at all perfect but insists the company has the interests of its users at heart.
Siobhan Cummiskey, Facebook’s head of policy for Europe, the Middle East and Africa said: “We absolutely do have the interests of the community of people who use Facebook at heart, and safety is really important to us, and that’s really why we are publishing this new set of community standards.”Facebook is also publishing, for the first time, a copy of the guidelines its 7,500 content reviewers use to help weigh up if a post violates the company’s policies and should be removed”.
She further went on to say that the said Facebook was in the process of hiring more people to help in the review of content.
“It’s very important that we use a combination of technology, human reviewers and the flagging of problem content in order to remove posts that violate our community standards,” she said.
“We use automation in order to route the various reports that we receive every week. We receive millions of reports every week, and automation helps us to route those reports to the right content reviewer.
“Technology is therefore really helping us here. In the context of child exploitation imagery, we use technology in order to stop the re-upload of known child exploitation images.
“Technology is also helping to counter terrorism. Ninety-nine percent of terrorist content is removed before it is ever flagged by our community of users.”The new community standards, posted on Facebook’s platform from Tuesday, highlight the firm’s determination to act on unacceptable content, but is also an admission the organisation needs to improve
Facebook states: “Our policies are only as good as the strength and accuracy of our enforcement and our enforcement isn’t perfect. We make mistakes because our processes involve people and humans are not infallible.”
Facebook has been under heavy criticism in recent weeks, with the company’s founder Mark Zuckerberg forced to apologise to US politicians over the Cambridge Analytica data mining scandal.
The company was also being accused by a number of governments of not doing enough to tackle terrorist material online.
On sunday, Health Secretary Jeremy Hunt gave the company until the end of this month to come up with a more strong and sturdy system for protecting children on Facebook.
Ms Cummiskey said she is going to welcome the opportunity to work with Mr Hunt and the UK government more generally in improving safety.
Safety is extremely important to us and I think we’re only getting better,” she said.
“There’s always more that we can be doing in this space and that’s really what today is all about. It’s about being very clear that harmful content has no place on Facebook at the same time we are trying to create a platform for all ideas.”