MENU

Facebook launches independent body for content policy

November 17, 2018 • Online & Social, Top Stories

An independent body to be constituted in the coming year will act as a "higher court" of sorts

An independent body to be constituted in the coming year will act as a “higher court” of sorts

Over the past couple of months, social networking service Facebook has come under scrutiny over data privacy and security measures.

At a recent media briefing held on Thursday, 15 November, Facebook reported that it has ramped up its ability to quickly detect “hate speech” and other posts violating community rules, with the leading social network under pressure from regulators in various countries and activists to root out abusive and inappropriate content.

An independent body to be constituted in the coming year will act as a “higher court” of sorts, considering appeals of content removal decisions made by the social network, Facebook CEO , Mark Zuckerberg said.

“I have come to believe that we shouldn’t be making so many decisions about free expression and safety on our own,” Facebook chief executive Mark Zuckerberg said in a media briefing.

The composition of the appeals body along with how to keep it independent while remaining in line with Facebook principles and policies was to be determined in the coming year.
Facebook also planned next year to begin releasing content removal summaries quarterly in a tempo on par with earnings reports, according to executives.

According to Zuckerberg, challenges faced by the California-based social network include the fact that people naturally tend to engage with more sensational content that, while perhaps at the edge of violating Facebook policies, are unhealthy for civilised discourse.

“We have made progress getting hate, bullying and terrorism off our network. It’s about finding the right balance between giving people a voice and keeping people safe.”Zuckerberg said.

Detecting bullying or hate can also require understanding of the gamut of languages used at Facebook, along with cultural contexts.

“We are getting better at proactively identifying violating content before anyone reports it, specifically for hate speech and violence and graphic content,” Facebook said in the new transparency report.

Facebook said that since its last transparency report, the amount of hate speech detected proactively, before anyone reported it, has more than doubled.

Edited by Neo Sesinye
Follow Neo Sesinye on Twitter
Follow IT News Africa on Twitter

Comments

comments


Comments are closed.

« »