Facebook releases long-secret content removal policy

ALLIANCE  DPA  AP IMAGES

ALLIANCE DPA AP IMAGES

An worldwide daily spoke to the Monika Bickert, head of global policy management at Facebook and she said, "These are issues in the real world".

We do not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence.

Facebook has for the first time published its Community Standards that dictate what its nearly 2 billion users can and can't post on the site.

The company can recognize images that have been posted before but cannot recognize new images.

The Community Standards were developed by Facebook's content policy team.

However, if you go too far, Facebook might remove the content on other grounds, such as hate speech or threats.

For example, Facebook says it won't allow hate speech about "protected characteristics". The company is also introducing a new appeals process, allowing users to request a review if they believe their post has been removed unfairly. (Facebook users are allowed to appeal the shutdown of an entire account but not individual posts.) The Washington Post previously documented how people have likened this predicament to being put into "Facebook jail" - without being given a reason why they were locked up.

In some cases, entire profiles, pages, or groups were taken down for violating Facebook policies, making all included content unavailable. Considering, engagement from Facebook India is controlled and barely existent, the appeals process will be useful for many complainants. Under pressure from several governments, it has been beefing up its moderator ranks since 2017.

"I worked on everything from child safety to counter terrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher".

"In the context of child exploitation imagery, we use technology in order to stop the re-upload of known child exploitation images", she said. At the events, Facebook representatives will discuss the platform's content policies and request feedback. Also in the new rules expanded the list of situations when Facebook will indicate the specific reason for deleting content.

At the April 17 meeting, about 25 employees sat around a conference table while others joined by video from New York, Dublin, Mexico City, Washington and elsewhere. There was little mention of what competitors such as Alphabet Inc's Google do in similar situations.

Facebook says the consequences of breaching its Community Standards vary depending on the severity of the breach and a person's history on Facebook. For example, videos of people affected by cannibalism, is generally not allowed, but these videos allowed if a warning appears, and the patient "is in a medical situation". For the first time, the social network is publishing detailed guidelines to what does and doesn't belong on its service - 27 pages worth of them, in fact. Presently, the team includes 7,500 content reviewers that's 40% more than the number they had in April before. "Now everybody out there can see how we're instructing these reviewers", says Monika Bickert, Vice President of Product Policy and Counterterrorism, said last week at a press briefing in Facebook's Menlo Park, California headquarters, according to an article on CNET. Content deemed to be permissible under community standards but in violation of local law - such as a prohibition in Thailand on disparaging the royal family - are then blocked in that country, but not globally.

Latest News