Facebook releases community standards update


It’s no secret that Facebook has been in image repair mode for the better part of two years.

Although they won’t come right out and say it, Facebook puts part of the blame on us and some of the content we post and share.

In an effort to clean up their act (and ours, apparently), Facebook has spent considerable time and money deleting user posts that violate “community standards.”

To do this, they’ve hired an army of reviewers to eliminate bad content. They’re also constantly tweaking algorithms to identify potential violations.

What Facebook’s army specifically hopes to curtail are posts that include adult nudity and sexual activity, fake accounts, hate speech, spam, terrorist propaganda, and violence and graphic content.

“Over the last two years, we’ve invested heavily in technology and people to more effectively remove bad content from our services,” said Guy Rosen, Facebook’s vice president of product management.

Facebook published the first report of these policing efforts in May.

That report was meant to show us the types of content that were detected and removed so that we could see just how well Facebook was enforcing its standards.

The report also revealed how much work is left to do.

“People will only be comfortable sharing on Facebook if they feel safe,” Rosen said.

A second report posted last week detailed improvement over the past six months. The update included new categories on the number of posts removed related to bullying and harassment, and sexual exploitation of children.

“We are getting better at proactively identifying violating content before anyone reports it, specifically for hate speech and violence and graphic content. But there are still areas where we have more work to do,” Rosen added.

Two content categories Facebook highlighted in its most recent report focused on successes in deleting posts with hate speech and graphic content.

The amount of hate speech Facebook deleted from the platform more than doubled from its first report, from 24 percent to 52 percent. Detection of posts with violence and graphic content also improved from 72 percent to 97 percent.

The biggest win, according to Facebook, is that most of these posts are being removed before anyone sees them.

“The majority of posts that we take down for hate speech are posts that we’ve found before anyone reported them to us,” Rosen said.

To get a sense of the volume of content, consider this: during a three-month period, Facebook took action on 15.4 million pieces of violent and graphic content. This means there may have been millions of other pieces of content that Facebook didn’t take action on, but still reviewed.

The report also noted the types of actions Facebook took on content violations.

“This included removing content, putting a warning screen over it, disabling the offending account or escalating content to law enforcement,” Rosen said.

You can read the report by searching “Community Standards Enforcement” at newsroom.fb.com.

Adam Earnheardt is chair of the department of communication at Youngstown State University. Follow him on Twitter at @adamearn and on his blog at adamearn.com.

Don't Miss a Story

Sign up for our newsletter to receive daily news directly in your inbox.