Facebook's Rules for Managing Political Speech

Emojis.PNG

A New York Times article describes a complex rulebook to help employees decide whether to address political speech on the site. The decision process is daunting, and guidelines include extensive examples of content—text and visuals—that may indicate hate or inspire violence.

The author questions whether Facebook employees are making rational, consistent decisions:

The guidelines for identifying hate speech, a problem that has bedeviled Facebook, run to 200 jargon-filled, head-spinning pages. Moderators must sort a post into one of three “tiers” of severity. They must bear in mind lists like the six “designated dehumanizing comparisons,” among them comparing Jews to rats.

Others say Facebook has too much power because the company controls speech in international political situations. For example, before an election in Pakistan, during a 24-hour media blackout, Facebook may have been the go-to source. Before this time, Facebook distributed 40 pages of “political parties, expected trends, and guidelines” to its employees. But guidelines sometimes contradict each other, and Facebook relies of Google Translate, which may not be accurate or precise enough.

Cover image source.

Discussion:

  • When Facebook faced criticism about sharing information with technology partners, company leaders responded in a blog post. If they were to do something similar in this case, what should they say to rebuild confidence in the site?

  • Although the rules can be a “bit baffling,” as the author says, Facebook is trying to address other criticism about its response to hate and violence on the site. What, if anything, can the company do differently to accept accountability?