LEAKED: Facebook’s Draconian Secret Rulebook for Regulating Political Speech
Facebook is trying to control political speech, but it’s doing it inconsistently and arbitrarily. A Facebook employee has given The New York Times over 1,400 pages from its rulebook on regulating political speech. He said he feared Facebook was exercising too much power.
The Times discovered that Facebook is censoring mainstream speech in some countries. In others, it allows extremist speech. For example, one moderator said he felt compelled by the rules to leave a post up that might incite violence. βYou feel like you killed someone by not acting,β he said. Moderators in India were told to flag for possible removal comments critical of religion.
Moderators were mistakenly told to flag for possible removal comments critical of religion.
The rulebook is a series of guidelines that several dozen Facebook employees come up with every other Tuesday morning in a meeting. They are then sent to all of Facebook’s 15,000 moderators around the world.
Overly Broad Description of Hate Speech
Definitions of Hate Speech
β’ Call for exclusion or segregation
β’ Designated dehumanizing comparison
β’ Dehumanizing claims
β’ Call for disease
β’ Mocking hate crimes
β’ Visual hate
Moderators sort posts into three tiers of severity. Some rules are quite broad. Mere calls for exclusion or segregation might be hate speech, no matter what the content. Others rules determine when βmartyrβ or βjihadβ indicates pro-terrorism speech. Rules determine when a banned group should not be discussed.
Words like “brother” or “comrade” may cross the line. The rules even warn about using emojis, such as ones that bully, mock or call to action.
Facebook maintains a list of people and groups it has banned for hate speech. The Times notes that not all of them are “on the fringe.” Facebook users are not allowed to support or praise them. Facebook bans more groups in countries where pressure is applied. Far left activist groups like the Southern Poverty Law Center are able to exert undue influence on Facebook in order to get people and groups on the right in the U.S. banned. Facebook in fact partners with the SPLC.
Facebook has different standards depending on the country. For example, the rulebook runs into problems with calls for an independent Kashmir in India. Some legal scholars say Indian law prohibits that speech β but some disagree. Nevertheless, Facebook warns moderators to look out for the words “Free Kashmir.”
Problematic Moderators
The moderators are required to scan about 1,000 pieces of content a day. This leaves them with only eight to 10 seconds per post, longer for videos. Much of the work is outsourced to companies that employ mostly unskilled workers.
In the U.S, Facebook errs on the side of removing too much harmless speech.
There is a legitimate tension between the competing interests of removing speech that is truly dangerous, and wrongly removing relatively harmless language. Facebook hasn’t quite found the right balance. In the U.S, it errs on the side of removing too much harmless speech.
The Times says Facebook has become “arguably one of the worldβs most powerful political regulators.” Facebook is a private company, not subject to the free speech protections of the Constitution. But because so much political speech today takes place on its platform, people have real concerns about how controls what appears on the site.
No one elected Facebook to decide how speech should be regulated.
Follow Rachel on Twitter at Rach_IC. Send tips to rachel.alexander@stream.flywheelstaging.com.