Get an inside look of how Facebook censors its content.
via Twitter
Facebook has officially hit 2 billion users. Mark Zuckerberg, the social network’s CEO, announced the milestone on Tuesday, saying, “As of this morning, the Facebook community is now officially 2 billion people! We’re making progress connecting the world, and now let’s bring the world closer together.” But policing a social network with so many users from different cultures and communities isn’t a simple task, according to documents released by Pro Publica.
According to a 1996 federal law, most tech companies are not required to censor their user-generated content, so Facebook is free to make its own rules for censorship. Currently, Facebook employs around 7,500 content reviewers who work as human censors. These censors have a specific algorithm to follow when deciding whether to delete posts. The result is a platform where certain subsets of people are allowed to be targets of online attacks, while others are protected.
Protected categories (PC):
Race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation, and serious disability/disease.
Non-protected categories (NPC):
Social class, continental origin, appearance, age, occupation, political ideology, religions, and countries.
Facebook deletes the following types of attacks when directed at protected categories:
Calls for violence, calls for exclusion, calls for segregation, degrading generalizations, dismissing, cursing, and slurs.
Facebook created rules to guide its content reviewers on whether to delete or allow certain posts.
PC + PC = PC
Example: African-American + Jewish = Protected category
PC + NPC = NPC
Example: African-American + child = Non-protected category
Therefore, an attack on African-American Jews would be deleted by Facebook content reviewers. But an attack on African-American children would not be deleted because age is not a protected category.
Under this algorithm, white men are protected while African-American children are not.
via Facebook
Recently, Facebook released a new category: quasi-protected (QPC). This designation applies to migrants who are only protected from calls for violence and dehumanizing generalizations. But they are not protected against calls for exclusion and degrading generalizations that are not dehumanizing.
PC + QPC = QPC
NPC + QPC = NPC
Therefore, Muslim migrants are a quasi-protected category, but teen migrants are a non-protected category.
“The policies do not always lead to perfect outcomes,” Monika Bickert, head of global policy management at Facebook told Pro Publica. “That is the reality of having policies that apply to a global community where people around the world are going to have very different ideas about what is OK to share.” Danielle Citron, a law professor and expert on information privacy at the University of Maryland, disagrees, saying the rules are “incorporating this color-blindness idea which is not in the spirit of why we have equal protection.” Citron also believes the “rules protect the people who least need it and take it away from those who really need it.”