Jessica Guyun, USA Today, December 3, 2020
Facebook puts a higher priority on detecting and deleting racist slurs and hate speech against Black people, Muslims, Jews, the LGBTQ community and people of more than one race than on statements such as “White people are stupid” and “Men are pigs.”
The company said Thursday its automated moderation systems are being retrained to focus on hate speech targeting historically marginalized and oppressed groups, which “can be the most harmful.”
“We have focused our technology on finding the hate speech that users and experts tell us is the most serious,” Facebook spokesperson Sally Aldous said in a statement.
Facebook has been reckoning with its role in systemic racism in the aftermath of George Floyd’s death. An internal civil rights audit faulted Facebook for prioritizing free expression over nondiscrimination. The audit was released in July as civil rights groups led a massive boycott of Facebook, and nearly 1,000 companies pulled millions of dollars in advertising to protest the spread of hate speech, violent threats and misinformation on Facebook’s platforms.
“Over the past year, we’ve also updated our policies to catch more implicit hate speech, such as content depicting Blackface, stereotypes about Jewish people controlling the world and banned Holocaust denial,” Aldous said. “Thanks to significant investments in our technology, we proactively detect 95% of the content we remove, and we continue to improve how we enforce our rules as hate speech evolves over time.”
Facebook bans hate speech based on race, gender and other characteristics. It relies on a set of rules called “Community Standards” to guide decisions about what violates that ban. The standards are enforced by computer algorithms and human moderators.
According to Facebook’s hate speech policy, derogatory statements about men and white people are treated the same as anti-Semitic statements or racial epithets.
For years, civil rights activists have lobbied Facebook to change its policy of protecting all groups equally. Neil Potts, public policy director at Facebook, told USA TODAY last year that applying more “nuanced” rules to the daily tidal wave of content rushing through Facebook and its other apps would be challenging.