Facebook will begin banning white nationalist, white separatist, and white supremacist content, and direct users who attempt to post such content to the website of the nonprofit Life After Hate, which works to de-radicalize people drawn into hate groups.
The change, first reported Wednesday by Vice’s Motherboard, comes less than two weeks after Facebook was heavily criticized for its role in the Christchurch mosque attack. The gunman went live on the platform for several minutes before the attack began, showing off his guns and at one point ironically said, “Remember lads, subscribe to PewDiePie,” referring to the Swedish YouTuber connected to a number of racist and anti-Semitic controversies.
BuzzFeed News has reached out to Facebook for more information on how the ban will work. In a blog post titled “Standing Against Hate,” Facebook published today, the ban takes effect next week. As of midday Wednesday, the feature did not yet appear to be live, based on searches by BuzzFeed News for terms like “white nationalist,” “white nationalist groups,” and “blood and soil.”
“It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services,” the blog post reads. “Over the past three months our conversations with members of civil society and academics who are experts in race relations around the world have confirmed that white nationalism and separatism cannot be meaningfully separated from white supremacy and organized hate groups.”
Earlier this week, a French Muslim advocacy group filed a lawsuit against Facebook, along with YouTube, for not removing footage of the attack quickly enough.
Facebook did not respond to an inquiry from BuzzFeed News last week on whether white nationalism and neo-Nazism were being moderated using the same image-matching and language-understanding it uses to police ISIS-related content. According to internal training documents that were leaked last year, Facebook has typically not considered white nationalism intrinsically linked to racism.
Based on information in Motherboard’s report, the platform will use content-matching to delete images previously flagged as hate speech. There was no further elaboration on how that would work, including whether or not URLs to websites like the Daily Stormer would be affected by the ban.
Progressive nonprofit civil rights advocacy Color of Change called Facebook’s new moderation policy a critical step forward.
“Color Of Change alerted Facebook years ago to the growing dangers of white nationalists on its platform, and today, we are glad to see the company’s leadership take this critical step forward in updating its policy on white nationalism,” the statement reads. “We look forward to continuing our work with Facebook to ensure that the platform’s content moderation guidelines and trainings properly support the updated policy and are informed by civil rights and racial justice organizations.”
In another change to Facebook’s moderation policy following public outcry, last month, the platform announced that anti-vax misinformation would appear less frequently across people’s News Feeds, public pages and groups, private pages and groups, search predictions, and in recommendation widgets around the site. The announcement came after weeks of pressure from lawmakers and public health advocates to crack down on anti-vax content.
Originally published at Buzzfeed