Carolyn Wysinger’s post removed for comments about the fragility of white men.
With social media platforms taking a more aggressive stance regarding racist, abusive, and hateful language on their platforms, there are times when those efforts end up blocking conversations about race and racism itself. The likelihood of getting an account suspended or taken down has been referred to as “Facebooking while Black.”
As covered in USA Today, the situations can become complicated quickly:
A post from poet Shawn William caught [Carolyn Wysinger’s] eye. “On the day that Trayvon would’ve turned 24, Liam Neeson is going on national talk shows trying to convince the world that he is not a racist.” While promoting a revenge movie, the Hollywood actor confessed that decades earlier, after a female friend told him she’d been raped by a black man she could not identify, he’d roamed the streets hunting for black men to harm.
For Wysinger, an activist whose podcast The C-Dubb Show frequently explores anti-black racism, the troubling episode recalled the nation’s dark history of lynching, when charges of sexual violence against a white woman were used to justify mob murders of black men.
“White men are so fragile,” she fired off, sharing William’s post with her friends, “and the mere presence of a black person challenges every single thing in them.”
This post was quickly deleted by Facebook, claiming that it violated the site’s “hate speech” policies. She was also warned that attempting to repost the content would lead to her being banned for 72 hours.
Facebook’s rules are that an attack on a “protected characteristic” — such as race, gender, sexuality or religion — violates its “hate speech” policies. In this case, the removal was because Wysinger’s post was speech that targeted a group based on a “protected characteristic” (in this case “white men”) and thus it was flagged for deletion.
Questions to consider:
- How should a site handle sensitive conversations regarding discrimination?
- If a policy defines “protected characteristics,” are all groups defined by one of those characteristics to be treated equally?If so, is that in itself a form of disparate treatment for historically oppressed groups?
- If not, does that risk accusations of bias?
- Is there any way to take wider context into account during human or technological reviews?
- Should the race/gender/sexuality/religion of the speaker be taken into account? What about the target of the speech?
- Is there a way to determine if a comment is “speaking up” to power or “speaking down” from a position of power?
In the case described above, Wysinger chose not to risk losing her Facebook access for any amount of time, and simply chose not to repost the statement about Liam Neeson. Facebook, for its part, has continued to adapt and adjust is policy. It streamlined its “appeals” process to try to deal with many of these kinds of cases, and has announced (after two years of planning and discussion) the first members of its Facebook Oversight Board, an independent body that will be tasked with reviewing particularly tricky content takedown cases on the platform.
Case study written by The Copia Institute, June 2020.