Metro Weekly

Meta Reverses Decision To Leave Up Anti-Trans Facebook Post

The Oversight Board of Meta reversed a company decision to allow a post encouraging trans people to kill themselves to remain online.

Illustration: Todd Franson; Original Photo: Mark Zuckeberg – Alessio Jacona, via Flickr

The Oversight Board for Meta, the parent company of Facebook, reversed a company decision to leave up a Facebook post that appeared to advocate for transgender individuals to kill themselves. 

The Board found that the post violated Meta’s Hate Speech and Suicide and Self-Injury Community Standards, noting that the issue with the case was not the policies themselves, but Meta’s enforcement of the policies.

The Board advised Meta to close gaps in how the company enforces its “community standards” by improving the internal guidance it gives to reviewers when they are determining whether users have violated those standards.

The offending post, which was placed online by a Facebook user in Poland in April 2023, included an image of a striped curtain in the blue, pink, and white colors of the transgender flag.

The text on the image, in Polish, read, “New technology… Curtains that hang themselves.”

Above that text were the words “spring cleaning” with a heart emoji next to it.

The user’s biography on their profile includes the description, “I am a transphobe,” which critics of the post have pointed to as evidence that the post was meant to be inflammatory and deliberately offensive to trans people.

The post received fewer than 50 reactions, with 11 different users reporting the post a total of 12 times. But only two of the 12 reports were prioritized for human review by Meta’s automated systems, while the other ten were closed without any action being taken.

The two reports that underwent human review, alleging that the post violated Facebook’s prohibition on content encouraging suicide or self-harm, were determined not to not be in violation. Meanwhile, none of the reports alleging that the post was a form of “hate speech” were reviewed by Meta employees.

Three users subsequently appealed Meta’s decision to leave up the Facebook post, with one appeal resulting in a human reviewer upholding the original decision that the post didn’t violate the Suicide and Self-Injury Community Standard. But appeals alleging violation of Meta’s Hate Speech Community Standard never underwent human review. 

One of the users who reported the original post appealed to the Oversight Board, an anonymous entity that reviews and issues binding verdicts on content moderation cases across Meta’s platforms, such as Facebook and Instagram.

The Oversight Board chose to hear the appeal of the decision to let the post stand, after which Meta determined the post violated both its Hate Speech and Suicide and Self-Injury policies, removed the post from Facebook, and disabled the account of the user who posted the content — though not for the anti-trans post, but previous unrelated violations of community standards.

The Board subsequently ruled that the content of the post violated Meta’s prohibition on “hate speech” because it includes “violent speech” in the form of a call for a protected-characteristic group’s death by suicide. The post, which advocates for suicide among transgender people, “created an atmosphere of intimidation and exclusion, and could have contributed to physical harm.”

Given the nature of the text and image, the post also exacerbated the mental health crisis affecting members of the transgender community, the Board wrote in an explanation of its decision. It also concluded that the post continues an ongoing pattern of attacks and political rhetoric directed at the trans community by politicians and public figures in Poland. 

“The Board urges Meta to improve the accuracy of hate speech enforcement towards LGBTQIA+ people, especially when posts include images and text that require context to interpret. In this case, the somewhat-coded references to suicide in conjunction with the visual depiction of a protected group (the transgender flag) took the form of ‘malign creativity.’ This refers to bad actors developing novel means of targeting the LGBTQIA+ community through posts and memes they defend as ‘humorous or satirical,’ but are actually hate or harassment.”

The Oversight Board expressed concern that Meta’s human reviewers did not pick up on, or perhaps deliberately ignored, context clues that would have allowed them to reach the conclusion that the post violated community standards.

It also said that Meta’s defense of the human reviewers, arguing that they hewed closely to existing guidance, reveals flaws in that guidance. Lastly, the Board criticized Meta’s automated review prioritization systems, which failed to flag the offending post as problematic.

The Board recommended that Meta clarify its Suicide and Self-Injury Community Standards to expressly forbid content promoting or encouraging suicide aimed at an identifiable group of people, not just to individual users.

It also recommended that Meta amend its internal guidance to clarify to human reviewers that “flag-based visual depictions of gender identity that do not contain a human figure are understood as representations of a group defined by the gender identity of its members.”

GLAAD, which submitted comment to the Oversight Board ahead of its decision, celebrated the Board’s conclusion and called on Meta to be proactive in addressing the flaws of how Meta deals with alleged violations of community standards.

“I personally want to hear Meta CEO Mark Zuckerberg tell the world, today, that his company cares about the safety, rights, and dignity of transgender people,” Sarah Kate Ellis, the president and CEO of GLAAD, said in a statement. “This dangerous hate on his platforms is causing devastating real-world harm and it must stop.”

Ellis noted that GLAAD, the Human Rights Campaign, and more than 250 LGBTQ celebrities and allies had signed onto an open letter in July 2023 urging all major social media companies, including Meta, to create a plan of action for how to deal with content expressing anti-transgender hate or animus on their platforms.

None of the social media companies ever responded to the letter. 

“This new Oversight Board ruling presents a vitally important opportunity for Meta,” Ellis said. “The company must address this urgent and terrifying phenomenon of violent anti-trans hate content. The weaponization of lies targeting historically marginalized groups has a long and terrible history and the spread of such disgusting bigotry should be vehemently and immediately denounced by Meta as not in alignment with their company values.”

Support Metro Weekly’s Journalism

These are challenging times for news organizations. And yet it’s crucial we stay active and provide vital resources and information to both our local readers and the world. So won’t you please take a moment and consider supporting Metro Weekly with a membership? For as little as $5 a month, you can help ensure Metro Weekly magazine and MetroWeekly.com remain free, viable resources as we provide the best, most diverse, culturally-resonant LGBTQ coverage in both the D.C. region and around the world. Memberships come with exclusive perks and discounts, your own personal digital delivery of each week’s magazine (and an archive), access to our Member's Lounge when it launches this fall, and exclusive members-only items like Metro Weekly Membership Mugs and Tote Bags! Check out all our membership levels here and please join us today!