Facebook and its parent organization Meta failed to identify violent hate speech in advertisements submitted by the non-profit organizations Global Witness and Foxglove. Global Witness conducted a similar test in March, and the social media behemoth failed back then too.
The test couldn’t have been easier, yet Facebook and its parent organization, Meta, failed.
The gang prepared 12 text-based advertisements that employed degrading hate speech to advocate for the death of individuals from Ethiopia’s three main ethnic groups: Tigrayans, Amhara, and Oromo. The advertisements were cleared for publication by Facebook’s processes, causing the system to fail the test.
Facebook Has Failed With Hate Speech Tests Previously
Back in March, Global Witness conducted a similar test with hate speech in Myanmar, which was also undetected by the social media giant’s systems.
This time, though, the group told Meta of the undiscovered infractions. The business stated that the advertisements should not have been permitted and emphasized its efforts to detect offensive content on its platforms.
After hearing from Meta, Global Witness filed two additional commercials for approval, both of which contained clear hate speech. The two advertisements were authorized in Amharic, Ethiopia’s most commonly spoken language.
Meta’s Statements On Approval Of Violent Adverts
In November, Meta said it had removed a post by the Ethiopian Prime Minister encouraging locals to commit violence against the opposing Tigray soldiers attacking the country’s capital. In a since-deleted tweet, Prime Minister Abiy was seen urging Ethiopian citizens to commit attacks against Tigrayan forces.
Abiy stated in the since-deleted tweet that the “responsibility to die for Ethiopia belongs to all of us.” He urged individuals to organize “with every weapon or capability.”
Abiy, who has 4.1 million followers on the network, has continued to post. The United States and other powerful nations have warned the Ethiopian administration about this after the prime minister referred to Tigrayan forces as “weeds” and “cancer” in prior statements.
Also read: Instagram Users Can Now Pin Up To Three Posts On Their Profile
What Can Be Concluded From The Repeated Approval Of Hateful Content On Facebook
“When advertisements advocating for genocide in Ethiopia regularly get through Facebook’s net – even after the matter is identified with Facebook,” said Rosa Curling, director of Foxglove. This London-based legal non-profit collaborated with Global Witness on the research. “Years after the slaughter in Myanmar, it is evident that Facebook has not learned its lesson.”
Meta Has Failed To Firmly Enforce Its Community Standards On Social Media
Image Source: CHESNOT/GETTY IMAGES
For a social media behemoth such as Meta, it is a massive failure to be unable to enforce its community standards on Facebook and Instagram (by not recognizing, labeling, or eliminating infractions), and users have previously reported on its highly restrictive interpretations of these standards, which were in some cases insufficient, to begin with.
A recent survey found that Meta’s hate speech policy does not adequately safeguard transgender and nonbinary users, as well as those who do not understand English. Media Matters discovered over a thousand infractions of Meta’s hateful speech policy, including Instagram profiles promoting white supremacy and comments on social media spreading anti-LGBTQ insults.
Media Matters also found over 700 breaches of Meta’s election policy, including entire Facebook groups whose sole purpose was to misinform people, unlabeled posts with election misinformation, and similar violations.
This article is published by Infostor.com (c)