11.7 C
HomeLegal InsightMeta struck 21.7 million violence-inciting content in Q1 2022

Meta struck 21.7 million violence-inciting content in Q1 2022

Meta, Facebook’s (FB) parent company, has been redoubling its efforts in checking inflammatory posts. According to a TradingPlatforms.com analysis, Meta struck 21.7 million violence-inciting posts and comments in Q1 2022. That’s a 175% jump from the similar actions it took in Q4 2021.

That revelation has caught the attention of TradingPlatforms’ Edith Reads. She holds, “Meta owes its 2.8Bmonthly FB users the responsibility of ensuring they have wholesome interactions on it. Its actions are an admission of FB’s potency as a communication medium. It’s an acceptance of its importance in reining any content urging violent and illegal activity.”

FB has advanced its detection capabilities

Facebook has been facing accusations of its ineffectiveness in curtailing hate-mongering and misinformation. These accusations have come to the limelight following Frances Haugen’s leaking of FB’s internal communication on the matter.

The Haugen leaks portray the social media giant as lacking the staff and local language expertise to flag incendiary content. FB’s artificial intelligence (AI) systems are no better either. According to the leaks, those AI tools lack algorithms for effectively screening some native tongues.

Meta has, however, moved to assure its users of its commitment to upholding ethical postings on its platforms. It claims to have improved at timely detecting and nipping hateful content. It has achieved that by adopting an expanded and proactive system enabling it to neuter 98% of malicious content before users report it.

FB is no stranger to controversy

Controversy seems to be FB’s second nature. A UN investigation into the ethnic cleansing of Myanmar’s Rohingya Muslims linked FB to the spreading of hate against them. The Haugen documents show FB lacked classifiers for flagging disinformation and hate-mongering in Burmese, Oromo, or Amharic.

News outlet Reuters claims that it discovered posts in Amharic, a common language in Ethiopia, terming some ethnic groups as enemies and calling death on them.

The firm has also hit the headlines recently after a live streaming of a racist-inspired shooting in Buffalo, NY. Many have criticized FB for taking too long to pull down the content.

The full story and statistics can be found here: Meta struck 21.7 million violence-inciting content in Q1 2022

latest articles

explore more


Please enter your comment!
Please enter your name here