Facebook banned 400,000 posts wrongly flagged as TERRORISM by mistake

Facebook has admitted to mistakenly marking 400,000 posts as terrorism-related and removing them from its platform.

The U.S. social media giant said Tuesday that the error resulted from a bug that caused its moderation algorithms to mislabel content.

Meta, the company that owns Facebook, shared the blunder in its quarterly community standards enforcement report.

It was forced to restore a total of 414,000 posts that had been wrongfully removed for violating policies related to terrorism.

A further 232,000 were binned because they were mistakenly deemed to be related to organised hate groups.

Apparently, the posts were blocked due to the same bug. The issue has now been resolved.

Read more about Facebook

I’m a cyber-expert and these are the most DANGEROUS password habits revealed

Facebook teases controversial update for BILLIONS in official warning

In its report, Meta said that it has nearly doubled the amount of violent content that it takes down from Facebook.

During the first quarter of 2022, Meta removed 21.7million posts for breaking its rules on violent content.

That's up from 12.4million in the previous quarter.

There were also rises in the amount of spam and drug-related content removed from Facebook and Instagram respectively.

Most read in News Tech

AIRPODS FEARS

Warning after Apple sued for 'bursting eardrums of boy, 12, with AirPods'

TRACK SCARE

Woman's urgent warning after 'finding tracker in BAG' after nipping to shops

EXTRA FLIX

Some Netflix viewers are getting EXTRA movies and TV shows that you can't see

RED RIVER

Nasa rover reaches critical ancient Mars river where it could find ALIEN life

But Meta said the prevalence of harmful content had decreased slightly in some areas, including bullying and harassment content.

That's due to improvements and enhancements to the company's proactive detection technology.

"Over the years we've invested in building technology to improve how we can detect violating content," Meta vice president of integrity, Guy Rosen, said.

"With this progress we've known that we'll make mistakes, so it's been equally important along the way to also invest in refining our policies, our enforcement and the tools we give to users."

Mr Rosen also said the company was ready to refine policies as needed when new content regulations for the tech sector are introduced.

The UK's Online Safety Bill is currently making its way through Parliament and would introduce strict new content rules around online harms for platforms such as Facebook and Instagram.

The EU is also working on its own regulation, with a similar approach expected in the United States in the future too.

"As new regulations continue to roll out around the globe, we are focused on the obligations they create for us," Mr Rosen said.

Read More on The Sun

My sister was charged £32 just for CRYING in front of the doctor at clinic

Ryanair passenger divides opinion after refusing to move seats for mum and baby

"So we are adding and refining processes and oversight across many areas of our work.

"This will enable us to make continued progress on social issues while also meeting our regulatory obligations more effectively."

  • Read all the latest Phones & Gadgets news
  • Keep up-to-date on Apple stories
  • Get the latest on Facebook, WhatsApp and Instagram

We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at tech@the-sun.co.uk

    Source: Read Full Article