Facebook reports spike in violent content

Share

In an 86-page report, Facebook revealed that it deleted 865.8 million posts in the first quarter of 2018, the vast majority of which were spam, with a minority of posts related to nudity, graphic violence, hate speech and terrorism.

Other categories it provides numbers for include content featuring nudity or sexual activity, terrorist propaganda, hate speech, spam and fake accounts. The company has pledged in recent months to use facial recognition technology - which it also uses to suggest which Facebook friends to tag in photos - to catch fake accounts that might be using another person's photo in their profile pictures.

Facebook took down 3.4 million pieces of graphic violence during the first three months of this year, almost triple the 1.2 million during the previous three months. Some 3.4 million pieces of content were either removed or labelled with a warning during the period covered by the report, with Facebook's improved detection systems picking up 85.6 per cent of the content it subsequently took action on.

Fake accounts plague social networks outside of Facebook, as Twitter and YouTube also deal with bots flooding their websites.

The amount of content moderated by Facebook is influenced by both the company's ability to find and act on infringing material, and the sheer quantity of items posted by users.

The social network said it took down 21 million pieces of adult nudity in the first three months of the year, according to its first Community Standards Enforcement Report. "It's partly that technology like artificial intelligence, while promising, is still years away from being effective for most bad content because context is so important". Nearly 100 per cent of the spam and 96 per cent of the adult nudity was flagged for takedown, with the help of technology, before any Facebook users complained about it. "However, we don't have a sense of how many incorrect takedowns happen - how many appeals that result in content being restored". The company didn't provide a number of views, but said it was "extremely low".

City soak up fans' adulation in victory parade
City were welcomed by huge crowds as the champions parade travelled through the city centre. "Thank you for coming, guys". Speaking to SSN reporter Ben Ransom, Kompany said: "We had a good season last season but these things take time".

Facebook says 0.22-0.27% of views by users were of content that violated its standards around graphic violence in the period.

This, the company says, is because there is little of it in the first place and because most is removed before it is seen. The company found and flagged 95.8% of such content before users reported it.

Facebook plans to continue publishing new enforcement reports, and will refine its methodology on measuring which bad content circulates over the platform.

"These kinds of metrics can help our teams understand what's actually happening to 2-plus billion people", he said. But only 38 percent had been detected through Facebook's efforts - the rest flagged up by users.

The social media giant promised the report will be the first of a series seeking to measure how prevalent violations of its content rules are, how much content they remove or otherwise take action on, how much of it they find before it is flagged by users, and how quickly they take action on violations.

Share