Google pledges 10000 staff to tackle extremist content

Share

In an interview with the British daily newspaper, The Daily Telegraph, Wojcicki said "bad actors" had used YouTube to "mislead, manipulate, harass or even harm" others.

In a blog post yesterday clearly attempting to address these concerns, Youtube's chief executive Susan Wojcicki (pictured top) said the platform was "taking actions to protect advertisers and creators from inappropriate content".

In order to combat this issue, the video hosting intends to "apply stricter criteria and conduct more manual curation" while simultaneously boosting its team of human reviewers "to ensure ads are only running where they should".

It is hard to know at this stage whether machine learning can adequately flag disturbing content aimed at children, as much of this type of content could be hard for an algorithm to recognise as disturbing or creepy, which is why human content reviewers are needed.

At the time YouTube also said it was developing a way to redirect users searching specific keywords on the website to playlists featuring videos that counter extremist content.

Going forward, she said that YouTube would be "carefully considering which channels and videos are eligible for advertising".

Destiny 2 Curse of Osiris: How Big is The Download Size
We don't know anything about the second expansion at the moment, aside from the fact that it'll be out sometime in spring 2018. Firstly, Destiny 2 Curse of Osiris will be available for play today after the conclusion of the scheduled maintenance.

The video streaming company built an automated software to determine videos with extremist content.

In 2018, YouTube will raise its content moderation workforce to over 10,000 employees, all of whom will be tasked to screen videos for trouble spots and simultaneously train YouTube's machine learning algorithm to look for and remove problematic children-centric content.

The company has been under increasing pressure from politicians, law enforcement and advertisers to remove content promoting terrorism, child pornography or other illegal activities on the video-sharing site.

Just last week it was reported that YouTube removed 150,000 videos over predatory comments targeting children, also disabling comments for more than 625,000 because of the same problem.

The technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess, according to Wojcicki.

Google earlier this year announced that it would give £1mln to fund projects that tackle extremism in the UK.

Share