YouTube creating team of 10K to moderate, purge dicey videos

Adjust Comment Print

Wojcicki said machine learning is helping Youtube's human reviewers remove almost five times as many videos than previously and that today, 98 per cent of the videos removed for violent extremism were flagged by machine-learning algorithms.

Apart from expanding the current strength, the company plans to take punitive measures about inappropriate comments by introducing new comment management tools and blocking mechanisms.

Google is Doing Something about the latest scandal it finds itself in (YouTube/comments/paedophiles), revealing a plan to boost its video content moderation team to as many as 10,000 people.

Several advertisers, included Mars Inc., Adidas and Diageo, said they would pull their campaigns off YouTube in the aftermath, fearing the videos would attract pedophiles, according to the Wall Street Journal. In addition, algorithms flagged about 98% of the videos removed for violent extremism.

There have been reports of creepy videos aimed at children and pedophiles posting comments on children's videos in recent weeks.

Supporters rally, demand due-process for US Rep. Conyers
The elder Conyers, however, continued to deny the allegations leveled against him. "This goes with the issue of politics". Ian Conyers (D-Detroit), a great-nephew of the retiring Conyers, also has said he will seek to fill the seat.

YouTube is also taking steps to try to reassure advertisers that their ads won't run next to gross videos. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors.

According to Wojcicki, YouTube spent a year ago "testing new systems to combat emerging and evolving threats" and invested in "powerful new machine learning technology", and is now ready to employ this expertise to tackle "problematic content".

"Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content", the CEO wrote in a blogpost, saying that moderators have manually reviewed almost 2m videos for violent extremist content since June, helping train machine-learning systems to identify similar footage in the future. "We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should", concluded Susan.

She said adding more people to identify inappropriate content will provide more data to supply and potentially improve its machine learning software.

Comments