YouTube to ramp up fight against violent content

YouTube to ramp up fight against violent content

YouTube to ramp up fight against violent content

The YouTube CEO also revealed plans for launching a new comment moderation tool, and in some cases, shutting down the comments altogether.

Addressing the monetization concerns of the creators, Wojcicki, in a separate blog post on YouTube's Creators blog, stated that the creators have made it clear that YouTube needs to be more focused while reviewing the content so that they do not demonetize a valid video.

YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.

Reuters reported last month that several major advertisers - such as Lidl and Mars - had pulled ads from the platform over "clips of scantily clad children".

The company is recruiting thousands of reviewers to reduce the amount of "problematic content" on its video platform.

To be sure, YouTube is more than a cesspool for haters and kooks, and Wojcicki reminded the world of that. Some 250 advertisers earlier this year also said they would boycott YouTube because of extremist videos that promoted hate and violence.

Australia to now scrutinise Facebook, Google
A final report from the probe is expected in June 2019. "We look forward to a thorough inquiry into the Australian media market". It will soon be releasing an issues paper, before completing a preliminary report by the end of next year.

YouTube will also expand its number of reviewers that check to make sure ads run next to appropriate videos.

The video-sharing platform has been criticised in recent weeks for failing to prevent predatory accounts and commenters from targeting children, as well as for the ease at which terrorist propaganda is uploaded to the site. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors. It's already been using the technology to help remove violent extremist videos.

Google will increase the number of its teams identifying and removing extremist content, hate speech and child cruelty from its YouTube channels following allegations of profiteering amid the failure to remove unsuitable footage.

"We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should".

The technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess, according to Wojcicki.

Related news