YouTube will penalize extremist videos that don’t qualify for removal

0
25
This original message was seen on site

Why it matters to you

YouTube promised to crack down on extremist content and those changes have no led to removing twice as many offending videos.

About 75 percent of YouTube extremist videos are now removed by artificial intelligence program before ever seeing a human flag — but now the platform will also penalize content that falls in a gray area between what is allowed in the usage policies but still contains controversial hate speech or extremism. On Tuesday, in an update on the effort to fight terror-related content, YouTube shared progress on the platform’s current efforts as well as where it is headed next.

In July, the company introduced four new focus areas to help remove extremist content, a tall order considering YouTube has around 400 hours of video uploaded every minute. Those areas included both more software detection through AI and more human evaluators. While the change was implemented less than two months ago, the AI software has improved in both speed and efficiency, YouTube says, removing content in many cases before the video received a flag by a viewer. That same software has helped to double the number of videos removed from YouTube for extremist content.

YouTube says they are continuing to make more hires as well as refining the software.

YouTube now also has more than 15 organizations on an advisory board to help with any new policies regarding what is allowed on YouTube and what is not. That number of organizations will continue to grow, the company says.

Removing content from YouTube is often a balance between prohibiting offensive and possibly extremist content and free speech. For videos that do not violate the terms of use to warrant a removal, but still contain “controversial religious or supremacist content,” YouTube will penalize the videos by removing several features. These videos that fall into that gray area will not allow comments and likes, and they will not show up in suggested videos.

Videos that fall into that category also cannot be monetized, part of a change to the YouTube’s content guidelines announced in June. The change was, in part, prompted by several advertisers leaving the platform when their ads were placed alongside offensive content.

The changes do not just target creators of extremist videos, however — when a user searches for keywords related to those extremist videos, YouTube now instead plays a curated playlist of videos that debunk the extremist messages.

“Altogether, we have taken significant steps over the last month in our fight against online terrorism,” the official blog post reads. “But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat.”