YouTube removing online terrorism content faster, Thanks to machine learning
The large number of uploaded content on YouTube's video sharing sites can sometimes make us uneasy because not all content is good for us, and some can even harm us, such as the content of terrorism for example. But, of course we also know that YouTube will not just keep on looking at this condition.
Nearly a month after YouTube said it was going to combat online terrorist content on its platform, the video-sharing site said the process is going well, thanks to machines and humans alike.
In an August 1 blog post, the Google-owned site said its multi-pronged approach is being aided by machine learning, where computers are detecting and removing the content in a faster manner. "Our machine learning systems are faster and more effective than ever before," YouTube said in the post. "Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag."
YouTube added that over the past month, machine learning has helped it remove more than double "both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down."
Over 400 hours of content are uploaded every minute to the platform, which has more than 1.5 billion logged-in users, according to YouTube CEO Susan Wojcicki.
In addition to using computers, YouTube is utilizing human experts, through its Trusted Flagger program. They've added 15 non-governmental organizations including the Anti-Defamation League, the No Hate Speech Movement and the Institute for Strategic Dialogue. There will also be tougher standards on videos that have been flagged by users as potential violations on hate speech and violent extremism, but may not actually be illegal.
"If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state," YouTube said.
The push comes after several terrorist attacks around the globe, including in the U.K. In June. YouTube said it would redirect people looking for extremist content to videos that confront and discredit the search topics, via the Redirect Method, created by another team at Google and its parent company, Alphabet. Despite the progress, YouTube said more needs to be done to combat extremist content that lives on its site.
"With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat," YouTube wrote. "We look forward to sharing more with you in the months ahead."
source: Written by Chris Ciaccia, Foxnews (URL)
Leave a Comment