Tech

YouTube Takedown Of Hate-Speech Content Gathers Momentum After Recent Content Policy Change With 2019 Q2 Being The Highest

YouTube has assured that it is increasingly taking down videos that spread hate and incite violence. The social video sharing platform claims within the second quarter of 2019 it deleted over 100,000 videos and terminated over 17,000 channels for hate speech. According to YouTube’s statistics, that’s a 5-fold increase over the first quarter of 2019. In addition to videos, YouTube has also increased vigilance on the comment sections and claims it has deleted over 500 million comments. However, the majority of the comments were taken down with the videos under which they appeared.

YouTube Content Policy Update Causes Confusion, Claims Users

YouTube has been facing an acute problem of the increasing amount of videos that propagated hate and indirectly urged violence. The platform has been trying hard to detect such hate-filled and provocative content for quite some time. Google and YouTube claim they have significantly improved their detection engines that comb through the content and autonomously decide which content is inciting hate and urging violence.

However, several users have openly complained that YouTube’s content policy isn’t comprehensive or fair. Moreover, quite a few have cited examples which even claim that YouTube’s content filtering algorithms are biased. Some users claim YouTube is hastily attempting to clean up its platform through rushed development of filtering engines. This has reportedly resulted in YouTube mistaking free-speech content with hate-speech content.

The inability to quickly and clearly distinguish between the two has resulted in several legitimate videos and user accounts being suspended. YouTube does have a clearly laid out appeal process, but that often doesn’t address the issue or reinstate the videos and user accounts, claim many. On the other hand, a “significant number” of channels that pedd;e anti-Semitic and white supremacist content have been left online, following the June 2019 changes to the content policy, claims the U.S. Anti-Defamation League report. The report includes proof of videos that featured anti-Semitic content, anti-LGBTQ messages, those that denied the Holocaust, featured white supremacist content and more.

YouTube Aware Of The Issues With Content Filtering Algorithms But Defends Platform:

YouTube has been increasingly relying on Machine Learning algorithms to detect hate content and remove offensive and provocative videos even before they are available for the general public for viewing. YouTube claims more than 80 percent of the videos auto-detected by machine learning algorithms were taken down without a single view in Q2 2019. However, in addition to automated systems, the platform relies on 10,000 people who are tasked with detecting, reviewing and removing content that violates its guidelines.

Interestingly, more than 87 percent of the 9 million total videos that YouTube took down in Q2 2019 were flagged by automated systems. Moreover, steady improvements to spam detection systems caused a 50 percent jump in channels which were flagged for removal for spam violations. Additionally, YouTube is also looking into cases of creator-on-creator harassment.

Despite the massive efforts to detect and take down content that includes or promotes hate speech, YouTube still has a long way to go, admitted the platform’s CEO Susan Wojcicki. Right after the report from U.S. Anti-Defamation League, Wojcicki put out a blog post on YouTube Creator Blog which clearly defended the company’s rather tricky position on the matter,

A commitment to openness is not easy. It sometimes means leaving up content that is outside the mainstream, controversial or even offensive. But I believe that hearing a broad range of perspectives ultimately makes us a stronger and more informed society, even if we disagree with some of those views.”

Needless to add, the statement leaves users more confused. However, as with any such inappropriate content, it is important that YouTube users and viewers flag the same and complain as well.


Leave a Reply

Your email address will not be published.

Close