YouTube found to have a list of approved content ‘super-flaggers’

YouTube’s parent company Google has drawn up a list of 200 approved ‘super-flaggers’ of content that include police organisations and companies that can flag various videos that breach content guidelines.

As part of YouTube's new remit, these selected parties will be able to flag 20 videos at any one time, which will then be reviewed by the video-sharing site.

The news of the super-flagger content moderators came about after the Financial Times in the UK reported (behind paywall) that the UK Metropolitan Police’s Counter Terrorism Internet Referral Unit has been systematically flagging dozens of videos on YouTube for what they believed held extremist views.

This raised alarm among some circles who fear the UK government and other governments across the world are censoring content that is harmful to them online.

Under the standard system, any YouTube user can flag a video as being inappropriate or offensive and the YouTube moderators will examine the content and decide whether it does indeed conflict with its guidelines.

Some areas Google indicates are a violation of its terms of service include the uploading of pornography, abuse towards humans or animals, hate speech or copyright infringement.

In other YouTube news, it is being reported that the website is to introduce a child-friendly version for those under 10 years of age and has called on video producers to make content for the new channel.

As things stand, YouTube allows parents to moderate their content to be more child-friendly but this new incarnation will allow them to include apps on their devices that will only allow access to particular children’s content.