YouTube Will Determine What ‘Conspiracy’ Is and Stop Recommending Such Videos

While the evolution of Google’s YouTube from a free expression platform into something entirely different has been underway for a while, it just took another step in a very short-sighted and restrictive direction. NBC News reports:

YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos…Chaslot said that YouTube’s fix to its recommendations AI will have to include getting people to videos with truthful information and overhauling the current system it uses to recommend videos.

There’s a lot to unpack here so let’s get started. First, it appears YouTube has announced the creation of a new bucket when it comes to content uploaded to the site. It’s no longer just videos consistent with company guidelines and those that aren’t, but there’s now a category for “conspiracy or medically inaccurate videos.” This is a massive responsibility, which neither YouTube or anyone else seems fit to be judge and jury. In other words, YouTube is saying it’s comfortable deciding what is “conspiracy” and what isn’t. Which brings up a really important question.