YouTube is equally to blame for Logan Paul’s video

It appears that YouTube is more responsible for the first crisis of the year on its video platform than was initially thought.

Yesterday, the internet was rightly outraged by news that YouTube star Logan Paul, who has 15 million subscribers and is part of YouTube’s Red subscription service, posted and later deleted a video that included extensive footage of a suicide victim filmed at Japan’s ‘Suicide Forest’.

Paul deleted the video less than 24 hours after posting it following outrage, but not before it had been watched by some six million people and — it emerges — been okayed by YouTube’s moderation team.

That revelation comes from one of YouTube’s own content assessment team who posted a screenshot that showed that the video had been approved on January 1 after being flagged by concerned viewers, as BuzzFeed first noted.

Logan Paul's video was reported and YouTube manually reviewed it; they decided to leave it up without even an age restriction… people who have re-uploaded it since have received strikes for graphic content. Ridiculous. pic.twitter.com/Hj9lyiQwE2

Let that sink for a minute. A person who is paid to keep unsuitable content off the platform looked over this video, and the footage of the victim’s barely-blurred-out corpse hanging from a tree, and decided that it is the kind of thing that should exist on the internet’s most popular video service.

The video included the hanging body in the thumbnail and was titled “We found a dead body in the Japanese Suicide Forest.” Yet despite that, and the disturbing scenes it included, it not only passed YouTube’s moderation check, but also went on to rank among the site’s top ten trending videos thereby exposing the disturbing scenes to viewers beyond Paul’s already-popular channel. (Notably, many of Paul’s subscribers are children aged under 18.)

YouTube’s guidelines specifically state that “it’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational, or disrespectful.”

“Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center,” a YouTube spokesperson told TechCrunch in a statement.

The spokesperson did not comment on whether YouTube had taken additional action against Paul, such as issuing a strike against his channel. According to its policies, channels that receive three strikes inside a three-month period are removed from the service, but each strike expires after three months.

According to the pseudonymous YouTube content moderator, other channels that reposted Paul’s video — predominantly due to outrage — were hit with strikes.

The incident may seem like a wrinkle in YouTube’s outgoing troubles given that the video was deleted within 24 hours, but it exposes just how broken YouTube’s current system is. It’s all the more worrying when you consider that YouTube claims over a billion users, who “each day.. watch a billion hours of video, generating billions of views.”

“It encompasses everything wrong with YouTube, the clickbait, the sensationalism, the thing that’s got to keep pushing [the envelope]. At the end of the day, it just shines bad on everyone,” the YouTuber, real name Felix Kjellberg, said in a video.

Note: article updated to correct that PewDiePie has 50 million not 50 subs, because of course he does.