A new controversy this week over the sexualized exploitation of children on YouTube has set off a wave of advertiser boycotts and heightened tensions among the site's biggest channels. The scandal has quickly become a flashpoint for a larger debate about how creators should address the very real problems plaguing the platform they're dependent on for their livelihood.

YouTube currently has just under a billion monthly users. It's the world's second-largest search engine, just behind Google, and people watch about a billion hours of video on the platform per day. It's only been about a year since YouTube's last large-scale scandal involving child exploitation on its site. The platform promised to address the problem at the time.

But following this new scandal, instead of hiring more moderators and building better tools to flag abuse, YouTube has, once again, put the responsibility on YouTubers. Now, on top of the burnout-inducing production schedule required to remain in algorithmic favor, the company expects creators to act as their own community moderators.

YouTube tweeted Thursday that it will also start holding creators accountable for the content of their own comment sections. "Even if your video is suitable for advertisers, inappropriate comments could result in your video receiving limited or no ads (yellow icon). Let us know if you have any questions," the company wrote.

The video's virality then inspired the #YouTubeWakeUp hashtag, which Watson and others used to demand advertiser boycotts. Brands like Purina, Epic Games, Disney, Nestle, and GNC have all said they're suspending advertising on YouTube until further notice.

"Any content — including comments — that endangers minors is abhorrent, and we have clear policies prohibiting this on YouTube," a YouTube spokesperson told BuzzFeed News. "We took immediate action by deleting accounts and channels, reporting illegal activity to authorities, and disabling comments on tens of millions of videos that include minors. There's more to be done, and we continue to work to improve and catch abuse more quickly."

The platform also reported illegal comments to National Center for Missing and Exploited Children (NCMEC), is working on blocking auto-completing search terms, and removed dozens of videos whose content was innocent in nature but featured young people who could be at risk for exploitation.

Many of the platform's most prominent voices, however, began to speak out against Watson for making his video exposé in the first place. Some accused him of making the video for views — and his own ad money. Popular YouTube stars like Daniel “Keemstar” Keem and Philip DeFranco have criticized Watson. Keem called him a fraud, tweeting, "This guy is trying to get advertisers to stop advertising on YouTube but puts 30 ads on his videos."

YouTube fans joined in and spent the week attacking Watson for taking possible ad revenue away from their favorite video makers. Following an outpouring of online abuse directed at him, Watson has gone silent on social media since releasing his video. BuzzFeed News has reached out to Watson via his Twitter and Instagram for comment.

The entire mess is a perfect summation of how fraught the relationship is between YouTube and YouTube's creators.

DeFranco, in a video about the controversy, said, "It's important that instead of saying, 'YouTube allows this and they’re happy about it' — because once again that is an insane argument — the best thing we can do is report disgusting monsters like we would anywhere else on the internet." As part of the backlash against Watson, an old clip from one of his now-private videos is also currently circulating. It's a scene from a man-on-the-street prank video which allegedly shows Watson asking an underage girl if she wants to make an adult video.

"When companies like Epic Games abandon the platform, it will hurt thousands of people who play and promote their products," Steven Jay Williams, who goes by boogie2988 on YouTube, told BuzzFeed News. "If enough advertisers pull out, it can ruin thousands of careers. We are talking billions of dollars in lost revenue for both YouTube and its creators."

Because YouTube creators are dependent on the overall health of the platform for their revenue, this reality has turned a sexual exploitation issue into a larger existential question about whether creators should even attempt to speak out about the larger structural issues with the site, for fear of rocking the boat and spooking advertisers.

"There is no way to 100% stop 'pedos' on YouTube," Keem told BuzzFeed News. "The only thing that can be done is what is already being done. Flag comments, flag videos, and let YouTube ban these accounts. But let's be clear, these people will just make new accounts."

Ironically, one of the main issues many creators have with Watson is that he made a video about the issue instead of just reporting it to the platform. "I reported on this over a year ago [in a video], but not to the extent it's being highlighted now," YouTube-based journalist Tim Pool told BuzzFeed News. "During the creepy kids algorithm scandal I came across a few videos like this and reported on it."

Daniel Keem's video about Watson's exposé:

YouTubers are defensive about the platform, warts and all, because at the end of the day, it's how they make money. Watson's video, for many of them, seemed to be more interested in bringing down the entire system than it was in exposing systemic child exploitation.

"This is horrible and these are garbage human beings, but also please understand that YouTubers have been punished before for things they had nothing to do with — and understand that they're going to be vocal about that," YouTuber Roberto Blake told BuzzFeed News. "If Matt revealed this and didn't list the advertisers and how to contact them in his video, you'd see a completely different [reaction from other YouTubers]."

Another criticism that is Watson's approach may have itself been exploitative. Some argued that making a viral video with actual examples of how you could game YouTube to share sexualized content featuring children was not the right way to police the platform.

"I'm very conflicted about whether 'awareness' is the right solution to this problem," Een from the YouTube channel Nerd City told BuzzFeed News. "Awareness is a very positive-sounding word, but if someone finds a stash of child porn on the internet, who exactly needs to be 'aware' of it? Law enforcement? Yes. The service where it was hosted on? Yes. The parents of the children? Presumably. But that may be where the need for awareness stops."

YouTube video creators don't work directly for YouTube. Instead, they use its tools, like ads and Superchat, to generate revenue for the company, of which they get a cut. It's meant to be a mutually beneficial relationship. However, the people creating the content that drives all that revenue have very little insight into how the recommendation algorithm works, nor do they have control over YouTube's policies. Yet they are often directly affected when something changes. That disconnect has led to several cycles of outrage, and even violence. Last April, YouTuber Nasim Najafi Aghdam, who had an obsession with the platform's algorithms, opened fire at the company's California headquarters, injuring three people before killing herself.

"I think it is entirely possible to combat this problem without advertisers boycotting the platform and causing a lot of people who live paycheck-to-paycheck to suffer," Williams said.