"We want to be consistent," she said at the CodeMedia conference Monday. "When someone violates our policies three times, we terminate. We terminate accounts all the time."

YouTube has come under fire in the last year from one controversy after the next. This year, it's been all about 23-year-old Paul, the popular video blogger who mixes youth comedy with outrageous pranks that some say go too far. In 2017 it was for running not-safe-for-work kids content that had been re-edited to include themes of violence and sex and for allowing extremist videos on the network.

Paul responded with an apology tour, first with a short video, then after the break with a longer video on suicide prevention. But then he returned with the Tide Pods tweet and the rat video.

His infractions count as two strikes. ""We can’t just be pulling people off of our platform," she said. "They need to violate," three times.

Friday, YouTube updated its overall violations policy for all creators to include removal from the preferred ad program, taking ads off a creator's channel and not including their videos in the recommendations tabs.

Wojcicki said the policy update "gives us more levers and opportunity to pull back services" if creators violate YouTube terms.

In late 2017, it updated another policy to deal with videos with violent and sexual themes that were aimed at children when it said it would dramatically increase the number of people overseeing content in 2018.

Alex Kruglov, who runs the Los Angeles tech startup pop.in, asked Wojcicki why his 9-year-old daughter was able to see kid content from characters like Curious George and Elsa on YouTube that had been altered and not suitable for kids.

She said that parents should stick with the YouTube Kids app, which looks to be a safer place for children, and that parents, when they find objectionable content, should report the videos so they can be pulled down.

"We’re working really hard on this," she said. "We’re working through the content to make sure we’re making the right recommendations."

Wojcicki has said YouTube will increase the number of people working to oversee content to more than 10,000 next year. "Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content," she said in a 2017 blog post.

At CodeMedia, she said that computers would flag the content first, and the 10,000 humans would work hand in hand with them, double-checking. "That many people and the machines will help," she said. "And if it doesn’t, we’ll add more people and more machines."