I learned about this restriction when I posted a story about whether it was OK for Call of Duty: Modern Warfare developers to include a scene in their popular game that showed a child being forced to defend herself and kill a Russian soldier during a chemical attack on her village in a fictional Middle Eastern country. The scene also showed dead children and a dead dog.

“We know there’s a difference between real-world violence and scripted or simulated violence – such as what you see in movies, TV shows, or video games – so we want to make sure we’re enforcing our violent or graphic content policies consistently,” said Google in a post. “Starting [December 2], scripted or simulated violent content found in video games will be treated the same as other types of scripted content.”

YouTube flagged my video for the story as one that would be age-restricted and not fully monetized. But the ruling in a new Google support post today said that “future gaming uploads that include scripted or simulated violence may be approved instead of being age-restricted” and that “there will be fewer restrictions for violence in gaming.”

YouTube said it will continue to protect viewers from real-world violence. If a video focuses on things like “dismemberment, decapitations, [or] showing of human corpses with severe injuries,” then Google said it will likely get restricted.

Google also said that the lifting of the restriction on games would have no effect on guideliness for advertisers. Any videos that depict violence in games could still be limited in reach to advertisers.