“Inevitably, both humans and machines make mistakes, and as we have increased the volume of videos for review by our teams, we have made some errors. We know we can get better and we are committed to making sure our teams are taking action on the right content. We are working on ways to educate those who share video meant to document or expose violence on how to add necessary context.”

Google said more than 83% of deleted videos were removed before any human flags in the last month, up 8 percentage points. Previously, YouTube has heavily relied on its community to flag up content that violates its policies. Its moderators have trained YouTube’s machine learning algorithms by reviewing more than 1 million videos.

But the company hasn’t provided any guarantees accidental deletions won’t happen again, and its statement puts the onus on video creators to add “necessary context.” That might include voiceover narration, as well as additional context in video titles and captions.

British researcher Eliot Higgins complained in August that YouTube had deleted his videos about Syria without warning. And American journalist Alexa O’Brien reportedly had a video used in Chelsea Manning’s trial deleted. Her channel was suspended because it was mistaken as an outlet for terrorist propaganda, according to a Gizmodo report.