Restricted Mode was originally created to give public institutions like libraries and schools the ability to prevent people from watching mature content like porn or graphic violence on their computers. As Wojcicki admitted, however, the system also filtered out innocuous LGBTQ content (examples she gave included “kissing at weddings, personal accounts of difficult events, and speaking out against discrimination”).

Even accidental censorship can be detrimental, especially to LGBTQ youth who use YouTube on school or library computers because it isolates them while they are seeking a community. As YouTuber Rowan Ellis noted, the platform is “one of the only places that queer and trans youth, gay youth, bisexual youth, pansexual youth, asexual youth, any of these kids, have a way into community, have a way into knowledge, have a way of feeling that they are not alone.”

In April, YouTube said it corrected an engineering problem that was wrongly filtering LGBTQ videos. Fixing Restricted Mode, however, isn’t simply about adjusting its system. Its rewritten guidelines seek to clarify its position by specifically allowing personal accounts from victims of discrimination or violent hate crimes, as long as they don’t contain graphic language or content.

To be sure, that’s still confusing (after all, it might be hard to avoid graphic language when describing what was said during a hate-motivated assault). Since the way YouTube’s filters have led to a lot of uncertainty, Wojcicki also said that it will add new content to its Creator Academy, or resources for video makers, to help them create videos that won’t be blocked in Restricted Mode. She also asked people to submit videos they think were wrongly restricted and promised that YouTube will review each one.

For some YouTube creators, the platform’s algorithms don’t just impact their view counts—it also has a measurable impact on their income. Controversy arising from Restricted Mode’s impact on LGBTQ-related videos happened at the same time that many YouTube creators saw their ad revenue drop as several major advertisers boycotted the platform, citing videos with extremist content (though it was also fueled by discontent over Google’s dominance in online advertising).

In her blog post, Wojcicki said the two issues were unrelated, but added that the platform has “rigorous training to ensure that anyone who reviews content that’s been flagged for review ensures all content is treated fairly.”