The revelation prompted lots of other Twitter users to try the same search, and they were surprised to find out that even their searches revealed the names of nearby women’s hostels and PGs. Many others were shocked and rightly expressed their outrage online.

I'm a tad pissed with the subtle shaming with responses featuring girls' schools. Way to associate :-((

So why would Google do this? The answer is simple. Its algorithm felt that bitches could be a term used for women, and hence showed results that should have shown up had the term ‘women’ been searched for instead.

This isn’t the first time Google searches have revealed negative stereotypes about women.

Yes and no. Because while Google’s algorithm in these cases has picked up the misogynist language that people tend to use online, as a responsible tech platform it must not shy away from fixing the problem.

To be fair, Google users can report inappropriate predictions. The link to do so appears at the bottom of the autocomplete tab.

And it’s not like Google predictions have no filter whatsoever. Google’s policies about autocomplete predictions ban violent, hateful, sexually explicit, and dangerous predictions.

Nonetheless, that a platform as powerful and universally accessible as Google could be seen to be playing into sexist stereotypes and downright misogynist values is a dangerous sign. And that must change soon.