Google Instant Blacklist Attributed to Imperfect Autocomplete

Publication 2600 compiled a Google Blacklist of search queries people won't find computed via Google Instant. Google said it's due to an autocomplete system that needs work.

Google chalked up a so-called blacklist of search terms
people found banned when using the new Google Instant predictive search technology
to an autocomplete system the company is continuing to refine.
Publication 2600
compiled the list of words that won't surface when users search for them using
Google Instant, which renders results automatically on the fly.

This list includes single terms such as "amateur,"
"lesbian" and "bisexual" and compound queries such as "barenaked
ladies" (this is a band) and "girls gone wild" (this is a line of reality TV videos where women bare themselves).

Words that aren't
blacklisted include "sadist," "feminazi" and "commie."
To see results for blacklisted words, users must hit enter. 2600 discussed the so-called Google Blacklist (found via Mashable):
"Like everything these days, great care must be
taken to ensure that as few people as possible are offended by
anything. Google
Instant is no exception. Somewhere within Google there exists a master
list of 'bad words' and evil concepts that Google Instant is programmed
to
not act upon, lest someone see something offensive in the instant
results...
even if that's exactly what they typed into the search bar. We call it
Google
Blacklist."
2600 invited users to test the blacklist themselves by
first typing in "puppy" and then "bitch," which is a word
used to describe a female dog that has been for years used as a derogatory term
for, among other things, a disliked female.
Results for "puppy" fill the screen with Google
Instant, while "bitch" using the same technology requires users to
manually hit the enter key to complete the search.
The point, 2600 argued, is that Google Instant is being
used as a tool that can be "filtered, controlled and ultimately
suppressed." 2600 punctuated the insinuation that Google is censoring
its search by noting: "It is indeed a good thing that
Google isn't evil."
This "filtering" is the way Google's
Autocomplete technology has worked for a few years, predating Google Instant, a
Google spokesperson told eWEEK.