A couple of days ago I wrote that I thought that the msnbot was the dumbest bot in town, since it was the only bot fooled by my ilizer service. But I was wrong. The msnbot is no dumber than the rest of the robots. Have a look at this google search. It is the world as seen through the eyes of the Bobby accessibility checker, and the googlebot really went for this one. I have no idea why Bobby checker actually process URLs in hyperlinks so they also filter through Bobby though - I don't really see the use (comic or otherwise).

Next question: What would be a useful heuristic to identify bots like this? I'm doubting there really is one. Most likely a filter would just be a long list of known cases, and probably there are just too many filters around to make that worthwhile. Presumably most serious filters implement the robot exclusion standard to save bandwidth and clock cycles.