Paul Graham highlighted an interesting concept in fighting off spammers. The basic idea is to make anti-spam tools do a counter strike in sites promoted by spammers. The basic idea is that a blacklist would be created to include repeat offenders. When a spam is seen, the server would check the blacklist to see if the site is on there. If it is, the tool would crawl the site, generating useless traffic for the spammer’s source, hence increasing the cost of sending out spam. On its face, the argument seems to work. Some more thoughts on it: High-volume auto-retrieval would only be practical for users on high-bandwidth connections, but there are enough of those to cause spammers serious trouble. This part could be handled by having the mail servers themselves take care of this. In most cases, mail servers are sitting on broadband lines. The reason for this is that they need to always be on to receive mail. If such counterstrike is to work, it has to come from those mail servers. A refinement to the system would be to also include a whitelist. The reason for a whitelist is that it would allow publishers to register with the…