spamdexing (idea)

The first form of spamdexing is made by adding a lot of keywords to
the page "keywords" metatag, or to the page body (and hiding them
by setting the text color to page color). This is annoying because it
increases the load time of page itself;

It's also infuriating if completely irrelated keywords are added
and searching for something will send you to page that has nothing to
do with the matter. (At one time, I was looking for information about
sex(Honest!!!! I gave up pr0n-surfing ages ago!)
and I was able to find tons of porn pages, and after a LONG search a
page about the matter at hand. I guess 1000 "sex" keywords that the
spamdexed pr0n page had made it to rank higher than a few of them that
the real pages I was looking for had... =)

These days, these search engines have systems that detect and ban pages that stuff keywords. Also, some "smart" new search engines use other criterias to rank their searches than simple "score by keyword appearance on parts of page" relevance count; Google ranks sites by links from other sites, for example (and to get links from other sites, you've got to be less annoying =)

Other form of spamdexing is getting the same page indexed a lot of times by a search engine, by placing the page under many different, publicly accessable domain names, with dynamic content that is slightly different based on the domain (so that the search engine won't rank them "close enough" to be considered same).

For example, the "Church" of Scientology had what the critics called "spam pagegenerator" - program that created "customized webpages", with the obvious intent that the sheer number of similiar pages by believers, made with this program, would outnumber the pages of Scientologycritics. (Google returns both scientology.org and xenu.net on the first page of search for "Scientology", so I guess this tactic won't work too well for smart searchers, either...)