February 27, 2009

In the mid-1980’s, Mark V. Shaney posted messages such as this to the Usenet group net.singles:

I seem to be important. For me, it would have agreed with the
technical insight that is dear to me. Because of this, I have no
advice for someone in that situation!

Joining Mensa was something I did him one better. I wore a dress skirt
a day for one week. I did him one better. I wore a dress skirt a day
for a 2 year relationship. I’m wondering if anyone else out there has
ever experienced this phenomena, whether it was actually your
contention that this is true for me.

I suppose it depends how you felt about someone before you became
emotionally attached and therefore “safer” – not to sporting events,
but to opera.

I lost 90 lbs a few months during my “flower child” days in high school
where, due to her high academic standings, was shunned by many of the
tube. The experience really screwed them up — if not their heads,
their knees. Why does one have to be the prime measurement of
manhood. No?

He was a scrawny, spastic nerd in high school, and I fantasized about
such a thing. It all depends on the sidelines, listening to what makes
the rest of the guys around her – suddenly finds herself in a situation
where guys are asking them out!? But this can result in members of
either the person of your dreams (in a larger number of males to
females studying the field of engineering), the ratio of males to
females is somewhere in the past. And, per the other person.

I find it hard to reconcile the notion that something or someone isn’t
theirs anymore. I have a date with the woman. Subjectively, I have
also acted in this weekend.

However, Shaney wasn’t a person. Shaney was a bot created by three Bell Labs researchers — Bruce Ellis, Rob Pike and Don Mitchell — that analyzed Usenet postings and then created its own posting. Shaney’s writings were quirky, non-sensical, and beloved by many.

Shaney worked by reading a training text and saving each triple of words that appear in the training text in a large table. Then it generates text using a markov chain, starting with two words that appear in the training text and repeatedly writing the third word of the triple, sliding the output window from word to word. The genius of the method is that any two words may appear in the training text with multiple following words, and the generator is free to choose any of them; thus, short fragments of text make sense, but the text as a whole frequently veers from one train of thought to another. A word includes its surrounding punctuation, so that sentence structure and, indirectly, grammar, are built in to the output.