March 31, 2007

In the comments to the post on genius below, the topic of the stagnation of progress came up. A post is more appropriate for a response, so here it is. I confess that I haven't studied the work of historians / philosophers of science, so the outline of what I'm going to say may well have been suggested already. In fact, it's pretty obvious if you devote a little time to thinking about it, so I assume that the gist actually has been proposed before, but as I'm otherwise occupied as far as reading, studying, and thinking goes, I'm not going to look into it. (A cursory Google search didn't turn anything up, so it's probably in a book or body of a journal article.)

To briefly reiterate my thoughts, I think that while there may be a grain of truth to the idea of progress plateau-ing, it's always premature to suggest that we've approached an asymptote for good in some area -- physics, music, what have you. The reason is simple: you can never tell when a huge revolution is about to happen, so for all we know, yet another one will occur -- when, who knows? -- even if we won't live to see it. So, I'd agree with a weaker version of the complaint that simply noted that we've apparently run out of bold new ideas for the time being, and only time will tell whether our progeny will discover or create something that no one had dreamed of before. I gave some examples of this in the comments.

A separate idea is that some fields don't stagnate as much as others -- for instance, as much progress as mathematicians have made in the past 2500 years, look at how little was done from roughly the ascendancy of the Roman Empire until roughly the proto-Renaissance in the 14th C. Alfred North Whitehead, a 20th C. thinker, quipped that philosophy was "a series of footnotes to Plato" -- but that exactly characterizes the field of geometry up until the beginnings of the 18th C. when Euler sowed the seeds for the generalization of geometry known as topology. It was only in the 19th C., with the discovery of non-Euclidean geometries, that anyone had found a way of doing geometry that didn't obey all of the rules that Euclid had laid down 2000 years before. Moreover, one of the four foundational branches of mathematics -- calculus and analysis -- wasn't even developed until the mid-17th C. An inchoate working out of probability began about the same time, while statistics is almost entirely a 20th C. invention. So, even though big changes do eventually occur once more, it seems to take a very long time for them to happen.

Why don't we put that into picture form to get a better feel for the claim?

Above is (what's supposed to be) an exponential decay function, which is what someone means anytime they use the phrase "diminishing returns." As is plain to see, the effect that the nth discovery has on the maturity of the field -- whether that be development of mathematics, understanding of natural laws, or elaboration of musical forms -- decreases (monotonically) as n increases. At the beginning, Euclid writes The Elements, and each work on geometry after that contributes increasingly less understanding to the field. Bach fleshes out most of the potential of the fugue, and each composer's contribution to the form afterward fills in increasingly smaller gaps left by the pioneer. Eventually, another Euclid or Bach comes along and charts out previously unseen areas, and the process repeats -- perhaps it ultimately will grind to a halt as science and art begin to place too strenuous of a demand on human cognition, but I don't believe that either (more on that at the end).

Now, here's how I think technological progress is different:

The axes have the same scale, so I didn't just "zoom out" from the graph above. Essentially, the logic is the same as above, but the cycle just repeats far more quickly, such that it is hardly noticeable that progress ever stagnates. Now, some people do complain about technological progress stagnating, such as those who whine that the nth incarnation of their iPod hasn't changed all that much over the past five years, but such "stagnation" is below the detection threshold of those who are not profoundly afflicted by ADHD. The reason for a faster cycle is pretty simple: most technology is designed to out-perform someone else's technology, as summarized by the common phrase "arms race." As an aside, most groundbreaking technological innovation is done at the level of monopolies (Bell Labs before AT&T was broken up), national governments (Department of Defense), or state-funded "private" bodies (MIT), and not between competitive firms, so don't read too much into the analogy. The basic point is that geometers and composers aren't threatened by an imminent menace, such as an invading army, so their urge to out-do others must derive from less reliable qualities such as personality trait competitiveness, spite, and so on.

In other words, while all geniuses have shown an indefatigable work ethic, some of the lesser figures may have had roughly genius levels of general intelligence and nuttiness, but simply lacked the time commitment required to make Newtonian contributions. Surely if it were a matter of life and death, though, as in the case of those charged with technological progress, even someone who might not have an extremely high intrinsic work ethic would have their feet put to the fire by the prospect of being conquered, for example.

Those familiar with the lingo of evolutionary biology will have noticed that I've hinted at an analogy, so I might as well make it explicit. In the more "pure" fields -- the arts and sciences -- progress reflects a spreading of something to all of those in the field: an understanding of nature, the conventions of the fugue, and so on. This is like a group of alleles spreading toward fixation in a population, such as lactose tolerance among peoples who have raised dairy-producing animals. The exponential decay model is also borrowed from this work, especially that of H.A. Orr, who has argued convincingly that adaptation of a population to its environment obeys such a model. If you don't have access to his journal articles, here's an intuitive argument I came up with to help me remember the gist of his more formal statements:

Imagine that you're a little kid passing the time dangling by your arms from a tree branch (so you're at an equilibrium state), and that a freak environmental change like an earthquake or gale-force wind knocks you off your branch. Also imagine that your flexible hand is the population, and that it is trying to latch onto an irregularly shaped branch to keep from falling into oblivion. Just getting your palm in the right place does most of the job, and then two flexings of your fingers do most of the rest, although you still need to carry out many more minute adjustments to get it to contour the branch (nearly) perfectly. In this analogy, a discrete movement of your hand is like a favored allele being substituted at some locus. (I hope you'll forgive the digression I've made from my central point here, since this is one of the more fascinating ideas I've read about recently.)

The case of technological progress, however, is more like frequency-dependent selection: your hand is making movements not only to fit a static branch, but in reaction to the series of movements made by a nearby person's hand -- maybe there are lots of people in the tree, and one or more of them is using their hand to try to shove you out to make room for them. Now you're engaged in a vicious cycle that is a matter of life and death. If you left a video camera recording the events in these two different trees, the results of the former would look pretty boring since the action would only be dictated by whether a hurricane or earthquake chanced to pass through the area during the time period, and many would complain that "nothing's happening." The latter's movie, however, would be so action-packed that, again, only the incurably jaded would lose interest.*

That's what must be causing people to complain that one type of progress seems to have ground to a halt -- the human mind is incapable of handling large stretches of time (a special case of the difficulty with big numbers in general), as any society's pre-scientific historical texts amply demonstrate. So, it's just a matter of having to wait longer for "the next big thing" in the arts and sciences.

There is another complication, namely that you can always out-perform your enemy, but you may or may not be able to understand more and more of nature or invent more and more original musical forms. Since that's a bit outside the scope of this musing, I'll just link to a previous post I wrote on this topic last year. Briefly, the original objection can be stated as an analogy to visual perception: the human eye is only so discerning when it comes to color, so even if every once in a great while a freak were born with 10 types of cone cells (as opposed to the typical 3), that person could still perceive only so much of the electromagnetic spectrum. But that's fine, since we've designed instruments like spectroscopes that supersede our eyes. The recent proof of the four color map theorem and elaboration of the properties of some 248-dimensional symmetrical thingie that I don't understand were both done by computer. This isn't as satisfying on a gut level, since the computer's proof just exhausted a huge number of mutually exclusive cases rather than present a conceptually elegant solution. But hey, more knowledge is more knowledge, right? So don't worry.

*This is distinct from Stephen Jay Gould's notion of punctuated equilibrium in two ways: first, the basis for Orr's claim is pretty rigorous, while Gould was making touchy-feely verbal arguments only. Second, Gould was making claims about the entirety of a species' evolution, while as I read it, Orr's claim is that adaptation to correct for a particular environmental disruption follows an exponential decay curve -- surely it's conceivable that many such disruptions operate independently of others, such that there is always pioneering adaptation going on, rather than all disruptions being concentrated into a much narrower window of time.

March 30, 2007

It's easy: work with teenagers. Many college graduates insulate themselves among people their own age or older for the remainder of their existence, and since they are unlikely to have children at all, let alone anytime soon after graduation, they'll never be forced to consider the reality that they're old -- that is, until their bedraggled biology refuses to comply with the outlandish demands they continue to place on it. Before considering an example of this in my next post, allow me to list some of the events that have clued me in to my official status as a graying geezer*, courtesy of the teenagers I tutor:

- One referred to the Will Ferrell era of Saturday Night Live as "the old SNL," despite my view of this cast as the "new" SNL which superseded the Chris Farley / Adam Sandler / Dana Carvey cast. This same student referred to an MTV t-shirt from 1981, which she got from her mother, as "retro." That didn't sound right, but then I remembered that when I was 17, a t-shirt from 1971 would probably have qualified as "retro" in my mind.

- Another said she might miss a session since she was going to a concert by Guster. The name sounded familiar, unlike that of most bands my students follow, to which she replied, "Yeah, they are kinda old." In reality, they're an alternative band from the mid-1990s.

- I thought that most people who enjoy Family Guy would also appreciate The Simpsons (at least the episodes from the pre-suckathon era that began around 1998), although apparently the less in-your-face, not so raunchy quality of The Simpsons leads teenagers today to refer to it as "old school" and not worth watching.

- When reviewing the word "gauntlet" for SAT prep, a student told herself that the meaning made sense given the nature of an MTV sports show of the same name, remarking half-contemptuously, "Oh, but you probably wouldn't know what that is." I'd normally take pride in not knowing what passing fad dominated the teenage TV market, but I took a little offense here at the suggestion that I was a withered old goat. In fact, I did know the show she was talking about; I watched a few episodes a couple years ago after being drawn in by the physique and attitude of the choleric, Cuban cutie Veronica Portillo.

- Above all, though, the easiest way to spot someone from a different generation is the slang they use. It's frightening how quickly your lingo goes out of style (my guess is that current teenagers wouldn't even immediately recognize "lingo" as a synonym of "slang"). This goes both ways. I get funny looks pretty frequently for using words like "gung ho," "fogey," "wimp," and others that I never would've suspected had already become démodé. Conversely, I had to embarrass myself by asking my students outright what the word "fried" meant in their argot -- when I first heard that so-and-so "got fried," I assumed they were talking about someone getting stoned. In fact, it means that so-and-so "got dissed really bad" or "put in their place."

At least that one I could've pieced together after hearing it a lot -- not so with "cise." One of my students (the pretty, popular, trendsetter type) used this word about ten times within an hour, despite never having used it before. "Oh great," I thought. "This word's just been born, so you can never say it too many times." She tried to explain it, but it has too many uses to be concisely defined. Reading over the entries at Urban Dictionary (most of which were entered by people in the DC metro area, so this word may not have spread to your region yet), and having heard the word many more times since then, I'd say the closest definition is "to hit someone up with" or "hook someone up with." (Yes, I realize the irony of using my generation's own impenetrable slang here.) For example, "hit me up with that pen over there" would be "cise me with that pen over there." Or when she wanted to be quizzed on vocabulary: "OK, cise me with some vocab words." I think someone needs to cise me with a few extra IQ points so I can keep all the usages straight.

So, that's a hint of what I deal with in my daily routine, although I could surely think of more examples if I had the time. The point is that I never would have discovered what a fogey I'd become if I'd chosen another line of work. A lack of contact with youngsters, such as that which characterizes the lives of most educated people, can seduce a person into believing that they're never going to grow old, thereby quieting their anxiety over their drifter's lifestyle and allowing them to persist in their adolescent view that "growing up" simply consists of getting a job and filing taxes.

*Although I am "only" 26, that is still older than most estimates of human generation time, which are typically between 20-25 years. My father started dating my mother in college when he was probably 20, married her when he was 22, and was 26 when I was born (my mother is one year younger than he). Perhaps this, too, has made it more difficult for me to escape the reality of how behind schedule I am -- most educated people's parents were probably 10 years older at each stage of the courtship and childbirth cycle than mine were.

March 23, 2007

Having been born in 1980, I remember when MTV used to broadcast music on television; now its programming consists mostly of bad reality shows. There are two such shows I actually find somewhat interesting, though, depending on the focus of the episode: True Life essentially brings the freak show tent into your living room, each episode spotlighting a handful of individuals who share the same unusual quality (at least by the standards of TV stars) -- for example, those who have OCD. Made tracks the progress a single person makes in achieving some outlandish transformation over a brief time period, such as a tomboy who wishes to blossom into a girly girl, or a 100-pound runt who wants to play on the varsity football team. Typically, the individual falls far short of their goal; e.g., usually they are given a token spot on a sports team, one assumes only because to do otherwise would crush the ego of the under-a-magnifying-glass subject. Where the person does achieve success, it is frequently a case of a "hot girl with glasses" transformation of the type parodied in Not Another Teen Movie -- that is, where the person has already been blessed by good genes and good luck, and so simply needs a brief intervention to smooth out their few remaining wrinkles. These shows tie together two important perspectives on human variation: namely, that it is ubiquitous and wide in scope (True Life) and that a lot of it has more to do with fortune than effort (Made).

Not being a TV junkie, I don't always care to watch the new episodes, but the summary of a recent True Life installment caught my eye: the "I'm a Genius" episode would profile three young individuals deemed geniuses. As I didn't recognize any of their names -- which you would expect given that two were in high school and one in his early 20s -- I knew immediately that "genius" didn't mean "genius" but "gifted" or "talented," since there is no such thing as an unrecognized genius.

The 20-something guy was skilled at chess, an immensely overrated means of detecting "genius" -- although it clearly places high IQ demands on the participants, it's difficult to construe playing chess as a creative endeavor rather than a display of cerebral prowess. Not knowing much about the subject, I'm sure there are bona fide geniuses, such as those whose insight opened up unexplored possibilities of attacks, defenses, and so on. Still, it's a bit of a stretch to claim that top chess players will leave an indelible stamp on the human record after their interment, unlike eminent poets or physicists. So, this person's profile is interesting mostly as a showcase of our culture's irrational reverence of chess masters as geniuses.

A more plausible candidate emerges in the profile of a 14 year-old boy who has published several novels, attained virtuoso status as a violinist (he has played Carnegie Hall), and participated in laboratory research. From the vignettes in the lab, it seemed he was more of an assistant than a member contributing original insight, but again, he's only 14; he's clearly capable of conducting novel research in adulthood if he wanted to. If I recall correctly, he got a perfect or near-perfect score on the PSAT. He displays the personality traits typically associated with a child prodigy: an unrelenting work ethic (including after-school tutor sessions to study AP subjects that he couldn't squeeze into his regular school schedule), a broad curiosity about many areas of culture, and a social life befitting an outsider (though not necessarily that of a pariah or recluse). Tellingly, he scarcely used the term "genius" to describe himself: his accomplishments speak for themselves.

In marked contrast, the final subject, a 16 year-old boy, dropped the "g-bomb" in a manner as profligate as the swearing of a weakling who confronts a brawny bully: the goal is to puff himself up in the face of a clearly superior contender. Unlike the other two subjects, he had no accomplishments to speak of -- as in, zero. He leads the social life of a common teenager, his only intellectual pursuit consisting of bubbling in responses to online IQ quizzes. Aware of his lack of achievement, he made a last-ditch effort to impress college admissions officials by trying out for Teen Jeopardy!; and though he easily qualified, he was smashed in his TV appearance. Despite this, he applied early-decision to Stanford, certain that the admissions committee would be awed by his "genius." In fact, he was so sure that he hadn't even bothered to start his applications to any other schools yet -- that would only happen in the (to his mind) rare event where he wasn't admitted early. Not surprisingly, the admissions committee must have been unimpressed and denied his application. Stewing in rancor, he muttered something to the effect of "I guess being a genius just isn't enough to get into Stanford." Replace the word "genius" with "cognitively gifted," and you'd be right.

No doubt this last individual will make a fine long-term employee at Starbucks or Barnes and Noble, where other smart yet directionless individuals eke out a living. Sometimes I wonder why, with so many smart people in the world, we don't enjoy a level of cultural production like that of the Renaissance or century of the Scientific Revolution, proportionate to the size of the population. Charles Murray wrestles with this question in Human Accomplishment, and I have my own peculiar hypothesis -- namely, that positive changes in health conditions -- long a significant environmental source of variance in traits -- has resulted in more individuals being bunched around the average, with fewer "deviants," including geniuses of the stature of Newton or Beethoven. (This applies only to variance in personality and other traits unrelated to intelligence, since general intelligence has steadily increased over the past 200 or so years.)

Yet another, not mutually exclusive explanation, which Murray considers only in a footnote, and which I'm becoming more convinced of as time goes by, is that our modern world offers too many opportunities for gifted individuals to earn a comfortable living from doing work that is not particularly culturally productive, as well as offering too many quick-fix distractions such as video games and TV -- and now, maintaining, reading, and commenting on blogs. As with all vices, they are all well and good in moderation, but we also inhabit a world in which self-restraint has become the watchword mostly of culture-skeptical fuddy-duddies. Although I wouldn't count myself among them, many paleoconservatives will tell you how embattled they feel in both the political and cultural arenas, Burke having been replaced by Bush as the most recognized conservative political figure. This loss of heterodox thinkers has surely retarded advances in the arts and sciences.

As an illustration, consider the differences in the lives of two of the greatest mathematicians -- Gauss, whom many regard the greatest math genius of all time, and John von Neumann, whom many view as the greatest math genius of the mid-20th Century. Both were deeply conservative, yet Gauss was more of a traditional, religious conservative who so valued restraint that his credo on publishing his ideas was Pauca sed matura -- "few, but ripe." Von Neumann, on the other hand, was a conservative mostly due to his anti-communism and militarism. In his personal life, he indulged in such a rakish lifestyle that one marvels that he didn't die earlier in a car crash. He is more of a neoconservative, then, and he also fits another aspect of the neocon profile in that he was an Ashkenazi Jewish intellectual. Now, clearly von Neumann inhabited an entirely distinct region of the intellectual galaxy than the 16 year-old soi disant genius from the MTV program, but imagine what more he would have produced had he not pursued such a thoroughly hedonistic path.

And for sure, most undergraduates at elite universities couldn't hope to be as imaginative as either of these figures -- but that doesn't excuse complacency. They can always worry "what if" their IQ were 15 or 30 points higher than their already gifted level, or "what if" they were possessed by a tireless passion rather than being "merely" highly motivated. But there are factors that they do have control over: e.g., how much leisure time to allow and what to do with it, or whether or not to sell out and join a profession rather than a culturally productive field. In fairness, I'm not talking about those who would enjoy a career in law, but rather those who are suited to culturally productive work but who wimp out and choose even more comfort and money than would be accorded a professor or composer (who surely don't want for human necessities). Nor am I poo-poo-ing those who hold down a day job to pay the bills while they churn out creative products, as with Einstein when he worked in a patent office.

But with the explosion of middle-class and professional niches since the Industrial Revolution, there has been no shortage of temptation to lead an intellectually indolent lifestyle. I certainly don't believe that individuals capable of cultural work should be forced into it, since they are intelligent enough to make their own decisions, unlike small children, but I still wish there was a stronger push to emphasize restraint in enjoying one's vices. Make no mistake: it is sheer vice for a mathematically gifted person to join the ranks of "quants" on Wall Street in order to be buried beneath a pile of money once they're gone, or for a singularly dexterous wordsmith to apply their craft in public relations or some other apparatus of the Ministry of Disinformation. Making some money on the side, or helping out some organization in wording a press release -- OK, but not devoting your life to it.

In short, when you look at how many gifted individuals expend their time in fields whose present and future contributions could be flushed down the toilet without much affecting the state of things, it's almost enough to make you long for harsher times when they could not have survived so easily as Lotus-Eaters. Fortunately, though, our modern world does allow us to enjoy much longer lives than before, so although none of us "is getting any younger," at least we have a more forgiving window of opportunity in which to correct the errors of our misspent youth. And on that note, I'm off to bury myself in some textbooks on math that I should have learned in college.

March 14, 2007

March 7, 2007

Oh, why not do what everyone else is doing? After all, the internet is the only place where inveterate introverts can become exhibitionists. As readers may have guessed, it will be difficult for me to limit myself to just 10 things that are weird about me.

1. In 8th grade, I dyed my hair plum / purple, and after that rose red (almost pink), then after that it was bleached with plum bangs. Dying your hair entails a lot of up-keep, so I shaved most of it off and sported a Taxi Driver mohawk (complemented by the obligatory blue jeans and olive field jacket -- no cowboy boots, though!). That also required a lot of maintenance, so I shaved it all off and grew my hair back normally in 10th grade.

2. My grandmother is Japanese (from Hakodate). I only classify this as "weird" since you wouldn't suspect it -- unless, I guess, you saw me play Tetris, or consume even a slight amount of alcohol.

3. I can hum and whistle at the same time, producing a drone somewhat similar to throat singing.

5. After eating, I can easily belch so loud that it shakes the walls and produces an echo within the ventilation ducts. I chalk this up to having unusual gut flora, or perhaps having normal gut flora but an unusual combination for genes involved in the digestive system.

6. Along with my two brothers, I was almost swept out to sea by a rip-tide during spring break in 2nd grade. We were only saved by some college guys who swam out to snag us.

7. Although I mastered typing fairly early (by practicing with the computer games Paws and Number / Word Munchers), I never read that much until 9th or 10th grade, so that I still read pretty slowly for my intelligence level -- for literary fiction, probably 1 page every 4 minutes. For science articles that are more straightforward, obviously less. Caffeine also helps!

8. In middle school, I used to trim my toenails improperly (making them fairly rounded instead of straight across), so over the next couple of years I had to have 3 ingrown toenails removed: the left side of one big toe, and both sides of the other big toe. The roots were permanently killed with some sort of acid so they wouldn't regrow. They don't look freaky, but not typical either.

9. Speaking of which, I found out that I have a very high tolerance for needle-related pain -- to remove an ingrown toenail, the doctor injects a local anaesthetic into THREE places of your toe at the joint below the toenail (once at the center-top, and two at the base, to form a triangle). Try having that done three times within a few years. It hurt like a bitch, but I can apparently steel myself pretty well for this type of pain at least.

10. Even if having a vindictive streak isn't unusual, for most people it doesn't extend to torturing video games. The original Nintendo system was notoriously unreliable after it had been played for awhile, and ditto the games for it. After so many game freezes, buttons that wouldn't respond, and so on, I eventually took a sledge hammer to the system itself (it was mine, not the family's). If it was just a particular game that screwed me, a couple of times I got so angry that I bored holes through the plastic cartridge with a screwdriver that I'd heated up over an open flame on the stove. (This was also during middle school -- what a tumultuous time!)