Clearly, the math folks have the last laugh: rated number one, they get the money, the working conditions and no stress:

She [Jennifer Courter] telecommutes from her home and rarely works overtime or feels stressed out. “Problem-solving involves a lot of thinking,” says Ms. Courter. “I find that calming.”

Astute readers might be guessing some of the ways this study may be less useful to folks thinking about switching careers than one might hope…but this kind of silliness is typical post holiday, pre-spring recreation in the news business.

(The worst, jobs, by the way, are what you would expect: low paid, physically demanding and dangerous. The three at the bottom of the 200 entry list are taxi driver, dairy farmer, and, dead last, lumberjack. Break into song here if you must.)

And anyway, finding out about a list like this leads directly to the inevitable question: how does your own daily grind rate?

For me, it depends on how I think of myself. If I call myself a historian — and why not, with two books of history of science/culture out and one more coming this June — then I’m in clover, baby. Historians, whose capsule job description reads, “analyzes and records historical information from a specific era or according to a particular area of expertise,”rank seventh on the list of desirable jobs, one ahead of sociologists (hah! take that, sister-in-law!)

That it is one behind computer systems analyst takes me down a peg, I guess, and my biologist friends will condescend from their perch at number four. (Though this description seems a little limiting: “Studies the relationship of plants and animals to their environment.”)

Still, in my guise as historian, I hold bragging rights over eleventh ranked economists (so, Brad DeLong. You might be a sought after authority, a player in national policy making and an utterly dauntingly prolific blogger, but I like my job better…or something); philosophers at number twelve, (Hilary Putnam respectfully disagrees, and provides a quite telling data point to the contrary); and at thirteen, physicists, (so Sean Carroll, are you just going to sit their and take it?)

All well and good, until I decide to think of myself as an author, dragging in at 93.

Hmmm.

(And as a lagniappe — what do you make of the fact that clergy come in at 70, one behind federal judges at 69? Hanging out their between God and man carries some stress, I guess.)

Much nonesense. Fun on a holiday morning. Now for the French Freedom Toast.*

*Just a little trip down memory lane here to remind us of how wonderful it is to trade idiocy for at least the possibility of engaged intelligence, starting tomorrow at noon.

Steven Pinker has made something of a splash with his account of confronting his personal genome, published in the NY Times magazine last week. The article is interesting, though Pinker’s hint of nervousness about just how much he wants to know of himself genetically gives it a slightly odd list to port.

There was also a problem, IMHO (humble, and without professional expertise, too) with the presentation of the article. Though Pinker was careful to undermine from time to time what he recognized as one of the fascinations of personal genomics — that “the human mind is prone to essentialism — the intuition that living things house some hidden substance that gives them their form and determines their powers” — the piece still teetered on a kind of 1980s “We’ve discovered the gene for X!” hoopla.

Much of that impression was conveyed by the photos that accompanied the print version of the article, with headshots of Pinker captioned with the trait identified within his genome. Partly, though, it derived from Pinker’s own ambivalence, as he acknowledged the pitfalls of essentialism in a genome in which so much of the information is not devoted to protein coding, and yet wrote sentences like this:

For some conditions, like Huntington’s disease, genetic determinism is simply correct: everyone with the defective gene who lives long enough will develop the condition.

This is true, of course, and yet…the genetic signature of Huntington’s disease involves the number of repeats of a short section of the genetic code, just three bases or genetic “letters,” associated with the Huntingtin gene. There is a number of repeats below which someone is not at risk for the disease — less than 27 copies — and a number above which disease essentially always occurs — 39 repeats and up. In the middle, the issue is more ambiguous, and a repeat total in that range may result in late onset of the disease, or even a progression to overt symptoms that is so slow that the affected individual dies of some unrelated cause before the production of the damaging form of the Huntingtin protein actually does enough harm to notice.

What governs the number of repeats is unclear; it is not, seemingly a matter of pure inheritance. Masha Gessen in her excellent Blood Matters tells the story of two brothers at risk for the Huntington gene. One develops symptoms early, gets tested, and receives confirmation that he possesses the gene with sufficient repeats to account for his relatively early onset of the disease. The other brother, who presumably inherited the same gene from the same parent, possesses an intermediate number, and may or may not end up with symptomatic Huntington’s at some later point in his life.

What does this all mean? That even in cases where the overwhelming effect of heritable genes is obvious, where possession of a given form of genetic information directly correlates with a particular observable trait, there are processes involved in the replication and inheritance of that information that produce variation.

I am no biologist, so I’ll defer here to John Maynard Smith, with whom I had the good fortune to have a conversation the one time we met, a few years before he died. He emphasized what I don’t think has seeped deeply enough into the popular understanding of modern genomics. In his phrase, (from memory), the environment for a gene begins at the chromosome.

That is, the genes that actually code for a protein do not do their work or move from generation to generation in a vacuum. Rather they exist in a physical environment that begins with its most immediate context — the DNA that exists surrounding coding regions — and the extends outward through the structure of DNA and other organic material that makes up the chromosome; the nucleus of a cell; the cell as a whole and so on and on and on. Things happen at each level of organization and between them that can affect what happens when the rubber hits the road and a protein gets made.

All of which to say is that even though Pinker certainly did not claim that genes are destiny in any crude way, his article still falls into a tradition that I do not think has fully caught up with the richness and the complexity of modern genetics and cell and organismic biology.

That said, the other matter that made my antennae twitch in Pinker’s article came in this paragraph:

Though the 20th century saw horrific genocides inspired by Nazi pseudoscience about genetics and race, it also saw horrific genocides inspired by Marxist pseudoscience about the malleability of human nature. The real threat to humanity comes from totalizing ideologies and the denial of human rights, rather than a curiosity about nature and nurture.

I agree with the last sentence (though I’d hardly say that it covered the sum of threats to humankind), but the claim that the genocides perpetrated by Marxist regimes are an example of blank-slate ideology gone very wrong is problematical on two levels.

First, it is simply wrong. For example Stalin’s war on the Kulaks — well-off peasants/farmers — treated Kulak resistance to collectivization as a symptom of an inherant, non-malleable quality, the class identity of the offending farmers. Similarly, Mao’s campaign against landlords (and others) immediately after the 1949 victory of the Chinese Communists, identified class and or occupation as a kind of original sin from which there could be no return. The same basic notion underlines the horrors inflicted on class or educational level by other regimes.

Of course, in China and the Soviet Union, exterminations justified by the identification of a human stain that needed to be eradicated to open the possibility of forging a new Communist humanity had roots that have nothing to do with a real commitment to either essentialism or a blank slate view of humanity — though at different stages of the process, both ideas were invoked. Rather they were all about power and resistance.

But at the same time, any finer grained look at what happened in the state-massacres of the 20th century does not support the simple-minded notion that as much or more harm was done to human beings through a commitment to a false perfectability of humankind as as was done through a commitment to a false notion of ineradicable genetic defects in particular groups. Essentialism was an integral part of both Nazi and Communist murders.

And that leads to my second objection to what I see as Pinker’s false equivalence of two evils. It isn’t just that he admits no complexity to the history; it is that the moral argument he seems to be making is itself highly suspect.

The real question Pinker avoids here isn’t whether evil comes to the world down multiple avenues. It is whether or not evil flows from a given cause, and if so, what can be done about it.

That dictators have used many justifications to treat other human beings as things rather than moral ends in themselves does not let you — or Pinker — off the hook on the specific issue of the misapplication of genetic ideas to divide humanity into those worth keeping and those it is permissable to destroy.

It is therefore also true that the pursuit of genetic knowledge, of that part of the human condition that is genuinely in ourselves, and not in our circumstances, needs to be concerned about the moral and ethical hazards raised by the research.

Of course, the field(s) are in fact acutely aware of this, as is Pinker himself, no doubt. But that he dredged up the old shibboleth that the Commies did it as his first response to the anticipated objection against the spectre of genetic determinsm betrays to me a kind of weariness with the argument.

I can understand that too — plenty of heat and not much light has been poured on this argument often enough. But it is still a bit of rhetorical sleight of hand, and it’s been popping up a bit in defenses of the new genomics. And that can’t be a good thing.

Update: I omitted thanks due to Abel Pharmboy and Janet Stemwedel, each of whom looked over sections of the post above to help preserve me from my own ignorance. Any errors that remain are, of course, all mine.

Just a quick post before heading out to contemplate much more interesting minds than that of Mr. Brooks at Science Online ’09, but reading today’s column, a number of howlers stood out. I’ll try to get to the meat of them in posts between conference sessions tomorrow, but to begin at the beginning — check this out:

Once there was just Newtonian physics and the world seemed neat and mechanical. Then quantum physics came along and revealed that deep down things are much weirder than they seem.

Reading Brooks say anything about science produces the sensation of watching a kid play with a whole box of kitchen matches. Nothing good can come of this.

Between Newton and the quantum revolution you had this and this and this, just for starters.

I admit that this has nothing to do with the main argument of his piece, which possesses its own follies to be ridiculed in due course. And maybe it’s pedantry to demand a bit of rigor in all those intellectual glittery bits Brooks wants to toss off so casually so that we may bow down (and suspend our critical judgment) before his transcendent wisdom on all other matters.

But I’ve found it a pretty good guide that if someone b.s.’s you on the small stuff, he or she is probably not what you would call reliable on anything of more import. Brooks doesn’t disappoint in the rest of the piece — but that exegesis is for another post. Here just ponder his latest monument to what very clever people worked very hard to understand over a span of centuries.

Image: nonexistent due to painfully slow internet connection at the hotel within which I type this. Sorry. To be adjusted if the wireless at the meeting site can take the strain.

A broad look at Darwin and the implications of the ideas he set in motion will be taking place at MIT next week.

The program looks great; it begins at the beginning, with a session on the origins of the physical substrate on which biological origins would take place, working through Darwin’s contribution, specific milestones in the development of the modern reconstruction of evolutionary history, and moving through current and past science-and-society issues thrown up over the last 150 years.

All this conveyed through a strong slate of speakers. In other words, its worth what time you can spare, should you happen to be somewhere near the 02139 zip code next week.

You can register here (scroll down to the bottom of the program). The conference is free and open to the public, but registration is requested.

Just in case anyone hasn’t caught up with this meeting yet, check out the website for the science blogging event of the year, Science Online ’09, being held this weekend (Jan 16-18) at the Sigma Xi Center in Research Triangle Park, North Carolina.

Registration is now, sadly, closed, (though the wait list may still be open) but the wiki is available for those who either are registered and want to get an advance taste, or those who aren’t and want to get a sense of what they might be missing (so as to be ready to sign up pronto next year.)

A couple of notes. First: this year, with all of thirteen months of blogging under my belt, I’ll be co-moderating a couple of sessions (or perhaps un-moderating them, [radicalizing them, perhaps?] in keeping with the un-conference theme of the weekend).

One of those, with Rebecca Skloot, will be on the theme of how to move from blogging to paid science journalism — that’s in the 4:30 slot Saturday afternoon.

The other, with Dave Munger, will come on Saturday morning at the ungodly hour of 9 a.m., and we’ll talk about making the leap from blogging to book writing. For a little light background reading for those of you that might be coming, (or for those interested in the form, I guess) I’ve posted to the conference Wiki a sample book proposal — the one one I wrote to sell this book: Newton and the Counterfeiter coming this June (in the US, by Houghton Mifflin Harcourt; would-be readers in the UK will have to wait for Faber & Faber to get their edition out in August).

Second: this will be my second year at the conference, so I know (a) how smoothly the operation runs, and how well its attendees are taken care of and (b) how much hard work Bora Zivkovic and Anton Zuiker put in to make it so. (I’m sure there are others involved; I just don’t know yet who they are — and so they are to be acknowledged only when I do). Not only the attendees, but everyone who reads science blogs and or cares about connecting science and the public owe Anton and Bora a debt of thanks.

“talk among the landed gentry was almost entirely about shooting unless it happened to be about hunting: between them, he [Cobbett] said, the two topics accounted for more than 90 percent of the words exchanged in the English provinces.” (Janet Browne, Charles Darwin: Voyaging, p. 111)

Darwin was an exceptionally prolific slaughterer of birds in his Cambridge days, to the point that he felt the need to justify the pursuit to himself. In his autobiography he wrote:

“How I did enjoy shooting, but I think that I must ahve been half consciously ashamed of my zeal, for I tried to persuade myslef that shooting was almost an intellectual employment; it required so much skill to judge where tofind most game and to hund the dogs well.”

Without being able (or really wanting) to test Cobbett’s data, I can say that I would have loved to see how swiftly Darwin and friends would have dealt with the reckless, feckless shot who will soon be released back into full time sporting pursuits. Contempt and disdain would sum it up, I think. As for we who must share the time-space continuum with the offender….well, if I lived in Wyoming, I’d be afraid. Very afraid.