I’ve got this friend, known her since we were both grad students back in the eighties: well-regarded in her profession, well-published, even coauthored a few texts on evolutionary ecology. Keeps getting best-teacher awards for her work in the classroom. Occasionally she appears as the resident expert at the local Café Scientifique‘s Valentine’s Day edition, where she talks about the myths and science of sex and reproduction. Staunch feminist (although granted, it would have been pretty tough to look around at the faculty gender ratio in mid-eighties UBC and not be one of those). James Tiptree, Jr. fan; I like to think I had something to do with that. The last person you’d expect to catch with a pair of gender blinkers — but of course the very reason I’m bringing all of this up is because she caught herself wearing one, back when she was working on her M.Sc.

She couldn’t figure out why there weren’t any females in the wild stickleback populations she kept sampling, you see. It didn’t make sense. She seined and seined, and all she ever got were males. She knew they were males, because every last one of them changed color during mating season, which is what male sticklebacks of that species do.

Except as it turns out, it’s also what females of that species do. It’s just that over however many years and however many research projects, nobody had actually noticed that before. It even came as a surprise to her supervisor, who was regarded throughout the department and beyond as Doctor Stickleback.

I knew this other guy too, back around the same time. Not so much of a friend; the kind who’d turn up his nose when he discovered you were designing an AD&D campaign instead of polishing your laundry lists for eventual publication in Nature. (Debbie, in contrast, was a kick-ass cleric with a deathly fear of zombie flesh-eaters). This other guy did his doctorate on damselflies, I think. Maybe crane flies. One of those insects with veined transparent wings anyway, little patches of which are sometimes blotched with dark pigment. He was doing a dispersal study as part of his thesis; how many flies returned to their pond of origin, how many headed out into the wild blue yonder never to return, that sort of thing. And in order to keep track of individuals, he dabbed teensy dots of paint in unique patterns onto the wings of each fly.

In the course of his defense, one of his examiners asked about the functional significance of those little blotches of pigment — the natural pre-existing ones, not the spots that had been applied experimentally. Did he know what purpose those spots served? (He did not; in fact, he hadn’t really even thought about it.) Turns out those patches act as a combination ballast/trim-tab system, to keep the insect stabilized in flight. By applying extra pigment with no thought for that system, the candidate had doomed some of his subjects to lopsided flight in endless circles, or weighed them down to the point that they might well have dropped from exhaustion shortly after escaping custody. Bottom line, any attempt to draw conclusions about “natural dispersal patterns” would be about as valid as those you could draw by letting a bunch of people out of prison after amputating their left legs below the knee. (Notwithstanding which, the dude passed. Last I heard, he even had tenure.)

So when my attention is drawn to this paper by Gowaty et al. (good popsci summary in ScienceDaily — thanks for that, Cate), my response is not OMG the entire field of behavioral ecology is undone. Rather, it’s more like Geez, you guys give scientists way too much credit. And you don’t give science nearly enough.

Illo credit: Kim DeRose

The short version: back in the forties, in the days before molecular genetics, some dude named Bateman needed to track fruit fly lineages in the course of a study on reproductive success. He settled on a set of mutations that were highly visible (good), but which fucked up the reproductive success of afflicted individuals (bad): curly wings (which made certain mating behaviors impossible), deformed eyes (which sailed right over the “beer goggles” phase directly into “functional blindness”). Bateman ultimately concluded that males were more promiscuous than females, based on counts of offspring so hideously deformed that a lot of them probably died before being enumerated.

Believe it or not, this is not bad science. It’s bad experimental design, which is a whole other thing — and to be fair, there weren’t a lot of methodological options back in the forties. Bateman’s technique was pretty cutting-edge given the state-of-the-art at the time. We don’t insist on perfect experimental design.

So it’s not bad science that he designed a flawed experiment. It’s not even bad science that his cultural blinkers apparently blinded him (and others) to those flaws; the scientific method is designed to compensate for the inevitable biases that accompany any field of human endeavor. No, the bad science in this story can be found in the fact that Bateman’s study appeared in 1948, and no one got around to replicating it, and failing to confirm its findings, until sixty-four years later. In the meantime it became a classic in its field, racking up somewhere in the neighborhood of two thousand citations.

Setting aside its classic-paper status, though, we’re really just talking about another iteration of my wing-pigment anecdote. Stuff like this happens all the time; usually you hope it gets caught before you go to defense. (If you want an example of science chugging along at a more reasonable pace, check out the recent dust-up about the possibility of arsenic-based life. The paper reporting those findings came out in 2010, and the one rebutting them came out just last week — not bad, given the experimental work involved.)

Anyway. As Gowaty herself says, Bateman’s study should’ve been replicated decades ago, as soon as molecular techniques were an option. Its limitations should have been obvious to anyone who read the paper, but apparently the original results were so unsurprising — of course males fuck around more than females! — that nobody saw the point in beating such an obviously-dead horse. (By the flip side of the same token, the arsenic-microbe paper ran into such furious and immediate resistance because its findings were so completely at odds with conventional wisdom.)

That said, though, tossing out one bad study hardly destroys the field of behavioral ecology (much less threatens the Darwinian evolutionary model, as those idiots over at the Discovery Institute would like to claim). In fact, we really can’t toss out the study, not in its entirety: as Tang-Martinez and Ryder point out in their intro to a symposium on this very issue:

“Bateman’s basic theoretical insight relating mating success to RS, and predicting that the sex which has greater variance in RS will be the sex that experiences stronger sexual selection, is undeniably correct.”

What Gowaty’s paper (and a whole shitload of others) has destroyed is the justification for overgeneralization. Different species have different reproductive strategies suited to their own traits, and the relevant energetics have not changed; differential cost of reproduction still leads to different behavioral strategies. What has changed is the assumptions we made in applying those energetics. Turns out a lot of them were unwarranted. Sperm (or rather, ejaculate) isn’t as trivially inexpensive to produce as everyone once thought. Females across a whole range of species exert way more control over who they mate with, and how often, than anyone ever gave them credit for in the days before molecular genetics. Choosy-female-Indiscriminate-male was a decent first-cut approximation (and still applies in some cases), but the real picture is far more nuanced and complex. If you’re interested, I’d recommend Tang-Martinez and Ryder’s paper as a concise, readable précis of all the stuff I got taught in grad school that turned out to be dead wrong by the time I’d graduated. The principles survive, but the presuppositions we applied to them were all over the map.

In fact, I’m a little disappointed that the basic principle have survived — because wouldn’t it be amazing if our understanding of the whole process did turn out to be completely ass-backwards? The most exciting scientific breakthrough I can imagine would be one that shows that we were Just. Dead. Wrong. I’d even go so far as to suggest that most practicing scientists would agree with me.

Sure, people get invested. Nobody wants to throw away the findings and theories that made their careers. The history of science is jam-packed with entrenched viewpoints, bitter rivalries, and vicious attacks on those who dared challenge the approved paradigm. The astronomer Fred Hoyle went to his grave grumbling about what an idiotic theory the “Big Bang” was. (In fact, he was the one who coined that term; he meant it as an expression of ridicule). But science is a huge honking tent, and the number of people personally involved in any given feud is bound to be pretty small. The rest of us watch from the sidelines, munching popcorn, waiting to see how it all shakes out. I submit this quote from a story about the recent discovery of the Higgs:

De Roeck said he would find it a “little boring at the end if it turns out that this is just the Standard Model Higgs.” Instead, he was hoping it would be a “gateway or a portal to new physics, to new theories which are actually running nature” …

Maybe I’m naïve, but I’d like to think that this is more typical of the scientists’ attitude. Ask any kid who wants to grow up to be a scientist (caveat to my American readers: if you can find any). They all want to make discoveries. They all want to find out new stuff. When was the last time any budding science-fair contestant ever said “I want to be a marine biologist so I can confirm Ford’s findings on orca vocalization”?

Discovering something new is way cooler than confirming something old. And discovery, by definition, involves the unexpected; it involves surprise. You can’t be surprised if all you ever learn is that you were right all along.

And this, I think, is the essence of the fuckuppedness of institutional science. Because while I like to think that most people go into science with that attitude, they learn pretty quickly to shut up about it. Being consistently wrong is no way to forge an academic career. The road to success is paved with papers that refine the current model: fill in a gap here and there, stick another brick in the wall, don’t do anything to piss off the architects (even if they don’t sit on all the funding and review boards, they sure as shit hand-picked the guys who do). The road to success is studded with traffic cops and safety rails and Maximum 30kph signs. It is risk-averse[1].

It is also boring as hell. Because a career in science is not the same thing as the thrill of science; and while your career may hinge on being right, the rush comes from discovery. And you can’t truly discover anything unless you start out by being wrong.

Maybe that’s idealistic. Maybe it’s downright naïve. So here’s a more pragmatic take: paradigms don’t last forever, no matter who lines up in their defense. Steady State lost to the Big Bang. Lamarck lost to Darwin. Conventional wisdom is like a climax forest: it is vast, and it reduces visibility, and it starves new growth in the shade of an upper canopy that grabs all the light. But eventually the deadwood accumulates past some critical threshold. Inevitably, lightning strikes.

Someday, maybe, someone will discover arsenic-based life, and the data will be solid. Someday, maybe, we’ll prove that God exists. Some day — if we’re very, very lucky — we’ll discover that we were wrong about everything. Think of the opportunities that open up when a dominant ecosystem collapses. Think of all the sunlight and space and nutrients available for new growth in the wake of a devastating forest fire. Think of the grant potential, the opportunities for professional advancement, in a field where all the established authorities have just had their asses handed to them in pieces.

Why, even someone who’s been out of the field for decades might have a shot at tenure.

[1] There are also, admittedly, those very rare cases where the mystery is so grand, the question so open-ended, that there’s been neither the time nor the data for truly entrenched views to develop. Nobody shat on Watson and Crick when they published the structure of DNA, for example (although they did belatedly shit on them for marginalizing the contributions of Rosalind Franklin).

23 Responses to “Fruit Flies, Forest Fires, and the Ecstasy of Being Wrong.”

This is one of those posts I know I’m going to end up bookmarking after linking to it a dozen times. Nothing irritates me more than gender norms reinforced by misinterpreted (or overinterpreted) biology.

This is an excellent post, Peter; thank you for writing it. As you point out, the great value of science is precisely the fact that there is a sifting procedure in place for sorting the sheep from the goats–even if it sometimes works imperfectly. At the same time, however, this is also why a little part of me died while reading above the line. I work in the humanities, where I research the intersections between cultural analysis and cognitive science. Just try doing this in a traditional humanities milieu–a milieu in which everybody is allowed to be right, all of the time; where a given paradigm can fly in the face of an empirical result of 5-sigma significance and still be taken seriously. Worse, to even query this happy state of affairs is enough to identify you as an enemy of all that is good, bright and noble in the human spirit. Fuck it man, science may not reach the bar it has set for itself, but at least it knows that there is a bar.

Michelle Amaral wanted to be a brain scientist to help cure diseases. She planned a traditional academic science career: PhD, university professorship and, eventually, her own lab.

But three years after earning a doctorate in neuroscience, she gave up trying to find a permanent job in her field.

Dropping her dream, she took an administrative position at her university, experiencing firsthand an economic reality that, at first look, is counterintuitive: There are too many laboratory scientists for too few jobs.
[ … ]

That might be one reason that not all that many American kids want to be scientists… so many did, that there aren’t jobs enough for them. See the article, it’s even worse for those who thought to pursue the academic track.

Even more evidence that the States are on a slide into whichever Circle of Hell is the most lukewarm and insipid.

Thanks as always for your interesting and excellent summary of the knife edge of science! And especially for the citation to the Tang-Martinez & Ryder paper. I will read with interest.

I’d like to quote an exchange we had back in 2008, and pat us on the back for our foresight on the whole “ejaculate not being trivially expensive” energy investment thing:

Me: I wanted to mention quickly that I think we can’t measure energy expended for reproduction by pitting the amount needed for ejaculation and ejaculate against creating eggs and pregnancy and lactation, and decide, oh, males have such a tiny energy investment. We may be artificially drawing the lines too fine around the male’s behavior. If we want to start putting in things such as scoping out and defending a territory, the energy investment starts to look more even. I’m not sure we can draw the line anywhere with a totally straight face, since males are the gene-mixers for lots of species, we could say everything the male does is energy invested in reproduction. I’m being hyperbolic, but you see the dilemma?

Great article Peter. The attraction of science to kids has always been the wow factor, the desire to discover something new, or even to disprove something that is well established. Plate tectonics, dinosaur killing asteroids, a better mousetrap. But I would also argue that some discoveries are made because the scientist wasn’t fully aware of, and therefore not biased by, the existing theories.

There was a story about a palaeontology prof at Guelph (probably a myth, but I always wished that it was true). He was doing his thesis on an old shell pile that was generally believed to be a midden. But this guy enjoyed his partying more than the reading and went into the field before reading all of the papers on the site. He didn’t even know that the scientists that had researched this site and others like it had concluded that they were the ancient version of a garbage dump. On site, he noticed that all of the shells were disarticulated and that only the left hand (top, bottom? I have forgotten the terminology) of the shells were represented. He concluded, correctly, that the site was the result of the differential transportation and sorting of the dead shells by water.

Would he have come to the same conclusion if he had gone in with the preconceived belief that it was a midden?

Peter Watts wrote, in-part: […] Think of the opportunities that open up when a dominant ecosystem collapses. Think of all the sunlight and space and nutrients available for new growth in the wake of a devastating forest fire. Think of the grant potential, the opportunities for professional advancement, in a field where all the established authorities have just had their asses handed to them in pieces.
[ … ]

Think of how the devastation of the Black Death toppled hierarchies and especially how it made freedmen of the serfs on a widespread basis, and how it created such a labor shortage that it was now more profitable to pursue technical innovations (“labor-saving devices”) than to just use up slaves because they cost far less than crafted metal-work. In the modern day, I would suggest that if we had less people, we’d have more robots. Case in point, the much-discussed demographic “crisis” in Japan. Not surprisingly, they lead the world in development and deployment of robots.

Lest anyone misread, and think that I’m suggesting that a pox on the tenured would cure all other ills, I hasten to add that I was just chiming in with another supportive metaphor.

On the other hand, people often remark about cats, watching some of the antics they go through at dinner-time, “how is it that something that seems to be spending most of its time lost in deep thought, seem to never learn much of anything?” — and it’s true: so long as whatever they are doing will end with them getting food in their mouths, they’ll keep doing the same thing over and over, even if a better way should be easy to figure out. Stupid, lazy, or just totally creatures of habit? Yet another endless debate.

Yet, is the modern system too cat-like, cheerfully doing whatever works, without much concerning itself with how it might be done much better?

Private Eye quoted De Roeck saying the same thing you did. In a wonderful example of getting absolutely everything ass-backwards, they thought this meant he was hoping for the existence of new science because that meant he could get a grant forevermore.

Of course that’s not the way it works. One doesn’t do science to get grants (they’re far too piddly for the work to be worth it). One does grants to stay just barely alive and to fund one while one does the science. The science is what matters — and with what they pay, nobody would go into scientific research for the money.

(The Eye’s bias is understandable, though — it spends most of its time reporting on the shenanigans of the interactions of politicians, big business, and the press, where you definitely *do* go into it for the money, or for the influence that’ll let you get the money later. This bias has led it astray in the past, e.g. on MMR, when it backed Wakefield for years because the government said opposite and governments always lie.)

Julian Morrison: Have you seen “Evolution’s Rainbow” by Joan Roughgarden, and her follow-up “The Genial Gene”? I’m curious what you think of her thesis.

I read Evolution’s Rainbow and found some of it dubious, but in other parts it was articulating what I had been thinking for years was wrong with the paradigm but felt like a fool articulating. How was The Genial Gene? Is it radically different, or an elaboration on the first book?

I find myself thinking of the lengthy discursion why silicon-based life is impossible in “The Disappearing Spoon: And Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements”.
Most the argument hinges on the difficulty of doing respiration with something that really prefers to be a solid.

I will try to hunt it down, Bastien, as it is a good quote and because you’ve asked, but won’t make any promises – I’m two weeks into a post-candidacy sleep cycle disruption while simultaneously dealing with all of the (important, urgent) things that didn’t get done prior to the candidacy.

That might be one reason that not all that many American kids want to be scientists… so many did, that there aren’t jobs enough for them. See the article, it’s even worse for those who thought to pursue the academic track.

Also a lot about slashing in there. Wonder how much of that stuff has been exported as well.

Even more evidence that the States are on a slide into whichever Circle of Hell is the most lukewarm and insipid.

On that note, there is apparently a bio-engineered bacteria that is supposed to end cavities on the way, so we can continue to be/become the most overweight country in the world without those painful reminders in the mouth.