Treasures that scientists keep on ice

Science, Humanity… and Spock?

I am sure someone, in the vast literature on science communication out there, has written about this much better than I can, but I want to share my perspective on an issue I think about a lot: the tension between being a human, full of biases and faults and emotions, and doing science, which at its core seems inimical to these human attributes.

Stomach-Churning Rating: 1/10; nothing but banal meme pics ahead…

This is not a rant; it is an introspective discourse, and I hope that you join in at the end in the Comments with your own reflections. But it fits into my blog’s category of rant-like perambulations, which tend to share an ancestral trait of being about something broader than freezer-based anatomical research. As such, it is far from a well-thought-out product. It is very much a thought-in-progress; ideal for a blog post.

(Dr./Mr.)Spock of the Star Trek series is often conveyed as an enviably ideal scientific mind, especially for his Vulcan trait of being mostly logical– except for occasional outbreaks of humanity that serve as nice plot devices and character quirks. Yet I have to wonder, what kind of scientist would he really be, in modern terms? It wasn’t Spock-fanboying that got me to write this post (I am no Trekkie), but he does serve as a useful straw man benchmark for some of my main points.

The first ingredient of the tension I refer to above is a core theme in science communication: revealing that scientists are human beings (gasp!) with all the same attributes as other people, and that these human traits may make the story more personable or (perhaps in the best stories) reveal something wonderful, or troubling, about how science works.

The second ingredient is simply the scientific process and its components, such as logic, objectivity, parsimony, repeatability, openness and working for the greater good of science and/or humankind.

There is a maxim in critical thinking that quite a few scientists hold: One’s beliefs (small “B”– i.e. that which we provisionally accept as reality) should be no stronger than the evidence that supports them. A corollary is that one should be swift, or at least able, to change one’s beliefs if the evidence shifts in favour of a better (e.g. more parsimonious/comprehensive) one.

It is a pretty damn good maxim, overall. But in viewing, or imagining (as “what-if?” scenarios), you may find that some scientists’ reactions to their beliefs/opinions/ideas — especially regarding conclusions that their research has reached — can occasionally violate this principle. That violation would almost always be caused by some concoction of their human traits opposing the functionality of this maxim and its corollary.

For example (and this is how I got thinking about this issue this week; I started writing the post on 5 December, then paused while awaiting further inspiration/getting normal work done/fucking around), what if Richard Dawkins was confronted with strong evidence that The Selfish Gene’s main precepts were wrong? This is a mere heuristic example, although I was thinking about it because David Dobbs wrote a piece that seemed to be claiming that the balance of scientific evidence was shifting against selfish genes (and he later shifted/clarified his views as part of a very interesting and often confusing discussion, especially with Jerry Coyne– here). It doesn’t matter if it’s Dawkins (or Dobbs) or some other famous scientist and their best or most famous idea. But would they quickly follow the aforementioned maxim and shift their beliefs, discarding all their prior hard work and acclaim? (a later, palaeontological, event in December caused me to reflect on a possibly better example, but it’s so controversial, messy and drenched in human-ness that I won’t discuss it here… sorry. If you really want a palaeo-example, insert Alan Feduccia and “birds aren’t dinosaurs” here as an old one.)

I’d say they’d be reluctant to quickly discard their prior work, and so might I, and to a degree that’s a good, proper thing. A second maxim comes into play here, but it is a tricky one: “Extraordinary claims require extraordinary evidence.” For a big scientific idea to be discarded, one would want extraordinary scientific evidence to the contrary. And additionally, one might not want to quickly shift their views to accomodate that new evidence, perhaps, as a hasty rush to a new paradigm/hypothesis could be very risky if the “extraordinary” evidence later turned out itself to be bunk, or just misinterpreted. Here, basic scientific practice might hold up well.

But, but… that “extraordinary evidence” could be very hard to interpret– this is the tricky bit. What is “extraordinary?” Often in science, evidence isn’t as stark and crisp as p<0.05 (a statistical threshold of significance). Much evidence requires a judgement call– a human judgement call — at some step in its scrutiny, often as a provisional crutch pending more evidence. Therein lies a predicament for any scientist changing any views they cherish. How good are the methods used to accumulate contrary evidence? Does that evidence and its favoured conclusion pass the “straight-face test” of plausibility?

All this weighing of diverse evidence can lead to subjectivity… but that’s not such a bad thing perhaps. It’s a very human thing. And it weighs heavily in how we perceive the strength of scientific methods and evidence. Much as we strive as scientists to minimize subjectivity, it is there in many areas of scientific inquiry, because we are there doing the science, and because subjectivity can be a practical tool. Sometimes subjectivity is needed to move on past a quagmire of complex science. For example, in my own work, reconstructing the soft tissue anatomy of extinct dinosaurs and other critters is needed, despite some varying degrees of subjectivity, to test hypotheses about their behaviour or physiology. I’ve written at length about that subjectivity in my own research and it’s something I think about constantly. It bugs me, but it is there to stay for some time.

One might look at this kind of situation and say “Aha! The problem is humans! We’re too subjective and illogical and other things that spit in the face of science! What we need is a Dr. Spock. Or better yet, turn the science over to computers or robots. Let amoral, strictly logical machines do our science for us.” And to a degree, that is true; computers help enormously and it is often good to use them as research tools. Evolutionary biology has profited enormously from turning over the critical task of making phylogenetic trees largely to computers (after the very human and often subjective task of character analysis to codify the data put into a computer– but I’d best not go off on this precipitous tangent now, much as I find it interesting!). This has shrugged off (some of) the chains of the too-subjective, too-authority-driven Linnaean/evolutionary taxonomy.

But I opine that Spock would be a miserable scientist, and much as it is inevitable that computers and robots will increasingly come to dominate key procedures in science, it is vital that humans remain in the driver’s seat. Yes, stupid, biased, selfish, egocentric, socially awkward, meatbag humans. Gotta love ‘em. But we love science partly because we love our fellow meatbags, and we love the passion that a good scientist shares with a good appreciator of science– this is the lifeblood of science communication itself. Science is one of the loftier things that humans do– it jostles our deeper emotions of awe and wonder, fear and anxiety. Without human scientists doing science, making human mistakes that make fantastic stories about science and humanity, and without those scientists promoting science as a fundamentally human endeavour, much of that joy and wonder would be leached out of science along with the uncomfortable bits.

Spock represents the boring -but necessary- face of science. Sure, Spock as a half-human could still have watered-down, plot-convenient levels of the same emotions that fuel human scientists, and he had to have them to be an enjoyable character (as did his later analogue, Data; to me, emotion chip or not, Data still had some emotions).

But I wouldn’t want to have Spock running my academic department, chairing a funding body, or working in my lab.

Spock might be a good lab technician (or not), but could he lead a research team, inspiring and mentoring them to new heights of achievement? Science is great because we humans get to do it. We get to discover stuff that makes us feel like superheroes, and we get to share the joy of those discoveries with others, to celebrate another achievement of humanity in comprehending the universe.

And science is great because it involves this tension between the recklessly irrational human side of our nature and our capacity to be ruthlessly logical. I hear a lot of scientists complaining about aspects of being a scientist that are more about aspects of being human. Yes, academic job hiring, and departmental politics, and grant funding councils, and the peer review/publishing system, and early career development, and so many other (all?) aspects of being a scientist have fundamental flaws that can make them very aggravating and leave people despondent (or worse). And there are ways that we can improve these flaws and make the system work better. We need to discuss those ways; we need to subject science itself to peer review.

But science, like any human endeavour, might never be fair. As long as humans do science, science will be full of imbalance and error. I am not trying to excuse our naughty species for those faults! We need to remain vigilant for them both in ourselves and in others! However, I embrace them, like I might an embarrassingly inept relative, as part of a greater whole; a sloppy symptom of our meatbaggy excellence. To rid ourselves of the bad elements of human-driven science, to some degree, would require us to hand over science to some other agency. In the process, we’d be robbing ourselves of a big, steamy, stinky. glorious, effervescent, staggeringly beautiful chunk of our humanity.

Spock isn’t coming to take over science anytime soon, and I celebrate that. To err is human, and to do science is to err, from time to time. But science, messy self-correcting process that it is, will untangle that thicket of biases and cockups over time. If we inspect it closely it will always be full of things we don’t like, and weeding those undesirables out is the job of every scientist of any stripe. Self-reflection and doubt are important weed-plucking tools in our arsenal for this task, because every scientist should keep their own garden tidy while they scrutinize others’. This is a task that I, as a scientist, try to take seriously but I admit my own failures (e.g. being an overly harsh, competitive, demanding reviewer in my younger years… I have mellowed).

So here’s to human-driven science. Live long and publish!

Up next: FREEZERMAS!!! A week-long extraganza of all that this blog is really about, centred around Darwin’s birthday. Starts Sunday!

Like this:

Related

7 Responses

John, it would be easier to write a book in response to this blog, than a comment.

You seem to argue two points at once:

The first is about subjectivity in science.

Even highbrow Thomas Kuhn spoke of “inspired guesses”. Middlebrow Steven Johnson: Where good ideas come from – argues that i.a. we explore the “adjacent possible”. Where you as a scientist are located in the field of force of science determines what adjacent possible you may explore, and how far outside this boundary you may go. Subjectivity is inevitable, and nothing to worry about. A termite shifting grains of sand to fix the temperature in the termite mound does not worry about its subjectivity in the choice of grains. In the end, all subjectivities coalesce into scientific progress.

Second: Dawkins.

Like Spencer, Dawkins will probably be remembered as a child of the times, rather than a great scientist. The “selfish gene” metaphor is fundamentally wrong because it slyly substitutes a “why” for a “how”. Teleology and determinism sneak in. When Dawkins wrote his book, the Zeitgeist was in favor of personal autonomy as against collective intentionality (remember the totalitarian ideologies?). We all lapped out his milk and honey metaphor. Of course, all this was unconscious – Exactly, Zeitgeist. It was defining the unspoken “adjacent possible”. Beware of what is hidden in plain view.

Dawkins’ other distractive and destructive contribution is “meme” – an analogy to gene. Sorry Richard, wrong again. While genes are particulate, “memes” would be “blending”. Fascinating that Darwin thought heredity as blending (he was proven wrong), while Dawkins thinks ideas are particulate (the “begat” theory of ideas) when it is blending – symmetry of errors. Dawkins was, of course, following a tradition going back to Plato. In Asia, life is seen as silent transformation – no memes there. Again, Zeitgeist, this time regional.

Thanks Aldo- yes I agree subjectivity is inevitable, although as individuals we seek to minimize it or we end up just making stuff up all the time. The Dawkins example was just to describe what initiated my post; I don’t want to debate whether he is right or wrong, as the answer is probably somewhere in between as science often tends to be, and very open to how one interprets an individual’s own contributions to a huge field over a long time period.

Very interesting blog John! I think another related point is egotism in science… I think it would be easier to change your mind about an idea if you accept it is not truly yours (as no piece of science is) but a culmination of ideas from many influences and sources. Much more progress can be made by working together than with a territorial, me vs you approach. This is what makes us place our worth as a scientist on one particular finding or idea instead of our contribution to the community as a whole. It also makes it scary if this finding is called into question. However in many cases stating in the wrong path may generate discussions that lead to amazing discoveries!

Thnaks Katrina- absolutely. Good point. One needs a certain amount of ego to weather the storm of criticism that is a part of peer-review in science, and to confidently present one’s work to conferences or grant agencies in order to get more work done, as in the end science is still done by individuals (not worker drones). Moderation in all things; too much ego is bad. Yet today the emphasis is increasibly upon collaborative, multi-disciplinary teams rather than individuals, so one’s ego has to slot in there comfortably like you say.

[…] We could all agree on the same lofty principles of science and digital data but even then, as complex human beings, we will have a wide spectrum of views on how to handle cases in general, or specific […]