That’s not all there is to morality, but lack the might, and you lack the ability to determine the right.

(And as for God(s) as the basis of morality? What is he/she/they/it but Mighty? the Mightiest of the Mighty?)

Might makes right is also the basis of knowledge. Of course, what counts as “might” varies considerably across time and space: might could mean “ability to summon spirits” or “to discern the secrets of nature” or, of course, to point a sword or an axe or a gun at a person’s head and say “believe” or “recant”; it could also refer to people or resources or the production of results.

Thomas Kuhn referred, famously, to paradigms: scientists operate within a particular paradigm or set of theories of how the world works, and new scientists are inculcated with and succeed according to their ability to produce new knowledge based on elaboration of those theories. Over time, however, those elaborations may run into trouble: the theory leads to x result, but y is what is witnessed. There may be some way to accommodate these anomalies, but eventually the anomalies will overwhelm the paradigm; upon the presentation of a new theory which can account not only for the old knowledge, but also the anomalies, the paradigm will shift.

(Imre Lakatos attempted to meliorate the harshness of this shift (and to mediate between Kuhn and Karl Popper’s strict falsificationism) with a notion of “research programmes” and whether they are “progressive” or “degenerative”, but he, too, allows that new research programs may emerge.)

Older or established members of a field may not accept a new paradigm or research program, but, as Max Planck famously observed, “science advances one funeral at a time”. Einstein, one of the most intelligent men of the 20th century, perhaps ever, just as famously never accepted quantum theory (“God does not play dice with the universe”), but he couldn’t foil it; he is dead, and the theory lives.

What, then, is the paradigm or research program but a form of might? It declares what counts as true and false, what is considered evidence and how to make sense of that evidence, what counts as science—and thus knowledge—at all.

None of this is meant to be argumentative, but axiomatic. This doesn’t mean there is no knowledge or no true knowledge, but that what counts as knowledge and truth is bound up in the conditions of the production of said knowledge and truth. Knowledge depends upon what we say knowledge is (“intersubjective agreement”), and there are a lot of ways to say it.

I’m a fan of science, and consider its methods to be powerful in eliciting knowledge about the natural world. I don’t think it can tell me much about poetry, but if I want to understand how a fertilized egg can turn into a person, then I’ll turn to a biology textbook rather than, say, a book of poetry.

Even the most potent forms of knowledge—the mightiest of the mighty—have their limits (see: embryology won’t teach you much about rhyme and meter), and potency itself is no guarantee against the loss or overthrow of a particular form of knowledge, an insight long known by tyrants, torturers, and con men alike.

Knowledge, for all of its power (Bacon), is also fragile: because there is nothing necessary or autonomous about any one form of knowledge, it can be lost or shattered or tossed away—which means it must be tended, and, when conditions dictate, defended.

All of which is a very long way to saying that the notion of “Let the public decide what’s the truth” with regard to the existence of climate change is a terrible, terrible idea, and as an attack on science itself, deserves to to be driven back to the gaseous bog from whence it came.

The website from which a got this image, Strange and Wonderful Things (a title after me own little heart), compares these funky little flowers to “little orange penguins marching over the rocks”—and yeah, I can see that.

But I see a bunch of old aunties in wide hats toting their bins back from the fields, or maybe the market.

Clouds are masses of frozen liquids suspended in the atmosphere, and one can use SCIENCE to determine how they form and what their shapes say about conditions in the atmosphere and that’s all for the good. Similarly, one can use the tools of SCIENCE to discover that c. uniflora is “distantly related to Foxglove and Generiads”, and that the flower is pollinated by birds who eat the white bits of the bloom.

But sometimes clouds are castles or armies or profiles of Abe Lincoln, and sometimes flowers are little orange penguins or bin-toting old aunties in wide hats.

This is one of the conundrums ways I’ve come to interpret various situations in life big and small. I don’t know that there is ever a correct decision (tho’ I’ll probably make the wrong one), but one chooses, nonetheless.

Which is to say: I choose to hang on to the “science” in political science.

I didn’t always feel this way, and years ago used to emphasize that I was a political theorist, not a political scientist. This was partly due to honesty—I am trained in political theory—and partly to snobbery: I thought political theorists were somehow better than political scientists, what with their grubbing after data and trying to hide their “brute empiricism” behind incomprehensible statistical models.

Physics envy, I sniffed.

After awhile the sniffiness faded, and as I drifted into bioethics, the intradisciplinary disputes faded as well. And as I drifted away from academia, it didn’t much matter anymore.

So why does it matter now?

Dmf dropped this comment after a recent post—

well “science” without repeatable results, falsifiability, and some ability to predict is what, social? lot’s of other good way to experiment/interact with the world other than science…

—and my first reaction was NO!

As I’ve previously mentioned, I don’t trust my first reactions precisely because they are so reactive, but in this case, with second thought, I’ma stick with it.

What dmf offers is the basic Popperian understanding of science, rooted in falsifiability and prediction, and requiring some sort of nomological deductivism. It is widespread in physics, and hewed to more or less in the other natural and biological sciences.

It’s a great model, powerful for understanding the regularities of non-quantum physics and, properly adjusted, for the biosciences, as well.

But do you see the problem?

What dmf describes is a method, one of a set of interpretations within the overall practice of science. It is not science itself.

There is a bit of risk in stating this, insofar as young-earth creationists, intelligent designers, and sundry other woo-sters like to claim the mantle of science as well. If I loose science from its most powerful method, aren’t I setting it up to be overrun by cranks and supernaturalists?

No.

The key to dealing with them is to point out what they’re doing is bad science, which deserves neither respect in general nor class-time in particular. Let them aspire to be scientists; until they actually produce a knowledge which is recognizable as such by those in the field, let them be called failures.

Doing so allows one to get past the no-good-Scotsman problem (as, say, with the Utah chemists who insisted they produced cold fusion in a test tube: not not-scientists, but bad scientists), as well as to recognize that there is a history to science, and that what was good science in one time and place is not good in another.

That might create too much wriggle room for those who hold to Platonic notions of science, and, again, to those who worry that this could be used to argue for an “alternative” physics or chemistry or whatever. But arguing that x science is a practice with a history allows the practitioners of that science to state that those alternatives are bunk.

But back to me (always back to me. . . ).

I hold to the old notion of science as a particular kind of search for knowledge, and as knowledge itself. Because of that, I’m not willing to give up “science” to the natural scientists because those of us in the social sciences are also engaged in a particular kind of search for knowledge. That it is not the same kind of search for the same kind of knowledge does not make it not-knowledge, or not-science.

I can’t remember if it was Peter Winch or Roger Trigg who pointed out that the key to good science was to match the method to the subject: what works best in physics won’t necessarily work best in politics. The problem we in the social sciences have had is that our methods are neither as unified nor as powerful as those in the natural sciences, and that, yes, physics envy has meant that we’ve tried to import methods and ends which can be unsuitable for learning about our subjects.

So, yes, dmf, there are more ways of interacting with the world than with science. But there are also more ways of practicing science itself.

This is not a “why I am not a creationist” piece. Oh no. Even though I’m not.

This is a hit on a “why I am a creationist” piece.

Virginia Heffernan, who can be an engaging writer, has apparently decided to disengage from thinking. In a widelycommented–upon piece for Yahoo, the tech and culture writer outed herself as a creationist. It is a spectacularly bad piece of . . . well, I guess it’s a species of argumentation, but as she kinds of flits and floats from the pretty to the happy and fleetly flees from sweet reason, it might be best to consider this a kind of (bad) performance art.

My brief with her is less about the God-ish conclusion than that flitting and floating: she rejects science because its boring and sad and aren’t stories about God sooooo much better?

You think I’m exaggerating? I am not. To wit:

I assume that other people love science and technology, since the fields are often lumped together, but I rarely meet people like that. Technology people are trippy; our minds are blown by the romance of telecom. At the same time, the people I know who consider themselves scientists by nature seem to be super-skeptical types who can be counted on to denigrate religion, fear climate change and think most people—most Americans—are dopey sheep who believe in angels and know nothing about all the gross carbon they trail, like “Pig-Pen.”

I like most people. I don’t fear environmental apocalypse. And I don’t hate religion. Those scientists no doubt see me as a dopey sheep who believes in angels and is carbon-ignorant. I have to say that they may be right.

Uh-huh.

Later she mentions that she’s just not moved by the Big Bang or evolution, and that evo-psych is sketchy science (which it is) this must mean all of science is sketchy (which it is not).

And then this stirring conclusion:

All the while, the first books of the Bible are still hanging around. I guess I don’t “believe” that the world was created in a few days, but what do I know? Seems as plausible (to me) as theoretical astrophysics, and it’s certainly a livelier tale. As “Life of Pi” author Yann Martel once put it, summarizing his page-turner novel: “1) Life is a story. 2) You can choose your story. 3) A story with God is the better story.”

(Would it be fair to mention at this point that I hated Life of Pi? Too beside-the-point?)

To summarize, she likes technology—because it’s trippy—but she doesn’t like knowing the hows and whys technology actually works, i.e., the science.

This would be fine—after all, there are all kinds of things I like without necessarily being interested in how and why they came to be—were it not for the fact that she’s a technology writer.

Perhaps she’s a closet Juggalo, or maybe she thought Bill O’Reilly waxed profound on the movement of tides, or maybe she just ate a shitload of shrooms and floated down to her keyboard, but I’d be very—excuse me, super-skeptical of the views of a tech writer who apparently thinks angels make iPhones.

~~~

I have to admit, I was more amused by her piece than anything, and her Twitter exchange with Carl Zimmer left me gasping; to the extent I can make out any kind of coherent line at all, it seems to be “I like stories more than theories—so there!”

As someone who likes both stories and theories—yes, Virginia, we can have both—however, I hate her feeding into the Two Cultures divide, not least because dopey angel-mongering tends to diminish even further the humanities.

I am a science enthusiast, but I am also a critic of the some of the more imperial epistemological claims by some scientists (what often gets branded as “scientism“). To note that the methods of science (methodological naturalism, nomological-deductivism—take yer pick) and knowledge produced from those methods are bounded is often taken as an attack on science itself.

And, to be fair, sometimes—as in the Storified Twitter spat, when Heffernan (big fat honking sigh) pulls Foucault out her nose to fling at Zimmer—it is.

But it ain’t necessarily so. It is simply the observation that science is one kind of practice, that it hasn’t escaped the conditionality and history of practice into some kind of absolute beyond.

Now, there’s a lot more behind that observation that I’m willing to go into at this late hour, so allow me to skip ahead to my ire at Heffernan: her dipshit argument makes it harder for those of us who’d prefer our critiques both dip- and shit-free.

So, thanks Virginia, thanks for stuffing your face with shrooms or replacing your neurons with helium or whatever the hell it was that lead you to declare the moon is made of cheese.

I’m a big fan of science, and an increasingly big fan of science fiction.

I do, however, prefer that, on a practical level, we note the difference between the two.

There’s a lot to be said for speculation—one of the roots of political science is an extended speculation on the construction of a just society—but while I am not opposed to speculation informing practice, the substitution of what-if thinking for practical thought (phronēsis) in politics results in farce, disaster, or farcical disaster.

So too in science.

Wondering about a clean and inexhaustible source of energy can lead to experiments which point the way to cleaner and longer-lasting energy sources; it can also lead to non-replicable claims about desktop cold fusion. The difference between the two is the work.

You have to do the work, work which includes observation, experimentation, and rigorous theorizing. You don’t have to know everything at the outset—that’s one of the uses of experimentation—but to go from brain-storm to science you have to test your ideas.

Biologist George Church thinks synthesizing a Neandertal would be a good idea, mainly because it would diversify the “monoculture” of the Homo sapiens.

My first response is: this is just dumb. The genome of H. sapiens is syncretic, containing DNA from, yes, Neandertals, Denisovans, and possibly other archaic species, as well as microbial species. Given all of the varieties of life on this planet, I guess you could make the case for a lack of variety among humans, but calling us a “monoculture” seems rather to stretch the meaning of the term.

My second response is: this is just dumb. Church assumes a greater efficiency for cloning complex species than currently exists. Yes, cows and dogs and cats and frogs have all been cloned, but over 90 percent of all cloning attempts fail. Human pregnancy is notably inefficient—only 20-40% of all fertilized eggs result in a live birth—so it is tough to see why one would trumpet a lab process which is even more scattershot than what happens in nature.

Furthermore, those clones which are successfully produced nonetheless tend to be less healthy than the results of sexual reproduction.

Finally, all cloned animals require a surrogate mother in which to gestate. Given the low success rates of clones birthed by members of their own species, what are the chances that an H. sapiens woman would be able to bring a Neandertal clone to term—and without harming herself in the process?

I’m not against cloning, for the record. The replication of DNA segments and microbial life forms is a standard part of lab practice, and replicated tissues organs could conceivably have a role in regenerative medicine.

But—and this is my third response—advocating human and near-human cloning is at this point scientifically irresponsible. The furthest cloning has advanced in primates is the cloning of monkey embryos, that is, there has been no successful reproductive cloning of a primate.

To repeat: there has been no successful reproductive cloning of our closest genetic relatives. And Church thinks we could clone a Neandertal, easy-peasy?

No.

There are all kinds of ethical questions about cloning, of course, but in the form of bio-ethics I practice, one undergirded by the necessity of phronēsis, the first question I ask is: Is this already happening? Is this close to happening?

If the answer is No, then I turn my attention to those practices for which the answer is Yes.

Cloning is in-between: It is already happening in some species, but the process is so fraught that the inefficiencies themselves should warn scientists off of any attempts on humans. Still, as an in-between practice, it is worth considering the ethics of human cloning.

But Neandertal cloning? Not even close.

None of this means that Church can’t speculate away on the possibilities. He just shouldn’t kid himself that he’s engaging in science rather than science fiction.

Astronomers searching for the building blocks of life in a giant dust cloud at the heart of the Milky Way have concluded that it would taste vaguely of raspberries.

Ian Sample of the Guardian reports that after years of pointing their telescope into the nether regions of the ‘verse,

astronomers sifted through thousands of signals from Sagittarius B2, a vast dust cloud at the centre of our galaxy. While they failed to find evidence for amino acids, they did find a substance called ethyl formate, the chemical responsible for the flavour of raspberries.

“It does happen to give raspberries their flavour, but there are many other molecules that are needed to make space raspberries,” Arnaud Belloche, an astronomer at the Max Planck Institute for Radio Astronomy in Bonn, told the Guardian.

Curiously, ethyl formate has another distinguishing characteristic: it also smells of rum.

No, I haven’t given up on my attempt to make sense of the outer reaches of modernity by looking at the [European] origins of modernity, but I haven’t made much headway, either.

Oh, I been readin’, oh yeah, but have I done anything with all that reading? Not really. Beyond the most basic fact that modernity and secularism two-stepped across the centuries, as well as the sense that medievalism lasted into the 20th century, I have information, history, ideas—but no theory.

Peter Gay’s two-volume essay on the Enlightenment (called, handily enough, The Enlightenment) has been helpful in understanding how the ideas of the early modern period were cemented in intellectual thought, but precisely because these men were already modern, they are of less help in understanding those who became modern, or who were medieval-moderns.

Newton, for example, was a kind of medieval-modern. His work in physics, optics, and calculus encompass a large portion of the foundation of modern science, but he also conducted experiments in alchemy; the founding of a new kind of knowledge had not yet erased the old.

Other, scattered thoughts: The Crusades were crucial in re-introducing into Europe the ideas of the ancient Greeks. . . although, even here, al-Andalus also provided an entree for Muslim knowledge of and elaboration on Levantine thought into a Christian worldview. Also, I haven’t read much on the impact of westward exploration and colonization on European thought. Hm.

Evolution in war strategy and armaments—I’m thinking here of the recruitment and consolidation of armies—undoubtedly played a role, as did consequences of those wars, especially the Thirty Years War. (The Treaty of Westphalia is commonly considered an origin in the development of the concept of state sovereignty. Which reminds me: Foucault.)

What else. I haven’t read much in terms of everyday life during this period, although I do have Braudel and Natalie Zemon Davis on my reading lists. I’m still not sure where to put the on-the-ground stuff, interested as I am in intellectual history. Still, a concentration on thoughts untethered from practice yields shallow history.

I have developed an abiding hatred for the Spanish Empire. This may be unfair to the Spaniards, but they turn up again and again as the bad guys. (How’s that for subtle interpretation?) I’ve got a big-ass book on the history of the Dutch Republic that I’m looking forward to, not least because of the central role of the Dutch in the development of capitalism.

Still, I really want to try to nail down the emergence of the individual as a political subject: there is no modernity without this emergence. The Reformation and the wars of religion are crucial, of course, but I want to understand precisely how the connection was made between the individual and his relationship to God and the emergence of the concept of the individual citizen’s relationship to the state. (I say concept because it took awhile for the walk to catch up to the talk.)

I have no plans to abandon this project, but if I can’t get it together, I may have to abandon my hopes for this project.

Maybe I should do that sooner rather than later: I’m always better after I’ve left hope behind.