Monday, March 28, 2011

If doing biology is impossible without assuming biological evolution, as some people (including at least one commenter on this blog) contend, and if it is also true that biological evolution can only be proven by an appeal to biology, then how does the evolutionary biologist avoid assuming what he's trying to prove?

Sometimes it seems like your worldview comes down to a simple attitude. Here is Richard Dawkins, being interviewed by Germany's Der Spiegel online about how viewing the world as poetic and beautiful is perfectly is perfectly consistent with its being explicable solely in scientific terms:

SPIEGEL ONLINE: What emphasis did you have in mind for the book [The Greatest Show on Earth]?

Dawkins: A positive, almost romantic view of life as something that is beautiful and explicable and beautiful because it is explicable. But there is the negative side, as well. It is an attempt to disabuse people, especially in America, but also in other parts of the world, who have become influenced by fundamentalist religion into thinking that life can be and should be explained as all designed. I regard that as a lazy and unhelpful explanation as well as an untrue one.

SPIEGEL ONLINE: You never experienced a religious phase in your life?

Dawkins: Of course. I was a child, wasn't I?

SPIEGEL ONLINE: You think religion is something we should move beyond as we enter adulthood?

Dawkins: You know what St. Paul said: When I was I child, I spoke as a child. But when I became a man, I put away childish things.

Contrast Dawkin's emphasis on giving up the child's view of the world, which involves attributing personality and design to it, with Chesterton, who is viewing the same world, but who believes there must be more than the bland physical mechanism of the world in order for it to be poetic and beautiful, and that the physically explicable nature of the world does not imply it is merely mechanism--and that there is nothing wrong in viewing the world like as a child:

But when I came to ask them I found they had really no proof of this unavoidable repetition in things except the fact that the things were repeated. Now, the mere repetition made the things to me rather more weird than more rational. It was as if, having seen a curiously shaped nose in the street and dismissed it as an accident, I had then seen six other noses of the same astonishing shape. I should have fancied for a moment that it must be some local secret society. So one elephant having a trunk was odd; but all elephants having trunks looked like a plot. I speak here only of an emotion, and of an emotion at once stubborn and subtle. But the repetition in Nature seemed sometimes to be an excited repetition, like that of an angry schoolmaster saying the same thing over and over again. The grass seemed signalling to me with all its fingers at once; the crowded stars seemed bent upon being understood. The sun would make me see him if he rose a thousand times. The recurrences of the universe rose to the maddening rhythm of an incantation, and I began to see an idea.

All the towering materialism which dominates the modern mind rests ultimately upon one assumption; a false assumption. It is supposed that if a thing goes on repeating itself it is probably dead; a piece of clockwork. People feel that if the universe was personal it would vary; if the sun were alive it would dance. This is a fallacy even in relation to known fact. For the variation in human affairs is generally brought into them, not by life, but by death; by the dying down or breaking off of their strength or desire. A man varies his movements because of some slight element of failure or fatigue. He gets into an omnibus because he is tired of walking; or he walks because he is tired of sitting still. But if his life and joy were so gigantic that he never tired of going to Islington, he might go to Islington as regularly as the Thames goes to Sheerness. The very speed and ecstacy of his life would have the stillness of death. The sun rises every morning. I do not rise every morning; but the variation is due not to my activity, but to my inaction. Now, to put the matter in a popular phrase, it might be true that the sun rises regularly because he never gets tired of rising. His routine might be due, not to a lifelessness, but to a rush of life. The thing I mean can be seen, for instance, in children, when they find some game or joke that they specially enjoy. A child kicks his legs rhythmically through excess, not absence, of life. Because children have abounding vitality, because they are in spirit fierce and free, therefore they want things repeated and unchanged. They always say, "Do it again"; and the grown-up person does it again until he is nearly dead. For grown-up people are not strong enough to exult in monotony. But perhaps God is strong enough to exult in monotony. It is possible that God says every morning, "Do it again" to the sun; and every evening, "Do it again" to the moon. It may not be automatic necessity that makes all daisies alike; it may be that God makes every daisy separately, but has never got tired of making them. It may be that He has the eternal appetite of infancy; for we have sinned and grown old, and our Father is younger than we. The repetition in Nature may not be a mere recurrence; it may be a theatrical encore. Heaven may encore the bird who laid an egg. If the human being conceives and brings forth a human child instead of bringing forth a fish, or a bat, or a griffin, the reason may not be that we are fixed in an animal fate without life or purpose. It may be that our little tragedy has touched the gods, that they admire it from their starry galleries, and that at the end of every human drama man is called again and again before the curtain. Repetition may go on for millions of years, by mere choice, and at any instant it may stop. Man may stand on the earth generation after generation, and yet each birth be his positively last appearance.

This was my first conviction; made by the shock of my childish emotions meeting the modern creed in mid-career. I had always vaguely felt facts to be miracles in the sense that they are wonderful: now I began to think them miracles in the stricter sense that they were wilful.

Saturday, March 26, 2011

In my previous post on what I considered the rational shortcomings I thought were displayed by the group of regular Darwinist commenters on my site, I had used the term "Peanut Gallery" to refer to this group, and also used the term "intellectual inbreeding" to describe the cause of their logical troubles.

But "Ozziejoe," a new commenter as far as I can tell, took the exact wording of a criticism I had previously made of another blogger and turned it around on me:

To call those with whom you disagree "Peanut Gallery" and "intellectual equivalent of the (inbred) Hapsburgs" is a clear example of the ad hominem fallacy, which involves a personal attack on the person you disagree with. I don't think Martin means this in any hostile way (in fact, I think he is partly just poking fun), but it is still a logical mistake--and one that has little to do with the merit of whatever position someone holds.

This was clever, I'll have to admit. My hat is off to him (her?).

But he should probably know that I do not use the term "Peanut Gallery" as a term of derision. In fact, I don't know that it was ever much used that way. The "Peanut Gallery" here is a group of hardy secularist souls who stop in almost daily so they can heckle, hoot, and catcall, and I give them pretty wide latitude to do so. I think it is essential to the roisterous ambiance here at Vital Remnants. I use it with a sense of affection.

I don't know what I'd do without them.

These are all people who, if they lived closer, I would be glad to invite over to smoke cigars and argue--something I would enjoy even more, I admit, if they were nonsmokers.

Also, the term has no logical role in my argument. I did not say that they were wrong because they were members of the Peanut Gallery. I merely used it to identify the group to which I was referring. So it could hardly be considered a logical fallacy.

I also said that I thought they suffered, like Darwinists in general, from the ill effects of "intellectual inbreeding," and compared them, metaphorically, to the Hapsburgs, the Austrian noble house that ruled until the mid-18th century who, because of marrying their cousins and other close relations, accumulated a number of interesting genetic defects, including hemophilia, sterility, and the "Hapsburg lip," a malady that prevented Charles II from chewing.

I don't have anything against the Hapsburgs. I'm sure they were very nice people. It's just that, when you bumped into them, they bled all over you.

I was not referring to any moral shortcomings on the part the part of the august members of the group of Regulars here, but merely to the intellectual habit Darwinists generally seem to possess whereby they dismiss the views of those who aren't intellectually close to them out of hand, and refuse to seriously consider the nonconsanguinous ideas of those outside their immediate intellectual relations. A habit that results in a strange kind of intellectual sterility. As far as I know, they are all still able to chew, in an intellectual sense, but their intellectual systems have a hard time dealing with unfamiliar ideas.

This was on evidence in the post I referred to when they refused (with the possible exception of Art, who stepped entirely out of his scientistic zone and gave me a syllogism which, however, led nowhere) to offer clear reasons why anyone who held to a creationist position could possibly be considered a critical thinker, other than that they thought creationists were mistaken in their positions.

Again, I was offering this as a metaphorical description of the situation; it was not part of the main argument of my post.

Friday, March 25, 2011

Every week seems to have a theme on this blog. If the big debate in the comments section of my post on Ken Ham was any indication, the theme for this week was whether anyone who holds the belief that the world was created rather than simply having developed through random chance over time can be said to have practiced critical thinking in doing so.

Of course, what constitutes "critical thinking" can sometimes be in the eye of the beholder, but what I mean by it, and what most people seem to mean by it, is that you are logical.

Now I'll say again, as I have said many times before on this blog, that I do not take a position on the age of the earth, largely because I am not a scientist and not terribly familiar with the scientific issues involved. That I have made lots of fun of Darwinists who make philosophical assertions disguised as scientific statements has made the members of this blog's Peanut Gallery, made up exclusively of people who hold to the dogma of scientific materialism, seemingly irate that I don't hold a position. They desperately want me to be a creationist and it really ticks them off that they can't pin it on me.

They want to set up a false dichotomy under which you are either a young earth creationist or a Darwinist. And I have to remind them repeatedly that I am trained primarily in philosophy and I am a Thomist, and, consequently, the issue of the age of the earth does not concern me too terribly much. The only question that really matters to me (other than the fact that there was an initial creation of the world) is whether organic things in this world--and the world itself--have intrinsic natures and purposes--purposes that may or may not have worked themselves out over a long period of time.

But, still, the fact that I don't condemn creationists really gets their goat. They charge that no one who holds to a creationist position can possibly be a critical thinker, and to say that they might be is a mortal intellectual sin.

But the funny thing about it is that their attempts to prove this lack any merit as proofs. Their own attempts to substantiate their position that no believer in creation can be a critical thinker lack all the hallmarks of critical thinking.

Their attempts to prove their point are plagued by an inability to make simple distinctions (an ability, which, along with the ability to see resemblances, is an essential part of critical thinking), a failure to see beyond their own prejudices, and a penchant to view anyone who differs with them as automatically stupid.

Their views bear all the marks of intellectual inbreeding. They are the intellectual equivalent of the Hapsburgs.

I have asked repeatedly for reasons why someone who holds to creationism cannot be logical and I keep getting the same answer: they're wrong. They cannot make the simple distinction between someone who is wrong and someone who is illogical. Now this is pretty basic. It is entirely possible for someone to be both wrong and logical--or for that matter, to be right and illogical. Happens all the time. And anyone who doesn't see this has undermined his own claim to be logical.

Anyone who knows anything about logic knows that you can have a perfectly logical reason for believing in something and be completely wrong. The process of reasoning whereby you get to a conclusion may very well be flawless, but the conclusion can still be wrong. All it takes is a false premise.

This is why a lot of really smart, logical people, come down on different sides of the same issues. But people like Singring, One Brow, and Art seem to be having a hard time wrapping their minds around this simple concept--a concept which anyone with pretensions to rationality ought to be able to grasp fairly easily.

In fact, until Art, a science professor at the University of Kentucky, surprised me the other day and actually produced a logical syllogism, I had concluded that no one in the Peanut Gallery was even capable of it. And Singring can't even tell the difference between contrary statements and contradictory statements. I submit that the champions of scientific materialism in the comments section of this blog who claim to be so rational could not pass a basic, high school-level, first year logic exam. And if they think they can, I've got one for them. We can do it right now.

Step right up. See if you can pass a test that my high school home school students ace on a regular basis.

This inability (or refusal) to admit that the fact that you disagree with someone is insufficient reason to conclude that they are illogical does have one advantage: it dispenses with the trouble of actually having to engage in any kind of critical thinking yourself.

A strange thing for people who criticize other people for lacking critical thinking skills to do.

Thursday, March 24, 2011

Susan Evans at the Homeschool Channel has written a post entitled "Don't Study Latin," an admittedly provocative title, particularly to those of us who teach it. The post is a representative example of what many people think about Latin, and for that reason alone is worth responding to.

First, she says, "At the risk of ticking people off (and the more you're ticked off, the more it's probably true), I would like to say that people who study Latin are snobs." She invites her reader to throw tomatoes if they disagree with her. I will not stoop to hurling vegetables, but I will point out a few logical problems with her argument.

First, to call those with whom you disagree "snobs" is a clear example of the ad hominem fallacy, which involves a personal attack on the person you disagree with. I don't think Evans means this in any hostile way (in fact, I think she is partly just poking fun), but it is still a logical mistake--and one that has little to do with the merit of whatever position someone holds.

If we were to argue, for example, that all mathematicians were snobs, would that say anything substantive about the truth or usefulness of mathematical constants, differential equations, or the Pythagorean Theorem? Does the personal state of the practitioner of some discipline qualify that discipline in any meaningful way? If Einstein had been caught shoplifting, would that mean that E no longer equaled MC2?

Second, she says, "All I'm saying is that the study of Latin is dreadfully boring. You're punishing your children. Are you just checking off the boxes of what you should do for a classical education just to say you did it? Or worse, to boast about your children?"

The assumption here seems to be that we should not study subjects that are boring, and that the fact that some people might find a particular subject boring is the fault, not of the person studying it, but of the subject itself. Does that mean that if our children find math boring then we should not teach it to them? Or literature or history? Does the merit of the subject depend on how exciting our children happen to find it? And then, of course, there is the matter of whether it really is boring and, if so, why? Is the teaching of English grammar punishing your children? There are many students who find English grammar terribly boring.

Latin is like any other subject in this respect: if taught poorly it is boring; if taught well it is not. But, again, this has little to do with the merit of the subject itself.

In regard to people studying Latin "checking off boxes," Evans does not give us any evidence to believe that this is in fact the reason people study Latin. They are certainly not reasons anyone who promotes the study of it would give. Who are these people who are studying Latin to check off boxes? And are there other subjects which Evans would find valuable that are also taught in order to check off boxes? If there are, is that a sufficient reason to denigrate those subjects? And again, is Latin any different from any other subject in this regard?

If you are going to criticize something, you need to be careful not to mischaracterize it in the process. This is commonly called a "straw man" argument. It is much easier to attack a mischaracterization of a position than the real position. You owe it to your opponent to deal with the best arguments for his opinion, not the worst ones you can find.

The arguments for studying Latin are:

1. Latin is a better way to understand the grammar of your own language--and the system of grammar in general. It is always better to study grammar in a language other than your own, since you tend to see right through the grammar of the language you already know. It is particularly good to study the grammar in an inflected language, since it has an organized noun and adjective system in addition the the organized verb system all languages have. Familiarity with an inflected language is almost the only way to really understand grammatical cases, since, in non-inflected languages like English, Spanish, and French, the cases are essentially invisible. And it is even better to study an inflected foreign language that is also regular. Evans says that if you are going to study a classical language, you should study Greek. Greek would work for the purpose of learning grammar better, but the problem with it (and German and some other inflected languages) is that they have many exceptions to their grammar rules. Latin is the most regular of all inflected languages, and is therefore easier to study.

2. The study of Latin also assists greatly in the development of critical thinking skills. The first reason has to do with its sheer complexity. As with any foreign language studied using a grammar-based approach, it requires the understanding and use of important linguistic distinctions such as person and number. As an inflected language it also involves distinctions of case, gender, and number. There are other sophisticated distinctions involved in Latin such as those between quality and quantity in adjectives, those between the different kinds of ablative cases, and those between the different noun declensions and verb conjugations. Someone has observed that just matching a Latin adjective to the noun it modifies in case, gender, and number involves 17 mental steps. Making distinctions and seeing resemblances are the two basic thinking skills, and Latin is full of them. This may be why there is a high correlation between the study of Latin and high performance on college entrance exams.

3. It is the mother tongue of Western civilization, being the basis for English academic vocabulary in general, and the vocabulary of the sciences in particular. Fully 60 percent of academic vocabulary is Latinate. One and two syllable English words are largely Anglo-Saxon, but larger words are mostly derived from Latin. Consequently, a student who knows Latin has a much easier time negotiating academic English. He will know the meaning of words he has never seen before--and more deeply understand the ones is already familiar with.

Anyone who challenges the wisdom or utility of teaching Latin needs to deal with these arguments.

"Plus," says Evans, "the people of Greece actually speak Greek. I've been to Greece, and I've heard Greek being spoken. It's definitely a live language." This is true. The problem is that the Greek spoken in ancient times is quite different from that spoken today. I remember a Jehovah's Witness coming to my house, who I invited in. His Greek accent was thick enough so that he was hard to understand. When we began discussing the New Testament, I got my Greek New Testament out and we looked at it. He was clearly unable to understand it.Evans says, "The bottom line is this: our time is precious and limited. Don't you want the greatest amount of good done in the least amount of time? If you can actually learn the Latin roots while at the same time learning a real live language that is the second language of our country, why not do it?"

For one thing, simply learning Latin roots outside of the language is an inferior means of learning the roots. You will know the roots much better if you actually study the language. Learning something in context always results in learning it better. I agree that our time is precious and limited. So why not study a language that teaches you grammar better, that, because of the mental training, is the best preparation for the later study of logic, and that is the root of the very language we speak--not to mention the foundation for the languages that Evans says we should study instead.

Latin primarily, and Greek secondarily, were once staples in any good school and only ceased being so when they were pushed out by the progressivism and pragmatism that has driven our schools into the ground. They were used because they were an essential part of the old, classical liberal arts curriculum--a curriculum that intellectually prepared students for any other subject because it trained students in the arts or skills that were common to all subjects. They were part of what we did to pass on the culture of the Christian West to each successive generation.

Wednesday, March 23, 2011

The price tag for Obamacare just went up. Here's the Wall Street Journal on the Congressional Budget Office's new report:

What's $2.3 trillion among friends? That's the canyon between the Congressional Budget Office's estimate of a $9.5 trillion federal budget deficit over the next decade under White House proposals, and the White House's own estimate of $7.2 trillion. The discrepancy emerged in a CBO analysis released Friday, not that it got much media attention.

Tuesday, March 22, 2011

I have propounded my own theory of human development on this blog many times. I believe, as I have said, that the evidence suggests that human beings are evolving into creationists. There are many indications that creationism is on the rise, and that Darwinists are headed for extinction, the victim of their own theory that it is only the fit that survive. I have called their fitness into question on the basis of a disorder known as "Darwinian Intolerance" (DI), a malady which I attribute to close intellectual inbreeding and whose acronym, I am amused to note, is the same as that of the Discovery Institute.

But a monkey wrench, so to speak, has been thrown into the legitimacy of my theory by a prominent creationist, who appears to have contracted a bad case of a related malady. Ken Ham, the proprietor of the Creation Museum in Petersburg, Kentucky, a facility whose right to exist I have defended on a number of occasions against the Darwin Police who have sought to shut the place down, appears to have descended to the same level of intellectual intolerance to which he himself has been subjected by his Darwinist detractors. He has gone after a fellow speaker at a home school convention for not being sufficiently close-minded about the issue of creation.

Jay Wile, who I had the great pleasure of having dinner with a couple times last week at the Southwest Home School Convention, is a science teacher and the author of a set of widely used science textbooks--and a creationist. He blogged about this at Proslogion, defending Peter Enns, a well-regarded Biblical scholar against Ham's charge that, because he does not believe in a literal account of creation, Enns is a "compromiser" and a theological "liberal."

One of the things you find out pretty quickly from talking with Wile is that he is a practitioner of the arts of critical thinking, logical analysis, and open and objective discussion, things which the New Atheists and other militant Darwinists officially oppose. And he had the temerity to actually apply them to statements which didn't really deserve them.

And to top it off, Wile's defense of Enns itself was attacked by the Hamites, as evidenced by the now 141 comments on this post.

One of the points Wile makes is that, if you are going to categorize everyone who does not believe in a literal 6-day creation as a compromiser and theological liberal, then you're going to have to deal with the consequences, which consist of calling people theological liberals who clearly are not.

Here's the argument:

All people who reject a literal 6-day creation are compromisers and theological liberals

C. S. Lewis (and Norman Geisler, and Gleason Archer, and William Lane Craig, [provide your own name from the list of prominent orthodox thinkers throughout the history of the Church] rejects a literal 6-day creation

Therefore, C. S. Lewis (etc.) is a compromiser and a theological liberal

The major premise here, which is clearly the one on which the Hamites are operating, leads to obvious absurdity. It's enough to cause a retrograde kink in my theory of the linear evolutionary ascent of creationists over time.

I have a quick and easy way to determine whether someone is orthodox: Go over each point in the Nicene Creed. If they affirm them all (with the possible exception of the "filioque" clause, which was added after the Council and which, on that plausible ground, the Eastern Orthodox reject), then they're orthodox. That was the Church's way of doing it. Might as well get used to it.

[And we note that a literal six-day creation was not one of the things the early Church chose to make a criterion of orthodoxy]

And I have an equally quick way of determining whether they really believe in a literal reading of the Bible: Offer them a bottle of fine French wine. If they refuse on Biblical grounds (despite all those pesky positive portrayals of "wine" that cannot be explained away by tortuous cultural, moral, sociological, and theological reasoning), then they really don't believe in the literal interpretation they profess.

Biofuel policies are hurting the environment and rural people, says the BBC.

European Union policies promoting biofuels are resulting in the displacement of native peoples, the cutting down of forests, a rise if food costs, and an increase of carbon emissions. In Malindi, Kenya, the bulldozers have moved in in order to clear the land of the local people which has been handed over to an Italian company to grow Jatropha, a plant that, according to some reports, increases carbon emissions.

But it's all okay. The important thing is not that environmental policies actually improve the environment. The most important thing is that liberals feel good about themselves, and this can be accomplished by simply passing laws and implementing policies with really good titles on them that make them sound as if they will actually do something.

Then everyone can go home and pretend they really did something, even though they really didn't.

Wednesday, March 16, 2011

The American Constitution Society for Law and Policy seems to think so.

I am quite far behind in my reading of First Things, and so didn't notice Robert George's article late last year on "God and Gettysburg," in which he recounts his surprise when, sitting at a conference and reading a pamphlet put out by the American Censorshi ... er, I mean the American Constitution Society for Law and Policy (ACS), he noticed that two words had been dropped from the Gettysburg Address printed in the pamphlet: "under God."

George first noted that the pamphlet contained a page saying "The printing of this copy of the U.S. Constitution and of the nation’s two other founding texts, the Declaration of Independence and the Gettysburg Address, was made possible through the generosity of Laurence and Carolyn Tribe"--despite the fact that the Gettysburg Address was not one of the "founding documents" of this nation, being written some 89 years after the nation's founding.

But George's most telling remarks concerned the missing words: under God.

After George made the criticism, Caroline Fredrickson, propaganda minister of the ACS, fired back on their blog that George's criticism was a "distraction." A distraction, apparently is worse than a distortion in the ACS's eyes:

The truth is, five drafts of Lincoln's Gettysburg Address exist, and historians are uncertain about which one Lincoln actually read on the battlefield. Three included references to God and two did not. Which one was the most accurate is not and cannot be known for certain.

The ACS uses the "Hay draft" of the speech, a fact that was only made clear on their website after George made the criticism. George explains each of the five drafts and a bit of their history, and then observes:

Of course, none of these copies is actually the Gettysburg Address. The Gettysburg Address is the set of words actually spoken by Lincoln at Gettysburg. And, as it happens, we know what those words are. (The Bliss copy nearly perfectly reproduces them.) Three entirely independent reporters, including a reporter for the Associated Press, telegraphed their transcriptions of Lincoln’s remarks to their editors immediately after the president spoke. All three transcriptions include the words “under God,” and no contemporaneous report omits them. There isn’t really room for equivocation or evasion: Abraham Lincoln’s Gettysburg Address—one of the founding texts of the American republic—expressly characterizes the United States as a nation under God.

Fredrickson's answer?

George cites the recollections of several reporters of the time who stated that the president included the words "under God" in his remarks. Did President Lincoln improvise and add those words as he spoke? Perhaps! I wasn't at Gettysburg, so I can't be sure that George wasn't. As for the journalists' accounts, it would be interesting to read a history of the Civil War based solely on contemporaneous reports of journalists of the time, which would include countless conflicts, distortions, and inaccuracies. At the very least, honest scholars must acknowledge that wise people have differing views based on the available facts.

Oh, brother.

So you've got a document which has traditionally include the missing two words, three of the five documents contain them, and all of the independent contemporary accounts contain them and you choose the one that doesn't contain them?

No telling what these people are doing to the Constitution and Declaration.

Tuesday, March 15, 2011

This debate (apparently on William F. Buckley's "Firing Line") features William Lane Craig debating Peter Atkins. Craig makes some of the points I have made on this blog about things that cannot be proven by science. The video does not show Atkins' response, which would be instructive.

Saturday, March 12, 2011

David Bentley Hart comments on Snyder v. Phelps, the Court decision that found in favor of Fred Phelps and the Westboro Baptist Church, whose congregation terrorized the family of a soldier who died in the line of duty at his funeral. These are the people who show up at various places with "God hates fags" signs (among others).

The Court ruled 8-1 that the First Amendment protected their behavior. Hart begs to differ:

No competent historian of American jurisprudence could believe for an instant that the authors of the Constitution of the United States ever envisaged an age in which enumerated liberties would be mistaken as writs of absolute license. The guarantee of free speech was certainly never intended as a shelter for any abuse whatsoever of the liberty it granted; it certainly was not meant to protect behaviors other than the unhindered expression of political or philosophical opinion; and it certainly was not meant to prevent the application of decent public prudence in determining what is or is not an intrinsically offensive manner of expressing that opinion.

Of course, the intention of the Founders is not necessarily dispositive in Constitutional interpretation, and Hart acknowledges that, and then goes on to wonder what would happen if we interpreted the 2nd Amendment they way that the Court has interpreted the 1st:

I mean, one can believe that the Constitution truly guarantees an individual right to keep and bear arms while still believing that a person who likes to spend his days on his porch with a hunting rifle menacingly aimed at his neighbor’s children can be restrained from doing so without his constitutional liberty being thereby abridged. But I may be wrong.

He then makes an interesting observation about freedom, based on a Hegelian principle:

What I am quite certain of, however, is that Hegel was essentially right when he pointed out that freedom is a concrete and practical condition. That is, we are free not merely when our wills are subject to no restraint, but when we inhabit a civil society that places quite inviolable boundaries around the areas in which the will operates.

... More to the point, though, freedom is also a communal condition. The measure of how free we truly are is how free we are to live together in communities of shared moral expectation and responsibility, as long as these are just and lawful communities, without fearing the intrusions of those who have no regard for us.

In other words, not only is freedom not inconsistent with restraint, but it requires it.

And can we note (in keeping with this week's "Barbara Forrest is not a logician" theme) that, according to the Barbara Forrest Fallacy, which dictates that anyone who thinks that a position is legally permissible also must believe in that position, we can conclude from the Court's decision that eight of the justices "hate fags"?

Wednesday, March 09, 2011

I don't think that in the six years this blog has been in operation that I have ever had to delete a post for reasons of content. But today I deleted two (by the same author). They were comments on my post about the misinformed and sloppy article by Barbara Forrest in the journal Synethese, an otherwise reputable periodical, claiming that Beckwith is a "creationist."

Forrest is the self-appointed head of the Academic Committee on Unscientific Activities whose wild accusations of scientific subversion have garnered attention in the debate over evolution. And woe be unto the hapless academic who finds himself in the wrong place when she begins pointing fingers, identifying the creationists in our midst.

"I have here a list of 205 names of known creationists in academic departments," she seems to say. Or was it 57? It's hard for her to remember.

In her fevered tirade in Synthese, she identified Beckwith as a creationist enemy of science. Trouble is, not only is Beckwith not a creationist, he isn't even a proponent of Intelligent Design. In fact, his break with the Intelligent Design position, the result of a sort of philosophical conversion to the philosophy of Aristotelian Thomism several years ago (which accompanied his religious conversion to Catholicism at about the same time) is one of the more interesting and widely publicized intellectual stories of the last several years.

Forrest, however, writes as if the whole conversion thing really never happened. In fact, so bad was Forrest's handling of the whole matter in her article that the editors of the journal took the unprecedented step of distancing themselves from the article, publishing a disclaimer. Beckwith calls Forrest's charges "a professional embarrassment" and "philosophical malfeasance." That, quite frankly, is a charitable assessment.

In any respectable movement she would be intellectually shunned.

In any case, one of the more excitable commenters on this blog, "Human Ape" (a self-characterization the accuracy of which I will not challenge) joined in the conspiracy theory excitement and repeated Forrest's demonstrably false charges. Technically speaking, he also knowingly and brazenly (he even announced he was doing it) violated the posting rules. That, I might have let go.

But character assassination based on charges that are publicly known to be false seems to me to be a little over the top.

And speaking of over the top, just go look at what passes for intelligent commentary at Human Ape's blog. It's really something to behold. This is what we're being asked to accept at the cost of forsaking a 2,500 year old intellectual and cultural tradition. It's not fundamentally different from Forrest's approach--just a little more straightforward.

There comes a point at which your blunders begin to embarrass even your friends. Has it come to that with Forrest? Or are the intellectual standards among the Darwinist alarmists really that low?

Tuesday, March 08, 2011

There are some questions so basic they seem unnecessary or superfluous. The question “What is nature?” is one of these. We never ask it because we think the answer is self-evident.

But is it?

What many of us do not realize is that the nature of nature is far from a settled question. We think it is settled because we live in a time dominated by the physical sciences, which are commonly attended with certain mechanistic assumptions about the natural world which we imbibe by osmosis from our educational and cultural surroundings. We catch them, to use the words of Samuel Johnson, like we catch the common cold: by contagion. We are unfamiliar with how these assumptions came to be and with the ideas they replaced. We know little about the reasons the older assumptions were abandoned or why the new ones took their place. In fact, the older view of nature has, among most of us, been completely forgotten.

Did the understanding of nature change because the new idea was better, or because it better fit with the cultural presuppositions of the time? Was the old idea of nature refuted or did it simply fall out of fashion?

2 Senses of the Word Nature
The early 20th century British philosopher R. G. Collingwood pointed out in his book, The Idea of Nature, that there are two senses of the word ‘nature.’ The meaning of the word with which we are most familiar is that which signifies the cosmos or the external world: the sum total or aggregate of natural things. The other, older meaning is that which originates with the Ionian Greek philosophers—Thales, Anaximander, and Anaximines—which signifies the essence or intrinsic principle of a thing. It is the intrinsic source of behavior. Alexander Pope writes,

Here Pope is using this newer sense of the word 'nature.' Then we have the anonymous author of a nursery rhyme, who advises:

Dogs delight to bark and bite …
for ‘tis their nature to.

In this case, the word 'nature' is being used in its older sense. The older sense of the word—nature as essence—started with the Greeks, who considered it the primary sense of the word:

This [intrinsic principle of nature] is the only sense it ever bears in the earlier Greek authors, and remains throughout the history of Greek literature its normal sense. But very rarely, and relatively late, it also bears the secondary sense … (Collingwood, The Idea of Nature, p. 44)

And even when the sense of nature as cosmos came into use, the earlier sense informed their notion of it. We might call the older sense of the word the philosophical sense, and the newer, the scientific sense. This older, classical view of nature tended not so much to ask how nature worked so much as it asked why nature worked the way it did.

What has happened in modern times is the philosophical sense has been subordinated to the scientific sense—if not eliminated entirely. The shift in terminology—and in world view—is easily visible in the hindsight of history.

What is the Classical View of Nature?

Every belief operates on the basis of some basic metaphor or analogy. For the Greeks the analogy by which they viewed nature was the analogy of an organism: a living whole with a purpose, each of whose parts contained within it a purpose of its own. Things in nature, whether they were living or not, were like a heart, or a kidney, or a set of lungs: they all served some purpose in the whole, and functioned in a way commensurate with that purpose.

Motion, for example, was the result of something intrinsic to a thing making it move; it was never imposed from without. It was natural motion. When a thing was dropped, it fell to the earth—not because of any "law of nature" (a phrase characterizing the modern view of nature), but because it was the nature (used in the older sense of essence) of the thing to fall downward, and it would continue to move that way unless and until something blocked its natural tendency.

Furthermore, this purpose in nature was like a mind. In fact, nature was permeated by Mind, which was evident through its regularity and orderliness. The repetitions in nature were not the effects of the dead clockwork which the modern mechanistic view articulated through its "laws of nature," but of some seemingly living personality.

Behind nature was not a law, but a will.

The Four Causes

When questions were asked about nature, they were viewed in the light of four questions, the answers to which were called the “four causes.” The four questions were:

What kind of thing is it? (the “formal” cause)

What is it made of? (the “material” cause)

What brought it about? (the “efficient” cause)

What is it for? (the “final” cause)

If you could answer these four questions, then you knew what something was.

Some people are old enough to remember their mother sewing their clothes. My mother would buy a pattern. She would lay it on top of the cloth, and cut the cloth out in the shape of the pattern to produce the dress or the shirt she was making so that I or my sister would be properly clothed.

We can see the four causes working in this process: the pattern is what determines what kind of thing it is—it is the formal cause; the cloth, being what the dress or shirt is made of, is the material cause—what it is made of; the efficient cause—what brought it about—was my mother; and the final cause—what the dress or shirt was for—was to clothe me and my sister.

The things of nature are viewed much the same way in the classical view. Each thing has an eternal pattern, and it is made of something, by something, and for something. The chief difference, however, between the process of making a piece of clothing—or any other human artifact—and God making the universe is that when my mother made the garment, the only thing intrinsic in the garment was the cloth. Everything else was imposed on the cloth from without. The formal cause—the pattern, the efficient cause—the maker, and the final cause—the purpose, were outside the garment. They were extrinsic to it. But when God created the world, according to this older view, he was able to place all of these things in the world itself.

Unlike my mother, who can only impose these things extrinsically, God can put the eternal patterns of things into them. He can also put a bit of himself into them—he is immanent in them, a theologian would say. Finally, he can put the purpose of a thing into the thing itself.

Aristotle used the example of a shipbuilder. A human shipbuilder imposes his design on the ship from without. The material of the human-designed ship has no inherent desire to be part of the ship: the wood bends into the form of the ship because someone (the designer outside the ship) forces it to. The wood itself has rather the opposite tendency; left to its own ways, it decays as wood does and returns to the earth. It doesn't maintain itself as, say, a tree does, but requires a craftsman to repair it by constantly reimposing the form of the ship on the wood.

An ancient Greek story tells of the construction of the Argo, the ship that would convey Jason and the Argonauts on their quest for the golden fleece. It was made of wood, but its keel was fashioned by Pallas Athene from the Singing Oak of Dodona, whose song, it was said, related the prophecies of Zeus. The Greek poet Apollonius Rhodius says of the launching of the Argo, as its sails caught the first burst of wind:

What if, like the prophecies of Zeus, the design of the shipbuilder could itself be put into the ship? What if, asked Aristotle, the wood was not just fashioned from without? What if the design of the shipbuilder could actually be put into the very wood itself, which willingly conformed itself to the purpose of the shipbuilder? What if, in some sense, the shipbuilder was in the ship?

It is within this pre-modern philosophical framework that Christian theologians once talked about God being both transcendent and immanent—both outside the world, as Creator, and in it, since he put his very purpose and design into it.

But the modern mechanistic view of nature sees this process very differently. Instead of seeing nature through the analogy of an organism, the modern view sees it in terms of a machine. In this mechanistic world view, there are only two causes, and even one of these must be redefined. Only the material and efficient causes—what a thing is made of and what brought it about, survive the onslaught of secular scientism in the 17th and 18th centuries. Formal and final causes are rejected completely, and even efficient causes changed to such an extent that it bears no resemblance to what Aristotle conceived.

The design of nature’s ship is not in the ship itself.

The fault lines between the classical and modern mechanistic views of nature can be seen in three questions to which these two views give completely different answers. The older, classical view asked three important questions: “What is nature?” “How is nature metaphysically ordered?” And, perhaps most importantly, “What is nature for?” The first two questions have to do with a things formal cause, it pattern, its ontology; and the third with its final cause, its purpose, its teleology.

What is Nature?

In regard to the first question—"What is nature?"—the classical view assumed first that things had natures or essences. A man had a human nature, and a dog a dog nature, and trees had tree natures, and so on. In the thought that derived from Plato, the things in this world—men and dogs and trees, for example—were replicas or imitations of natures or essences, the perfect forms of which existed in heaven. Here in this world there were men and dogs and trees, but in heaven was manness, and dogness, and treeness. The forms in heaven were the perfect models of all the imperfect replicas of them here on earth.

In later, Aristotelian thought, the natures, or essences, were not in some heavenly realm but existed in the things themselves. Human nature was in every human, dog nature was in every dog, and tree nature was in every tree. For both Plato and Aristotle, if every human were to die and none were left, there would still exist a human nature, like an eternal pattern, timeless and immaterial.

The modern mechanistic view rejects this idea. Modern thought is nominalistic: it rejects the idea of eternal natures or essences. When a classical thinker says, “Man is a creature of God,” he is referring to all the individual incarnations of humanity—the beings who have been infused with a human nature that is common to all men. When a modern mechanistic thinker refers to men, he is simply referring to all featherless bipeds, those vertebrate creatures who happen to share certain characteristics, like having two arms and two legs, ten fingers and toes, two eyes, a nose, and a mouth, and whose cranial capacity enables them to outthink their mammalian rivals—and who, anyway, are on their way developmentally to being something else.

Many of our modern debates find their origin in this modern rejection of natures and essences. When debates arise over abortion or cloning, for example, the issue inevitably settles on the question of whether the unborn child or the cloned human is a human person. This is simply the old debate about the classical, or “realist” view about natures, and the newer, nominalistic view in another guise. What one is really asking is, “Does the fetus—or the cloned baby—have a human nature?” But the argument takes place between people whose views of nature are worlds apart, and so the opponents argue completely past each other.

How is Nature Ordered?

In regard to the second question—“How is nature metaphysically ordered?”—the classical belief was that nature is hierarchical: there were some things that were more important than other things in terms of their metaphysical significance. Minerals were at the bottom, then plants, then animals, and then, at the top of the natural hierarchy, was man.

In Leon Kass’s book, The Beginning of Wisdom, he points out that the creation story—whatever you may believe about what it says concerning the temporal order of the creation—says something very definite about the metaphysical order of nature. “The order of the cosmos is not only supremely intelligible,” says Kass, “it also appears to be hierarchic.” It is not the biology of events that is important to the classical view (since it is philosophical rather than “scientific"), but their cosmology: “In the cosmology of Genesis,” says Kass, “human beings clearly stand at the peak of creation.”

This view was universally accepted by the Greeks. “Numberless are the world’s wonders,” says the Greek playwright Sophocles, “but none more wonderful than man.”

The idea of metaphysical order is a hard one for some people to comprehend. I once announced to one of my classes that the earth was the center of the universe. The declaration was met with indignant guffaws. How, my students asked, could I make such an ignorant statement? Copernicus had shown, hadn’t he, that the earth revolved around the sun? And hadn’t later scientists proved that the earth is just one astronomical body among billions in an endless cosmos? On what grounds could I hold that the earth was the center of the universe?

“It’s simple,” I said. “The earth is the center of the universe because …” (I paused for dramatic effect) “because that’s where everybody is.”

This caused quite a commotion. When I said that the earth was the center of the cosmos, I was making a metaphysical, not a scientific, statement. I was saying that the most important place—the place where dwelt those creatures created in the image of the eternal God—was earth. My students, on the other hand, thought I was making a “scientific” statement—which is unanswerable after Einstein’s dismissal of absolute space anyway.

The idea that there are certain things that are more metaphysically significant than others runs into trouble under the modern view. The only way one thing can be discriminated from another is its material composition—that and the arrangement of its components. But since everything is made from fundamentally the same sorts of things—those things on the periodic table of elements, for example—there is no basis upon which we can say that one thing is “higher” or “lower” than another.

Charles Darwin once scrawled in the margin of one of his notebooks, “Never say ‘higher’ or ‘lower.’” Why? Because to Darwin, a modern mechanistic thinker, the only difference between the creatures referred to as “higher,” like humans, and “lower,” like ants, is their level of biological complexity. The trouble is that once you accept the idea that the only distinction between biological creatures is their physical complexity, you completely undercut any basis for human rights. Why should we value one creature more than another simply because it is more complex? What is there in the idea of complexity that confers any value at all?

The irony is that, while we talk more and more about things like human rights, we cling more and more to a world view in which human rights makes no sense. In his book, The Abolition of Man, C. S. Lewis remarks:

[W]e continue to clamour for those very qualities we are rendering impossible….In a sort of ghastly simplicity we remove the organ and demand the function … We laugh at honour and are shocked to find traitors in our midst. We castrate and bid the geldings be fruitful…

What is Nature For?

The third, or teleological question—“What is nature (and what are natural things) for?”—has to do with the purpose of the universe and the things that make it up—a thing’s final cause. In the classical view, things not only have a purpose, but a purpose intrinsic to them. It is not imposed from without, but infirms each thing. The intrinsic purpose of an acorn is to become an oak tree. The intrinsic purpose of a puppy is to become a full grown dog. The intrinsic purpose of an infant—or a fetus—is to become an adult human being.

But the modern mechanistic view rejects this too. Things have no inherent purpose. The crucial blow struck against final cause came during the Enlightenment, when early scientists such as Galileo, Kepler, and Newton became party to the expulsion of final causes from things themselves. Instead of purpose residing in things, it was instead placed in the hands of some outside intelligence.

Early in the scientific revolution, when the scientists were still self-consciously Christian, this outside purpose was seen to derive from God Himself. This is why the Enlightenment saw the rise of Deism which, in effect, saw God as the divine mechanic. He set the machine of nature in motion, and might even occasionally tweak it. The purpose of an acorn to become an oak tree was no longer in the acorn; the purpose of a dog was no longer in the dog; the purpose of the man was no longer in the man—but was imposed from without by God.

The second blow falls with Darwin. Still operating generally within the mechanistic view, Darwin places an emphasis on process. The purpose now is even taken out of God’s hands. There is no purpose in a thing, and there is no purpose imparted by God. The only “purpose” is the process itself. There is no purpose for Darwin outside natural selection. Natural selection provides its own purpose. As the evolutionary scientist Kenneth Miller says, “the design is the process.” But such a “purpose” is very different from the purpose of the old view. Purpose in the old view was an intelligent purpose, buried in nature, but originating in a mind—for the Christian, the personal intelligence of God himself. The purpose of Darwinism does not come from an intelligence—much less a personal intelligence.

The rejection of purpose, like the rejection of essences in nature and of the metaphysical ordering of nature, has ramifications. Under the classical view of nature, a thing could be judged according to whether it accomplished its purpose. If an acorn failed to fully accomplish the purpose of an oak—for shade or for wood or for simply propagating other oaks—it was judged deficient in being an oak. If a puppy failed to fully accomplish the purpose of a dog—to herd, to protect, or to provide companionship—it was judged deficient in being a dog. If a human never accomplished his purpose—say, to glorify God—it was deficient in being a human.

Under the modern view however, the only purpose a thing has is the one we individual humans give it. The prow of our ship no longer sings, but is now silent, and we are set adrift in a world devoid of meaning and purpose.

Or are we?

The irony is that the modern view grew out of the older, classical view. The advances in science which grew out of the Renaissance would have been impossible outside of the view of the world as inherently meaningful, orderly, and purposive. The modern view in this regard is like an ungrateful and rebellious child—denying his ancestry and taking sole credit for all his successes.

And, in fact, not much has changed. The modern view, in denying is origins, undermines even itself.

The historical rejection of the classical view was, said the philosopher Alfred North Whitehead, “through and through an anti-intellectualist movement.” As Edward Feser points out in his book The Last Superstition:

Indeed one comes to realize that the very possibility of reason and morality is deeply problematic at best on a modern naturalistic conception of the world, but perfectly intelligible on the classical philosophical worldview and the religious vision it sustains.

Abandoning this older view, says Feser, “was the single greatest mistake in the history of Western thought.” And he may be right.

Francis Beckwith has posted several excerpts from his response to Barbara Forrest in Synthese magazine. Forrest, the crusader against Intelligent Design and logically-challenged scourge of imaginary creationists everywhere, is bound and determined to prove that Beckwith is a creationist, despite the fact that, like, he's not.

One of her arguments is that, since Beckwith thinks that constitutionally-based arguments against teaching about Intelligent Design in schools are unsound, he therefore, must agree with Intelligent Design. Of course, that doesn't logically follow, but Forrest somehow finds it compelling.

I also have a response to Forrest in the works, focused, of course, on her sloppy reasoning. I'm sorry, but it's just hard for me to believe, given the really bad reasoning that characterizes virtually everything she writes, that the woman actually got a Ph.D in philosophy. This kind of thing wouldn't have passed muster in an undergraduate paper where I come from.

But her non-sequiturs are apparently real crowd pleasers at places like Panda's Thumb, don't you know.

Beckwith's blog article on his response is here. Edward Feser's comment is here.