The Left has traditionally assumed that human nature is so malleable, so perfectible, that it can be shaped in almost any direction. By contrast, a Darwinian science of human nature supports traditionalist conservatives and classical liberals in their realist view of human imperfectibility, and in their commitment to ordered liberty as rooted in natural desires, cultural traditions, and prudential judgments. Arnhart's email address is larnhart1@niu.edu.

Tuesday, November 22, 2016

Good health is a precondition for a good life. If we are crushed by disease, injury, or undernourishment, we cannot live a happy and flourishing life. Throughout most of human history, most human beings were disabled in their lives or died prematurely from poor health. Although this is still true today for many people around the world, over the past two hundred years, human life has become healthier on average than it ever was before 1800. This has been caused by the increasing freedom, knowledge, and technology coming from the Liberal Enlightenment. (See Max Roser's survey of the data for "Global Health.")

One reminder of the catastrophic effects of epidemic diseases in human history is Thucydides' account of the plague in Athens during the Peloponnesian War.

The Peloponnesian War between Athens and Sparta and their allied cities began in the summer of 431 BCE. Athens was led politically and militarily by Pericles. In the winter of 431, he delivered the Funeral Oration for those Athenians who died in the first year of the war. As reported by Thucydides, in his history of the war, Pericles' speech was a celebration of the power and the virtues of the Athenians as shaped by the freedom they enjoyed in their democracy.

As I have indicated in a previous post, ancient Athens manifested some of the economic, social, and political success of the Liberal Enlightenment, although it did not achieve the uniquely self-sustained and accelerating growth of the Great Enrichment in northwestern Europe and North America at the end of the 19th century.

In the first days of summer in 430 BCE, the Spartans and their allies invaded Attica once again, initiating the second year of the war. But Thucydides passes over this quickly in one sentence, so that he can turn immediately to a careful and dramatic description of the plague that began to appear in Athens that summer (2.47-58). It is a jolting effect for readers to have Pericles' beautiful funeral speech followed by an ugly account of a plague where bodies were left unburied. Readers might see that the implied question here is the human meaning of death: death in war can be glorious, but death in a plague is not.

The plague killed Pericles, and for that reason alone it might have turned the course of the war against Athens. Thucydides himself had the disease, but he was one of the lucky ones who recovered, which gave him immunity and allowed him to study the disease in others.

His scientific attitude in his careful observation and recording of the disease might show the influence of Hippocrates, who stressed the importance of recording clinical histories of diseases and of looking for natural causes rather than superstitiously assuming that there are divine causes. Hippocratic medical science was limited, however, by the ancient Greek taboo against the dissection and autopsy of human bodies, so that their inferences about human anatomy came from the dissection of other animals. It was also limited by the absence of microscopes for seeing microorganisms, so that the bacterial and viral causes of disease were invisible.

Thucydides says that physicians were completely ignorant of the causes of the disease, and they knew no way to treat it. In fact, the physicians commonly died from the disease because they had contact with the sick. Not only did all human arts fail to stop or slow the disease, even supplications in the temples of the gods and divinations failed, and eventually people stopped appealing to the gods.

Thucydides traces the path of the plague from sub-Saharan Africa to Egypt and Libya and then through Persia to Athens. Leaving to others any speculations about the causes of the disease, he proposes only to lay out the symptoms of the disease so that it can be recognized if it ever breaks out again.

The mortality rate was high, and victims generally died on the seventh or eighth day after first contracting the disease. But those who survived were protected from reinfection.

People in good health suddenly felt headaches, and they had inflammation and redness of the eyes, bleeding from the mouth, small pustules and ulcers over the body, vomiting, diarrhea, stomach pain, violent spasms, and unquenchable thirst. They were never able to rest or sleep. Many lost their fingers, their toes, and their genitals from the violent swelling and ulceration. Some lost their memory, so that even if they survived, they did not know themselves or their friends.

People died alone, because their family and friends were afraid to care for them, for fear of contracting the disease. Those few who were good enough to care for the victims often died as a result. The most caring and compassionate people were those who had recovered from the disease, and so they knew that they could care for the sick without fear of being attacked by the disease.

Bodies were thrown into piles to be burned without ceremony. Many bodies laid unburied.

When the sick thought they were dying, they became utterly lawless, because since they saw themselves as already under a sentence of death, they had no fear of either human law or divine law. They ceased worshipping the gods, because they saw no benefit in this.

Some people thought this plague was the fulfillment of ancient prophecies and oracles foreseeing that such a pestilence would be inflicted on the Athenians to give victory to the Spartans. But Thucydides was skeptical about this.

Thucydides says that the plague was "too much for human nature to endure" and "stronger than reason" or "beyond rational explanation" (kreisson logou).

The dark pessimistic mood of this description of the plague in Athens was recreated by Lucretius at end of his De Rerum Natura, where he recounted the story of the Athenian plague as told by Thucydides. Many readers have found it strange that Lucretius chose to end his book this way, with a sad rather than a happy ending, because the argument of the book is that the Epicurean philosophic teaching allows us to avoid any fear of death that would ruin our happiness in life. It is odd, then, that Lucretius does not suggest that Epicurus or an Epicurean would have withstood the horrible circumstances of the plague any better than anyone else.

Lucretius presents the plague at the end of his book as if it were the end of the world caused by natural causes. As I have indicated in previous posts (with links here), Lucretius taught that since the cosmos was not designed by providential gods who care for human beings, the cosmic conditions necessary for human life are not eternal, and thus the human world must someday come to an end, and human life on Earth will be extinguished. For Leo Strauss and the Straussians, this is "the most terrible truth."

While Thucydides had no rational explanation for the plague, modern historians and scientists have offered a wide variety of explanations for it as epidemic typhus, anthrax, typhoid fever, bubonic plague, smallpox, measles, or toxic shock syndrome. Most recently, some researchers have argued that the clinical and epidemiologic features of the disease as described by Thucydides conform best to Ebola, which was first recognized in humans in 1976, and which appeared recently in an outbreak in sub-Saharan Africa in 2014-2016. (See Powel Kazanjian, "Ebola in Antiquity?", Clinical Infectious Diseases 61 [2015]: 963-68.) Ebola is a deadly virus disease that kills about 50% of the people that suffer from it. The ancient Greeks could not have understood such a disease since the virus cannot be seen with the naked eye. But while modern science can explain the disease, there is so far no known cure. Understanding the disease and how it spreads through contact with bodily fluids does at least allow for containing it, and the recent epidemic was brought under control by the spring of this year.

Other epidemic diseases that have ravaged human life throughout history have been brought under control or even largely eliminated through modern scientific technology. For example, the bubonic plague is now understood as caused by the bacterium Yersinia pestis, which is transmitted by fleas carried by rodents. Originating in China, the bubonic plague was responsible for three long-lasting epidemics in Europe: the Justinian Plague (the 6th through the 8th centuries), the Black Plague (from the mid-14th century to the Great Plague of London in 1665), and the third pandemic at the end of the 19th century. The Black Plague killed 30%-60% of Europe's total population. Although it could still become a major health threat, the bubonic plague has been controlled by insecticides, antibiotics, and a plague vaccine. Other epidemic diseases that have claimed hundreds of millions of lives over human history have been almost completely eradicated--such as smallpox and malaria.

The Liberal Enlightenment has promoted the knowledge and the technology that has made this human progress in health possible. It has also promoted the statistical knowledge that makes it possible to precisely measure this progress.

As Max Roser indicates in his article on the "Burden of Disease," the Global Burden of Disease Project (GBD) of the Institute for Health Metrics and Evaluation measures the Disability Adjusted Life Years (DALYs) lost per 100,000 people. This is the sum of years of potential human life and flourishing lost due to premature mortality and the years of productive life lost due to disability.

So, for example, the DALYs for Ebola in 2015 were zero for almost all nations, but they were 2,734.33 for Sierra Leone, 1,424.39 for Liberia, and 432.91 for Guinea. The reduction in the damage from typhoid fever over the past 25 years can be measured by the DALYs for this disease. In 1990, the DALYs for India were 928.45 and for Burkina Faso 1,232.25. In 2015, the DALYs were 436.68 for India and 587.97 for Burkina Faso. The DALYs for the United States, Great Britain, and many other countries were almost zero.

Friday, November 18, 2016

If evolutionary success is measured by high rates of survival and reproduction, leading to a growing population, then the human species has been amazingly successful over the last two hundred years. (See Max Roser's article on "World Population Growth.")

Some historians have estimated that in 10,000 BCE, the world population was 2.4 million. By 1,000 CE, it was 295 million. By 1800, it was 900 million. So, for thousands of years, the human population grew, but very slowly. The annual growth rate was probably never more than .5%.

But after 1800, the annual growth rate increased to .8% in 1900 and to 2.2% in 1962, which was the highest rate of growth in human history. After 1962, population has grown, but at a declining rate. Population has grown from 1.5 billion in 1900 to 6.1 billion in 2000, and then to 7.5 billion at the end of 2016.

Growth in population depends on the combination of two factors--the rate of fertility and the rate of mortality. Prior to 1800, the rate of fertility was often high, but the rate of mortality was also high. Women gave birth to many children with the expectation that only a few would survive to adulthood. In the 19th century, beginning in northwestern Europe and North America, fertility remained high, but mortality declined, because improved standards of health and sanitation based on improved knowledge in medical science and public health lowered the rate of infant mortality and lengthened life expectancy. Consequently, population began to grow faster than ever before in human history. No country in the world today has a lower life expectancy than the countries with the highest life expectancies in 1800.

Beginning in the first half of the 20th century, the most developed nations began to show a drop in the rate of fertility. Social scientists have called this the "demographic transition." In the more developed countries, women delay the age of their first pregnancy, and they choose to have fewer children. As parents invest more in the education of their children, and as women have more opportunities for investing in their careers outside the home, parents choose to have fewer children. By the 1960s, some countries saw fertility rates fall below replacement levels (less than 2.1 children per woman), which brought a decline in population along with an ageing of the population. (See my previous post on the demographic transition.)

In recent years, however, there has been some evidence that as societies move into the very highest levels of human development--as measured by long life expectancy, great wealth, and high levels of education--the declining trend in fertility is being reversed. By 2005, Sweden and some other highly developed societies were showing this, although the increase in fertility was still not yet up to replacement levels. (See Mikko Myrskyla et al., "Advances in Development Reverse Fertility Declines," Nature 460 [6 August 2009]: 741-43.) For me, this shows that the natural human desire for children will always assert itself, although parents in the socioeconomic circumstances of modern liberal societies will often prefer to invest heavily in fewer children.

Even if the demographic transition has slowed the rate of growth in world population, the stupendous growth in population has continued. Is this a sign of human progress or not? Many thinkers of the Liberal Enlightenment have said yes. David Hume, for example, in his long essay on "Of the Populouness of Ancient Nations," criticized ancient nations for having a lower growth in population than modern nations, and he argued: "every wise, just, and mild government, by rendering the condition of its subjects easy and secure, will always abound most in people, as well as in commodities and riches. . . . if every thing else be equal, it seems natural to expect, that, wherever there are most happiness and virtue, and the wisest institutions, there will also be most people" (Essays, Liberty Fund, p. 382). Hume believed that population was growing faster in modern nations than in ancient nations because there was more liberty in modern nations: "human nature, in general, really enjoys more liberty at present, in the most arbitrary government of Europe, than it ever did during the most flourishing period of ancient times" (383). After all, the primary difference between the economic life of the ancients and that of the moderns was the practice of slavery among the ancients.

Like Hume, Etienne Damilaville, in his article on "Population" in the French Encyclopedia, edited by Diderot and d'Alembert, claimed that liberty fosters a growing population, because "it is under mild, limited governments, where the rights of humanity are respected, that men will become numerous" (Encyclopedic Liberty, Liberty Fund, p. 502).

This belief that growing population was a sign of human progress in a free society was challenged by Thomas Malthus in his Essay on the Principle of Population (1798), who warned that since population tends to increase faster than the production of food, restricting the number of births was the only way to avoid famine and starvation.

This Malthusian pessimism has been adopted by many modern environmentalists, who insist that the modern growth in human population is unsustainable and must soon lead to a catastrophic collapse of human civilization. In 1968, Paul Ehrich began his best-selling book The Population Bomb by declaring:

"The battle to feed all of humanity is over. In the 1970's the world will undergo famines--hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate. . . . We must have population control at home, hopefully through a system of incentives and penalties, but by compulsion if voluntary methods fail. . . . The birth rate must be brought into balance with the death rate or mankind will breed itself into oblivion" (11).

Ehrlich wrote that he first knew "the feel of overpopulation"during "one stinking hot night in Delhi":

"My wife and daughter and I were returning to our hotel in an ancient taxi. The seats were hopping with fleas. The only functional gear was third. As we crawled through the city, we entered a crowded slum area. The temperature was well over f100, and the air was a haze of dusty and smoke. The streets seemed alive with people. People eating, people washing, people sleeping. People visiting, arguing, and screaming. People thrusting their hands through the taxi window, begging. People defecating and urinating. People clinging to buses. People herding animals. People, people, people, people. As we moved slowing through the mob, hand horn squawking, the dust, noise, heat, and cooking fires gave the scene a hellish aspect" (15).

For an environmentalist like Ehrlich, Hell is "people, people, people, people"--too many people!

Ehrlich's prediction of massive famines in the 1970s in overpopulated countries like India proved false because of people like Norman Borlaug. Borlaug, an agronomist from Iowa, spent his life developing high-yield hybrid crops that would solve the problem of global hunger. His success in doing this was called the Green Revolution. After working for many years helping farmers in Mexico, Borlaug moved in 1963 to India and Pakistan, where he showed farmers that they could have better crops with bigger yields. He also advised governments that farmers should be paid market prices for their crops instead of imposing price controls to subsidize food for urban people, because such price controls would reduce the supply of food. Today, India and Pakistan produce seven times more wheat than they did before Borlaug arrived. In 1970, Borlaug won the Nobel Peace Prize for his work increasing the global food supply and thus averting the famines predicted by Ehrlich in 1968.

Major famines have largely disappeared from the world. The great famines of the 20th century were mostly man-made in illiberal regimes like the Soviet Union, China, Cambodia, Ethiopia, and North Korea. In the 21st century, socialist regimes like that in Venezuela continue to produce food shortages. Mao's "Great Leap Forward" famine in China (1958-1962) killed 30 million people, making it perhaps the greatest single catastrophe in human history. Once the collectivized farms in China were abolished, and farming was privatized, food production increased, and now China produces a surplus of food for world markets. The freedom for people to choose their own work, and to reap the rewards, has made this most populous nation on Earth prosperous.

For Julian Simon, an economist at the University of Maryland, people like the Chinese farmers and Norman Borlaug illustrate the point that a growing human population is good, because people are so productive and inventive in solving problems that human beings are the "ultimate resource," and so having lots of them is good for us. Simon's classical liberal approach to population made him Ehrlich's greatest adversary.

Simon argued that there are no resources for human life without the human effort to discover and use them. So, for example, petroleum is not inherently a resource. The Native Americans had no use for it. It became useful only after human beings discovered how it could be used to satisfy human desires and then found efficient ways to extract it and sell it.

Of course, human beings are consumers of resources as well as producers, and people like Ehrlich assume that in general people consume more than they produce, so that population growth is bad. But Simon argued that growing human populations in free societies, where people are free to be inventively productive, will produce net increases in resources.

Tuesday, November 15, 2016

Human life today is better than it has ever been in human history, because we enjoy the benefits of two centuries of human progress through the Liberal Enlightenment. Our time is the best of all times that human beings have ever known.

And yet most human beings around the world deny this. As the above chart indicates, in surveys asking people whether the world is getter better, most people (94% in the United States, 96% in Great Britain and Germany) say no. Many of those Americans who believe everything is getting worse voted for Donald Trump, because he appealed to their fear that America and the whole world are in decline, and because he persuaded them that only the leadership of a strongman can save them.

This popular pessimism is contradicted by empirical data that shows more human progress in the past two hundred years than at any time in previous human history. For example, in public opinion surveys, most people around the world say that world poverty has been increasing. In the United States, only 8% of the people believe that over the last 30 years the proportion of the world population living in extreme poverty has decreased. But these 8% are correct.

In this chart, the top line shows the proportion of the world's population living in extreme poverty, defined as living on less than $2 a day (measured in international dollars according to prices of 2011), from 1820 to 1990. In 1820, 94% of the world's population lived in extreme poverty. In 1990, it was down to 52%. For the second line, poverty is defined as living on less than $1 a day. In 1820, 84% of the world's population lived in such poverty. In 1990, it was down to 24%. The shorter line to the right is based on World Bank data for 1980 to 2015 showing the level of poverty defined as living on less than $1.90 a day. In 1980, 44% of the world population was living in such severe poverty. By 2015, this was down to 9.6%. This shows a steady decrease in extreme poverty over the last two hundred years, with the decrease accelerating over the past 30 years. Moreover, economic historians have found evidence that for all of human history prior to about 1800 over 95% of human beings lived in such extreme poverty.

Perhaps the best place to find this kind of data is "Our World in Data" -- an online publication by Max Roser (an economist at Oxford University) that surveys the data on the development of human living conditions at a global scale. Most of the data is presented in visual charts and global maps that show historical trends across time.

This data shows that human life in general is better today in at least ten ways:

(1) There are more lives and longer lives.
(2) Life is healthier
(3) Life is richer and less impoverished.
(4) Life shows more equality of opportunity.
(5) Life is more peaceful.
(6) Life is freer.
(7) Family life is better.
(8) Life is more environmentally sustainable.
(9) Life is more enlightened.
(10) Life is more virtuous.

Relying largely on the data collected by Roser, I will be writing a series of blog posts presenting some of the evidence for these ten progressive trends as caused by the norms and institutions of the Liberal Enlightenment that promote increasing freedom, knowledge, and technology.

I can foresee, however, that most readers will not be convinced by this evidence. The reason for this is that the human mind has evolved cognitive biases that make it hard for us to believe that human life is improving and easy for us to feel worried and dissatisfied.

One such bias is what psychologists Daniel Kahneman and Amos Tversky call the "availability heuristic." The more memorable an event is, because it's horribly shocking, the more probable we think it is.

So, for example, when we see shocking reports about terrorist attacks, we assume that such attacks are highly probable, and so we become terrified. I remember a friend of mine in Manhattan, Kansas, telling me that when she first saw the television coverage of the 9/11 terrorist attacks in New York City and Washington, DC, she left her office, picked up her son at his elementary school, and returned home where she hugged him for hours. Of course, she knew intellectually that her son in Kansas was in no danger of being attacked by terrorists. But emotionally she felt as if her family was under attack. Many Americans and many people around the world felt the same way that day.

In fact, that's the whole aim of terrorists--to throw an entire community of people into a state of panic. After the killing of 14 people in a terrorist attack by radical Muslims in San Bernardino, California, in December of 2015, Trump proposed that all Muslims should be banned from entering the United States. But very few Muslims become terrorists. And the statistical possibility of being killed by a terrorist in the United States is extremely low. Ordinary homicide is more likely. But even homicidal violence in the United States has been declining since a peak in the early 1990s. Nevertheless, the reports of terrorist attacks have such a shocking impact on us that we are inclined to be thrown into a panic that distorts our judgment about what should be done to protect ourselves.

The great danger here is that if we do not recognize the human progress brought to us by liberal norms and institutions, we might become so fearful and angry that we will turn away from our liberal principles and embrace the illiberal rhetoric of a demagogue who promises to save us.

Although the human progress achieved by the Liberal Enlightenment is real, it is not inevitable, as should be clear from the catastrophic suffering brought into the middle of the 20th century by the illiberal regimes of Stalin, Hitler, and Mao. The recent resurgence of ethnic nationalism in its attack on liberal globalism suggests that this could happen again.

Tuesday, November 08, 2016

This is a video from 2001 of Karoha Langwane, a /Gwi tracker from Lone Tree in the central Kalahari, Botswana, running an eight-hour-long persistence hunt of a kudu bull (a species of antelope), which requires the cognitive skill for tracking animals. Karoha might be one of the last traditional hunters practicing the persistence hunt--chasing the hunted animal in the mid-day sun for hours until it collapses from overheating and exhaustion--which was probably the earliest form of human hunting, going back two million years among our hominid ancestors. Louis Liebenberg has argued that the speculative tracking required for such hunting involves the hypothetico-deductive reasoning that underlies the cognitive abilities for science: our hunter-gatherer ancestors evolved an innate ability to use scientific reasoning when they interpreted tracks and signs and made testable predictions about animal behavior.

This is a video of Daniel Lieberman lecturing on "Brains, Brawn, and the Evolution of the Human Body." Lieberman is a proponent of the "endurance running hypothesis"--the idea that our human bodies and brains are evolved for running long distances, because this was required for persistence hunting millions of years ago before the invention of hunting technology like bows and arrows.

The arguments here by Liebenberg and Lieberman help to resolve a paradox about human evolution first identified by Alfred Russel Wallace. While Wallace was a co-discoverer along with Darwin of the theory of evolution by natural selection, Wallace disagreed with Darwin in that Wallace did not believe that natural selection could fully explain the evolution of human beings in their high moral and mental capacities. In his essay on "The Limits of Natural Selection as Applied to Man," Wallace argued that the human brain was larger than it needed to be for survival as a primitive hunter-gatherer. Natural selection, he observed, "has no power to produce absolute perfection but only relative perfection, no power to advance any being much beyond his fellow beings, but only so much beyond them as to enable it to survive them in the struggle for existence." And yet human beings have mental powers for abstract thought, as expressed in art, science, mathematics, philosophy, and religion, that would not have been useful for the survival of our Paleolithic ancestors. Such powers could not therefore have evolved by natural selection, because they would have been useless for the survival and reproduction of our prehistoric ancestors.

Wallace declared:

"The mental requirements of savages, and the faculties actually exercised by them, are very little above those of animals. The higher feelings of pure morality and refined emotion, and the power of abstract reasoning and ideal conception, are useless to them, are rarely if ever manifested, and have no important relations to their habits, wants, desires, or well-being. They possess a mental organ beyond their needs. Natural Selection could only have endowed savage man with a brain a little superior to that of an ape, whereas he actually possesses one very little inferior to that of a philosopher."

Since the evolution of such a brain could not be the work of natural selection, Wallace inferred, it must be the work of artificial selection by "some higher intelligence." Just as human beings have artificially selected plants and animals to be bred for special traits, so this "higher intelligence" must have guided human evolution to achieve a high mental capacity. Many readers assumed that this "higher intelligence" must be God. But Wallace said this was a misconception, because this higher intelligence could be some kind of spiritual mind other than God.

Creationists and intelligent design theorists have seen Wallace as agreeing with their claim that natural science can see evidence of creative intelligence in the natural world, and particularly in the cognitive and moral capacities of the human mind that show evidence of supernatural design.

Similar to Wallace's argument is the argument of theistic evolutionists like C. S. Lewis and Alvin Plantinga that an evolutionary naturalism becomes self-refuting if it denies the supernatural origin of the human mind. The reasoning is that the theistic doctrine of the human mind as created by God in His image provides the necessary support for the validity of human thought, including the validity of modern science. If we embrace metaphysical naturalism--the view that nothing exists except nature, and so there is no God or nothing like God--we are caught in self-contradiction: if human thought originated not from a divine Mind but from the irrational causes of nature, then we cannot trust our minds as reliable, and thus we cannot trust our belief in naturalism, or anything else. Insofar as science--including evolutionary science--depends on the validity of human thought, and insofar as theism is the indispensable support for trusting in the validity of human thought, science is not only compatible with theism, science depends upon theism.

In my posts on Plantinga's argument (here and here), I have pointed out that the weak link in Plantinga's reasoning for metaphysical naturalism as self-defeating is his assumption that adaptive behavior is completely unrelated to true belief. Plantinga asks us to imagine that we could have been naturally evolved for a state of complete and perpetual delusion. Having taken this step of radical Cartesian skepticism, he then tells us--as Descartes did--that the only escape from such skepticism is to assume that God would never allow this to happen. But as always is the case for the Cartesian skeptic, this all depends on imagining scenarios that are utterly implausible and unsupported by even a shred of evidence. The evidence of evolutionary history suggests that evolution produces cognitive faculties that are reliable but fallible. The mental abilities of animals, including human beings, are fallible because evolution produces adaptations that are good enough but not perfect, and this results in the mental fallibility that is familiar to us.

But despite this fallibility, the mental faculties cannot be absolutely unreliable or delusional. Even Plantinga concedes that in the evolution of animals, "adaptive behavior requires accurate indicators." So, for example, a frog must have sensory equipment that allows him to accurately detect flies so that he can catch them with his tongue. And the honeybee waggle dance is a dramatic example of how evolution by natural selection favors adaptive behavior that tracks the truth about the world.

Similarly, evolution by natural selection has given human beings mental capacities that are reliable, even if fallible, in tracking the world. If Liebenberg is right, the distinctively human mental capacities arose originally among prehistoric hunter-gatherers for the literal tracking of wild game, which created the capacity for the abstract hypothetical reasoning of modern science.

Archaeological evidence indicates that our hominid ancestors were hunting about two million years ago. Without weapons such as bows and arrows, the only effective form of hunting was probably persistence hunting--chasing an animal during the hottest time of the day until it overheated and dropped from exhaustion. Anatomical evidence indicates that human beings are the only primates that are designed for the endurance running required for persistence hunting.

In easy tracking terrain, hunters could follow an animal's trail by looking for one sign after another. But in difficult terrain, the hunters had to imagine the likely route the animal might take so that they might reconstruct the animal's behavior and decide in advance where they might find signs. This would require what Liebenberg calls "speculative tracking" that uses "hypothetico-deductive reasoning." Based on their knowledge of animal behavior and of the physical environment, hunters had to interpret the visible signs of an animal's path in terms of some hypothesis as to how and where the animal was moving.

In modern science, the visible world is explained by a postulated invisible world. So that, for example, physicists use particle colliders to create visible particle tracks that are explained by hypotheses about invisible structures (atoms and subatomic particles) and forces (such as gravity). Similarly, ancient hunters tracking an antelope had to interpret visible tracks as signs to be explained by hypotheses about the invisible movements of the antelope. This abstract mental capacity for hypothetical reasoning could evolve by natural selection because those hunters who were good at this were more likely to have antelope for dinner.

It has been observed, however, that only a few people in hunter-gatherer societies are successful at this, because only the most intelligent members of the society will have the capacity for such scientific reasoning. Similarly, we know that only a few people--an Aristotle, an Isaac Newton, or an Albert Einstein--will have the intellectual capacity for the deepest scientific or philosophic inquiries. And thus the philosophic or scientific life will be most fully expressed by only a few people, even though the capacity for philosophic and scientific reasoning is latent in evolved human nature.

This same capacity for imaginative, hypothetical reasoning that generates scientific and philosophic knowledge can also generate mythic fiction and superstition. Knowledge is valuable, because if we can follow the tracks of the antelope, we find the antelope, and we can eat. We can also derive some satisfaction in telling stories about the antelope deity, although we will never find it.

Tuesday, November 01, 2016

J. Budziszewski (pronounced "Boojee-shef-skee") is a professor in the Departments of Government and Philosophy at the University of Texas. He is a prolific author best known for his writings on his Christian interpretation of natural moral law, which he finds in the work of Thomas Aquinas. His most recent book is a commentary on Aquinas's "Treatise on Law."

In a series of papers, Budziszewski has criticized my defense of natural law or natural right as grounded in a Darwinian account of human nature. He criticizes me for my "determined attempt to make natural law safe for atheists." For "natural law," he argues, one must "regard nature as the design of a supernatural intelligence." By contrast, for "naturalism," one must "regard nature (in a physical or material sense as all there is." What I defend, he says, is not "natural law" but "naturalism," and thus atheism.

I was reminded of this debate while attending the recent meeting (October 28-29) of the Society of Catholic Social Scientists at Aquinas College in Grand Rapids, Michigan. There was a special panel on Budziszewski's interpretation of natural law.

As some of the panelists indicated, one of Budziszewski's main ideas is to oppose what he calls "the Second Table Project." It is said that Moses brought down from Mount Sinai Ten Commandments on two tablets of stone. Traditionally, the first four commandments are identified as the first tablet or table, and they concern the worship of God; the last six commandments (beginning with honoring father and mother) are identified as the second table, and they concern moral laws. Some Christians (Roger Williams, for example) have seen here a separation of Church and State, in that the Church enforces the first table of theological law, while the State enforces only the second table of moral law. The first table requires religious faith. But the second table can be known by natural reason. The first table corresponds to divine law that can be known only by those who are believers in the Bible as divine revelation. The second table corresponds to natural law that can be known by all human beings, even those who are not biblical believers, because it depends on natural human experience. The second table can stand on its own natural ground without any necessary dependence on the supernatural. But this is exactly what Budziszewski denies, because, he insists, there cannot be a natural law if there is no divine lawgiver.

I have argued that if we see what Aquinas calls natural law as corresponding to what Darwin calls the natural moral sense rooted in evolved human nature, then those people who have been infused with religious faith can understand that evolved human nature as the product of God's creative design working through the natural history of evolution, while those people who lack such faith can understand that evolved human nature as the product of an unguided natural history of evolution.

Darwin leaves open the possibility of theistic evolution by employing Aquinas's idea of "dual causality"--the religious believer can see natural causes as secondary causes, as distinguished from divine causes as primary causes (the subject of a previous post). (I have also written posts on the Catholic Church's acceptance of Darwinian evolution.)

Whether we have faith or not, whether we are on the side of revelation or on the side of reason, we can all recognize the common morality of natural law or natural right. Religious belief can reinforce that natural morality for those who are religious believers. But those who lack any religious belief can still recognize that natural morality as dictated by our natural experience and natural reason.

Some of my critics--not only Budziszewski, but also Craig Boyd, C. Stephen Evans, John Hare, Carson Holloway, Matthew Levering, Stephen Pope, Richard Sherlock, John West, and Benjamin Wiker--have complained that this distorts Aquinas's teaching by ignoring the ways in which Aquinas makes natural law dependent on God as the creator of that law. After all, Aquinas indicates that natural law belongs to God's eternal law, because a human being as a rational creature "has a natural inclination to his proper act and end, and this participation of the eternal law in the rational creature is called the natural law" (ST, I-II, q. 91. a. 2). Moreover, Aquinas indicates that human beings are directed to eternal happiness in Heaven as their final end, and for this they need divine law--the divinely revealed Biblical law of the Old and New Testaments--to instruct them how to achieve that eternal happiness (ST, I-II, q. 91, aa. 4-5).

But doesn't this confirm my claim about the autonomy of natural law as separated from divine law? By natural law, Aquinas says, human beings are directed to their natural end of earthly happiness, which is "in proportion to the capacity of human nature." Human beings cannot recognize their supernatural end--eternal happiness in Heaven--unless they believe in the divine law of the Bible (ST, I-II, q. 91, a. 4, ad 1). The need for divine law to reveal supernatural ends shows that natural law by itself is directed to purely natural ends that can be known by natural experience without any belief in God or His commands.

Moreover, it is only by recognizing the autonomy of natural law that allows us to use the natural law to correct the divine law of the Bible. Budziszewski's denial of natural law's autonomy makes this impossible. Consider three examples of how natural law can correct the moral mistakes in the Bible.

First, Budziszewski says that recognizing "the wrong of deliberately taking innocent human life" is part of the natural law. And yet, according to the Bible (Genesis 22), Abraham showed his faith in God by being willing to obey God's commandment to murder his innocent son Isaac. Some Christians like Soren Kierkegaard have seen this Biblical story as teaching us "the suspension of the ethical" in our faith in God. We must obey God's commands even when they are unethical. But most people see this Biblical teaching as wrong, because we recognize that wrongness of killing innocent children, and thus our natural moral sense corrects the Bible.

Aquinas explains: "that which is done miraculously by the Divine power is not contrary to nature, though it be contrary to the usual course of nature. Therefore, . . . Abraham did not sin in being willing to slay his innocent son, because he obeyed God, although considered in itself, it was contrary to right human reason" (ST, II-II, q. 154, a. 2, ad 2). Here Aquinas shows us a direct contradiction between reason and revelation, natural law and divine law, and if we take the side of revelation and divine law, we must allow--even honor--the killing of innocent people whenever we think God has commanded it, even though this is "contrary to the usual course of nature" and "contrary to right human reason."

For Aquinas, there is no way of escaping this shocking contradiction between natural moral law and arbitrary divine command, because if he appeals to natural law to correct the Biblical story, he will be exposed to persecution from church authorities. In fact, the Bishop of Paris had condemned faculty at the University of Paris who were accused of teaching pagan natural philosophy that was contrary to the Christian faith, and there were suspicions about Aquinas being one of that group. We might consider the possibility that this forced Aquinas to engage in esoteric writing.

A second example of how natural law might correct the Bible is in correcting the religious violence of the Old Testament. The last three popes--John Paul II, Benedict XVI, and Francis--have all acknowledged that the Church needs to ask forgiveness for the religious violence practiced by the Church and endorsed by the Bible, including violence against heretics and apostates. As Cardinal Ratzinger, and Prefect of the Congregation for the Doctrine of the Faith, Pope Benedict XVI endorsed a remarkable statement of the International Theological Commission in 1999 on "The Church and the Faults of the Past." This statement indicates that we must recognize that the Bible is mistaken when it reports God as commanding unjust violence. This is said to require a "paradigm change"--"a transition from a sacral society to a pluralist society, or, as occurred in a few cases, to a secular society."

So now, it seems that the Catholic Church has embraced liberalism in accepting the move from a premodern "sacral society," in which violence could be used to enforce religion, to a "pluralist society" or "secular society," based on religious toleration and nonviolence. (This has been the subject of a previous post.)

At the conference at Aquinas College, conservative Catholics argued that the only escape from the morally corrupting relativism of America's liberal culture was to restore faithfulness to the moral teaching of the Catholic Church's Magisterium. But they were largely silent about how the Church (beginning with Vatican II) has accepted the liberalism of toleration and pluralism as a correction of the illiberal religious violence endorsed by the Bible and by the premodern Church. Until recently, the Church saw Protestant Christians as heretics who could be properly persecuted and even executed (see Robert Bellarmine, On Temporal and Spiritual Authority [Liberty Fund, 2012], 79-120). Aquinas endorsed the Inquisition (ST, II-II, q. 10, aa. 8, 11; q. 11, a. 3). Now, even conservative Catholics like Budziszewski recognize that this was wrong in violating natural law.

A third example of natural law correcting the Bible is recognizing the wrongness of the Bible's endorsement of slavery. While the Bible sanctions slavery (see my post here, which includes links to other posts), Budziszewski knows by natural law that this is wrong, and therefore he looks for some way to correct the Bible to conform to his natural moral knowledge that slavery is wrong. He writes: "Consider how many centuries it took natural law thinkers even in the Christian tradition to work out the implications of the brotherhood of master and slave. At least they did eventually. Outside of the biblical orbit, no one ever did--not spontaneously" (The Line Through the Heart, 36). The explicit teaching of the Bible is that the "brotherhood of master and slave" is consistent with preserving slavery as a moral good, and this was the understanding of many Christians in the American South before the Civil War. But Budziszewski rightly judges that Christians had to correct the Bible by seeing that human brotherhood demands the abolition of slavery as a great moral wrong.

As I have argued in Darwinian Natural Right, Darwin and others were able to see the wrongness of slavery as a violation of evolved human nature, and particularly of the natural desire for justice as reciprocity.