Sunday, November 30, 2014

The trigger for his breakdown was when he saw a horse being beaten. He threw his arms around it, sobbing. He never recovered, and ended his days in an insane asylum.

For all of his attempts to portray himself as a bad boy, Nietzsche (a pencil-necked mouse of a man whose one true love, Lou Salome, refused to sleep with him even once), was in real life anything but. For one thing, he was far too sensitive for his own good, even though he tried to pretend he wasn't sensitive at all. As hard as he tried to not to, he identified with victims, and that's why the horse being beaten broke him.

In one of his writings, "Dionysus vs. the Crucified," Nietzsche wrote about two totally different religions — one based on taking the point of view of the victimizer, and the other that takes the point of view of the victim.

The first religion he correctly identified as pagan, and it has nothing to do with the silly "kinder, gentler" faux-paganism that those repulsed by what Christianity has become are today trying to create (or in their minds, recreate).

The second religion Nietzsche identified as Christian. Although an atheist, he was in some ways more Christian than those who today profess to be. He could at least identify with those victimized, something today, many people cannot do.

The pagan god Dionysus, Nietzsche pointed out, was not the god of drinking and partying and having a good time. He was the god of drunken rioting, destruction, and at times tearing people to shreds. And although it sounds counter-intuitive, he was also a fertility god.

We've all seen Dionysus. Every time you see a mob of people rioting and destroying things, and breaking into businesses and carrying off the merchandise and attacking innocent people, that's Dionysus at his worst.

There are many different myths about Dionysus — apparently each Greek town had its own version — but all of them employed the same concept: a god who is slain — in fact dismembered — and then restored to life. That's one of the reasons he was a fertility god — he died and then was reborn, just as the crops were every year.

Some of the ancient Greeks did engage in drunken destructive festivals, which brought the disapproval of the authorities, who feared revolution. A government afraid of revolution? We can use the Dionysian slaughter of the French Revolution as an example of that fear (if you want to understand ancient myths, look for the modern equivalent).

These drunken destructive orgiastic rites were finally tamed by being turned into plays, such as the ones about Oedipus and Agamemnon. In the original communal festivals, people, after their rioting, felt "cleansed" —then later, after the plays took the festivals' place, the same catharsis sent them home rid of what Aristotle famously called "pity and fear."

One of the most ominous things about these festivals is there was always a scapegoat, one onto which the sins and frustrations of the community were projected. Often they were killed. Later, in the theater, the characters were the scapegoats, only this time they were fictional and died imaginary deaths.

Scapegoating is why today in destructive rioting there are always people—the "oppressors"—who are targeted for attack (sometimes these scapegoats have been dead for hundreds of years, such as the infamous "Dead White Males" responsible for every problem in the U.S., and, indeed, the world).

After the rioting and attacks are over, those involved — however temporarily — feel renewed and rejuvenated, because they have "cleansed" themselves, not of their pity and fear, but their resentment and hate.

This scapegoating is the main thing Nietzsche noticed about Dionysus. All pagan religions, he told us, are Dionysian. They take the point of view of the victimizer; the scapegoats are always guilty and were killed for the utilitarian "greater good."

Christianity, on the other hand, for the first time in history took the point of view of the victim. As the Gospels show, Jesus was the innocent victim, although the religious leaders of the time considered him guilty ("It is expedient that one man should die for the sake of the people").

The Russian writer Dmitri Merejkowski saw the same division that Nietzsche did: he believed all religions could be divided into two basic ones: in the first, Man sacrifices Man to Man. In the second, God sacrifices Himself to Man.

Today, the French philosopher and theologian Rene Girard, author of Violence and the Sacred and Things Hidden Since the Foundation of the World, is probably the most well-known writer about scapegoating. Not surprisingly, he has been influenced by Nietzsche, whom he considered a prophet. A crazed one, but still a prophet.

Girard thought the function of a scapegoat was to renew society, however imperfectly, and another theologian, Walter Wink, agreed with him, calling it "the myth of redemptive violence," i.e., the world can be reborn through violence.

Girard has suggested scapegoating should have ended with Jesus' sacrifice, because it was the first time in history the scapegoat was considered innocent. Before that, he tells us, people always thought the scapegoats deserved exactly what they got.

The psychiatrist M. Scott Peck said scapegoating is "the genesis of human evil," because when they do it people ignore their own guilt and other flaws and project them onto other people, whom they believe have to be destroyed to rid the world of whom they have defined as evil.

We have now, and have had for thousands of years, ideologies in which one group is scapegoated, so an attempt is made to sacrifice that group.
Each of those ideologies, as Merejkowski wrote, sacrificed Man to Man. And, as Nietzsche predicted, each were worshipers of Dionysus and his destructive frenzies. His observations allowed him to predict the carnage of the 20th Century — and in his opinion, beyond.

I've read estimates of 177 million to 200 million people in the 20th Century killed in various wars. All, ultimately were scapegoats; all, ultimately, were sacrificed to Dionysus.

Dionysus belongs to what Mircea Eliade called "the myth of the eternal return." This myth has roots in non-Christian classical civilization, and in it the creation of society is followed by the degeneration of it and then by regeneration.

This notion helped the ancients deal with the uncertainty of the future -which is an eternal problem.

The writers I've quoted are telling us when certain groups of people believe society (or the world) is degenerating, a scapegoat must be found and destroyed. There is no pity for the scapegoat, since in the minds of the sacrificers the sacrificed brought it on themselves.

Jacques Barzun, in his book, From Dawn to Decadence, wrote, "When people see futility and the absurd as normal, the culture is decadent. The term is not a slur; it is a technical label. A decadent culture offers opportunities chiefly to the satirist... "

I close with something Girard wrote in Violence and the Sacred: "Men can dispose of their violence more efficiently if they regard the process not as something emanating from within themselves, but as a necessity imposed from without..."

If democracy is feminine - an unwitting slavery, the belief in an impossible safety and equality, without competition, with weak, soft manginas, then the masculine is the opposite: freedom, harder men willing take risks, competition, and an aristocracy, hopefully through competition and talent.

In reproductive terms, the "masculine" is K and the "feminine" is r.

If democracy is feminine, you'll never be able to vote in the masculine unless the vote is severely restricted. For one thing, women shouldn't be allowed to vote, which has never been allowed in any half-sane society.

The Manosphere, in its typically great confusion, splits the two above types into Alpha and Beta - and gets the definitions wrong. For one thing, putting cowardice and promiscuity as "Alpha" behaviors, without knowing cowardice has been placed there. For another, no grown man uses those terms. And the "Beta" support of civilization is actually "Alpha."

The following article is from Free Speech and was written by William Pierce.

I always have been very fond of women -- perhaps too much sometimes. I always have enjoyed their company greatly. I have really worshiped feminine beauty. I have admired and respected women when they have served their purpose in the life of our people, as much as I have admired and respected men who have served their purpose.

Having said this I must tell you now that I believe that a great part of the present pathology of our society can be ascribed properly to its feminization over the past century or two, to its loss of its former masculine spirit and masculine character.

This came to mind most recently when I saw and heard the reaction to Timothy McVeigh's statement to the court on August 14, at the time he was sentenced to die. What McVeigh said was very relevant, very pertinent. He said that the government teaches its citizens by its example. When the government breaks the law, then its citizens will not respect the law.

But the spectators almost uniformly were disappointed by this statement. They complained that they wanted to hear him say that he was sorry for what he had done, that he was sorry for the innocent victims of the Oklahoma City bombing. They weren't even interested in hearing about the much larger issue of government lawlessness that Mr. McVeigh raised. They only wanted an apology for the suffering of individual victims. This is a feminine attitude, this focusing on personal and individual feelings rather than on the larger, impersonal context. It is a feminine attitude, despite the fact that it was expressed by grown men.

Many other people besides me have come to similar conclusions, although not all of them have wanted to come right and out and say so, because that would be the height of Political Incorrectness, the height of "insensitivity." As far back as the 1960s some perceptive commentators were remarking on the generally unmasculine character of the young men they encountered in our universities. Male university students even then tended to be too timid; too soft; too lacking in boldness, pride, and independence; too whiny in adversity; insufficiently willing to endure hardship or to challenge obstacles.

We have always had both soft, dependent men and hard, proud men in our society, but the commentators were comparing the relative numbers of masculine and non-masculine men they saw in our universities in the 1960s with what they had seen in the 1930s and 1940s. The 1960s, of course, were a time when the whinier men were making extraordinary efforts to remain in the universities in order to avoid military service, while many of the more masculine men were off in Vietnam, but this isn't enough to account for the change these commentators noticed.

Something written by the American historian Henry Adams back in 1913 was recently called to my attention. Adams wrote "Our age has lost much of its ear for poetry, as it has its eye for color and line and its taste for war and worship, wine and women." Now, Henry Adams was a man who had much more than a passing interest in such matters -- he was a lifelong student of these things and also was a professor of history at Harvard back in the days when the professors at that university were expected to know what they were talking about -- so we ought to pay some attention to his observation of the state of affairs in America in 1913. Incidentally, he was a member of one of America's most distinguished families. He was a great grandson of the founding father and second President of the country, John Adams, and a grandson of the sixth President, John Quincy Adams.

Henry's brother, Brooks Adams, had written a book 18 years earlier, in 1895, on the subject commented on by Henry. It was The Law of Civilization and Decay, and in it Brooks made an even more general observation than that stated later by Henry. Brooks saw two types of man: the type he described as spiritual man, typified by the farmer-warrior-poet-priest; and the type he called economic man, typified by the merchant and the bureaucrat. I believe that Brooks must have known a different breed of priests than those I am familiar with. He was thinking of Martin Luther and Giordano Bruno, not Billy Graham and John Paul II.

He saw spiritual man as having the leading role in the building of a civilization, with the economic men coming out of the woodwork and assuming the dominant role after the civilization had peaked and was in the process of decay. Spiritual men are those with vision and daring and a close connection to their roots, their spiritual sources. Economic men are those who know how to calculate the odds and evaluate an opportunity, but who have cut themselves loose from their spiritual roots and become cosmopolitans, to the extent that that offers an economic advantage. The spirit of adventure and the current of idealism run strong in spiritual men; economic men, on the other hand, are materialists. And Brooks was referring only to European men, to White men. He was not even considering the Jews or Chinese.

Most of us are a mixture of the two types, and it's difficult to find examples of purely spiritual or purely economic men. Michelangelo and Charles Lindbergh tended toward the type of spiritual man. Pick almost any prominent politician today -- Bill Clinton or Newt Gingrich, say -- and you have a good example of economic man. Which is not to say that all economic men are politicians, by any means: just that, since they are not likely to be distinguished in the arts, scholarship, or exploration, politics is where economic men are most likely to find fame.

So what does this have to do with the feminization of our society and the preponderance of whiny young men at our universities today? Actually, these things are very closely interrelated. They also are related to the things which caught the attention of Henry Adams: the loss of our aesthetic sense, our warrior spirit, and our feeling for what is divine, along with our masculinity.

When I say "loss," I am using this word only in its relative sense. Our society still has masculine elements, masculine characteristics; it's just that they are weaker now than they were 200 years ago. And 200 years ago there were some effeminate tendencies to be found; tendencies which today have become much more pronounced. It would be an error, I believe, to attribute this shift in balance solely to the machinations of feminists, homosexuals, or even Jews. They are responsible for the condition of our society today primarily in the sense that the pus in a ripe boil is to be blamed for the boil. The feminists, homosexuals and Jews characterize our society in large part today -- they are symptoms of the pathology afflicting our society -- but we must look deeper for the cause of our decay.

Let me repeat Henry Adams' observation. He wrote: "Our age has lost much of its ear for poetry, as it has its eye for color and line and its taste for war and worship, wine and women."

If he were writing today, he might note that the immortal lyrics of his contemporary, Tennyson, have given way in favor to the pretentious drivel of Maya Angelou; that the Western tradition in art, which had culminated in the 19th century in the paintings of Caspar David Friedrich and John Constable, has been shoved aside in the 20th century by the trash-art of Picasso, Chagall, and Pollock; that the profession of arms, which was still a more or less honorable profession in the 19th century, a profession in which gentlemen and even scholars still could be found, has become at the end of the 20th century a vocation for bureaucrats and lickspittles, for men without honor or spirit; that worship, once taken seriously even by many intelligent and sophisticated men, is now the business of Christian democrats, with their egalitarian social gospel, and of vulgarians of the Jim and Tammy Faye Bakker stripe, with their television congregations of superstitious, amen-shouting dimwits.

Can we properly describe this change noted by Henry Adams as the feminization of our society? Or should it be thought of as the replacement of aristocratic values by democratic values, a general vulgarization of standards and tastes? Actually, these two ways of looking at the change are related. But let me take Brooks Adams' position now and say that the change can be attributed most fundamentally to the growing materialism in our society, to the replacement of spiritual values by economic values. What does that have to do with feminism or with democracy?

Actually, a great deal. In a very broad sense, aristocratic values are masculine values, and democratic values -- egalitarian values -- are feminine values. It is also true that, in a very broad sense, materialism is a feminine way of looking at the world. It is a way which puts emphasis on safety, security, and comfort, and on tangible things at the expense of intangibles. It is not concerned with concepts such as honor, and very little with beauty, tradition, and roots. It is a way with a limited horizon, with the home and hearth very much in sight, but not distant frontiers. Reverence and awe for Nature's majesty are unknown to the materialist.

As spiritual man gives way to economic man, when one historical era merges into another -- as idealism gives way to materialism -- society gives a freer play to the feminine spirit while it restricts the masculine spirit. Words gain over deeds; action gives way to talk. Quantity is valued over quality. All of God's children are loved equally. Pickaninnies are considered "cute" or even "adorable." The role of the government shifts from that of a father, who maintains an orderly and lawful environment in which men are free to strive for success as little or as much as suits them, to that of a mother, who wants to insure that all of her children will be supplied with whatever they need.

It is not just society which changes, not just government, not just public policy; individual attitudes and behavior also change. The way in which children are raised changes. Girls no longer are raised to be mothers and homemakers but rather to be self-indulgent careerists. Boys no longer are raised to be strong-willed, independent, and resourceful. That requires hardness and self-denial; it requires masculine rule during the formative years. A disciplined environment gives way to a permissive one, and so the child does not learn self-discipline. Spanking becomes a criminal offense. The child is not punished for disobedience, nor is he given the opportunity to fail and to learn from this the penalties that the real world holds for those who are not strong enough to succeed. And so boys grow up to be whiny and ineffective young men, who believe that a plausible excuse is an acceptable substitute for performance and who never can understand why the gratification they seek eludes them.

The move from masculine idealism to feminine materialism leads inevitably to hedonism, egoism, and eventually narcissism. Henry Adams also claimed that we have lost our taste for wine and women. Well, certainly not in the sense that we have become less interested in alcohol or sex. What he meant is that we have lost the keen edge of our appreciation for civilization's refinements, for the finest and most subtle things in life: that our appetites have become grosser as they have become less disciplined. Our interest now is in alcohol for its ability to give us a momentary buzz, not in fine wine for its inherent artistry.

A similar consideration applies to the way in which our taste for women has changed. And is this not to be expected? It is the masculine spirit which appreciates woman, which appreciates feminine qualities, and as this spirit declines, our taste for women loses its edge and becomes coarser. We move from an age in which women were not only appreciated but also treasured and protected into an age in which homosexuality is open, tolerated, and increasingly common; Madonna is a celebrated symbol of American womanhood; and feminine beauty is a mere commodity, like soybeans or crude oil: an age in which parents dump their daughters into the multiracial cesspool that America's schools and cities have become to let them fend for themselves. In an age in which materialism and feminism are ascendant, this is the only way it can be. To attempt to make it otherwise -- to attempt to decommercialize sex, for example -- would be a blow against the economy, against the materialist spirit. And to elevate women again to the protected status they had in a more masculine era would be fought tooth and nail by the feminists as a limitation on women's freedom.

This subject is a little fuzzy, and I've been speaking qualitatively rather than quantitatively. For almost everything I've said, an opponent could produce a counterexample. And that's because I'm talking about very large-scale phenomena, involving many people, many institutions, and many types of interactions. Even during periods of history which I would characterize as masculine or as dominated by the masculine spirit, one can find examples of feminine tendencies and of institutions with a feminine spirit, just as one can find masculine tendencies in our society today. For example, while I claim that our society is becoming more effeminate today, someone can attempt to counter that by noting that masculinized women are more prominent today -- female lawyers, female executives, female military officers -- and one can attribute that to masculine influences in our society. I would counter that by saying that when men become less masculine, women become less feminine.

Likewise, when I relate materialism and feminism, or when I say that the rise of the economic spirit is associated with a decline in masculinity, someone else can find plenty of men with no shortage of testosterone -- strong, aggressive capitalists -- who are epitomes of what Brooks Adams called "economic man."

What it really amounts to is that the masculine character, like the feminine character, has many components. The component I have emphasized today is the spiritual component -- and there are other components. It is a complex subject. But I still believe that we can meaningfully describe what has happened to our society and our civilization during the past couple of centuries as a decline in masculinity. I believe that such a description sheds a useful light on one aspect of what has happened to us. And I believe that Henry Adams' comment on our society's loss of its artistic sense and of its sense of reverence, along with its warrior spirit, is a generally true statement which has value in helping us to understand our predicament. Adams, to be sure, was a scholar of considerable depth, and he wrote a great deal of carefully reasoned material to support the one-sentence summary which I quoted.

By the way, one subject with which Henry Adams -- and his brother Brooks too -- were familiar in this regard was the role of the Jew in undermining civilization. Henry made a number of comments about the destructive role of the Jews in the economic and cultural aspects of European civilization. His observations on this subject are perhaps best summed up by something he wrote in a letter to a friend in 1896: "The Jew," he wrote, "has got into the soul . . . and wherever he . . . [that is, the Jew] goes, there must remain a taint in the blood forever." How much worse that taint has become during the century since Henry Adams made that observation!

I apologize for being so abstract in my own comments today. But I believe that it's useful to back off every now and then and try to see the big picture, to try to develop an intuitive sort of understanding of our situation, even if it means talking about things which are by their nature somewhat fuzzy.

Saturday, November 29, 2014

Those who think the world would be better if we got rid of religion are sadly mistaken. When it comes to economics, I often think of the mentally-ill Christianity-hating Jewish atheist "Ayn Rand" (real name: Alice Rosenbaum) who some of the more naive libertarians thinks makes sense, even though she considered certain certain ethnic groups (American Indians and Palestinians) worthy of only extermination. Same with free-market hating leftists who wear Che' shirts and think Marx (another Jewish atheist) makes sense.

The same applies to the "I Fucking Love Science" retards, none of whom, as far as I can tell, understand science at all (try as further examples the deluded pothead Carl Sagan and Stephen Jay Gould...again, both Jewish atheists, with the latter a self-proclaimed Marxist who destroyed his reputation through his lies).

Robert Sirico, a Catholic priest and co-founder of the Acton Institute, is perhaps one of the most economically literate clergymen you will find among America’s public intellectuals. While most seminaries do not train future pastors and lay leaders to think theologically about economics, Sirico says understanding questions about economics is necessary if Christian leaders want to rightly seek the good of society and train others to do the same. Joseph Gorra, founder and director of Veritas Life Center, talked Sirico about economic life and human flourishing.

At this year's Acton University conference, you spoke on how love is an indispensable basis for economic life. To some, that might seem odd if economic life is viewed as the maximization of utility and material well-being.

We can’t enter the marketplace as something other than what we really are, and real human love demonstrates the impossibility of being merely homo economicus (“the economic man”), which is essentially a thesis that reduces human beings to their materiality.

Humans are simultaneously material and transcendent, individual and social. We are not merely individual entities, though we are uniquely and unrepeatably that, even from the first moment of our conception. Yet the whole of our lives we are social and individual, material and spiritual. If we ignore this existential reality, then we fail to understand what it means to be human.

Love—authentic human love—helps us understand this anthropological reality. Even conjugal love offers more than physicality. In this act of love, we offer our whole selves, including our ideals, dreams, and indeed our future to one another—none of which exists in material reality. Love, especially in the biblical sense, is not merely what one wants for oneself, but is a free decision that wills the good of the person one loves. And this transcendent act, this non-material dimension of human anthropology—when open to new life—normatively results in other human persons who are made from the dust of the earth and the breath of life.

What is the most pervasive problem shaping our thinking related to economics and society?

In addition to the anthropological problem, people don’t think about economics much at all. This may be because we live in the most advanced economic society in history—including people in the developing world, because even though their lives are much more economically difficult than those in the developed world, they still live better than their ancestors did. Many people think we can just live off the legacy of a past prosperity; that we can live at the expense of everyone else. But this illusion can last only so long, and if we don’t attend to the dismal science of economics, we all will be in trouble.

Your book "Defending the Free Market" is less of an unbridled endorsement for capitalism as much as it is the moral case for a free economy. What can we gain by distinguishing free market from capitalism and then crony capitalism?

I learned from my experiences in Latin America and Europe that the word capitalism is fraught with misconceptions. It is, in the first place, a Marxist word and is thus very narrow in its emphasis on the material. It refers only to stuff in the economy. A free market—or better, a free economy—refers to human action, where people make choices based on subjective needs, both as producers and consumers.

Crony capitalism or State capitalism is the antithesis of a free economy, which depends on favors from politicians to include or exclude people from the circle of exchange. Even Adam Smith acknowledges this in his Wealth of Nations, where he warns against this tendency.

Political analyst Yuval Levin said, “A capitalist system requires the kind of citizen that it does not produce.” How would you respond?

I would agree, but only if by “capitalist system” Mr. Levin means one that is without a moral understanding of enterprise, the rule of law, clear rights to property and contract, and all those elements that went into the development of the West. The problem is today we are losing the moral sensibility and the transcendent reference point, that which yields not merely material prosperity but moral purpose and significance.

I am not promoting merely a free society, but one that is both free and virtuous. Only this is worthy of human beings.

There is a longstanding Main Street vs. Wall Street attitude that tends to occupy the American popular imagination of pundits, politicians, populists, and some pastors. Given your theology of work, vocation, and economics, what is your take on that attitude?

The formation of a moral conscience within an entire industry necessitates examining two aspects. The first recognizes the hard work and creative endeavors of the many men and women who work in these businesses. Their work has improved the entrepreneur’s capacity to raise capital and allows common people to participate in some of history's most successful enterprises by way of a stock market. We must also see their work with full knowledge of the particulars. One could compare this to a quantitative strategist creating a new financial product to an automotive engineer devising a new vehicle safety system. Both exhibit a creative human contribution to the service of others through trade.

The second aspect is to examine the ethics of business models that make up this industry. Aside from debating interest rates and profits, few theologians today would see a moral impediment to profitably lending money to your neighbor. The real moral questions arise when our system of fractional reserve banking allows the financing industry to operate money with such economic exception. In many contexts, we have less of a Wall Street investor problem and more of a regulatory capture problem. It’s appropriate that people regard some Wall Street activities as immoral business, and I the think most beneficial thing to do going forward is help people understand the nature of sound money and its important role in our economy.

In recent years, high frequency trading has often dominated criticisms of Wall Street. What are the challenges we face in understanding this convergence of information technology advances, markets, and sometimes predatory activities?

The complexity of the trading activity creates situations that are sometimes difficult to distinguish as predatory or simply highly competitive.

Much high frequency trading is an outgrowth of a more antiquated market structure by which brokers and market makers operated a standard share distribution model intending to assist firms in raising capital, as well as to provide investors liquidity and reduce counter-party risk. High frequency trading could become predatory if relationships from the old market model prevent new participants from competing in a transparent and freely accessed marketplace, which can occur in a variety of circumstances. Not the least of which is when a distortion of free markets occur as political favoritism plays a significant role—through bailouts, for instance. Unlike the implication one often finds in many news headlines, it’s not the speed of trading as much as the quality, transparency, and freedom of the marketplace that will allow us to determine the moral fruits of this endeavor.

What blind spots do those on either the political right or left have in understanding inequality and poverty as a problem to be solved?

Both sides tend to leave out an aspect of reality. Those on the right believe the illusion that all that matters is economic efficiency, as though human beings do not matter. This fails because humans are also consumers who need to be able to purchase the goods and services being offered—which Henry Ford seemed to understand. And more importantly, workers are human beings; they have certain rights simply because they are humans, even if those rights aren’t the sole responsibility of employers.

Those on the left have a utopian dream in which everyone can be the same. But the very fact of human individuality shows its impossibility.

Underlying the call for equality in the material sense is the moral desire that people be able to live at a certain basic level of material dignity. It’s probably better to call this equity. Hence, we must speak about the floor in an economic sense, not the gap—much less the ceiling.

Regarding poverty, the image that comes to mind is a pie. If we think the world of riches is static, then we will see the normative solution as dividing it up—or redistribution. If, on the other hand, we see the pie as capable of being grown—that is, production—then the normative solution is economic liberty and initiative, which produce prosperity. Acton’s curriculum Poverty Cure, designed especially for churches, makes the case that at the international level, trade should be preferred, both morally and economically, over aid.

What is a moral case against raising a federal minimum wage as a way to alleviate poverty?

This is a well-debated topic in the economic literature, and I have no doubt that most people on all sides have the best of intentions. But there are several prudential reasons for shunning federally mandated minimum wages.

First, salaries are not arbitrary. They reflect what the business owner knows to cover the costs of production, which are usually under tight margins. If employers are forced by law to inflate wages to rates above the market rate, then either the costs are passed on to the consumers who also work for a living, or the least productive employees—most often minorities and teenagers—or those who were hired last are the first ones to be fired, lest the whole enterprise collapses. As a result, entrance into the world of work will be more difficult for those starting out, and they’ll never acquire habits of enterprise.

Additionally, if a minimum wage is a moral requirement of society as a whole, why should it be that the burden falls only on employers as opposed to something more akin to an earned income tax credit whereby people receive from society—at whatever level of government deemed prudent—a kind of negative income refund. I think this latter policy has its own problems, but it appears more just than singling out employers. Finally, why would such a thing need to be federally imposed when wage differentials vary?

So does paying a living wage meet the same objections as raising the minimum wage if a living wage is not regulated but at the discretion of a company to determine what is a just wage?

I do think that a living wage—an idea developed in the mid-16th century, which the Scholastics believed was best achieved by the market wage—is what people are looking for. I think it’s more prudent because a market wage will more closely respond to a living wage, generally because there’s more information available to all partners—employers, worker, and consumers—if the pricing structure is free from obfuscating governmental regulations.

Christians who are well formed in their moral obligations may well choose to pay above the federal minimum. When hiring in my parish, even for simple and temporary work, we haven’t paid a minimum wage in years. Probably the most dramatic example, though not very much reported on, is Hobby Lobby, which has paid its lowest wage earning employees multiples of the minimum wage for quite some time.

What can local churches and pastors do to help address short-term and long-term unemployment issues? Or is this something only policymakers and the business community can address?

Historically, churches have played a critical role in addressing these sort of questions, and I believe they still can, though not merely through direct charitable initiatives—which should be a given in any Christian society, in addition to the moral formation we’re called to offer people. To do this, we must think like business leaders, but whose bottom line need not be monetary.

Specific kinds of work training efforts imbued with moral sensibility would be one way to do this, which businesses operating under all kinds of regulations might find difficult. Imagine what it would look like if after two or three years of operation a church got the reputation of turning out workers from the church’s vocation training school, people who were not only highly proficient in their respective field, but also carried themselves with a sense of professionalism, honesty, and respect that made their employees desirable on the market for their habit of punctuality, politeness, and giving a real day’s work for a real day’s wage.

And what if our organizations developed from within the congregations and constituencies the kinds of insurance programs for health care, unemployment insurance, small loans for work related costs, and even supplemental wage assistance for disadvantaged people just starting a job, which some undeniably need. To the extent that our organizations become politicized and secularized, they will dissolve and lose their Christian identity.

I can hear the question forming in some minds: Well, isn’t that the job of government, and don’t these programs exist through various welfare schemes? There are two critical differences. First, the people involved in creating these programs and holding their participants to account within them would be people known to them. This accounts for much of the success of micro-loan efforts in India and Latin America.

Second—and here I call the church to particular account—we would need to ground such programs in a clear moral foundation. As Christians, we should seek to form men and women who are not merely workers, but prepare them to be evangelists as well, since they’ve had a profound personal encounter with the living Christ. Try getting that kind of program funded by the present welfare establishment.

“When you’ve got your health, you’ve got just about everything” ran the tag-line in a famous Geritol commercial from the 1970s, and the guys we most have reason to be grateful for are undoubtedly those who’ve developed the medical practices and the drugs and devices that have transformed our lives over the past hundred fifty years.

Before the turkey gets carved, it’s worth taking a moment to remember a few of these brilliant, persistent, and lucky men, and recall their accomplishments. Even when they’ve won Nobel Prizes in Medicine, their names are virtually unknown. They’re not mentioned in the Core Curriculum or celebrated by Google on their birthdays.

Pain

If you ever had surgery, did you opt for anesthesia?

If so, thank a few more white males, beginning with William Clarke in New York and Crawford Long in Georgia who both used chloroform in minor surgeries in 1842. A paper published four years later by William Morton, after his own work in Boston, spread the word. Ether replaced chloroform during the next decade. There are now scores of general and regional anesthetics and sedatives and muscle relaxants, administered in tandem. The first local anesthetic has also been superseded. It was cocaine, pioneered by a Viennese ophthalmologist, Carl Koller, in 1884.

Ever take an analgesic?

Next time you pop an aspirin, remember Felix Hoffmann of Bayer. In 1897, he converted salicylic acid to acetylsalicylic acid, much easier on the stomach. Aspirin remains the most popular and arguably the most effective drug on the market. In 1948 two New York biochemists, Bernard Brodie and Julius Axelrod, documented the effect that acetaminophen (Tylenol), synthesized by Harmon Morse in 1878, had on pain and fever. Gastroenterologist James Roth persuaded McNeil Labs to market the analgesic in 1953.

Infectious Diseases

Most Americans today die of heart disease or cancer, but before the twentieth century, it was infectious diseases that struck people down, and children were the primary victims. In pre-industrial England, still with the most developed economy in the world in the late 17th century, 50% of all children didn’t survive the age of 15. With the phenomenal growth of cities during the 19th century, cholera, typhoid fever, and tuberculosis became the leading killers.

In 1854, a London medical inspector, John Snow, proved that a cholera epidemic in Soho was caused by infected sewage seeping into the water supply. Until then it was thought the disease spread through the air. The sanitary disposal of sewage and the provision of clean water, possible thanks to mostly anonymous metallurgists and engineers -- an exception is the famous Thomas Crapper, who pioneered the u-shaped trap and improved, though he didn’t invent, the flush toilet -- has saved more lives than any drug or surgical innovation.

Dramatic improvements in food supply have also had an incalculable effect on health. Agricultural innovations, beginning with those introduced in England in the 18th century, were disseminated globally by the end of the 20th century -- the “Green Revolution.” Famines struck Europe as recently as the late 1860s. (The man-made famines of the 20th century are another story.) A transportation revolution made possible the provision of more than sufficient protein, calories, and nutrients worldwide. Needless to say, it was white males who designed and built the roads, canals, railroads, and ports and airports, and the ships, trains, planes, and trucks that used them, and the mines, and then wells, pipelines, and tankers that supplied the fuel they ran on.

Whatever the merits of taking vitamins and supplements today, no one has to take vitamin C to prevent scurvy, or vitamin B to prevent pellagra, or vitamin D and calcium to prevent rickets. And, for the time being, we all live in a post-Malthusian world. The global population was about 800 million to 1 billion when the gloomy parson wrote his famous book in 1798. It’s now over 7 billion.

Dr. Snow had no idea what was actually causing cholera. It was Louis Pasteur who gave the world the germ theory of disease, as every schoolchild once knew. Studying the fermentation of wine, he concluded that this was caused by the metabolic activity of microorganisms, as was the souring of milk. The critters were responsible for disease, too, he recognized, and identified three killer bacteria: staphylococcus, streptococcus, and pneumococcus. Nasty microorganisms could be killed or rendered harmless by heat and oxygenation, Pasteur discovered, and would then prevent the disease in those who were inoculated. He went on to develop vaccines for chicken cholera, anthrax, and rabies. Edward Jenner had demonstrated in in the late 1790s that the dreaded smallpox could be prevented by injecting patients with material from the pustules of cowpox victims, a much milder disease. (The word vaccine comes from vaca, one of the Latin words for cow.) Pasteur, however, was the first to immunize patients by modifying bacteria rather than through cross-vaccination.

A parade of vaccines followed. People in their mid-60s and older can remember two of the most famous: the Salk and Sabin vaccines against poliomyelitis, a paralyzing disease that had panicked American parents in the late ‘40s and early ‘50s. Children preferred Albert Sabin’s 1962 version: the attenuated virus was administered on a sugar cube. Jonas Salk’s inactivated vaccine, available in 1955, was injected.

In 1847, more than a decade before Pasteur disclosed his germ theory, the Viennese obstetrician Ignaz Semmelweis documented the effectiveness of hand washing with chlorinated water before entering a maternity ward. He brought mortality rates from puerperal fever down from 8% to 1.3%. Two decades later, having read a paper by Pasteur, Joseph Lister demonstrated the effectiveness of carbolic acid to sterilize wounds and surgical instruments. Mortality rates fell from around 50% to about 15%. The efforts of both men, especially Semmelweis, were met with ridicule and disdain.

Pasteur’s German rivals Robert Koch and Paul Ehrlich made monumental contributions to biochemistry, bacteriology, and hematology, but left the world no “magic bullet” (Ehrlich’s term). Koch identified the organism causing tuberculosis, the leading killer of the 19th century, but his attempts at finding a vaccine failed. His purified protein derivative from the bacteria, tuberculin, could be used to diagnose the disease, however. It was two French researchers, Albert Calmette and Camille Guerin, who developed a successful vaccine, first administered in 1921, though it was not widely used until after World War II.

Ehrlich joined the search for antibacterial drugs that were not denatured bacteria or viruses. He synthesized neoarsphenamine (Neo-Salvarsan), effective against syphilis, a scourge since the late15th century, but which had toxic side effects. It was not until the 1930s that first generation of antibiotics appeared. These were the sulfa drugs, derived from dyes with sulfa-nitrogen chains. The first was a red dye synthesized by Joseph Klarer and Fritz Mietzsch. In 1935, Gerhard Domagk at I. G. Farben demonstrated its effectiveness in cases of blood poisoning.

The anti-bacterial properties of Penicillium had already been discovered at this point by Alexander Fleming. The Scottish bacteriologist had famously left a window open in his lab when he went on vacation in 1928, and returned to find that a mold had destroyed the staphylococcus colony in one of his petri dishes. But it’s one thing to make a fortuitous discovery and another thing to cultivate and purify a promising organic compound and conduct persuasive trials. This was not done until 1941. Thank Oxford biochemists Howard Florey and Ernst Chain. A Pfizer chemist, Joseph Kane, figured out how to mass-produce penicillin and by 1943 it was available to American troops. The wonder drug of the 20th century, penicillin killed the Gram-positive bacteria that caused meningitis, diphtheria, rheumatic fever, tonsillitis, syphilis, and gonorrhea. New generations of antibiotics followed, as bacteria rapidly developed resistance: among them, streptomycin in 1943 (thank Selman Waksman), tetracycline in 1955 (thank Lloyd Conover), and, the most widely prescribed today, amoxicillin.

Diagnostic technologies

Microscope: While the Delft draper Antonie van Leeuwenhoek didn’t invent the compound microscope, he improved it, beginning in the 1660s, increasing the curvature of the lenses, and so became the first person to see and describe blood corpuscles, bacteria, protozoa, and sperm.

Electron microscope: Physicist Ernst Ruska and electrical engineer Max Kroll constructed the prototype in Berlin in 1933, using a lens by Hans Busch. Eventually, electron microscopes would be designed with two-million power magnification. Leeuwenhoek’s had about two hundred.

Stethoscope: Thank the French physician René Laennec, who introduced what he called a microphone in 1816. British nephrologist Golding Bird substituted a flexible tube for Laennec’s wooden cylinder in 1840, and the Irish physician Arthur Leared added a second earpiece in 1851. Notable improvements were made by Americans Howard Sprague, a cardiologist, and electrical engineer Maurice Rappaport in the 1960s (a double-sided head), and Harvard cardiologist David Littmann in the same decade (enhancing the acoustics). The device undoubtedly transformed medicine, and with good reason became the symbol of the health care professional.

Sphygmograph: The first machine to measure blood pressure was created by a German physiologist, Karl von Vierordt in 1854.

X-rays: Discovered by Karl Wilhelm Röntgen, at Wurzberg in 1895, this was probably the single most important diagnostic breakthrough in medical history. Before Röntgen noticed that cathode rays, electrons emitted from a cathode tube, traveled through objects and created images on a fluorescent screen, physicians could only listen, palpitate, examine stools, and drink urine.

PET scans: James Robertson designed the first machine in 1961, based on the work of number of American men at Penn, Wash U., and Mass General, designed the first machine. The scanner provides an image from the positron emissions coming from a radioactive isotope injected into the patient, and is particularly useful for mapping activity in the brain.

CAT scans: The first model was developed by electrical engineer Godfrey Hounsfield, in London, 1972, drawing on the work of South African physicist Alan Cormack in the mid-1960s. It generates three-dimensional and cross-sectional images using computers and gamma rays.

MRI: Raymond Damadian, a SUNY professor of medicine with a degree in math, performed the first full-body scan 1977. His design was anticipated by theoretical work by Felix Bloch and Edward Purcell in the 1930s, and, later, Paul Lauterbur. MRIs map the radio waves given off by hydrogen atoms exposed to energy from magnets, and are particularly useful in imaging tissue -- and without exposing the patient to ionizing radiation.

Ultrasound: Ian Donald, a Glasgow obstetrician, in the mid-1950s adopted a device already used in industry that generated inaudible, high frequency sound waves. The machine quickly and cheaply displays images of soft tissue, and now provides most American parents with the first photo of their baby.

Endoscopes: Georg Wolf produced the first flexible gastroscope in Berlin in 1911, and this was improved by Karl Storz in the late ‘40s. The first fiber optic endoscope was introduced in 1957 by Basil Hirschowitz, a South African gastroenterologist, drawing on the work of British physicist Harold Hopkins. The scope is indispensible in diagnosing GI abnormalities.

Angiogram: Werner Forssmann performed the first cardiac catherisation -- on himself -- in Eberswald in 1929. He inserted a catheter into his lower left arm, walked downstairs to a fluoroscope, threaded the catheter to his right atrium and injected a radioptic dye. The technique was further developed by Dickson Richards and André Courmand at Columbia in the ‘40s, and then extended to coronary arteries, initially accidentally, by Frank Sones at the Cleveland Clinic in 1958.

X-rays and scopes were quickly used in treatment as well diagnosis. Roentgen himself used his machines to burn off warts. Similarly, in 1964, Charles Dotter and Marvin Judkins used a catheter to open a blocked artery, improving the technique in 1967. Andreas Gruentzig then introduced balloon angioplasty in 1975, an inflated balloon opening the narrowed or blocked artery. In 1986, Jacques Puel implanted the first coronary stent at U. of Toulouse, and soon afterwards a Swiss cardiologist, Ulrich Sigwart, developed the first drug-eluding stent.

The men who developed five of the most dramatically effective and widely used drugs in internal medicine deserve mention.

In the late ‘30s, two Mayo Clinic biochemists hoping to cure rheumatoid arthritis, Philip Hench and Edward Kendall, isolated four steroids extracted from the cortex of the adrenal gland atop the kidneys. The fourth, “E,” was very difficult to synthesize, but Merck chemist Lewis Sarrett succeeded, and in 1948, the hormone was injected into fourteen patients crippled by arthritis. Cortisone relieved the symptoms. Mass produced, with much difficulty, by Upjohn chemists in 1952, it was refined by their rivals at Schering three years later into a compound five times as strong, prednisone. In addition to arthritis, corticosteroids are used in the treatment of other inflammatory diseases, like colitis and Crohn’s, and in dermatitis, asthma, hepatitis, and lupus.

Anyone over fifty can remember peptic ulcers, extremely painful lesions on the stomach wall or duodenum. They were thought to be brought on by stress. “You’re giving me an ulcer!” was a common expression. Women were especially affected, and a bland diet was the only treatment, other than surgery. The lesions were caused by gastric acid, and two British pharmacologists and a biochemist, George Paget, James Black, and William Duncan, investigated compounds that would block the stomach’s histamine receptors, reducing the secretion of acid. There were endless difficulties. Over 200 compounds were synthesized, and the most promising, metiamide, proved toxic. Tweaking the molecule, replacing a sulfur atom with two nitrogen atoms, yielded cimetidine in 1976. As Tagamet, it revolutionized gastroenterology. It was also the first drug to generate over $1 billion in annual sales. Its successors, the proton pump inhibitors Prilosec and its near-twin Nexium, more than doubling the acid reduction, have also been blockbuster drugs.

Cimetidine was the culmination of one line of research that began in 1910, when a London physiologist, Henry Dale, isolated a uterine stimulant he called “histamine.” Unfortunately, when it was given to patients, it caused something like anaphylactic shock. The search began for an “antagonist” that would block its production, even before it was recognized as the culprit in hay fever (allergic rhinitis). The most successful antagonist was one was developed in 1943 by a young chemist in Cincinnati, Geroge Rieveschl, diphenhydramine, marketed as Benadryl. Ten to thirty percent of the world’s population suffers from seasonal allergies, so this was hailed as miracle drug. In the early ‘80s a second generation of antihistamines appeared that didn’t cross the brain-blood barrier and thus didn’t sedate the user. Loratadine (Claritin), the first, was generating over $2 billion in annual sales before it went generic.

Diabetes, resulting in high blood glucose levels (heperglycemia), has been known for two millennia. It was a deadly disease, type 1 rapidly fatal, type 2, adult onset, debilitating and eventually lethal. By the end of the 19th century, the Islets of Langerhans in the pancreas had been identified as the source of a substance that prevented it, insulin, but this turned out to be a fragile peptide hormone, broken down by an enzyme in the pancreas during attempts to extract it. In 1921, Canadian surgeon Frederick Banting and medical student Charles Best determined a way to disable the production of the enzyme, trypsin. Injected in a teenager with type 1 diabetes, insulin was immediately effective. There is still no cure for diabetes, but today the 380 million sufferers globally can live normal lives thanks to Banting and Best.

Finally, millions of men and their wives and girlfriends owe a big debt to British chemists Peter Dunn and Albert Wood, and Americans Andrew Bell, David Brown, and Nicholas Terrett. They developed sildenafil, intended to treat angina. It works by suppressing an enzyme that degrades a molecule that relaxes smooth muscle tissue, increasing blood flow. Ian Osterloh, running the clinical trials for Pfizer, observed that the drug induced erections, and it was marketed for ED. Viagra made the cover of Time Magazine after it was approved in March 1998. The blue pill still generates about $2 billion annually in sales, despite competition, and is prescribed for 11 million men.

Two incredible machines build in the mid-20th century revolutionized the practice of medicine. Both remove blood from the body.

During World War II, the Dutch physician Willem Kolff constructed a machine to cleanse the blood of patients suffering from renal failure by filtering out urea and creatine. Over 400,000 Americans are on dialysis today.

In 1953, after 18 years of work, John Gibbon, a cardiologist at the University of Pennsylvania, produced a machine that oxygenated blood and pumped it around the body, permitting operations on the heart, like those performed a decade later by Michael DeBakey in Houston and René Favaloro in Cleveland. The two surgeons pioneered coronary bypass grafts, using a blood vessel in the leg or chest to re-route blood around a blocked artery. About 200,000 operations are performed each year, down from about 400,000 at the turn of the century, thanks to stents. Gibbon’s machine enabled the most widely covered operation in history, the heart transplant, first performed by South African surgeon Christian Barnard in 1967, based on research by Norman Shumway and others. Over 2,000 Americans receive heart transplants each year.

The cardiac device Americans are most likely to encounter is the defibrillator, now in airports, stadiums, supermarkets, and other public places. Thank two Swiss professors, Louis Prévost and Frédéric Batelli, who, in 1899, induced ventricle fibrillation, abnormal heartbeat, in dogs with a small electrical shock, and restored normal rhythm with a larger one. It was not until the 1940s that a defibrillator was used in heart surgery, by Claude Beck in Cleveland. A Russian researcher during World War II, Naum Gurvich, discovered that biphasic waves, a large positive jolt followed by a small negative pulse, was more effective, and a machine was constructed on this basis by an American cardiologist, Bernard Lown. Improvements by electrical engineers William Kouwenhoven and Guy Knickerbocker, and cardiologist James Jude at Hopkins in 1957, and subsequently by Karl and Mark Kroll, and Byron Gilman in the ‘90s made the device much smaller and portable.

Over three million people worldwide don’t have to worry about defibrillators or face open-heart surgery. These are the recipients of pacemakers, and can thank a Canadian electrical engineer, John Hopps. Predecessors were deterred by negative publicity about their experiments, which were believed to be machines to revive the dead. Gurvich had faced this as well. Hopps’ 1950 device used a vacuum tube. With the invention of the transistor, a wearable pacemaker became possible, and Earl Bakken designed one in 1958. Not long afterward, two Swedish engineers, Rune Elmquist and Åke Senning created an implantable pacemaker. The first recipient eventually received 26 and lived to age 86. Lithium batteries, introduced in 1976, enabled the creation of devices with a much longer life.

Cardiac Drugs

Cardiac stimulants have been around since the late 18th century. Thank William Withering, who published his experiments with the folk-remedy digitalis (from foxglove) in 1785.

Anti-anginal drugs were introduced a century later, also in Britain: amyl nitrite in the mid-1860s and nitroglycerin a decade later. Both compounds had been synthesized by French chemists. Thank Thomas Bruton and William Murrell.

The first diuretics, to reduce edema (swelling) and lower blood pressure, were alkaloids derived from coffee and tea. These were not very effective, but better than leeches. Mercury compounds were pioneered by the Viennese physician Arthur Vogel in 1919. These worked, but were tough on the kidneys and liver. The first modern diuretics, carbonic anhydrase inhibitors, were developed in the 1940s, with the American Karl Beyer playing a leading role.

The first anti-coagulants date from the ‘20s. A Johns Hopkins physiologist, William Howell, extracted a phospholipid from dog livers that he called heparin and that appeared to prevent blood clots. The first modern anti-coagulant, and still the most widely prescribed, was warfarin (Coumadin), developed as a rat-poison by Karl Link in Wisconsin in 1948. Its effectiveness, and lack of toxicity, was revealed when an army recruit took it in a suicide attempt.

Anti-arrhythmic drugs, to stabilize the heartbeat, were introduced in the opening decade of the 20th century. The first was derived from quinine. The big breakthrough occurred in 1962. Thank, once again, the Scotsman James Black, who synthesized propranolol in that year, the first beta-blocker. What they block are the receptors of epinephrine and norepinephrine. These two chemicals (catecholamines) increase the heart rate, blood pressure, and blood glucose levels, useful for many purposes, but not a good thing in patients with cardiac arrhythmia, irregular heartbeats. Beta-blockers are also prescribed to lower blood pressure.

ACE inhibitors lower the levels of an enzyme secreted by the kidneys and lungs that constricts blood vessels. The unpromising source for the first inhibitor was the venom of the Brazilian pit-viper. It was extracted, purified, and tested by three Squibb scientists in 1975, David Cushman, Miguel Ondetti, and Bernard Rubin. It’s still widely prescribed, though many other ACE inhibitors have since been designed. They are used for patients with congestive heart failure or who have had a heart attack, as well as those with hypertension.

Finally, mention must be made of the statins, which, though over-hyped and over-prescribed, lower serum cholesterol and reduce the risks of a second heart attack. A Japanese microbiologist, Akira Endo, derived, from a species of Penicillium, a substance that inhibited the synthesis of cholesterol, but it was too toxic to use on humans. In 1978, a team at Merck under Alfred Alberts had better luck with another fungus, and called the compound lovastatin. Statins work by inhibiting the activity of an enzyme called HMGR.

Cancer Drugs

In the forty-three years since Richard Nixon’s “war on cancer” was launched, the disease has received the lion’s share of government, foundation, and pharmaceutical industry funding, though heart disease kills more people -- 596,577 Americans last year to 576,691 for cancer, according to the most recent data. This makes it particularly difficult, and invidious, to single out individual researchers.

There is still, of course, nothing close to a magic bullet, though cancer deaths have dropped about 20% since their peak in 1991. Around 27% of cancer deaths this year will be from lung cancer, so the rate will continue to fall as more people stop smoking.

The originators of a few therapies with good five-year survival rates ought to be singled out and thanked.

Seattle oncologist Donnall Thomas performed the first successful bone marrow transplant in 1956. The donor was an identical twin of the leukemia patient. With the development of drugs to suppress the immune system’s response to foreign marrow, Thomas was able to perform a successful transplant from a non-twin relative in 1969. About 18,000 are now performed each year.

One of the more notable successes of chemotherapy has been in the treatment of the childhood cancer acute lymphoblastic leukemia (ALL). Sidney Farber in the late ‘40s carried out clinical trials with the antifolate aminopterin, synthesized at Lederle by the Indian biochemist Yellapragada Subbarow. This proved the first effective compound in treating the disease. It was superseded by methotrexate, and now, as in all chemo treatments, a combination of agents is used. The five-year survival rate for ALL has jumped from near zero to 85%.

Early detection is the key to successful treatment in all cancers, and survivors of breast cancer can thank at least four men who pioneered and popularized mammography over a fifty-year period beginning in 1913: Albert Salman, Stafford Warren, Raul Leborgne, and Jacob Gershon-Cohen.

A second key to the comparatively high survival rates for women with breast cancer is tamoxifen. First produced in the late ‘50s by British endocrinologist Arthur Walpole, it was intended as a “morning-after” birth control pill because it blocked the effects of estrogen. However, it failed to terminate pregnancy. Researchers had meanwhile discovered that some, though not all, women with breast cancer recovered when their ovaries were removed. Walpole thought tamoxifen might block breast cancer estrogen receptor cells, inhibiting their reproduction, and persuaded a professor of pharmacology, Craig Jordan, to conduct experiments. These demonstrated the drug’s efficacy, and after clinical trials it was approved and marketed in 1973. Think of Arthur W. the next time you see one of those ubiquitous pink ribbons.

Most chemo agents are cytotoxic metal-based compounds that do not distinguish between abnormal cells and healthy cells that also divide rapidly. The nasty side effects range from hair-loss and nausea to decreased production of red blood cells, nerve and organ damage, osteoporosis and bone fusion, and loss of memory and cognition. More selective drugs, monoclonal antibodies, have been used for some time. These were first produced by Georges Köhler and César Millstein in 1975 and “humanized” by Greg Winter in 1988, that is, made more effective by using recombinant DNA from mammals. Over 30 “mab” drugs have been approved, about half for cancer.

Research has also been underway for years into delivery systems using “nano-particles” that will target tumors exclusively. Another approach, pioneered by Jonah Folkman, has been to find drugs that will attack the blood supply of tumors, angiogenesis inhibitors. This turned out not to be the magic bullet Folkman hoped for, but more than fifty of these drugs are in clinical trials, and a number are effective owing to other mechanisms, and are currently used.

Psychiatric medicine

Drugs have revolutionized the practice of psychiatry since the 1950s, and brought relief to millions suffering from depression, anxiety, and psychoses. For obvious reasons, these are some of the most highly addictive and widely abused drugs.

Bernard Ludwig and Frank Berger: meprobamate, the tranquilizer Miltown. By the end of the ‘50s, a third of all prescriptions in America were for this drug

Leo Steinberg: the anxiolytic (anti-anxiety) benzodiazepines, first synthesized in 1955. The most successful initially was diazepam, Valium, marketed in 1963. The most widely prescribed benzodiazepine today is alprazolam, Xanax. It’s also the most widely prescribed psychiatric drug, with nearly 50 million prescriptions. It increases concentrations of dopamine and suppresses stress-inducing activity of the hypothalamus.

Leandro Panizzon: methylphenidate (Ritalin). The Swiss chemist developed it in 1944 as a stimulant, and named it after his wife, whose tennis game it helped improve. Until the early ‘60s amphetamines were used, counter-intuitively, to treat hyperactive children. Thirty years after its patent expired, the controversial dopamine re-uptake inhibitor is still the most widely prescribed medication for the 11% of children who’ve been diagnosed with ADHD.

Klaus Schmiegel and Bryan Malloy: the anti-depressant fluoxetine, the first SSRI, selective serotonin reuptake inhibitor, increasing serotonin levels. Marketed as Prozac in 1988, it made the cover of Newsweek and is still prescribed for over 25 million patients.

Paul Janssen: risperdone (Risperdal), the mostly widely prescribed antipsychotic drug worldwide. The Belgian researcher developed many other drugs as well, including loperamide HCL (Imodium). When commenters on web articles advise trolls to take their meds, they might want to specify risperdone.

Seiji Sato, Yasuo Oshiro, and Nobuyuki Kurahashi: aripiprazole (Abilify) which blocks dopamine receptors, and was the top selling drug at the end of 2013, grossing $1.6 billion in Q4.

A few observations.

Japanese and Indian researchers will make important contributions to future drugs, as the trio responsible for Abilify reminds us.

And, naturally, some women have played roles in the advances that have been summarized. Mary Gibbon, a technician, assisted her husband on the heart-lung machine. Lina Stern did important research on the blood-brain barrier, and it was in her lab that Guravich improved the defibrillator. Jane Wright conducted early trials of methotrexate that helped demonstrate its efficacy. Lucy Wills did pioneering work on anemia in India. Roslyn Yalow helped develop radioimmunoassay, which measures concentrations of antigens in the blood. Anne-Marie Staub did interesting work on antihistamines, though her compounds proved toxic.

They are exceptions. Our benefactors have not only been overwhelmingly European males, but are mostly from England and Scotland, Germany, France, Switzerland, and the Netherlands, as well as Americans and Canadians whose families emigrated from those countries. And, of course, Jews, who’ve won 28% of the Nobel Prizes in Medicine.

Some of the beneficiaries in particular might want to think about this.

Muslims boast that their faith has over 2 billion followers throughout the world. If this number is accurate it has far less to do with the appeal of Islam or with Arab or Turkish conquests, and everything to do with the work of some Northern Europeans and Jews, along with the “imperialists” who built roads, canals, and ports and the vehicles that use them, as well as schools and hospitals -- like the traveling eye clinics in Egypt funded by the Jewish banker Ernest Cassel, which nearly eliminated blinding trachoma, then endemic.

The fact that we in the U.S. idolize our entertainers as no society has before is not going to cut off the supply of outstanding medical researchers. Very bright and inquisitive people usually don’t pay much attention to popular culture. But it diminishes us.

It’s the ingratitude, though, not the indifference, that’s more troubling.

Biting the hand that feeds is a core principle of Leftists. For 150 years, they’ve sought to concentrate power in their own hands by exploiting the resentment of ignorant people against a system that has finally enabled mankind to spring the Malthusian trap.

Multiculturalism, with its simple-minded relativism, has broadened the scope of the party line. Not only shadowy “capitalists” are vilified, but whites and males. Ignorant people can now think well of themselves by opposing “racism” and “the patriarchy” -- and by voting for an unqualified and deceptive poseur because, though a male, he is not white.

The first step parents can take to help spare America from being “fundamentally transformed” is to insist that history be properly taught. This means, among other things, recognizing the accomplishments of a few men who’ve found cures for or relieved the symptoms of diseases that have killed and tortured humans for millennia.

Wednesday, November 26, 2014

I owned a taxi for five years and got to know a lot of hookers. I even got to know the woman who ran the escort service and sometimes went to dinner with her.

It was an interesting job, and an enlightening one.

I've recently read some articles about how some men are seeing hookers for "emotional fulfillment." I knew that 25 years ago, when I found many of the men didn't have any sort of sexual contact with the girls. They sometimes took them on dates and sometimes stayed home and watched movies. That astonished me, that a man would pay a girl $100 to keep him company for two hours.

In sad fact, some of the guys wanted to see the girls again just for those reasons - emotional fulfillment.

I've been an antenna my entire life. Since I was about six, really. Had I thought about it - and I did a bit - I would have realized that men paying women for some sort of brief emotional connection presaged some big problems later on. Which it has. Sex is not the problem - being driven nuts by being alone is.

One of the worst things to do to a prisoner is solitary confinement, with no human contact. Because of what I experienced, I can guarantee you these guys have some sick sexual fantasies.

There was also married men - and there weren't that many - who saw the girls on business trips. It made me wonder what their home life was like. I figured, not too good, because the guys always called for the same girl. Emotionally connected to her in some way, you might say.

Other men were the type who couldn't get any girl because of some physical defect, and those were the ones who had been ostracized for so long they had gotten sexually weird and wanted to humiliate the girls. Revenge - power, domination and control. Sort of a hate. I saw a girl come out of his house and then start crying.

I only did it for three months, I think, and after that I was so sickened I quit. I wasn't so much sickened by the guys but the girls. Whores are a weird, weird bunch, and none of them are very smart and none of them have hearts of gold. I thought they were mentally ill, more than anything else.

I mean, what kind of girl becomes a whore at 15? I have seen that.

The guys I met who paid for the girls to keep them company weren't bad-looking at all. I guess, for some unknown reason, they couldn't find a woman, so instead of sitting alone on a Saturday night they decided to call Rent-a-Woman. Better two hours of company than staring at a TV screen by yourself.

I have found this loneliness affects women who have hit the Wall, only they oftentimes turn into hostile, hateful shrikes. It never occurs to them they are experiencing what many men experienced when younger.

I'm not sure about this Neanderthal/Sapiens split, the theory may have been forbidden at one time but not anymore, but overall it is a good introduction to r/K theory.

Personally I see a huge split between stupid extroverts and smart introverts. I also consider leftists to be mentally-ill children.

Intro of the r vs. K concept

r/K selection theory is an incredibly powerful paradigm, capable of flaying a liberal’s mind faster than any other concept on the planet.

r stands for reproductive focus – the rabbit strategy of pumping out lots of cheap offspring and fucking whatever moves. It’s a response to overwhelming predation.

K stands for competitive focus – the wolf strategy of mating for life and raising pups together, teaching them to hunt and integrate socially with the pack. It’s a response to selective pressure for individual excellence.

A racist example

Since I’m a straight to the point kind of guy, I’ll start with the most racist example possible.

Negro Sub-Saharan Africa is r-selective. Selection pressures overwhelm individual merit, taking random individuals and population swathes. Fecundity rapidly repopulates to carrying capacity. A farmer who attempts to build or save will be attacked by his neighbors because he threatens their long-term r investment.

Also, this entire post is written in reaction to the free content written by Anonymous Conservative at http://www.anonymousconservative.com. If you’re unable to access his site, and you get a “missing index file” error, try using this proxy: http://www.freewebproxy.asia.

Mechanisms of r/K expression shifts in society

There are three primary mechanisms by which the ratio of r/K expression shifts in a society:

1. Environmental cues alter psychology and biology during childhood development
2. Changes in the population’s underlying genetic ratio
3. Changes in social promotion

Together, these produce the cycle of history, in which hardship creates a superior K selected group, which conquers, prospers, becomes r-selected, and collapses.

r/K theory also fully explains the American political bipolar spectrum of left-liberal vs. right-conservative. However, the story is a little more complex, because America is a decadent empire in decline, so both sides are mostly r.

Let’s start by talking about some of the simpler implications of r/K theory first, before we tackle hairy topics like contemporary America.

Sex

Simple enough for ya?

Homosexuality is a dishonorable mating strategy

In one of the pdfs, AC (Anonymous Conservative) describes the complex and chivalrous mating combat of the Australian Giant Cuttlefish. A subset of males cheat at this competition, adopting the coloration and appearance of a female in order to skip the combat, mate quickly, and sneak away.

This example finally provides an evolutionary justification for homosexuality. Mimicking a female gives the Anticompetitive cuttlefish access to females, which he would otherwise never acquire. Likewise, almost all human fags are bisexual, and many men become gay only after failing with women. Being gay permits the occasional “experimental” bang with a girlfriend. Hence the K male’s aversion to fags and fag hags. To quote:

“the Anticompetitor is designed to avoid engaging in these competitions, while still seeking advantage within the competitive environment through violations of the rules of honor which govern the competition.”

Hatred for liberals is genetic

These quotes are all coming from AC’s website:

“…Competitors who evolved to revile those who violated the Competitive strategy. These groups would easily dominate such a group competition. Individuals that were imbued with a fierce contempt for cowardice, a hatred for selfishness, and aversions towards such behaviors as interference in free competitions between men, opportunistic advantage taking, rule breaking, sexual sneaking and disloyalty to the group would form, and function within, successful groups unusually well.”

In other words, homophobia is not latent homosexuality, it is a rational genetic strategy. As is hating hippies.

Feminism is r-type

“In r-type populations, females exhibit more male traits, such as increased size, aggression, and competitiveness. In this milieu, this is an effective aspect of an r-strategy, as r-females need to both provide for their offspring, and fend off threats, due to male abandonment.

It is interesting that modern feminism, so often associated with the left, exhibits a denigrating view of the rewards offered by offspring rearing, an embrace of sexual liberation for women (ie promiscuity), a denigrating view of men which would facilitate short-term mating relationships, as well as an increased drive to compete aggressively alongside males, in traditionally male endeavors.”

Likewise, r-type males will become more effeminate.

A deeper look at r-selected psychology and childhood development

Next let’s look at those twisted little minds.

Liberalism is brain damage

“Amygdala damage in monkeys produces total loss of threat aversion, hyper-sexuality with inappropriate partners, and diminished child rearing investment.” (All of which are r-type strategies.)

An unstable childhood makes for a liberal adult

“an Anticompetitor is likely to be an individual who has received cues in childhood indicating that as an adult, they will prove uncompetitive with Competitor peers.”

Whereas, if the environment was more hospitable,

“Individuals would then adopt a psychology geared towards an adherence to social rules, diminished personal selfishness, monogamy, formation of stronger, more loyal pair bonds between mates (itself an exhibition of competitive intent), and they would actually delay the onset of puberty, possibly in an effort to optimally increase maturity and ability prior to competing for a mate”

So a liberal is someone who,

“…during the earliest years of an individual’s life, would detect cues within their environment that survival was going to prove difficult, and their life would likely be short and harsh. These individuals would develop a psychology geared towards opportunistic advantage taking, rule breaking, promiscuity, depersonalization of mates, and they would also enter puberty earlier. It was proposed that the adoption of this psychological and biochemical path was an attempt to simply mate and reproduce as quickly as possible, with as little investment in childrearing as possible”

In other words, a born loser.

Danger sign

Younger brothers tend to be more r-type.

The older brother or sister dominates him in childhood, signaling that he will be outcompeted.

Thus, the first child should be a son to maximize K.

(Take that, little bro.)

Liberals are fattie-fucking losers.

A summary of the liberal mind:

“One strategy will pessimistically avoid the fear of a competition that they feel destined to lose, while mating desperately with any mate available.”

Liberals cannot face the world without government protection.

“What is described in the paper is a desire to restrict individual actions through rules, so as to eliminate uncertainty in interpersonal outcomes.”

T. Gondii exacerbates this – cat ownership. Ouch.

Meat eating is conservative, vegetarianism is liberal

Herding or hunting are high skill, high investment competitive activities for apex predators. Foraging isn’t. The brain responds to the diet by shifting in either direction. Neanderthals were herders; Sapiens foragers.

“Watership Down” understands liberalism perfectly

If you’ve read “Watership Down”, you’ll remember the episode of the barrow of the snare. The protagonists were instantly welcomed, though they were outsiders. The vivid sickness of the barrow’s mad poet. The wrongness and easy sex of the place – all driven by overwhelming predation. The horror of the natives when the snare was defeated – liberals don’t want to win.

For a perfect example of the sickness of our contemporary poets, see the music video “The Wall” by Pink Floyd. He wallows in his agonized pleasure as his one-night-stand bangs another r-type. He despises and caricatures fascist K values. He’s obsessed with the mass warrior culling of WWI, and with creating a socialist society, and with rebelling against social rules. A perfect journey through a crippled and pathetic psyche.

The Neanderthal female

If all that was too depressing, here’s a lighter note. Once upon a time, women judged a man for his chivalric grace in defeat, as much as his prowess in victory. With the return of K-selection, that time will come again.

“a Competitor male’s willingness to shoulder the risk of failure in competition likely arose as a sacrifice, wisely demanded by the females looking to maximize their own competitive advantage.”

Who’s more warlike: K’s or r’s?

K’s get a bad rap because they’re warriors. But warriors do less fighting than cowards, at the end of the day. Here’s why.

The pyschopathic leader problem of r-societies

One of the main reason r societies tend to commit so much government violence is that they are dominated by psychopaths. These psychopaths must stay in power, despite ruling an incompetent population that demands perpetual abundance. The solution is always to kill a bunch of people.

AC writes of Communism/Socialism/Marxism:

“…every such movement has its leaders chosen by individuals with deficient amygdalae, who cannot see the threat until it confronts them openly, and are programmed to appease any such threat once it is perceived.”

Hello, Stalin/Mao/Pol Pot/ *snort* Kim Il Jong.

K’s are not the war party

AC (AnonymousConservative) identifies K’s as the War Party. This is not quite correct. K’s are not AFRAID of war. r’s are. However, r’s will tend to engage in far more war than K’s. This is because K’s are fundamentally isolationist, while r’s are fundamentally universalist.

When you combine r’s universalism, tendency to promote predatory leaders, and general incompetence, you get a recipe for psychopaths misdirecting attention from home misrule by blaming and demonizing the Other and engaging in endless wars of aggression or internal democide.

The same pattern simply does not occur in K-dominated societies. A rational K is unlikely to engage in wars of aggression unless he possesses an overwhelming military advantage that renders cost negligible. An r will engage in wars regardless of cost. And a K will never commit democide because it violates chivalry and sanctity of life, while an r values neither.

The classic example of a supposed K-driven war is the 4th Reich’s blitzkreig, eventual overreach and collapse. One could also point to the Napoleonic wars. However, these were actually patinas of K-rhetoric painted over heavily r-driven societies. Both were broken, weak, heavily socialist, and needed misdirection to permit the psychopaths at the top to maintain power over the r masses. For comparison, the Soviet Union was far bloodier, despite employing a purely r rhetoric.

More accurate examples of K driven wars would be European colonization of the territories of the low-IQ races, and the Mongol conquest of Asia. These were rationally justified wars supported by the rhetoric of competitive superiority. Likewise, we can see in Sparta a strong K-selective pattern, while Athens in its fall was seduced to stupidity by r dynamics which led to the rise of a foolish and predatory leadership.

The mainstream conservative movement in America employs a mixed r- and K-rhetoric to justify a mostly r platform, that is only K by comparison to the American leftist movement, which employs purely r-rhetoric to justify a more extreme version of the same thing. Contemporary America does not have a mainstream K movement, because it is a decadent empire.

Liberals practice total war, Neanderthals practice chivalry

Amongst themselves, K’s engage in ritual limited combat and abide by the results without envy, to permit the fittest to breed.

Liberals are programmed to break such rules. Thus their wars are total, and they frequently practice both genocide and democide on unspeakable scales.

r’s do not value life, since it is easily replaceable by promiscuous breeding. K’s value life, since they want to keep their group strong to defeat other groups.

The exception – K’s are willing to genocide inferior populations, as in colonialism and the Mongol hordes. r’s are psychologically horrified by genocide for inferiority’s sake (except when under the extreme duress characterized by modern socialist fascism), but will eagerly genocide in the name of righteousness, fairness, equality, etc.

Darwin was right, Dickens was wrong

“Conservatism will produce a society in which there will be a certain constant, low level of discomfort among those who lack ability, effort and determination. Those who fail to succeed will receive little help from the government, and will be forced to endure privation, while exerting themselves mightily.”

This is the only effective form of eugenic birth control – the constant low-grade stress of privation. Let the poor reproduce until they cannot stand the scarcity any longer.

The alternative?

“By contrast, Liberalism will tend to produce periods devoid of discomfort, as they redistribute resources to the more r-type psychologies, and other less successful individuals within the society. However, as the r-type individuals increase in number, you will see a gradual reduction of the population’s abilities, and a concomitant diminution in production. If history is a guide, this will produce a sudden collapse of the society, and sudden massive increase in discomfort for all. Thus where Conservatism would produce a small level of constant discomfort, Liberalism will produce alternating waves of greater comfort, and enormous misery.”

This is the true intended purpose of the Federal Reserve – to create a perpetual “period devoid of discomfort,” while preventing “a sudden collapse.”

“In short, Reagan was right. There is no left or right, there really is only an up or down. One strategy promotes pro-socialty, (K-type) morality, success, and Darwinian advancement, through simply granting men freedom. While a complementary ideology supports anti-sociality, (r-type) immorality, failure, and Darwinian devolution, through the imposition of uniform government oppression.”

And of course, instability and collapse guarantee war.

How genetic shifts in r/K ratio work

So how fast can a population shift its genetic ratio of r vs K genes?

Pro-K genetic shifts in r/K ratio happen slowly

AC assume that a society can shift its genetic r/K ratio rapidly. I disagree. While it’s possible that major events like wars or class purges can decimate the K population, the r population is essentially ineradicable.

In fact, the only way to move the needle on r is to subject an insular breeding population to several hundred years of intense selection, ala the Ashkenazim.

Why? Because the population ratio is not 50/50. It’s 80/20, favoring r. And since K doesn’t do fecundity blooms, this means that it’s possible to LOSE K quickly, but not possible to GAIN it quickly.

So aside from genocide-level events and the rare accidental evolutionary hothouse, we need a different explanation for the shifts in societal tenor that occur so regularly throughout history.

We can arguably see permanent K-loss in many previously successful empires that seemingly exhaust themselves and sink into oblivion, never reviving, but instead being absorbed by other, more vigorous peoples. However, this explanation only applies to a small percentage of cases.

The major explanation for the “phases of empire” that Sir John Glubb has so astutely expounded, is not changes in genetic r/K ratio, but changes in social promotion. Early rising empires promote K’s to power and influence; late decadent ones promote r’s.

r societies never fully eliminate K’s

Even with the r’s in charge, things rarely get bad enough that the K’s can’t reproduce. r’s mostly fight amongst themselves – or prey on each other. Thus the genetic population ratio remains stable at around 80/20.

How to permanently defeat the r’s

Getting pretty damn tired of the r’s by now? Me too. So how do we get rid of those pesky cock-a-roaches?

Space might be the solution

If humanity ever manages to GET to space, we might enter a phase of permanently K-driven evolution. Space is a lot like glacial Europe – hostile to life.

How many think Obama is going to put a man on Mars?

Better start looking for plan B.

r’s are better at internal competition in large groups

The problem is that r’s have one natural superiority over K’s – a greater aptitude for navigating large dynamic social groupings. Thus, the more a society leaves environmental K pressure behind, the greater the probability that the r’s will take over.

K’s can neutralize r’s by becoming the Distant Other

r’s are psychological ninjas when it comes to defeating the Near Other, who does not detect the threat. But they are hardwired to appease the Distant Other.

When K’s forcefully assert their absolute distinctness from r’s, and begin to mercilessly ostracize and antagonize them, the r’s will flip to appeasement.

The best argument against Liberalism is that Neanderthals are a separate species. By underlining our implacable enmity and difference, we become sympathetic and non-threatening.

Makes no sense? Well, that just means your amygdala is working.

Hence the name of this post – “Death to the Extroverts.”

I’m dead serious. If it worked for the Black Panthers, it can work for us.

Tuesday, November 25, 2014

"A foolish faith in authority is the worst enemy of the truth." - Albert Einstein

I've had enough jobs to know most of my "bosses" were morons, and that especially includes MBAs from Harvard and Yale. I came to the conclusion years much of college is worthless and sometimes even dangerous.

College and most jobs are based on hierarchy. There is nothing especially wrong with hierarchy, as long as those in charge are competent. That is, it is a legitimate hierarchy based on competent authority. College doesn't particularly lend itself to that. And that's the rub.

Everything is in reality a web or network. There is no one is charge of any of it. Those networks work best when decentralized. And although there are various hierarchies in that network, the people at the tops of those hierarchies have to be the best ones.

I have found the Old Boy's Network really does exist and in that hierarchy the stupid are promoted. That's an incompetent hierarchy within a network that will disappear. More competent, better networks will erase them - and this is how it should be.

Those networks or web are not static but dynamic. The existence of those webs in static but the fact they change is dynamic. We just want them to change for the better, which they don't always do.

Bureaucracies are as static as can be.

These days, those people aren't the ones coming out of those creaky, decrepit institutions known as colleges. Look at what Khan Academy has done.

There are more women than men in college. The ignorant howl. Ninety-percent of my time in college was worthless. All of high school was worthless. For all practical purposes I could have dropped out of school in the first grade, after I learned to read and write and do arithmetic. Nearly everything I learned, I taught myself.

Everything that can be decentralized will be decentralized, which is why I expect the U.S. to break up. It should have broken up before we ended up with the War between the States.

People should start at the bottom and work their way up. Like Jimmy Olson, cub reporter in the old Superman series. My father, who was a high-school dropout and became a general contractor, started as a cub carpenter. Beginners were really called that - cubs.

People want to associate with people like themselves. Smart with smart. So I expect, with these decentralized networks, that the U.S. in going to split into a few or more nations, and one of them will be mostly composed of the smartest people. Now as to where this will be, I don't yet know.

Monday, November 24, 2014

Everything is an interconnected web and no one is charge of it. Which makes Marxists and other statists pretty much insane.

Since everything is that web, we abstract from it "cause-and-effect" and sometimes in pretty simplistic ways.

Steven Johnson is best-known today for tracing historical connections. Gutenberg allowed everyone to read books. Then next, since most people then were far-sighted came glasses, which allowed close reading. So, books lead to glasses, which only happened because of the invention of glass.

Before Johnson there was James Burke, with his book and TV series, Connections - a '70s version of Johnson.

But before either one was Leonard Read's "I, Pencil."

I am a lead pencil—the ordinary wooden pencil familiar to all boys and girls and adults who can read and write.

Writing is both my vocation and my avocation; that's all I do.

You may wonder why I should write a genealogy. Well, to begin with, my story is interesting. And, next, I am a mystery—more so than a tree or a sunset or even a flash of lightning. But, sadly, I am taken for granted by those who use me, as if I were a mere incident and without background. This supercilious attitude relegates me to the level of the commonplace. This is a species of the grievous error in which mankind cannot too long persist without peril. For, the wise G. K. Chesterton observed, "We are perishing for want of wonder, not for want of wonders."

I, Pencil, simple though I appear to be, merit your wonder and awe, a claim I shall attempt to prove. In fact, if you can understand me—no, that's too much to ask of anyone—if you can become aware of the miraculousness which I symbolize, you can help save the freedom mankind is so unhappily losing. I have a profound lesson to teach. And I can teach this lesson better than can an automobile or an airplane or a mechanical dishwasher because—well, because I am seemingly so simple.

Simple? Yet, not a single person on the face of this earth knows how to make me. This sounds fantastic, doesn't it? Especially when it is realized that there are about one and one-half billion of my kind produced in the U.S.A. each year.

Pick me up and look me over. What do you see? Not much meets the eye—there's some wood, lacquer, the printed labeling, graphite lead, a bit of metal, and an eraser.

Innumerable Antecedents

Just as you cannot trace your family tree back very far, so is it impossible for me to name and explain all my antecedents. But I would like to suggest enough of them to impress upon you the richness and complexity of my background.

My family tree begins with what in fact is a tree, a cedar of straight grain that grows in Northern California and Oregon. Now contemplate all the saws and trucks and rope and the countless other gear used in harvesting and carting the cedar logs to the railroad siding. Think of all the persons and the numberless skills that went into their fabrication: the mining of ore, the making of steel and its refinement into saws, axes, motors; the growing of hemp and bringing it through all the stages to heavy and strong rope; the logging camps with their beds and mess halls, the cookery and the raising of all the foods. Why, untold thousands of persons had a hand in every cup of coffee the loggers drink!

The logs are shipped to a mill in San Leandro, California. Can you imagine the individuals who make flat cars and rails and railroad engines and who construct and install the communication systems incidental thereto? These legions are among my antecedents.

Consider the millwork in San Leandro. The cedar logs are cut into small, pencil-length slats less than one-fourth of an inch in thickness. These are kiln dried and then tinted for the same reason women put rouge on their faces. People prefer that I look pretty, not a pallid white. The slats are waxed and kiln dried again. How many skills went into the making of the tint and the kilns, into supplying the heat, the light and power, the belts, motors, and all the other things a mill requires? Sweepers in the mill among my ancestors? Yes, and included are the men who poured the concrete for the dam of a Pacific Gas & Electric Company hydroplant which supplies the mill's power!

Don't overlook the ancestors present and distant who have a hand in transporting sixty carloads of slats across the nation.

Once in the pencil factory—$4,000,000 in machinery and building, all capital accumulated by thrifty and saving parents of mine—each slat is given eight grooves by a complex machine, after which another machine lays leads in every other slat, applies glue, and places another slat atop—a lead sandwich, so to speak. Seven brothers and I are mechanically carved from this "wood-clinched" sandwich.

My "lead" itself—it contains no lead at all—is complex. The graphite is mined in Ceylon. Consider these miners and those who make their many tools and the makers of the paper sacks in which the graphite is shipped and those who make the string that ties the sacks and those who put them aboard ships and those who make the ships. Even the lighthouse keepers along the way assisted in my birth—and the harbor pilots.

The graphite is mixed with clay from Mississippi in which ammonium hydroxide is used in the refining process. Then wetting agents are added such as sulfonated tallow—animal fats chemically reacted with sulfuric acid. After passing through numerous machines, the mixture finally appears as endless extrusions—as from a sausage grinder-cut to size, dried, and baked for several hours at 1,850 degrees Fahrenheit. To increase their strength and smoothness the leads are then treated with a hot mixture which includes candelilla wax from Mexico, paraffin wax, and hydrogenated natural fats.

My cedar receives six coats of lacquer. Do you know all the ingredients of lacquer? Who would think that the growers of castor beans and the refiners of castor oil are a part of it? They are. Why, even the processes by which the lacquer is made a beautiful yellow involve the skills of more persons than one can enumerate!

Observe the labeling. That's a film formed by applying heat to carbon black mixed with resins. How do you make resins and what, pray, is carbon black?

My bit of metal—the ferrule—is brass. Think of all the persons who mine zinc and copper and those who have the skills to make shiny sheet brass from these products of nature. Those black rings on my ferrule are black nickel. What is black nickel and how is it applied? The complete story of why the center of my ferrule has no black nickel on it would take pages to explain.

Then there's my crowning glory, inelegantly referred to in the trade as "the plug," the part man uses to erase the errors he makes with me. An ingredient called "factice" is what does the erasing. It is a rubber-like product made by reacting rape-seed oil from the Dutch East Indies with sulfur chloride. Rubber, contrary to the common notion, is only for binding purposes. Then, too, there are numerous vulcanizing and accelerating agents. The pumice comes from Italy; and the pigment which gives "the plug" its color is cadmium sulfide.

No One Knows

Does anyone wish to challenge my earlier assertion that no single person on the face of this earth knows how to make me?

Actually, millions of human beings have had a hand in my creation, no one of whom even knows more than a very few of the others. Now, you may say that I go too far in relating the picker of a coffee berry in far off Brazil and food growers elsewhere to my creation; that this is an extreme position. I shall stand by my claim. There isn't a single person in all these millions, including the president of the pencil company, who contributes more than a tiny, infinitesimal bit of know-how. From the standpoint of know-how the only difference between the miner of graphite in Ceylon and the logger in Oregon is in the type of know-how. Neither the miner nor the logger can be dispensed with, any more than can the chemist at the factory or the worker in the oil field—paraffin being a by-product of petroleum.

Here is an astounding fact: Neither the worker in the oil field nor the chemist nor the digger of graphite or clay nor any who mans or makes the ships or trains or trucks nor the one who runs the machine that does the knurling on my bit of metal nor the president of the company performs his singular task because he wants me. Each one wants me less, perhaps, than does a child in the first grade. Indeed, there are some among this vast multitude who never saw a pencil nor would they know how to use one. Their motivation is other than me. Perhaps it is something like this: Each of these millions sees that he can thus exchange his tiny know-how for the goods and services he needs or wants. I may or may not be among these items.

No Master Mind

There is a fact still more astounding: the absence of a master mind, of anyone dictating or forcibly directing these countless actions which bring me into being. No trace of such a person can be found. Instead, we find the Invisible Hand at work. This is the mystery to which I earlier referred.

It has been said that "only God can make a tree." Why do we agree with this? Isn't it because we realize that we ourselves could not make one? Indeed, can we even describe a tree? We cannot, except in superficial terms. We can say, for instance, that a certain molecular configuration manifests itself as a tree. But what mind is there among men that could even record, let alone direct, the constant changes in molecules that transpire in the life span of a tree? Such a feat is utterly unthinkable!

I, Pencil, am a complex combination of miracles: a tree, zinc, copper, graphite, and so on. But to these miracles which manifest themselves in Nature an even more extraordinary miracle has been added: the configuration of creative human energies—millions of tiny know-hows configurating naturally and spontaneously in response to human necessity and desire and in the absence of any human master-minding! Since only God can make a tree, I insist that only God could make me. Man can no more direct these millions of know-hows to bring me into being than he can put molecules together to create a tree.

The above is what I meant when writing, "If you can become aware of the miraculousness which I symbolize, you can help save the freedom mankind is so unhappily losing." For, if one is aware that these know-hows will naturally, yes, automatically, arrange themselves into creative and productive patterns in response to human necessity and demand—that is, in the absence of governmental or any other coercive masterminding—then one will possess an absolutely essential ingredient for freedom: a faith in free people. Freedom is impossible without this faith.

Once government has had a monopoly of a creative activity such, for instance, as the delivery of the mails, most individuals will believe that the mails could not be efficiently delivered by men acting freely. And here is the reason: Each one acknowledges that he himself doesn't know how to do all the things incident to mail delivery. He also recognizes that no other individual could do it. These assumptions are correct. No individual possesses enough know-how to perform a nation's mail delivery any more than any individual possesses enough know-how to make a pencil. Now, in the absence of faith in free people—in the unawareness that millions of tiny know-hows would naturally and miraculously form and cooperate to satisfy this necessity—the individual cannot help but reach the erroneous conclusion that mail can be delivered only by governmental "master-minding."

Testimony Galore

If I, Pencil, were the only item that could offer testimony on what men and women can accomplish when free to try, then those with little faith would have a fair case. However, there is testimony galore; it's all about us and on every hand. Mail delivery is exceedingly simple when compared, for instance, to the making of an automobile or a calculating machine or a grain combine or a milling machine or to tens of thousands of other things. Delivery? Why, in this area where men have been left free to try, they deliver the human voice around the world in less than one second; they deliver an event visually and in motion to any person's home when it is happening; they deliver 150 passengers from Seattle to Baltimore in less than four hours; they deliver gas from Texas to one's range or furnace in New York at unbelievably low rates and without subsidy; they deliver each four pounds of oil from the Persian Gulf to our Eastern Seaboard—halfway around the world—for less money than the government charges for delivering a one-ounce letter across the street!

The lesson I have to teach is this: Leave all creative energies uninhibited. Merely organize society to act in harmony with this lesson. Let society's legal apparatus remove all obstacles the best it can. Permit these creative know-hows freely to flow. Have faith that free men and women will respond to the Invisible Hand. This faith will be confirmed. I, Pencil, seemingly simple though I am, offer the miracle of my creation as testimony that this is a practical faith, as practical as the sun, the rain, a cedar tree, the good earth.