Art Diamond's web log

July 31, 2015

George Bailey Wanted to Make Money, But He Wanted to Do More than Just Make Money

(p. 219) Actually, it's not so strange. The norm for bankers was never just moneymaking, any more than it was for doctors or lawyers. Bankers made a livelihood, often quite a good one, by serving their clients-- the depositors and borrowers-- and the communities in which they worked. But traditionally, the aim of banking-- even if sometimes honored only in the breach-- was service, not just moneymaking.

In the movie It's a Wonderful Life, James Stewart plays George Bailey, a small-town banker faced with a run on the bank-- a liquidity crisis. When the townspeople rush into the bank to withdraw their money, Bailey tells them, "You're thinking of this place all wrong. As if I had the money back in a safe. The money's not here." He goes on. "Your money's in Joe's house. Right next to yours. And in the Kennedy house, and Mrs. Backlin's house, and a hundred others. Why, you're lending them the money to build, and they're going to pay you back, as best they can.... What are you going to do, foreclose on them?"

No, says George Bailey, "we've got to stick together. We've got to have faith in one another." Fail to stick together, and the community will be ruined. Bailey took all the money he could get his hands on and gave it to his depositors to help see them through the crisis. Of course, George Bailey was interested in making money, but money was not the only point of what Bailey did.

Relying on a Hollywood script to provide evidence of good bankers is at some level absurd, but it does indicate something valuable about society's expectations regarding the role of bankers. The norm for a "good banker" throughout most of the twentieth century was in fact someone who was trustworthy and who served the community, who was responsible to clients, and who took an interest in them.

Source:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

July 23, 2015

Some Learn in Order to Gain Competence, Others Learn to Gain Direct Rewards

(p. 184) Think about two different tennis pros giving you tennis lessons. The first pro says things like "good shot" and "good swing" all the time, to encourage you. The second one says "good swing" only when you make a good swing. If hearing "good swing" gives you a hedonic charge, then you will prefer the first instructor to the second (more gold stars, more encouragement). But if what gives you the charge is getting better at tennis, you will prefer the second instructor to the first. That's because the second instructor's feedback to you is much more informative than the first one's. You're not after "good swing" gold stars; you're after a better tennis game. So feedback is essential to the development of a complex skill-- whether it be empathy or a strong forehand. But he-(p. 185)donic feedback, in the form of incentives, is not. It may even be counterproductive, as in the case of instructor number one.

In schools, tests provide an extremely important source of feedback-- of information-- to the teacher and the student-- about how things are going. Tests, or something like them, often offer the best way to diagnose problems and correct them. So tests as a source of information are good and important. The problem is that in addition to providing information, tests provide outcomes that students, and their parents, and their teachers, want and like-- outcomes like approval, prizes, awards, honors, special privileges, and school ratings. The hedonic character of these outcomes is what gets students and teachers to orient their work to passing the tests, and to regard what they do in the classroom as merely instrumental, as merely a means to various rewarding ends.

There are important differences between children oriented to getting A's and children oriented to learning from their mistakes. Psychologist Carol Dweck and her associates have spent thirty years studying the incentive systems that govern the learning of children throughout the educational process. They have uncovered two fundamentally different approaches to learning in kids that can often lead to profound differences in how well kids learn. One group of kids has what Dweck has called performance goals; the other group has what she has called mastery goals. Children with performance goals are primarily interested in gaining favorable judgments of their competence. They want to do well on tests. They want social approval. They want awards. Children with mastery goals are primarily interested in increasing their competence rather than in demonstrating it. They want to encounter things that they can't do and to learn from their failures. As Dweck puts it, performance-oriented children want to prove their ability, while mastery-oriented children want to improve their ability. Children with performance goals avoid challenges. They prefer tasks that are well within the range of their ability. Children with mastery goals seek challenges. They prefer tasks that strain the limits of their ability. Children with performance goals respond to failure by giving up. Children (p. 186) with mastery goals respond to failure by working harder. Children with performance goals take failure as a sign of their inadequacy and come to view the tasks at which they fail with a mixture of anxiety, boredom, and anger. Children with mastery goals take failure as a sign that their efforts, and not they, are inadequate, and they often come to view the tasks at which they fail with the kind of relish that comes when you encounter a worthy challenge.

Source:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

July 19, 2015

Should Students Read to Learn, or to Get Gold Stars?

(p. 181) When a consultant tells teachers to concentrate on the bubble kids and ignore the kids who are most in need of help, something has gone wrong. And if gold stars turn reading from an adventure into a job, something has gone wrong. But what? The typical response to examples like these is not to blame incentives but to blame "dumb" incentives. The presumption is that "smart" incentives, or at least "smarter" incentives, will do the job.

This is a mistake. In many situations, for many activities, no incentives are smart enough. Teachers like Deborah Ball and Mrs. Dewey spend their day figuring out how much time to spend with each student and how to tailor what they teach to each student's particular strengths and weaknesses. They are continually balancing conflicting aims-- to treat all students equally, to give the struggling students more time, to energize and inspire the gifted students. Along comes the incentive to bring up the school's test scores, and all the nuance and subtlety of Mrs. Dewey's moment-by-moment decisions go out the window. And what "smarter" incentive is going to replace judgment in making sensitive choices in a complex and changing context like a classroom?

Or what, exactly, would you incentivize to encourage hospital custodian Luke to seek the kind and empathetic response to the distraught father who wanted his son's room cleaned? Incentives are always based on meeting some specific, measurable criterion: read more books; raise more test scores; wash more floors. Left to his own devices, Luke asks himself, "What can I do to be caring?" and because he has moral skill, he comes up with a good answer. With "caring" incentivized, Luke (p. 182) might ask, "What do I have to do to get a raise or a bonus?" "Reclean the room" might be a right answer. "Look sympathetic" might be a right answer. "Be caring" surely is not. Aristotle thought that good
people do the right thing because it is the right thing. Doing the right thing because it's the right thing unleashes the nuance, flexibility, and improvisation that moral challenges demand and moral skill enables. Doing the right thing for pay shuts down the nuance and flexibility.

Source:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

July 18, 2015

(p. A15) The reality of modern medicine, Dr. Stossel argues, is that private industry is the engine of innovation, with productivity and new advances dependent on relationships between commercial interests and academic and research medicine. Companies, not universities or research with federal funding, run 85% of the medical-products pipeline. "We all inevitably have conflicts all the time. You only stop having conflicts when you're dead. The only conflict-free situation is the grave," he says.

The pursuit of the illusion "to be pure, to be priestly, to be supposedly uncorrupted by the profit motive," Dr. Stossel says, often has the effect of banishing or else discounting the expertise of the people who know the most but whose integrity and objectivity are allegedly compromised by industry ties. What ought to matter more, he adds, is simply "Results. Competence. LeBron James--it's putting the ball in the basket."

. . .

Zero-tolerance conflict-of-interest editorial policies, Dr. Stossel says, suppress and distort debate by withholding positions of authority. "If you have an industry connection, if you really understand the topic, you can't say anything," he notes. "If you're an editor, and you have an ideological predilection, you have all this power and you can say anything you want."

Dr. Stossel is equally scorching about the drug and device companies and their trade organizations, which he says drift around like Rodney Dangerfield, complaining they don't get no respect. They prefer not to be confrontational, they rarely fight back against the conflict-of-interest scolds. "They're laying responsibility by default to the patients, the people who actually have a first-hand connection to whatever the disease is: 'Goddammit, I want a cure.' "

Which is the larger point: The to-and-fro between publications not meant for lay readers can seem arcane, but the product of conflict-of-interest politics is fewer cures and new therapies. The predisposition against selling out to industry is pervasive, while reputations can be ruined overnight when researchers find themselves in a page-one exposé or hauled before Congress, even if there is no evidence of misconduct or bias.

Better, then, to conform in the cloisters than risk offending the conflict-of-interest orthodoxy--or translating some basic-research insight into a new treatment for patients. Dr. Rosenbaum reports: "The result is a stifling of honest discourse and potential discouragement of productive collaborations. . . . More strikingly, some of the young, talented physician-investigators I spoke with expressed worry about how any industry relationship would affect their careers."

. . .

'Pharmaphobia"--part polemic, part analytic investigation, a history of medicine and a memoir--deserves a wide readership. . . . "I'd rather get a conversation started with people who are smarter than I am about how complicated and granular and nuanced and unpredictable discovery is. Let's not slow it down."

(Note: the online version of the interview has the date June 26, 2015, and has the title "A Cure for 'Conflict of Interest' Mania; A crusading physician says medical progress is hampered by a holier-than-thou 'moralistic bullying.'.")

July 11, 2015

Canny Outlaws in Education and at Hogwarts

(p. 174) Interestingly, the union members in some of the schools run by Green Dot Public Schools, a charter school group with a solid educational track record, did not boycott the benchmark tests. The reason that they refused is revealing. Green Dot's exams are created by a panel of teachers from its schools and are regularly reviewed for effectiveness and modified by the teachers. The tests have more credibility with the teachers than the tests for the rest of the district's schools, which are written by an outside company, imposed from above, and don't mesh with year-round schedules.

The quiet resistance of canny outlaws and the vocal protests of others are signs that teachers dedicated to preserving and encouraging discretion and wise judgment are not going quietly into the night. These teachers are not people who simply rebel at rules or who are just committed to their own ways of doing things. They are committed to the aims of teaching, a practice whose purpose is to educate students to be knowledgeable, thoughtful, reasonable, reflective, and humane. And they are brave enough to act on these commitments, taking the risks necessary to find ways around the rules. We suspect that many of our readers are canny outlaws themselves or know people who are: practitioners who have the know-how and courage to bend or sidestep for-(p. 175)mulaic procedures or rigid scripts or bureaucratic requirements in order to accomplish the aims of their practice. We admire canny outlaws in the stories we tell ourselves about such people and even in some of our children's stories. We read the Harry Potter tales to them because Harry, Ron, and Hermione are canny outlaws who gain the guts and skill to break school rules and stand up to illegitimate power in order to do the right thing to achieve the aims of wizardry, indeed to save the practice itself.

Source:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

July 9, 2015

Physicists Accepting Theories Based on Elegance Rather than Evidence

A few months ago in the journal Nature, two leading researchers, George Ellis and Joseph Silk, published a controversial piece called "Scientific Method: Defend the Integrity of Physics." They criticized a newfound willingness among some scientists to explicitly set aside the need for experimental confirmation of today's most ambitious cosmic theories -- so long as those theories are "sufficiently elegant and explanatory." Despite working at the cutting edge of knowledge, such scientists are, for Professors Ellis and Silk, "breaking with centuries of philosophical tradition of defining scientific knowledge as empirical."

Whether or not you agree with them, the professors have identified a mounting concern in fundamental physics: Today, our most ambitious science can seem at odds with the empirical methodology that has historically given the field its credibility.

July 5, 2015

"You Can't Get Married if You're Dead"

(p. A15) On Friday my phone was blowing up with messages, asking if I'd seen the news. Some expressed disbelief at the headlines. Many said they were crying.

None of them were talking about the dozens of people gunned down in Sousse, Tunisia, by a man who, dressed as a tourist, had hidden his Kalashnikov inside a beach umbrella. Not one was crying over the beheading in a terrorist attack at a chemical factory near Lyon, France. The victim's head was found on a pike near the factory, his body covered with Arabic inscriptions. And no Facebook friends mentioned the first suicide bombing in Kuwait in more than two decades, in which 27 people were murdered in one of the oldest Shiite mosques in the country.

They were talking about the only news that mattered: gay marriage.

. . .

The barbarians are at our gates. But inside our offices, schools, churches, synagogues and homes, we are posting photos of rainbows on Twitter. It's easier to Photoshop images of Justice Scalia as Voldemort than it is to stare evil in the face.

July 3, 2015

Officers Used to Learn from Trial and Error in Training Their Units

(p. 156) In the army, wartime experience is considered the best possible teacher, at least for those who survive the first weeks. Wong found another good one--the practice junior officers get while training their units. The decisions these officers have to make as teachers help develop the capacity for the judgment they will need on the battlefield. But Wong discovered that in the 1980s, the army had begun to restructure training in ways that had the opposite results.

Traditionally, company commanders had the opportunity to plan, (p. 157) execute, and assess the training they gave their units. "Innovation," Wong explained, "develops when an officer is given a minimal number of parameters (e.g., task, condition, and standards) and the requisite time to plan and execute the training. Giving the commanders time to create their own training develops confidence in operating within the boundaries of a higher commander's intent without constant supervision." The junior officers develop practical wisdom through their teaching of trainees, but only if their teaching allows them discretion and flexibility. Just as psychologist Karl Weick found studying firefighters, experience applying a limited number of guidelines teaches soldiers how to improvise in dangerous situations.

Wong's research showed that the responsibility for training at the company level was being taken away from junior officers. First, the time they needed was being eaten away by "cascading requirements" placed on company commanders from above. There was, Wong explained, such a "rush by higher headquarters to incorporate every good idea into training" that "the total number of training days required by all mandatory training directives literally exceeds the number of training days available to company commanders. Company commanders somehow have to fit 297 days of mandatory requirements into 256 available training days." On top of this, there were administrative requirements to track data on as many as 125 items, including sexual responsibility training, family care packets, community volunteer hours, and even soldiers who had vehicles with Firestone tires.

Second, headquarters increasingly dictated what would be trained and how it would be trained, essentially requiring commanders "to follow a script." Commanders lost the opportunity to analyze their units' weaknesses and plan the training accordingly. Worse, headquarters took away the "assessment function" from battalion commanders. Certifying units as "ready" was now done from the top.

The learning through trial and error that taught officers how to improvise, Wong found, happens when officers try to plan an action, (p. 158) then actually execute it and reflect on what worked and what didn't. Officers who did not have to adhere to strict training protocols were in an excellent position to learn because they could immediately see results, make adjustments, and assess how well their training regimens were working. And most important, it was this kind of experience that taught the commanders how to improvise, which helped them learn to be flexible, adaptive, and creative on the battlefield. Wong was concerned about changes in the training program because they squeezed out these learning experiences; they prevented officers from experiencing the wisdom-nurturing cycle of planning, executing the plan, assessing what worked and didn't, reevaluating the original plan, and trying again.

Source:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

June 29, 2015

Common Sense "Rules" Often Contradict Each Other

(p. 43) The world we face is too complex and varied to be handled by rules, and wise people understand this. Yet there is a strange and troubling disconnect between the way we make our moral decisions and the way we talk about them.

From ethics textbooks to professional association codes to our everyday life, any discussion of moral choices is dominated by Rules Talk. If we're asked to explain why we decided to tell the painful, unvarnished truth to a friend, we might say, "Honesty is the best policy." But if we're asked why we decided to shade the truth we might say, "If you can't say anything nice, don't say anything at all." It's clearly not a rule that is telling us what to do. Both maxims are good rules of thumb, but we don't talk about why we picked one and not the other in any particular case. "Better safe than sorry." But "He who hesitates is lost." "A penny saved is a penny earned." But "Don't be penny wise and pound foolish." When we hear the maxim, we nod. End of story. It's as if stating the rule is sufficient to explain why we did what we did.

Source:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

June 21, 2015

Empathy for the Absent

In Practical Wisdom the authors argue for empathy and against rules. There is something to be said for their argument.

But we tend to empathize with those who are present and not those we do not see or even know.

For example in academic tenure and promotion decisions, slack is often cut for colleagues who already have their foot in the door. We know them, their troubles and challenges. So they are tenured and promoted and given salary increases and perks even though there are others outside the door who may have greater productivity and even greater troubles and challenges.

Charlie Munger in an interview at the University of Michigan spoke of how hard it is for physicians to hold their peers responsible when they are incompetent or negligent. They have empathy for their peers, knowing their troubles and challenges. And Munger also says few physicians are willing to suffer the long-lasting "ill will" from their peers who have been held accountable. They do not know so well the patients who suffer, and one way or another, the patients are soon out of sight.

Just as in academics we do not know so well the students who suffer; or the able scholars who suffer, standing outside the door.

Following rules seems unsympathetic and lacking in empathy. But it may be the best way to show empathy for the absent.

The book mentioned is:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

June 18, 2015

Under Perverse Institutions, It Takes "Canny Outlaws" to Do What Is Right

Practical Wisdom is a hard book to categorize. It is part philosophy, and one of the co-authors is an academic philosopher. But most of the book consists of often fascinating, concrete examples. The examples are usually of perverse institutions and policies that create incentives and constraints that reward those who do bad and punish those who do good. The authors' main lesson is that we all should become stoical "canny outlaws" by finding crafty ways to do what is right, while trying to avoid or survive the perverse incentives and constraints.

Maybe--for me the main lesson is that we all should get busy reforming the institutions and policies. But whether their lesson or my lesson is the best lesson, their book is still filled with many great examples that are worth pondering.

In the next few weeks, I will be quoting several of the more useful, or thought-provoking passages.

The book discussed, is:

Schwartz, Barry, and Kenneth Sharpe. Practical Wisdom: The Right Way to Do the Right Thing. New York: Riverhead Books, 2010.

June 11, 2015

My nephew has been downloading music and movies illegally from the Internet. Without sounding self-righteous, how can I get him to respect intellectual-property rights?

--Patricia

My own view on illegal downloads was deeply modified the day that my book on dishonesty was published--when I learned that it had been illegally downloaded more than 20,000 times from one overseas website. (The irony did not escape me.) My advice? Get your nephew to create something and then, without his knowing, put it online and download it many, many times. I suspect that will make it much harder for him to keep up his blithe attitude toward piracy.

For the full advice column by Dan Ariely, professor of behavioral economics at Duke , see:

June 2, 2015

Hamburger Grown in Lab from Cow Stem Cells

(p. D5) A hamburger made from cow muscle grown in a laboratory was fried, served and eaten in London on Monday in an odd demonstration of one view of the future of food.

. . .

The two-year project to make the one burger, plus extra tissue for testing, cost $325,000. On Monday it was revealed that Sergey Brin, one of the founders of Google, paid for the project. Dr. Post said Mr. Brin got involved because "he basically shares the same concerns about the sustainability of meat production and animal welfare."

The meat was produced using stem cells -- basic cells that can turn into tissue-specific cells -- from cow shoulder muscle from a slaughterhouse. The cells were multiplied in a nutrient solution and put into small petri dishes, where they became muscle cells and formed tiny strips of muscle fiber. About 20,000 strips were used to make the five-ounce burger, which contained breadcrumbs, salt, and some natural colorings as well.

May 12, 2015

Aaron Burr Gave Jeremy Bentham a Copy of The Federalist Papers

(p. 720) For four years, the disgraced Burr traveled in Europe, resorting occasionally to the pseudonym H. E. Edwards to keep creditors at bay. Sometimes he lived in opulence with fancy friends and at other times languished in drab single rooms. This aging roué sampled opium and seduced willing noblewomen and chambermaids with a fine impartiality. All the while, he cultivated self-pity. "I find that among the great number of Americans here and there all are hostile to A.B.-- All-- What a lot of rascals they must be to make war on one whom they do not know, on one who never did harm or wished harm to a human being," he recorded in his diary. He befriended the English utilitarian philosopher Jeremy Bentham and spoke to him with remarkable candor. "He really meant to make himself emperor of Mexico," Bentham recalled. "He told me I should be the legislator and he would send a ship of war for me. He gave me an account of his duel with Hamilton. He was sure of being able to kill him, so I thought it little better than murder." Always capable of irreverent surprises, Burr gave Bentham a copy of The Federalist. The shade of Alexander Hamilton rose up to haunt Burr at unexpected moments. In Paris, he called upon Talleyrand, who instructed his secretary to deliver this message to the uninvited caller: "I shall be glad to see Colonel Burr, but please tell him that a portrait of Alexander Hamilton always hangs in my study where all may see it." Burr got the message and left.

May 11, 2015

"Animals Have Complex Minds and Rich Emotional Lives"

(p. D6) We now know that species from magpies to elephants can recognize themselves in the mirror, which some scientists consider a sign of self-awareness. Rats emit a form of laughter when they're tickled. And dolphins, parrots and dogs show clear signs of distress when their companions die. Together, these and many other findings demonstrate what any devoted pet owner has probably already concluded: that animals have complex minds and rich emotional lives.

May 2, 2015

Fongoli Chimps, Where Prey Is Scarce, Show "Respect of Ownership"

(p. A10) The Fongoli chimpanzees live in a mix of savanna and woodlands where prey is not as abundant as in rain forests. There are no red colobus monkeys, and although the chimps do hunt young vervet monkeys and baboons, the much smaller bush babies are their main prey.

Dr. Pruetz argues that less food may have prompted both technological and social innovation, resulting in new ways to hunt and new social interactions as well. Humans evolved in a similar environment, and, as she and her colleagues write in Royal Society Open Science, "tool-assisted hunting could have similarly been important for early hominins."

. . .

By and large, said Dr. Pruetz, the adult males, which could take away a kill, show a "respect of ownership." Theft rates are only about 5 percent. The chimps she studies also have more mixed-sex social groups than chimp bands in East Africa.

Travis Pickering, an anthropologist at the University of Wisconsin, said that with less food available it seems that the Fongoli chimps, "have to be more inventive" and that "these hunting weapons even the playing field for non-adults and females."

April 11, 2015

Perceptual Diversity Puzzle: Is It White-and-Gold or Blue-and-Black?

"The dress in a photo from Caitlin McNeill's Tumblr site." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. B1) The mother of the bride wore white and gold. Or was it blue and black?

From a photograph of the dress the bride posted online, there was broad disagreement. A few days after the wedding last weekend on the Scottish island of Colonsay, a member of the wedding band was so frustrated by the lack of consensus that she posted a picture of the dress on Tumblr, and asked her followers for feedback.

"I was just looking for an answer because it was messing with my head," said Caitlin McNeill, a 21-year-old singer and guitarist.

. . .

Less than a half-hour after Ms. McNeil's original Tumblr post, Buzzfeed posted a poll: "What Colors Are This Dress?" As of Friday afternoon, it had (p. B5) been viewed more than 28 million times. (White and gold was winning handily.)

. . .

Politicians were eager to stake out their positions. "I know three things," wrote Senator Christopher Murphy, a Connecticut Democrat, on Twitter. "1) the ACA works; 2) climate change is real; 3) that dress is gold and white."

Sorry, senator. The dress, as we all now know, is blue and black. It goes for 50 pounds at Roman Originals, a British retailer.

. . .

Various theories were floated about why the dress looks different to different people. (No, if you see the darker hues of blue and black it doesn't mean that you are depressed.)

Duje Tadin, associate professor for brain and cognitive sciences at the University of Rochester, says it may be because of variations in the number of photoreceptors called cones in the retina that perceive the color blue. The human eye has about six million cones that are sensitive to green, red or blue. Signals from the cones go to the brain, which interprets them as color.

"It's puzzling," conceded Dr. Tadin. "When it comes to color, blue is always the weird one. We have the fewest number of blue cones." He added, "If you don't have very many blue cones, you may see it as white, or if you have plenty of blue cones, you may see more blue."

. . .

The one thing scientists could agree on was that this is a very unusual illusion. People who see the dress one way do not eventually begin to see it the other way, as is common with many optical illusions. "This clearly has to do with individual differences in how we perceive the world," said Dr. Tadin. "There's something about this particular image that just captures those differences in a remarkable way.

April 4, 2015

Heckman Thinks that Economists Who Are Only Economists May Be Dangerous

The Journal of Political Economy, edited by the University of Chicago economics department, is one of the three or four most prestigious journals in the economics profession. For the last 20 years or so (if memory serves) the back cover of each issue has had a funny quote or interesting or unusual anecdote, related to some aspect of economics.

I was surprised to see that the quote from the October 2014 issue as "suggested by James J. Heckman." Heckman is a Nobel-Prize-winner who is known mainly for developing new econometric techniques in the area of labor economics. When I was a graduate student at Chicago, his graduate students tended to be among those who were most oriented to formalism and technique. So I was surprised to see that he had suggested the following quote from neo-Austrian economist and fellow Nobel-Prize-winner F.A. Hayek:

(p. 463) But nobody can be a great economist who is only an economist---and I am even tempted to add that the economist who is only an economist is likely to become a nuisance if not a positive danger.

Source:

Hayek, F. A. "The Dilemma of Specialization." In The State of the Social Sciences, edited by Leonard D. White. Chicago: University of Chicago Press, 1956.

(Note: I do not have the book, and cannot find the page range of Hayek's article in the book.)

March 20, 2015

Moral Progress Accelerated in the 18th Century

(p. A11) For hundreds of years, people flocked to public hangings as a form of entertainment. Onlookers crowded into town squares and brought their families, reveling in the carnival atmosphere. Today most people are sickened at the idea of merriment at an execution. (Many are disturbed that executions take place at all.) We recoil from other once-common practices, too: slavery, the mistreatment of children, animal cruelty. Such shifts in attitude or belief surely constitute a form of moral progress and suggest, for once, that civilization is advancing and not receding.

. . .

Mr. Shermer defines moral progress as an "increase in the survival and flourishing of sentient beings," which he illustrates with graphs and charts that reveal, among other things, a decline in war-related deaths, the expansion of the food supply, the reduction in major epidemics, the growth of world GDP and the spread of democracy.

Humanitarian achievements in the West, Mr. Shermer notes, began in earnest [in] the 18th century. Yet the ability to reason ethically is not a product of the Enlightenment. A moral instinct seems to be present at birth: Even infants possess innate intuitions about fairness and reciprocity, as Mr. Shermer explains. All societies punish free riders. The Golden Rule and Babylon's Code of Hammurabi (advocating proportionate punishment) predate the ancient Greeks. So why did we need an Enlightenment to jump-start our moral progress?

March 17, 2015

Wealth Can Be Used for Self-Improvement, Not Just Trivial Pursuits

Hamilton, in a letter to his future wife:

(p. 145) I do not, my love, affect modesty. I am conscious of [the] advantages I possess. I know I have talents and a good heart, but why am I not handsome ? Why have I not every acquirement that can embellish human nature? Why have I not fortune, that I might hereafter have more leisure than I shall have to cultivate those improvements for which I am not entirely unfit?

February 12, 2015

Former Nebraskan Writes that Football Breaks the Soul

(p. C1) The poet Erin Belieu was born in Nebraska. It's a place where, she once wrote,

football is to life what sleep deprivation is

to Amnesty International, that is,

the best researched and the most effective method

of breaking a soul.

Ms. Belieu got out, soul entirely unbroken. She's spent the past two decades composing smart and nettling books of poems, beginning with "Infanta" (1995), which was chosen for the National Poetry Series by Hayden Carruth. I've admired her three previous books, but her new one, "Slant Six," seems to me better by an order of magnitude. It's got more smoke, more confidence, more wit and less tolerance for obscurity. Her crisp free verse has as many subcurrents as a magnetic field.

(Note: the online version of the interview has the date DEC. 9, 2014, and has the title "From a Slim Book, Many Observations." The name of the interviewer, presumably the author of the italicized passage above, is not given in either the online or print versions.)

February 7, 2015

(p. D8) This week [the week starting Sun. January 25, 2015], China's ideological drive against Western liberal ideas broadened to take in a new target: foreign textbooks.

Meeting in Beijing with the leaders of several prominent universities, Education Minister Yuan Guiren laid out new rules restricting the use of Western textbooks and banning those sowing "Western values."

"Strengthen management of the use of original Western teaching materials," Mr. Yuan said at a meeting with university officials, according to Xinhua, the state news agency. "By no means allow teaching materials that disseminate Western values in our classrooms."

The strictures on textbooks are the latest of a succession of measures to strengthen the Communist Party's control of intellectual life and eradicate avenues for spreading ideas about rule of law, liberal democracy and civil society that it regards as dangerous contagions, which could undermine its hold on power.

On Jan. 19, the leadership issued guidelines demanding that universities make a priority of ideological loyalty to the party, Marxism and Mr. Xi's ideas.

Mr. Yuan's message this week spelled out how universities should do that.

"Never allow statements that attack and slander party leaders and malign socialism to be heard in classrooms," he said, according to the Xinhua report. "Never allow teachers to grumble and vent in the classroom, passing on their unhealthy emotions to students."

January 23, 2015

"It Is the Individual Who Is the Agent of the Action"

(p. C6) Mr. Mischel begins by describing how, in the late 1960s, he and his colleagues devised a straightforward experiment to measure self-control at the Bing Nursery School at Stanford University. In its simplest form, children between the ages of 4 and 6 were given a choice between one marshmallow now or two marshmallows if they waited 15 minutes. Some kids ate the marshmallow right away, but most would engage in unintentionally hilarious attempts to overcome temptation.

. . . About a third of the original subjects, the researchers reported, deferred gratification long enough to get the second treat.

. . . in 2006, . . . Mr. Mischel published a new paper in the prestigious journal Psychological Science. The researchers had done a follow-up study with the students they had tested 40 years before, examining the sort of adults they had grown into. They found that the children who were able to delay gratification had higher SAT scores entering college, higher grade-point averages at the end of college and made more money after college. Perhaps not surprisingly, they also tended to have a lower body-mass index.

. . .

In his commencement address, Adm. McRaven explained his final life lesson with an anecdote: "In SEAL training there is a bell," he explained. "A brass bell that hangs in the center of the compound for all the students to see. All you have to do to quit--is ring the bell. Ring the bell and you no longer have to wake up at 5 o'clock. Ring the bell and you no longer have to do the freezing cold swims. Ring the bell and you no longer have to do the runs, the obstacle course, the PT--and you no longer have to endure the hardships of training. Just ring the bell." To ring the bell is to give up.

Interestingly, one of Mr. Mischel's lesser-known marshmallow experiments had a similar setup, with a bell that the children could ring to call back the experimenter and save them from themselves. For the children, though, ringing the bell was not giving up but calling in the cavalry. His book is an encouraging reminder that, despite all the factors that urge us to indulge, "at the end of that causal chain, it is the individual who is the agent of the action and decides when to ring the bell." You are ultimately in control of your self.

(Note: the online version of the review has the date Sept. 19, 2014, and has the title "Book Review: 'The Marshmallow Test' by Walter Mischel; To resist the tempting treat, kids looked away, squirmed, sang or simply pretended to take a bite.")

October 31, 2014

Declaration and Constitution Built Upon Philosophical Radicals Locke, Spinoza, Epicurus and Lucretius

(p. C7) In Mr. Stewart's telling, the central tenets of "philosophical radicalism" worked their way into the Declaration of Independence and the Constitution by a kind of ideological stealth. When, for example, Jefferson referred in the first paragraph of the Declaration to "the separate and equal station to which the Laws of Nature and of Nature's God entitle" a nation, he wasn't just offering a palatable conception of deity to his religious or nominally religious readers. He was drawing on a radical tradition stretching back to John Locke and especially to the Dutch rationalist Baruch Spinoza, who himself had drawn on the ancient Greek philosophers Epicurus and Lucretius.

(Note: the online version of the review has the date July 25, 2014, and has the title "Book Review: 'Nature's God' by Matthew Stewart & 'Independence' by Thomas P. Slaughter; Was America's revolution driven by political philosophers, or practical men reacting to events?")

October 15, 2014

We Feel Safer When We Have More Personal Control

(p. C3) So how should we approach risk? The numbers can help, especially if we simplify them. For acute risks, a good measure is the MicroMort, devised by Stanford's Ronald A. Howard in the 1970s. One MicroMort (1 MM) is equal to a one-in-a-million chance of death.

. . .

In truth, "Don't do that, it's dangerous!" is about much more than the numbers. We must also reflect on the full basis for our preferences--such as, to take one small psychological characteristic among many, what we value in life, as well as what we fear.

. . .

In fact, the numbers tend to have the effect of highlighting the psychological factors. Take traveling. For 1 MM, you can drive 240 miles in the U.S., fly 7,500 miles in a commercial aircraft or fly just 12 miles in a light aircraft. We tend to feel safer if we feel more personal control, but we have no control whatsoever in a passenger jet, the safest of all (notwithstanding last week's terrible tragedy). You could take that as evidence of human irrationality. We take it as evidence that human motives matter more than the pure odds allow.

September 14, 2014

Similarities Between Lucretius and Galileo

(p. 254) Like Lucretius, Galileo defended the oneness of the celestial and terrestrial world: there was no essential difference, he claimed, between the nature of the sun and the planets and the nature of the earth and its inhabitants. Like Lucretius, he believed that everything in the universe could be understood through the same disciplined use of observation and reason. Like Lucretius, he insisted on the testimony of the senses, against, if necessary, the orthodox claims of authority. Like Lucretius, he sought to work through this testimony toward a rational comprehension of the hidden structures of all things. And like Lucretius, he was convinced that these structures were by nature constituted by what he called "minims" or minimal particles, that is, constituted by a limited repertory of atoms combined in innumerable ways.

Source:

Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.

August 24, 2014

U.S. Constitution Reflects Lockean Natural Rights

(p. A13) Over the past three decades, Richard A. Epstein has repeatedly argued--with analytical rigor and astonishing erudition--that governments govern best when they limit their actions to protecting liberty and property. He is perhaps best known for "Takings," his 1995 book on the losses that regulations impose on property owners. Of late, he has exposed the flaws of a government-administered health system.

In "The Classical Liberal Constitution," Mr. Epstein takes up the political logic of our fundamental law. The Constitution, he says, reflects above all John Locke's insistence on protecting natural rights--rights that we possess simply by virtue of our humanity. Their protection takes concrete form in the Constitution by restricting the federal government to specific, freedom-advancing and property-protecting tasks, such as establishing a procedurally fair justice system, minting money as a stable repository of value, preserving a national trade zone among the states, and, not least, guarding the rights listed in the Bill of Rights.

(Note: the online version of the review has the date March 23, 2014, and has the title "BOOKSHELF; Book Review: 'The Classical Liberal Constitution,' by Richard A. Epstein; Our understanding of the Constitution lost its way when we embraced the idea that rights are created by a benevolent state.")

August 5, 2014

"A Unique Moment in History . . . When Man Stood Alone"

(p. 71) . . . , something noted in one of his letters by the French novelist Gustave Flaubert: "Just when the gods had ceased to be, and the Christ had not yet come, there was a unique moment in history, between Cicero and Marcus Aurelius, when man stood alone." No doubt one could quibble with this claim. For many Romans at least, the gods had not actually ceased to be--even the Epicureans, sometimes reputed to be atheists, thought that gods existed, though at a far remove from the affairs of mortals--and the "unique moment" to which Flaubert gestures, from Cicero (106-43 BCE) to Marcus Aurelius (121-180 CE), may have been longer or shorter than the time frame he suggests. But the core perception is eloquently borne out by Cicero's dialogues and by the works found in the library of Herculaneum. Many of the early readers of those works evidently lacked a fixed repertory of beliefs and practices reinforced by what was said to be the divine will. They were men and women whose lives were unusually free of the dictates of the gods (or their priests). Standing alone, as Flaubert puts it, they found themselves in the peculiar position of choosing among sharply divergent visions of the nature of things and competing strategies for living.

Source:

Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.

August 3, 2014

Locke and Smith Showed How Economic Life Has Moral Value

(p. 241) Andrzej Rapaczynski discusses "The Moral Significance of Economic Life" in the most recent issue of Capitalism and Society. His abstract summarizes the argument (p. 242) compactly: "Much of the modern perception of the role of economic production in human life--whether on the Left or on the Right of the political spectrum--views it as an inferior, instrumental activity oriented toward self-preservation, self-interest, or profit, and thus as essentially distinct from the truly human action concerned with moral values, justice, and various forms of self-fulfillment. This widely shared worldview is rooted, on the one hand, in the Aristotelian tradition that sees labor as a badge of slavery, and freedom as lying in the domain of politics and pure (not technical) knowledge, and, on the other hand, in the aristocratic medieval Christian outlook, which--partly under Aristotle's influence--sees nature as always already adapted (by divine design) to serving human bodily needs, and the purpose of life as directed toward higher, spiritual reality. . . . As against this, liberal thinkers, above all Locke, have developed an elaborate alternative to the Aristotelian worldview, reinterpreting the production process as a moral activity par excellence consisting in a gradual transformation of the alien nature into a genuinely human environment reflecting human design and providing the basis of human autonomy. Adam Smith completed Locke's thought by explaining how production is essentially a form of cooperation among free individuals whose self-interested labor serves the best interest of all. The greatest "culture war" in history is to re-establish the moral significance of economic activity in the consciousness of modern political and cultural elites." Capitalism and Society, December 2013, vol. 8, no. 2, http://capitalism.columbia.edu/volume-8-issue-2.

August 1, 2014

The Unintended Consequences of Requiring Monks to Read

(p. 28) The high walls that hedged about the mental life of the monks--the imposition of silence, the prohibition of questioning, the punishing of debate with slaps or blows of the whip--were all meant to affirm unambiguously that these pious communities were the opposite of the philosophical academies of Greece or Rome, places that had thrived upon the spirit of contradiction and cultivated a restless, wide-ranging curiosity.

All the same, monastic rules did require reading, and that was enough to set in motion an extraordinary chain of consequences. Reading was not optional or desirable or recommended; in a community that took its obligations with deadly seriousness, reading was obligatory. And reading required books. Books that were opened again and again eventually fell apart, however carefully they were handled. Therefore, almost inadvertently , monastic rules necessitated that monks repeatedly purchase or acquire books. In the course of the vicious Gothic Wars of the mid-sixth century and their still more miserable aftermath, the last commercial workshops of book production folded, and the vestiges of the book market fell apart. Therefore, again almost inadvertently, monastic rules necessitated that monks carefully preserve and copy those books that they already possessed.

Source:

Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.

July 24, 2014

An "Entrepreneurial" Scriptor for the Pope Could Earn 300 Florins a Year

Poggio was a scriptor for a pope who was fired. The jobless Poggio then sought classical manuscripts in obscure monasteries, and found De Rerum Natura.

(p. 21) Scriptors received no fixed stipend, but they were permitted to charge fees for executing documents and obtaining what were called "concessions of grace," that is, legal favors in matters that required some technical correction or exception granted orally or in writing by the pope. And, of course, there were other, less official fees that would privately flow to someone who had the pope's ear. In the mid-fifteenth century, the income for a secretary was 250 to 300 florins annually, and an entrepreneurial spirit could make much more. At the end of a twelve-year period in this office, Poggio's colleague George of Trebizond had salted away over 4,000 florins in Roman banks, along with handsome investments in real estate.

Source:

Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.

July 21, 2014

How De Rerum Natura Aided the Early Italian Renaissance

I am interested in how the dominant ideas in a culture change. Greenblatt's The Swerve discusses how some early Renaissance Italians sought lost and forgotten works from antiquity to broaden their ideas. In particular it emphasizes the rediscovery of Lucretius's De Rerum Natura.

I am not as unreservedly enthusiastic about Lucretius as Greenblatt is, but The Swerve includes much that is thought-provoking about a place and time that I need to better understand.

In the next few weeks I will quote a few of the passages that were especially memorable, important or amusing.

Book discussed:

Greenblatt, Stephen. The Swerve: How the World Became Modern. New York: W. W. Norton & Company, 2011.

May 11, 2014

Fair Use Doctrine Allows Copying for Educational Purposes

(p. 23) I am a public-school teacher with a limited budget for supplies. Is it unethical to illegally download copyrighted instructional materials for use in my class?BEN L., BROOKLYN

It is not. In fact, it's sometimes not even illegal. In 1976, Congress created copyright exceptions for educational purposes. Copyright law allows "face-to-face" exhibition and presentation of a copyrighted work, assuming the purpose is academic. There is also the doctrine of fair use, which states that copies "for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship or research, is not an infringement of copyright."

Now, it's worth acknowledging that these guidelines were implemented before downloading a textbook was even possible. And even in an educational setting, using an entire copyrighted work, and thereby diminishing its market potential, might constitute a violation of fair use. But in my opinion, the principles are the same, even if you do violate copyright law: If your sole motive for downloading material is educational (and there is no free or low-cost equivalent that serves your purposes equally well), there should be no problem.

May 2, 2014

"If You Do the Right Thing and Lose, You Still Did the Right Thing"

Senator Tom Coburn. Source of photo: online version of the NYT interview quoted and cited below.

(p. 12) You recently learned you have prostate cancer and announced that you'll be leaving the Senate next January two years before the scheduled end of your term. How are you feeling? I'm feeling good. I'm not cured of the disease, but I'm on my way to marked improvement. And they may potentially have a cure. But I've got 5 or 10 years in front of me even if they don't cure it.

. . .

Do you really think the problem in Washington is that people don't listen to one another? My philosophy is different than most of the people up here. I think if you do the right thing and lose, you still did the right thing. I think if you do less than the right thing and win, it's morally reprehensible.

May 1, 2014

Edison's Goal Was Not Philanthropy, But to Make Useful Inventions that Sold

(p. 163) . . . , Edison had declared publicly that his inventions should be judged only on the basis of commercial success. This had come about when a reporter for the New York World had asked him a battery of questions that threw him off balance: "What is your object in life? What are you living for? (p. 164) What do you want?" Edison reacted as if he'd been punched in the stomach, or so the writer described the effect with exaggerated drama. First, Edison scanned the ceiling of the room for answers, then looked out the window through the rain. Finally, he said he had never thought of these questions "just that way." He paused again, then said he could not give an exact answer other than this: "I guess all I want now is to have a big laboratory" for making useful inventions. "There isn't a bit of philanthropy in it," he explained. "Anything that won't sell I don't want to invent, because anything that won't sell hasn't reached the acme of success. Its sale is proof of its utility, and utility is success."

He had been put on the spot by the reporter, and had reflexively given the marketplace the power to define the meaning of his own life.

April 10, 2014

Deconstruction Theory as an "Elaborate Cover for Past Sins"

Source of book image: http://www.evelynbarish.com/uploads/1/8/2/7/18270381/847645.jpg?478

(p. 14) Barish, a retired professor of English at the City University of New York Graduate Center, has devoted many years to tracking the elusive trail of the noted literary scholar who made headlines posthumously in 1988, after a researcher in Belgium discovered the trove of literary criticism he had published in that country's leading pro-Nazi newspaper during World War II. De Man, who had emigrated to the United States in 1948, earned a doctorate at Harvard in 1960 and went on to a dazzling academic career, forming a generation of devoted disciples. When he died in 1983 at age 64, he was a revered figure. The author of brilliant if difficult essays on modern literature, he had been among the first to embrace deconstruction, the influential theory elaborated by the French philosopher Jacques Derrida. Deconstruction focused on linguistic ambiguity, infuriating critics who viewed it as a dangerous relativism.

. . .

Detractors maintained that despite obvious differences, the two were cut from the same intellectual cloth: The ideas about "undecidability" in language were an elaborate cover-up for past sins. The most hostile critics seized the opportunity to strike a decisive blow against deconstruction, as a doctrine with unavowable antecedents in Nazism.

Now, almost 30 years later, when the theoretical avant-garde has moved on, "The Double Life of Paul de Man" revives the man and his fall. This time, we get a story of the professor not just as a young collaborator, but as a scheming careerist, an embezzler and forger who fled Belgium in order to avoid prison, a bigamist who abandoned his first three children, a deadbeat who left many rents and hotel bills unpaid, a liar who wormed his way into Harvard by falsifying records, a cynic who used people shamelessly. Some of these accusations have been made before (and documented), but Barish develops them and adds new ones. Her conclusion is somber: She places de Man not among the charming scoundrels but among the false "new messiahs" of history.

March 25, 2014

"Babies Are Smarter than You Think"

Source of book image: http://www.washingtonpost.com/rf/image_296w/2010-2019/WashingtonPost/2013/12/19/Outlook/Images/booksonbooks0031387485124.jpg

Harvard psychologist Steven Pinker discusses a favorite book of 2013:

(p. C11) . . . , babies are smarter than you think, and their cognitive and moral lives, revealed by ingenious experimental techniques, show that fairness, empathy and punitive sentiments have deep roots in human development. Paul Bloom's "Just Babies" illuminates this research with intellectual rigor and a graceful, easygoing style.

March 21, 2014

Hope for "a Morality that Maximizes Human Flourishing"

Source of book image: http://2.bp.blogspot.com/-6zEBTa23QDo/UtsQ6rZTkoI/AAAAAAAACdI/lAdUEZDMyaQ/s1600/Moral+Tribes.png

Harvard psychologist Steven Pinker discusses a favorite book of 2013:

(p. C11) "Moral Tribes," by Joshua Greene, explains the fascinating new field of moral neuroscience: what happens in our brains when we make moral judgments and how ancient impulses can warp our ethical intuitions. With the help of the parts of the brain that can engage in careful reasoning, the world's peoples can find common ethical ground in a morality that maximizes human flourishing and minimizes suffering.

February 14, 2014

(p. 294) "I certainly feel more in harmony with all the world after having been in communion with you, my Prince of Peace. I say this reverently, dear, for truly that is what you are to me, and I am so glad the world knows you as the Great Peacemaker." "What ideal lives we shall lead, giving all our best efforts to high and noble ends, while the drudgery of life is attended to by others. Without high ideals, it would be enervating and sinful. With them, it is glorious, and you are my prince among men, my own love."

Source:

Nasaw, David. Andrew Carnegie. New York: Penguin Press, 2006.

(Note: underline in original.)

(Note: the pagination of the hardback and paperback editions of Nasaw's book are the same.)

February 10, 2014

Carnegie Said "Socialism Is the Grandest Theory Ever Presented"

More on why Andrew Carnegie is not my favorite innovative entrepreneur:

(p. 257) "But are you a Socialist?" the reporter asked.

Carnegie did not answer directly. "I believe socialism is the grandest theory ever presented, and I am sure some day it will rule the world. Then we will have obtained the millennium.... That is the state we are drifting into. Then men will be content to work for the general welfare and share their riches with their neighbors."

"'Are you prepared now to divide your wealth' [he] was asked, and Mr. Carnegie smiled. 'No, not at present, but I do not spend much on myself. I give away every year seven or eight times as much as I spend for personal comforts and pleasures."

Source:

Nasaw, David. Andrew Carnegie. New York: Penguin Press, 2006.

(Note: ellipsis, and bracketed pronoun, in original.)

(Note: the pagination of the hardback and paperback editions of Nasaw's book are the same.)

February 5, 2014

Evidence Babies Are Born with a Sense of Fairness

Source of book image: http://news.yale.edu/sites/default/files/imce/main-bloom.jpg

(p. 15) Is morality innate? In his new book, "Just Babies," the psychologist Paul Bloom draws from his research at the Yale Infant Cognition Center to argue that "certain moral foundations are not acquired through learning. . . . They are instead the products of biological evolution." Infants may be notoriously difficult to study (rats and pigeons "can at least run mazes or peck at levers"), but according to Bloom, they are, in fact, "moral creatures."

He describes a study in which 1-year-olds watched a puppet show where a ball is passed to a "nice" puppet (who passes it back) or to a "naughty" puppet (who steals it). Invited to reward or punish the puppets, children took treats away from the "naughty" one. These 1-year-olds seem to be making moral judgments, but is this an inborn ability? They have certainly had opportunities in the last 12 months to learn good from bad. However, Bloom has found that infants as young as 3 months old reach for and prefer looking at a "helper" rather than a "hinderer," which he interprets as evidence of moral sense, that babies are "drawn to the nice guy and repelled by the mean guy." He may be right, but he hasn't proved innateness.

Proving innateness requires much harder evidence -- that the behavior has existed from Day 1, say, or that it has a clear genetic basis. Bloom presents no such evidence. His approach to establishing innateness is to argue from universalism: If a behavior occurs across cultures, then surely it can't be the result of culture.

January 29, 2014

Spencer Justified Carnegie as an Agent of Progress

(p. 229) Whether they read Spencer for themselves, as Carnegie had, or absorbed his teachings secondhand, his evolutionary philosophy provided the Gilded Age multimillionaires with a framework for rationalizing and justifying their outsized material success. In the Spencerian universe, Carnegie and his fellow millionaires were agents of progress who were contributing to the forward march of history into the industrial epoch. Carnegie was not exaggerating when he proclaimed himself a disciple of Spencer and referred to him, in almost idolatrous terms, as his master, his teacher, one of "our greatest benefactors," and the "great thinker of our age."

Source:

Nasaw, David. Andrew Carnegie. New York: Penguin Press, 2006.

(Note: the pagination of the hardback and paperback editions of Nasaw's book are the same.)

January 19, 2014

Do You Have to Be a Human to Have a Soul?

I cannot prove it to the skeptical, but after observing and interacting with our dachshund Willy almost every day for about 10 years, I strongly believe that he thinks and feels in ways that show he has a soul.

And I have no trouble believing that if a dachshund has a soul, then an elephant has one too.

(p. A21) Caitrin Nicol had an absorbing essay in The New Atlantis called "Do Elephants Have Souls?" Nicol quotes testimony from those who study elephant behavior. Here's one elephant greeting a 51-year-old newcomer to her sanctuary:

"Everyone watched in joy and amazement as Tarra and Shirley intertwined trunks and made 'purring' noises at each other. Shirley very deliberately showed Tarra each injury she had sustained at the circus, and Tarra then gently moved her trunk over each injured part."

Nicol not only asks whether this behavior suggests that elephants do have souls, she also illuminates what a soul is. The word is hard to define for many these days, but, Nicol notes, "when we talk about it, we all mean more or less the same thing: what it means for someone to bare it, for music to have it, for eyes to be the window to it, for it to be uplifted or depraved."

October 14, 2013

Brazilian Entrepreneur Inspired by "The Men Who Built America"

The co-founder of the Havan chain, Luciano Hang, arrives at the chain's flagship store, which is in Brusque, Brazil. Source of photo: online version of the NYT article quoted and cited below.

(p. 6) "My philosophy is pro-capitalism, so of course the best symbols for this come from the United States," said Mr. Hang, who flies around Brazil on a Learjet to visit the nearly 60 stores in his chain, called Havan. "I tell people that we're about freedom: the freedom to stay open when we choose, the freedom to work for us and the freedom to shop," he added. "I know this can be controversial, but I think those who disagree with my approach are few and far between."

. . .

The son of textile factory workers, descended from German and Italian immigrants, Mr. Hang said he admired European culture but preferred the United States. He said he was inspired by a show on the History Channel, "The Men Who Built America," about industrial titans like John D. Rockefeller and Cornelius Vanderbilt.

"I couldn't sleep after I saw that program," he said.

His business model is partly based on Walmart, whose small-town origins he admires, as well as its method of turning economies of scale into low prices.

August 29, 2013

Philosopher Herbert Spencer Defended Capitalism in America

Source of book image: online version of the WSJ review quoted and cited below.

Spencer was sometimes a much better philosopher than the modern caricature portrays, a caricature exemplified by the review quoted below and, perhaps, by the book reviewed. I would like to look at this book sometime, because there may be some interesting history in it---though I am not optimistic about the book's economic assumptions, or its account of Spencer's philosophy.

(p. A11) Herbert Spencer, the 19th-century British philosopher, is remembered today as the forbidding -- almost forbidden -- father of "Social Darwinism," a school of thought declaring that the fittest prosper in a free marketplace and the human race is gradually improved because only the strong survive. In Barry Werth's satisfying "Banquet at Delmonico's," Spencer is also a querulous 62-year-old celibate whose 1882 American tour culminates in a feast to which are invited the "mostly Republican men of science, religion, business, and government" who shared and spread the Spencerian creed.

Applying Darwinian insights about evolution to political, economic and social life -- though he did not himself use the term "Social Darwinism" -- Spencer concluded that vigorous competition and unfettered capitalism conduced to the betterment of society. He predicted that the American, raised in liberty, would evolve into "a finer type of man than has hitherto existed," dazzling the world with "the highest form of government" and "a civilization grander than any the world has known."

. . .

The public clamor over the visit of a dyspeptic foreign philosopher to these shores was partly due to the indefatigable promotion of Edward Livingston Youmans, Spencer's chief American proselytizer, who called his beau ideal the most original thinker in the history of mankind. Youmans is among the several critics and apostles of Spencer and Darwin whose profiles Mr. Werth skillfully interweaves in this Gilded Age tapestry.

August 3, 2013

Wittgenstein Heirs Lost Family Wealth and "Found Little Happiness"

Source of book image: online version of the WSJ review quoted and cited below.

(p. W10) As he lay dying during Christmas 1912 -- from a gruesome throat cancer -- the Viennese industrialist Karl Wittgenstein no doubt took some comfort in the fact that he was leaving to his heirs one of the largest fortunes in Europe. He had acquired his wealth in just 30 years, the period during which Wittgenstein, an engineer, transformed a small steel mill into Europe's largest steel cartel through a combination of hard work, luck and ruthlessness. As der österreichische Eisenkönig (the "Austrian iron king"), he was the chief executive, principal shareholder or director of dozens of industrial companies and banks that provided the ore, manufacturing and financing for most of the steel products of the Habsburg Empire.

In his spare time, Wittgenstein acquired a spectacular house in Vienna, grandly styled as the family's Palais Wittgenstein.

. . .

Today, though, the Wittgenstein millions are gone and the Palais replaced by a hideous concrete apartment block. "Riches," Adam Smith wrote, ". . . very seldom remain long in the same family." Alexander Waugh's grimly amusing "The House of Wittgenstein" shows how the family fortune was lost and how the family members themselves, despite instances of prodigious talent and accomplishment, found little happiness in their own lives or pleasure in their sibling relations.

May 2, 2013

Cultural Impact of Industrial Design Is Greater than Cultural Impact of Fine Arts

(p. C3) Capitalism has its weaknesses. But it is capitalism that ended the stranglehold of the hereditary aristocracies, raised the standard of living for most of the world and enabled the emancipation of women. The routine defamation of capitalism by armchair leftists in academe and the mainstream media has cut young artists and thinkers off from the authentic cultural energies of our time.

Over the past century, industrial design has steadily gained on the fine arts and has now surpassed them in cultural impact. In the age of travel and speed that began just before World War I, machines became smaller and sleeker. Streamlining, developed for race cars, trains, airplanes and ocean liners, was extended in the 1920s to appliances like vacuum cleaners and washing machines. The smooth white towers of electric refrigerators (replacing clunky iceboxes) embodied the elegant new minimalism.

"Form ever follows function," said Louis Sullivan, the visionary Chicago architect who was a forefather of the Bauhaus. That maxim was a rubric for the boom in stylish interior décor, office machines and electronics following World War II: Olivetti typewriters, hi-fi amplifiers, portable transistor radios, space-age TVs, baby-blue Princess telephones. With the digital revolution came miniaturization. The Apple desktop computer bore no resemblance to the gigantic mainframes that once took up whole rooms. Hand-held cellphones became pocket-size.

April 9, 2013

Marx's Contradictions Due to His Being a Reactive Journalist Instead of a Philosopher

Source of book image: http://s-usih.org/wp-content/uploads/2013/04/marx.jpg

(p. 14) Plenty of scholars sweated through the 20th century trying to reconcile inconsistencies across the great sweep of Marx's writing, seeking to shape a coherent Marxism out of Marx. Sperber's approach is more pragmatic. He accepts that Marx was not a body of ideas, but a human being responding to events. In this context, it's telling that Marx's prime vocation was not as an academic but as a campaigning journalist: Sperber suggests Marx's two stints at the helm of a radical paper in Cologne represented his greatest periods of professional fulfillment. Accordingly, much of what the scholars have tried to brand as Marxist philosophy was instead contemporary commentary, reactive and therefore full of contradiction.

March 6, 2013

Entrepreneur Ping Fu Learned the Resilience of Bamboo

Source of book image: online version of the WSJ review quoted and cited below.

(p. A11) The history of American business is full of immigrant success stories--of men and women who flee poverty and oppression in their home countries, arrive on our shores with only pennies in their pockets, and go on to build companies that generate wealth, create jobs, and provide innovative products and services.

Count among them Ping Fu, the Chinese-born chief executive of the high-tech company Geomagic, which provides 3D-imaging for such modern-day miracles as customized prosthetic limbs. If your child wears orthodontic braces, chances are that they were designed for his teeth with the help of Geomagic technology. Ms. Fu founded the company in 1997, 13 years after arriving in San Francisco with $80 in her purse and three English phrases in her vocabulary: "hello," "thank you" and "help."

. . .

In the U.S., Ms. Fu worked as a maid, a waitress and a baby sitter while learning English and studying computer science. She eventually landed at Bell Labs in Illinois before striking out on her own. "I was a reluctant and unlikely entrepreneur," she writes. In China, "I had been hardwired to think that money was evil, and traumatized as a child because of my family's success." Encouraged by her Shanghai Papa to follow in the family's entrepreneurial tradition, she and her then-husband launched Geomagic. In her book, she traces the challenges she faced in building a company--obtaining funding, winning customers, managing a growing staff of professionals.

Ms. Fu's life story raises a core question about the development of the human psyche: Why is it that, confronted with the kind of horrors that Ms. Fu experienced as a child, some survivors succeed in later life while others fail, overcome by the trials they endured?

Ms. Fu credits the tranquil, happy childhood she experienced for the first eight years of her life. She also points to the Taoist teachings of her Shanghai Papa, who taught her to admire the flexible nature of the bamboo trees that grew in the family garden. Bamboo, he told her, "suggests resilience, meaning that we have the ability to bounce back from even the most difficult times."

February 10, 2013

Is America Moving Toward a Less Upwardly Mobile Future?

Source of book image: http://catholicexchange.com/wp-content/uploads/2012/07/Coming-Apart.jpg

(p. C6) The future as described by Charles Murray in "Coming Apart'' is bleak enough to have been imagined by George Orwell. Unfortunately, "Coming Apart" is nonfiction, meticulously documented and depressingly real. Mr. Murray examines America as it moves away from an upwardly mobile, socially mobile country with shared purpose and shared identities to a country dividing into two isolated and disparate camps.

For the full review essay, see:

Jeb Bush (author of passage quoted above, one of 50 contributors to whole article). "Twelve Months of Reading; We asked 50 of our friends to tell us what books they enjoyed in 2012--from Judd Apatow's big plans to Bruce Wagner's addictions. See pages C10 and C11 for the Journal's own Top Ten lists." The Wall Street Journal (Sat., December 15, 2012): passim (Bush's contribution is on p. C6).

(Note: the online version of the review essay has the date December 14, 2012.)

February 4, 2013

Social Scientists Prefer Articles that Contain Bogus Math

Source of graphic: online version of the WSJ article quoted and cited below.

(p. A2) . . . research has shown that even those who should be especially clear-sighted about numbers--scientific researchers, for example, and those who review their work for publication--are often uncomfortable with, and credulous about, mathematical material. As a result, some research that finds its way into respected journals--and ends up being reported in the popular press--is flawed.

In the latest study, Kimmo Eriksson, a mathematician and researcher of social psychology at Sweden's Mälardalen University, chose two abstracts from papers published in research journals, one in evolutionary anthropology and one in sociology. He gave them to 200 people to rate for quality--with one twist. At random, one of the two abstracts received an additional sentence, the one above with the math equation, which he pulled from an unrelated paper in psychology. The study's 200 participants all had master's or doctoral degrees. Those with degrees in math, science or technology rated the abstract with the tacked-on sentence as slightly lower-quality than the other. But participants with degrees in humanities, social science or other fields preferred the one with the bogus math, with some rating it much more highly on a scale of 0 to 100.

"Math makes a research paper look solid, but the real science lies not in math but in trying one's utmost to understand the real workings of the world," Prof. Eriksson said.

January 29, 2013

Fragile Governments Cling to Failed Foreign Aid

Source of book image: http://si.wsj.net/public/resources/images/OB-VL312_bkrvta_DV_20121122124330.jpg

(p. C12) Nassim Nicholas Taleb's "Antifragile" argues that some people, organizations and systems are resilient in the face of stress because they are able to alter themselves by adapting and learning. The converse is fragility, embodied in entities that are immovable even when faced with shocks or adversity. To my mind, an obvious example is how numerous governments and international agencies have clung to foreign aid as a tool to combat poverty even though aid has failed to deliver sustainable growth and meaningfully reduce indigence. And nation-states, which rest on one unifying vision of the nation, tend to be fragile, while city-states that adjust, adapt and constantly evolve tend to be antifragile. Mr. Taleb's lesson: Embrace, rather than try to avoid, the shocks.

For the full review essay, see:

Dambisa Moyo (author of passage quoted above, one of 50 contributors to whole article). "Twelve Months of Reading; We asked 50 of our friends to tell us what books they enjoyed in 2012--from Judd Apatow's big plans to Bruce Wagner's addictions. See pages C10 and C11 for the Journal's own Top Ten lists." The Wall Street Journal (Sat., December 15, 2012): passim (Moyo's contribution is on p. C12).

(Note: the online version of the review essay has the date December 14, 2012.)

January 16, 2013

Descartes Saw that a Great City Is "an Inventory of the Possible"

(p. 226) Joel Kotkin writes about "The Broken Ladder: The Threat to Upward Mobility in the Global City." "A great city, wrote Rene Descartes in the 17th Century, represented 'an inventory of the possible,' a place where people could create their own futures and lift up their families. In the 21st Century--the first in which the majority of people will live in cities--this unique link between urbanism and upward mobility will become ever more critical."

December 21, 2012

Ellison and Jobs on Money

(p. 299) . . . Jobs and his family went to Hawaii for Christmas vacation. Larry Ellison was also there, as he had been the year (p. 300) before. "You know, Larry, I think I've found a way for me to get back into Apple and get control of it without you having to buy it," Jobs said as they walked along the shore. Ellison recalled, "He explained his strategy, which was getting Apple to buy NeXT, then he would go on the board and be one step away from being CEO." Ellison thought that Jobs was missing a key point. "But Steve, there's one thing I don't understand," he said. "If we don't buy the company, how can we make any money?" It was a reminder of how different their desires were. Jobs put his hand on Ellison's left shoulder, pulled him so close that their noses almost touched, and said, "Larry, this is why it's really important that I'm your friend. You don't need any more money."

Ellison recalled that his own answer was almost a whine: "Well, I may not need the money, but why should some fund manager at Fidelity get the money? Why should someone else get it? Why shouldn't it be us?"

"I think if I went back to Apple, and I didn't own any of Apple, and you didn't own any of Apple, I'd have the moral high ground," Jobs replied.

December 8, 2012

"It Isn't What You Know that Counts--It Is How Efficiently You Can Refresh"

Source of book image: online version of the WSJ review quoted and cited below.

(p. A17) Knowledge, then, is less a canon than a consensus in a state of constant disruption. Part of the disruption has to do with error and its correction, but another part with simple newness--outright discoveries or new modes of classification and analysis, often enabled by technology.

. . .

In some cases, the facts themselves are variable. . . .

. . .

More commonly, however, changes in scientific facts reflect the way that science is done. Mr. Arbesman describes the "Decline Effect"--the tendency of an original scientific publication to present results that seem far more compelling than those of later studies. Such a tendency has been documented in the medical literature over the past decade by John Ioannidis, a researcher at Stanford, in areas as diverse as HIV therapy, angioplasty and stroke treatment. The cause of the decline may well be a potent combination of random chance (generating an excessively impressive result) and publication bias (leading positive results to get preferentially published).

If shaky claims enter the realm of science too quickly, firmer ones often meet resistance. As Mr. Arbesman notes, scientists struggle to let go of long-held beliefs, something that Daniel Kahneman has described as "theory-induced blindness." Had the Austrian medical community in the 1840s accepted the controversial conclusions of Dr. Ignaz Semmelweis that physicians were responsible for the spread of childbed fever--and heeded his hand-washing recommendations--a devastating outbreak of the disease might have been averted.

Science, Mr. Arbesman observes, is a "terribly human endeavor." Knowledge grows but carries with it uncertainty and error; today's scientific doctrine may become tomorrow's cautionary tale. What is to be done? The right response, according to Mr. Arbesman, is to embrace change rather than fight it. "Far better than learning facts is learning how to adapt to changing facts," he says. "Stop memorizing things . . . memories can be outsourced to the cloud." In other words: In a world of information flux, it isn't what you know that counts--it is how efficiently you can refresh.

November 23, 2012

Econometrician Leamer Argues for Methodological Pluralism

(p. 44) Ignorance is a formidable foe, and to have hope of even modest victories, we economists need to use every resource and every weapon we can muster, including thought experiments (theory), and the analysis of data from nonexperiments, accidental experiments, and designed experiments. We should be celebrating the small genuine victories of the economists who use their tools most effectively, and we should dial back our adoration of those who can carry the biggest and brightest and least-understood weapons. We would benefit from some serious humility, and from burning our "Mission Accomplished" banners. It's never gonna happen.

November 19, 2012

Econometric "Priests" Sell Their New "Gimmicks" as the "Latest Euphoria Drug"

The American Economic Association's Journal of Economic Perspectives published a symposium focused on the thought-provoking views of the distinguished econometrician Edward Leamer.

I quote below some of Leamer's comments in his own contribution to the symposium.

(p. 31) We economists trudge relentlessly toward Asymptopia, where data are unlimited and estimates are consistent, where the laws of large numbers apply perfectly and where the full intricacies of the economy are completely revealed. But it's a frustrating journey, since, no matter how far we travel, Asymptopia remains infinitely far
away. Worst of all, when we feel pumped up with our progress, a tectonic shift can occur, like the Panic of 2008, making it seem as though our long journey has left us disappointingly close to the State of Complete Ignorance whence we began.

The pointlessness of much of our daily activity makes us receptive when the Priests of our tribe ring the bells and announce a shortened path to Asymptopia. (Remember the Cowles Foundation offering asymptotic properties of simultaneous equations estimates and structural parameters?) We may listen, but we don't hear, when the Priests warn that the new direction is only for those with Faith, those with complete belief in the Assumptions of the Path. It often takes years down the Path, but sooner or later, someone articulates the concerns that gnaw away in each of (p. 32) us and asks if the Assumptions are valid. (T. C. Liu (1960) and Christopher Sims (1980) were the ones who proclaimed that the Cowles Emperor had no clothes.) Small seeds of doubt in each of us inevitably turn to despair and we abandon that direction and seek another.

Two of the latest products-to-end-all-suffering are nonparametric estimation and consistent standard errors, which promise results without assumptions, as if we were already in Asymptopia where data are so plentiful that no assumptions are needed. But like procedures that rely explicitly on assumptions, these new methods work well in the circumstances in which explicit or hidden assumptions hold tolerably well and poorly otherwise. By disguising the assumptions on which nonparametric methods and consistent standard errors rely, the purveyors of these methods have made it impossible to have an intelligible conversation about the circumstances in which their gimmicks do not work well and ought not to be used. As for me, I prefer to carry parameters on my journey so I know where I am and where I am going, not travel stoned on the latest euphoria drug.

This is a story of Tantalus, grasping for knowledge that remains always beyond reach. In Greek mythology Tantalus was favored among all mortals by being asked to dine with the gods. But he misbehaved--some say by trying to take divine food back to the mortals, some say by inviting the gods to a dinner for which Tantalus boiled his son and served him as the main dish. Whatever the etiquette faux pas, Tantalus was punished by being immersed up to his neck in water. When he bowed his head to drink, the water drained away, and when he stretched up to eat the fruit hanging above him, wind would blow it out of reach. It would be much healthier for all of us if we could accept our fate, recognize that perfect knowledge will be forever beyond our reach and find happiness with what we have. If we stopped grasping for the apple of Asymptopia, we would discover that our pool of Tantalus is full of small but enjoyable insights and wisdom.

October 25, 2012

Reality Is Not Always "Elegant"

Source of book image: http://images.betterworldbooks.com/067/Ordinary-Geniuses-Segre-Gino-9780670022762.jpg

(p. C9) In the summer of 1953, while visiting Berkeley, Gamow was shown a copy of the article in Nature where Watson and Crick spelled out some of the genetic implications of their discovery that DNA is structured as a double helix. He immediately realized what was missing. Each helix is a linear sequence of four molecules known as bases. The sequence contains all the information that guides the manufacture of the proteins from which living things are made. Proteins are assembled from 20 different amino acids. What is the code that takes you from the string of bases to the amino acids? Gamow seems to have been the first to look at the problem in quite this way.

But he made a physicist's mistake: He thought that the code would be "elegant"--that each amino acid would be specified by only one string of bases. (These strings were dubbed "codons.") He produced a wonderfully clever code in which each codon consisted of three bases. That was the only part that was right. In the actual code sometimes three different codons correspond to the same amino acid, while some codons do not code for an amino acid at all. These irregularities are the results of evolutionary stops and starts, and no amount of cleverness could predict them.

October 15, 2012

"The New Upper Class Must Start Preaching What It Practices"

Source of book image: http://si.wsj.net/public/resources/images/OB-RO889_bkrvmu_DV_20120130124608.jpg

(p. C2) There remains a core of civic virtue and involvement in working-class America that could make headway against its problems if the people who are trying to do the right things get the reinforcement they need--not in the form of government assistance, but in validation of the values and standards they continue to uphold. The best thing that the new upper class can do to provide that reinforcement is to drop its condescending "nonjudgmentalism." Married, educated people who work hard and conscientiously raise their kids shouldn't hesitate to voice their disapproval of those who defy these norms. When it comes to marriage and the work ethic, the new upper class must start preaching what it practices.

September 30, 2012

A True Tall Tale: Mankiw Lays a Reductio Ad Absurdum on the Egalitarians

(p. 155) Should the income tax system include a tax credit for short taxpayers and a tax surcharge for tall ones? This paper shows that the standard utilitarian framework for tax policy analysis answers this question in the affirmative. This result has two possible interpretations. One interpretation is that individual attributes correlated with wages, such as height, should be considered more widely for determining tax liabilities. Alternatively, if policies such as a tax on height are rejected, then the standard utilitarian framework must in some way fail to capture our intuitive notions of distributive justice.

September 29, 2012

How a Group of "Natural Philosophers" Created Science in a London "Full of Thieves, Murderers and Human Waste"

(p. 19) London before the mid-1600s was a general calamity. The streets were full of thieves, murderers and human waste. Death was everywhere: doctors were hapless, adults lived to about age 30, children died like flies. In 1665, plague moved into the city, killing sometimes 6,000 people a week. In 1666, an unstoppable fire burned the city to the ground; the bells of St. Paul's melted. Londoners thought that the terrible voice of God was "roaring in the City," one witness wrote, and they would do best to accept the horror, calculate their sins, pray for guidance and await retribution.

In the midst of it all, a group of men whose names we still learn in school formed the Royal Society of London for the Improvement of Natural Knowledge. They thought that God, while an unforgiving judge, was also a mathematician. As such, he had organized the universe according to discernible, mathematical law, which, if they tried, they could figure out. They called themselves "natural philosophers," and their motto was "Nullius in verba": roughly, take no one's word for anything. You have an idea? Demonstrate it, do an experiment, prove it. The ideas behind the Royal Society would flower into the Enlightenment, the political, cultural, scientific and educational revolution that gave rise to the modern West.

This little history begins Edward Dolnick's "Clockwork Universe," so the reader might think the book is about the Royal Society and its effects. But the Royal Society is dispatched in the first third of the book, and thereafter, the subject is how the attempt to find the mathematics governing the universe played out in the life of Isaac Newton.

. . .

To go from sinful "curiositas" to productive "curiosity," from blind acceptance to open-eyed inquiry, from asking, "Why?" to answering, "How?" -- this change, of all the world's revolutions, must surely be the most remarkable.

July 18, 2012

Neglecting Valid Stereotypes Has Costs

(p. 169) The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong. The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible.

March 23, 2012

Faraday and Einstein Were Visual and Physical Thinkers, Not Mathematicians

Source of book image: http://www.rsc.org/images/Faraday_Chemical_History-of-a-Candle_180_tcm18-210390.jpg

(p. C6) Michael Faraday is one of the most beguiling and lovable figures in the history of science. Though he could not understand a single equation, he deduced the essential structure of the laws of electromagnetism through visualization and physical intuition. (James Clerk Maxwell would later give them mathematical form.) Albert Einstein kept a picture of Faraday over his desk, for Einstein also thought of himself primarily as a visual and physical thinker, not an abstract mathematician.

. . .

Faraday's text is still charming and rich, a judgment that few popular works on science could sustain after so many years. Though he addresses himself to an "auditory of juveniles," he calls for his audience to follow a close chain of reasoning presented through a series of experiments and deductions.

. . .

. . . : "In every one of us there is a living process of combustion going on very similar to that of a candle," as Faraday illustrates in his experiments.

In his closing, he turns from our metabolic resemblance to a candle to his deeper wish that "you may, like it, shine as lights to those about you."

March 13, 2012

Upper Class "Have Lost the Confidence to Preach What They Practice"

Source of book image: http://4.bp.blogspot.com/-K9jKNHD0vwE/Tzn4yKgEtII/AAAAAAAAC8Q/2wZqk1Hl1V4/s1600/murray-coming-apart.jpg

(p. 9) The problem, Murray argues, is not that members of the new upper class eat French cheese or vote for Barack Obama. It is that they have lost the confidence to preach what they practice, adopting instead a creed of "ecumenical niceness." They work, marry and raise children, but they refuse to insist that the rest of the country do so, too. "The belief that being a good American involved behaving in certain kinds of ways, and that the nation itself relied upon a certain kind of people in order to succeed, had begun to fade and has not revived," Murray writes.

March 7, 2012

Hero Was Oblivious to What Others Thought

Author Eyal Press. Source of photo: online version of the NYT review quoted and cited below.

(p. C26) Maybe the refined intellectual, engaged with ideas, manages to think herself above petty concerns like nationalism? That was what Mr. Press suspected he would find in Aleksander Jevtic, the Serb who pulled many Croatians from a line of men destined to be tortured or killed in 1991.

"Aleksander Jevtic had somehow avoided internalizing this us-versus-them thinking," Mr. Press writes, "which I assumed had something do with his education and intellect, a rare skepticism and levelheadedness that enabled him to see past the blinding passions and compellingly simple ideas that drove the logic of hate."

But when Mr. Press at last meets Mr. Jevtic, he finds not a Balkan Isaiah Berlin, nor a soldier-philosopher like Orwell. This lifesaver, this ethical prince among men, turns out to be a slovenly couch potato living off rents he collects from a building he owns: "He also liked sleeping late, hanging out with friends, and watching sports" on his "giant flat-screen television."

Mr. Press surveys the findings of social scientists and neuroscientists, but none of them have entirely figured out where bravery comes from. Every beautiful soul is different.

Mr. Jevtic's wife is Croatian, which certainly helped him think of the enemy as human. But Mr. Jevtic is also a misanthrope, and his natural social isolation helped him hear the call of an instinctive decency; he didn't care what his fellow Serbians, including his commanding officers, might think.

He "wasn't in the business of making good impressions," Mr. Press writes. "His obliviousness to what others thought wasn't necessarily his most becoming feature. But it had served him well in 1991."

January 16, 2012

What We Eat Affects Our Feelings and Choices?

But since we choose what we eat, we have the power to control how food affects our feelings and choices?

(p. C12) As the neuroscientist Antonio Damasio writes, "The mind is embodied, not just embrained."

The latest evidence comes from a new study of probiotic bacteria, the microorganisms typically found in yogurt and dairy products. While most investigations of probiotics have focused on their gastrointestinal benefits--the bacteria reduce the symptoms of diarrhea and irritable bowel syndrome--this new research explored the effect of probiotics on the brain.

The experiment, led by Javier Bravo at University College Cork in Ireland, was straightforward. First, he fed normal lab mice a diet full of probiotics. Then, Mr. Bravo's team tested for behavioral changes, which were significant: When probiotic-fed animals were put in stressful conditions, such as being dropped into a pool of water, they were less anxious and released less stress hormone.

How did the food induce these changes? The answer involves GABA, a neurotransmitter that reduces the activity of neurons. When Mr. Bravo looked at the brains of the mice, he found that those fed probiotics had more GABA receptors in areas associated with memory and the regulation of emotions. (This change mimics the effects of popular antianxiety medications in humans.)

November 22, 2011

The Costs of Altruism

(p. D1) On entering the patient's room with spinal tap tray portentously agleam, Dr. Burton encountered the patient's family members. They begged him not to proceed. The frail, bedridden patient begged him not to proceed. Dr. Burton conveyed their pleas to the oncologist, but the oncologist continued to lobby for a spinal tap, and the exhausted family finally gave in.

. . .

(p. D2) . . . , Dr. Burton is a contributor to a scholarly yet surprisingly sprightly volume called "Pathological Altruism," to be published this fall by Oxford University Press. . . .

As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one's liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous "how can I help you?" behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.

. . .

David Brin, a physicist and science fiction writer, argues in one chapter that sanctimony can be as physically addictive as any recreational drug, and as destabilizing. "A relentless addiction to indignation may be one of the chief drivers of obstinate dogmatism," he writes. . . .

Barbara Oakley, an associate professor of engineering at Oakland University in Michigan and an editor of the new volume, said in an interview that when she first began talking about its theme at medical or social science conferences, "people looked at me as though I'd just grown goat horns. They said, 'But altruism by definition can never be pathological.' "

To Dr. Oakley, the resistance was telling. "It epitomized the idea 'I know how to do the right thing, and when I decide to do the right thing it can never be called pathological,' " she said.

. . .

Yet given her professional background, Dr. Oakley couldn't help doubting altruism's exalted reputation. "I'm not looking at altruism as a sacred thing from on high," she said. "I'm looking at it as an engineer."

November 4, 2011

"Whatsoever a Man Soweth, That Shall He Also Reap"

"A gardener's recipe for vengeance at the Sixth Street and Avenue B Community Garden in Manhattan." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 20) At the 700 community gardens sprinkled through the city like little Edens, the first commandment should be obvious: Thou shalt not covet, much less steal, thy neighbor's tomatoes, cucumbers or peppers. But people do.

"This was an inside job," Holland Haiis-Aguirre, a key-holder at the West Side Community Garden, said after she arrived at her plot on July 24 to pick a "big, beautiful, full-sized cucumber" that she and her husband had tended from infancy. Instead, she found a denuded vine; her prize cuke apparently was in someone else's salad. "So frustrating," she wailed.

. . .

Sally Young shrouds her 18 heirloom tomato plants in bird netting, but it is not birds she is trying to outwit. Claude Bastide, who grows aromatic herbs, had his spearmint and rosemary plants stolen early in the season. He responded with a sign: "Dear Plant Thief: If I catch you stealing my plants, I will boil you alive in a cauldron filled with poison ivy and stinging nettles until your flesh falls off your bones!"

October 11, 2011

Confirmation Bias (aka "Pigheadedness") in Science

(p. 12) In a classic psychology experiment, people for and against the death penalty were asked to evaluate the different research designs of two studies of its deterrent effect on crime. One study showed that the death penalty was an effective deterrent; the other showed that it was not. Which of the two research designs the participants deemed the most scientifically valid depended mostly on whether the study supported their views on the death penalty.

In the laboratory, this is labeled confirmation bias; observed in the real world, it's known as pigheadedness.

Scientists are not immune. In another experiment, psychologists were asked to review a paper submitted for journal publication in their field. They rated the paper's methodology, data presentation and scientific contribution significantly more favorably when the paper happened to offer results consistent with their own theoretical stance. Identical research methods prompted a very different response in those whose scientific opinion was challenged.

October 7, 2011

Another Nod to Planck's "Cynical View of Science"

The Max Planck view expressed in the quote below, has been called "Planck's Principle" and has been empirically tested in three papers cited at the end of the entry.

(p. 12) How's this for a cynical view of science? "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."

Scientific truth, according to this view, is established less by the noble use of reason than by the stubborn exertion of will. One hopes that the Nobel Prize-winning physicist Max Planck, the author of the quotation above, was writing in an unusually dark moment.

And yet a large body of psychological data supports Planck's view: we humans quickly develop an irrational loyalty to our beliefs, and work hard to find evidence that supports those opinions and to discredit, discount or avoid information that does not.

September 28, 2011

We Tend to Ignore Information that Contradicts Our Beliefs

Source of book image: online version of the WSJ review quoted and cited below.

We learn the most when our priors are contradicted. But the dissonance between evidence and beliefs is painful. So we often do not see, or soon forget, evidence that does not fit with our beliefs.

The innovative entrepreneur is often a person who sees and forces herself to remember, the dissonant fact, storing it away to make sense of, or make use of, later. At the start, she may be alone in what she sees and what she remembers. So if we are to benefit from her ability and willingness to bear the pain of dissonance, she must have the freedom to differ, and she must have the financial wherewith-all to support herself until her vision is more widely shared, better understood, and more fruitfully applied.

(p. A13) Beliefs come first; reasons second. That's the insightful message of "The Believing Brain," by Michael Shermer, the founder of Skeptic magazine. In the book, he brilliantly lays out what modern cognitive research has to tell us about his subject--namely, that our brains are "belief engines" that naturally "look for and find patterns" and then infuse them with meaning. These meaningful patterns form beliefs that shape our understanding of reality. Our brains tend to seek out information that confirms our beliefs, ignoring information that contradicts them. Mr. Shermer calls this "belief-dependent reality." The well-worn phrase "seeing is believing" has it backward: Our believing dictates what we're seeing.

. . .

One of the book's most enjoyable discussions concerns the politics of belief. Mr. Shermer takes an entertaining look at academic research claiming to prove that conservative beliefs largely result from psychopathologies. He drolly cites survey results showing that 80% of professors in the humanities and social sciences describe themselves as liberals. Could these findings about psychopathological conservative political beliefs possibly be the result of the researchers' confirmation bias?

As for his own political bias, Mr. Shermer says that he's "a fiscally conservative civil libertarian." He is a fan of old-style liberalism, as in liberality of outlook, and cites "The Science of Liberty" author Timothy Ferris's splendid formulation: "Liberalism and science are methods, not ideologies." The "scientific solution to the political problem of oppressive governments," Mr. Shermer says, "is the tried-and-true method of spreading liberal democracy and market capitalism through the open exchange of information, products, and services across porous economic borders."

But it is science itself that Mr. Shermer most heartily embraces. "The Believing Brain" ends with an engaging history of astronomy that illustrates how the scientific method developed as the only reliable way for us to discover true patterns and true agents at work. Seeing through a telescope, it seems, is believing of the best kind.

August 19, 2011

"A Brilliant and Exhilarating and Profoundly Eccentric Book"

"David Deutsch." Source of caption and photo: online version of the NYT review quoted and cited below.

(p. 16) David Deutsch's "Beginning of Infinity" is a brilliant and exhilarating and profoundly eccentric book. It's about everything: art, science, philosophy, history, politics, evil, death, the future, infinity, bugs, thumbs, what have you. And the business of giving it anything like the attention it deserves, in the small space allotted here, is out of the question. But I will do what I can.

. . .

The thought to which Deutsch's conversation most often returns is that the European Enlightenment of the 17th and 18th centuries, or something like it, may turn out to have been the pivotal event not merely of the history of the West, or of human beings, or of the earth, but (literally, physically) of the universe as a whole.

. . .

(p. 17) Deutsch's enthusiasm for the scientific and technological transformation of the totality of existence naturally brings with it a radical impatience with the pieties of environmentalism, and cultural relativism, and even procedural democracy -- and this is sometimes exhilarating and sometimes creepy. He attacks these pieties, with spectacular clarity and intelligence, as small-­minded and cowardly and boring. The metaphor of the earth as a spaceship or life-­support system, he writes, "is quite perverse. . . . To the extent that we are on a 'spaceship,' we have never merely been its passengers, nor (as is often said) its stewards, nor even its maintenance crew: we are its designers and builders. Before the designs created by humans, it was not a vehicle, but only a heap of dangerous raw materials." But it's hard to get to the end of this book without feeling that Deutsch is too little moved by actual contemporary human suffering. What moves him is the grand Darwinian competition among ideas. What he adores, what he is convinced contains the salvation of the world, is, in every sense of the word, The Market.

August 17, 2011

A Case for Epistemic and Technological Optimism

Source of book image: http://us.penguingroup.com/static/covers/all/5/5/9780670022755H.jpg

Horgan is well-known for writing a pessimistic book about the future of science. For him to write such a positive review of a book that reaches the opposite conclusion, is impressive (both about him and the book he is reviewing).

From Horgan's review and the reviews on Amazon as of 8/7/11, I view the Deutsch book as potentially important and profound. (I will write more when I have read it.)

(p. 17) . . . Mr. Deutsch knocks my 1996 book, "The End of Science," for proposing that the glory days of science--especially pure science, the effort to map out and understand reality--may be over. Mr. Deutsch equates my thesis with "dogmatism, stagnation and tyranny," all of which, for the record, I oppose. But he makes the case for infinite progress with such passion, imagination and quirky brilliance that I couldn't help enjoying his argument. More often than not I found myself agreeing with him--or at least hoping that he is right.

. . .

If we acknowledge our imperfections, Mr. Deutsch observes, then, paradoxically, there is no problem that we cannot tackle. Death, for instance. Or the apparent incompatibility between the two pillars of modern physics, quantum theory and general relativity. Or global warming, which Mr. Deutsch believes we can overcome through innovation rather than drastic cutbacks in consumption. He gores the sacred cow of "sustainability": Societies are healthiest, he declares, not when they achieve equilibrium but when they are rapidly evolving.

May 28, 2011

"A Lonely Ghost Uttering a Truth that Nobody Would Ever Hear"

(p. 26) He was a lonely ghost uttering a truth that nobody would ever hear. But so long as he uttered it, in some obscure way the continuity was not broken. It was not by making yourself heard but by staying sane that you carried on the human heritage.

May 5, 2011

"When We Get 'Out of Book,' We Are at Our Most Human"

Source of book image: http://www.turingfilm.com/wp-content/uploads/2011/03/11-3-18-The-Most-Human-Human.jpg

To be an innovative entrepreneur is to "get out of book" in the language well-expressed below.

(p. A17) In chess, computers are strongest in the parts of the game in which human players rely most on memory: the opening and closing sequences. (Serious players learn strategies by rote, and the early stages of even grandmaster games contain few surprises for the cognoscenti.) Knowledge of these tried and tested moves is called "the book." By the middle section of a game, however, the number of permutations of moves is too vast for memorization to help. Here players need to get "out of book" and act unexpectedly, which is why computers--even Deep Blue--can struggle.

Mr. Christian elaborates on this distinction and applies it to human intelligence in general. For isn't it precisely when people refuse to get "out of book"--just following orders or playing their role--that we find them least human? Likewise, when we get "out of book," we are at our most human. Think of the difference between the waiter who runs through the usual routine and the one who responds to your order with a witticism. Remaining alive to what is mechanical or original in our own behavior can preserve a sense of human difference.

April 1, 2011

Autos Give Us Autonomy

The open road. Source of photo: online version of the NYT article quoted and cited below.

(p. 60) I've been converted by a renegade school of thinkers you might call the autonomists, because they extol the autonomy made possible by automobiles. Their school includes engineers and philosophers, political scientists like James Q. Wilson and number-crunching economists like Randal O'Toole, the author of the 540-page manifesto ''The Vanishing Automobile and Other Urban Myths.'' These thinkers acknowledge the social and environmental problems caused by the car but argue that these would not be solved -- in fact, would be mostly made worse -- by the proposals coming from the car's critics. They call smart growth a dumb idea, the result not of rational planning but of class snobbery and intellectual arrogance. They prefer to promote smart driving, which means more tolls, more roads and, yes, more cars.

. . .

(p. 65) . . . Macaulay . . . observed in the 19th century that ''every improvement of the means of locomotion benefits mankind morally and intellectually, as well as materially.''

. . .

In an essay called ''Autonomy and Automobility,'' Loren E. Lomasky, a professor of political philosophy at the University of Virginia, invokes Aristotle's concept of the ''self-mover'' to argue that the ability to move about and see the world is the crucial distinction between higher and lower forms of life and is ultimately the source of what Kant would later call humans' moral autonomy. ''The automobile is, arguably, rivaled only by the printing press (and perhaps within a few more years by the microchip) as an autonomy-enhancing contrivance of technology,'' he writes. The planners determined to tame sprawl, Lomasky argues, are the intellectual heirs of Plato and his concept of the philosopher-king who would impose order on the unenlightened masses.

February 24, 2011

Grammar Mavens Are "Guilty of Turning Superstitions into Rules"

Source of book image: http://static.letsbuyit.com/filer/images/uk/products/original/132/76/the-lexicographer-s-dilemma-the-evolution-of-proper-english-from-shakespeare-to-south-park-13276063.jpeg

(p. C29) It's getting harder to make a living as an editor of the printed word, what with newspapers and other publications cutting staff. And it will be harder still now that Jack Lynch has published "The Lexicographer's Dilemma," an entertaining tour of the English language in which he shows that many of the rules that editors and other grammatical zealots wave about like cudgels are arbitrary and destined to be swept aside as words and usage evolve.

. . .

"Too often," he writes, "the mavens and pundits are talking through their hats. They're guilty of turning superstitions into rules, and often their proclamations are nothing more than prejudice representing itself as principle."

And, as he notes in his final chapter, the grammatical doomsayers had better find themselves some chill pills fast, because the crimes-against-the-language rate is going to skyrocket here in the electronic age. There is already much whining about the goofy truncated vocabulary of e-mail and text messaging (a phenomenon Mr. Lynch sees as good news, not bad; to mangle the rules of grammar, you first have to know the rules). And the Internet means that English is increasingly a global language.

January 8, 2011

Longfellow Created a "Hero Whose Bravery Can Inspire"

(p. C13) When it comes to the galloping meter of a narrative poem with a message, Longfellow has no equal.

Unfortunately, this poetic tradition has fallen on hard times. Academics have come to prefer different forms--mainly lyrical verse on personal topics more suited to the tastes of intellectuals than the masses. In recent years, many of Longfellow's works have fallen out of literary anthologies. The reputations of his contemporaries Emily Dickinson and Walt Whitman have eclipsed his own.

In his day, however, Longfellow was America's most widely read poet--and his most widely read poem was interpreted as both a warning cry and a call to action on the eve of the Civil War. Yet Longfellow achieved a larger purpose, creating a national hero whose bravery can inspire his fellow citizens down the generations: "For, borne on the night wind of the past / Through all our history, to the last / In the hour of darkness and peril and need / The people will waken and listen to hear / The hurrying hoofbeats of that steed / And the midnight message of Paul Revere."

December 26, 2010

Alex Was No Birdbrain: "Wanna Go Back"

Alex on left, Irene Pepperberg on right. Source of photo: online version of the NYT review quoted and cited below.

(p. 8) "Alex & Me," Irene Pepperberg's memoir of her 30-year scientific collaboration with an African gray parrot, was written for the legions of Alex's fans, the (probably) millions whose lives he and she touched with their groundbreaking work on nonhuman communication.

. . .

Alex, . . . , is a delight -- a one-pound, three-dimensional force of nature. Mischievous and cocky, he also gets bored and frustrated. (And who wouldn't, when asked to repeat tasks 60 times to ensure statistical significance?) He shouts out correct answers when his colleagues (other birds) fail to produce them. If Pepperberg inadvertently greets another bird first in the morning, Alex sulks all day and refuses to cooperate. He demands food, toys, showers, a transfer to his gym.

This ornery reviewer tried to resist Alex's charms on principle (the principle that says any author who keeps telling us how remarkable her subject is cannot possibly be right). But his achievements got the better of me. During one training session, Alex repeatedly asked for a nut, a request that Pepperberg refused (work comes first). Finally, Alex looked at her and said, slowly, "Want a nut. Nnn . . . uh . . . tuh."

"I was stunned," Pepperberg writes. "It was as if he were saying, 'Hey, stupid, do I have to spell it out for you?' " Alex had leaped from phonemes to sound out a complete word -- a major leap in cognitive processing. Perching near a harried accountant, Alex asks over and over if she wants a nut, wants corn, wants water. Frustrated by the noes, he asks, "Well, what do you want?" Mimicry? Maybe. Still, it made me laugh.

After performing major surgery on Alex, a doctor hands him, wrapped in a towel, to an overwrought Pepperberg. Alex "opened an eye, blinked, and said in a tremulous voice, 'Wanna go back.' " It's a phrase Alex routinely used to mean "I'm done with this, take me back to my cage." The scene is both wrenching -- Alex had been near death -- and creepy, evoking the talking bundle in "Eraserhead."

Pepperberg frames her story with Alex's death: the sudden shock of it, and the emotional abyss into which she fell. Ever the scientist, she wonders why she felt so strongly. The answer she comes up with is both simple -- her friend was dead -- and complex. At long last, and buoyed by the outpouring of support from people around the world, she could express the emotions she'd kept in check for 30 years, the better to convince the scientific establishment that she was a serious researcher generating valid and groundbreaking data (some had called her claims about animal minds "vacuous"). When Alex died, that weight lifted.

(Note: the online version of the review has the date November 7, 2010.)

(p. A21) Even up through last week, Alex was working with Dr. Pepperberg on compound words and hard-to-pronounce words. As she put him into his cage for the night last Thursday, she recalled, Alex looked at her and said: "You be good, see you tomorrow. I love you."

I asked him why more researchers weren't working with African grays, trying to replicate Pepperberg's achievements with Alex. "The problem with these animals is that they are the opposite of fruit flies," he said, meaning that parrots live a long time--often, fifty to sixty years in captivity. "Alex was still learning when he died, and he was thirty." He later elaborated: "Irene's work could not really have been planned ahead, as nobody knew what was possible. . . . Alex's development as a unique animal accompanied Irene's as a unique scientist. Hers is not a career trajectory one would advise to young scientists--it's too risky."

December 5, 2010

A Key to Scientific Truth: Nullius in Verba ("On No One's Word")

(p. 68) . . . scientific understanding didn't progress by looking for truth; it did so by looking for mistakes.

This was new. In the cartoon version of the Scientific Revolution, science made its great advances in opposition to a heavy-handed Roman Catholic Church; but an even larger obstacle to progress in the understanding and manipulation of nature was the belief that Aristotle had already figured out all of physics and had observed all that biology had to offer, or that Galen was the last word in medicine. By this standard, the real revolutionary manifesto of the day was written not by Descartes, or Galileo, but by the seventeenth-century Italian poet and physician Francesco Redi, in his Experiments on the Generation of Insects, who wrote (as one of a hundred examples), "Aristotle asserts that cabbages produce caterpillars daily, but I have not been able to witness this remarkable reproduction, though I have seen many eggs laid by butterflies on the cabbage-stalks. . . ." Not for nothing was the motto of the Royal Society nullius in verba: "on no one's word."

Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

(Note: first ellipsis added; italics and second ellipsis, in original.)

November 7, 2010

How Scientific Progress Was Slowed By Too Much Respect for Aristotelian Theory

William Rosen has a wonderful early example of how too much respect for theory can keep us from making the observations that would eventually prove the theory to be wrong:

(p. 7) Aristotle argued against the existence of a vacuum with unerring, though curiously inelegant, logic. His primary argument ran something like this:

1. If empty space can be measured, then it must have dimension.
2. If it has dimension, then it must be a body (this is something of a tautology: by Aristotelian definition, bodies are things that have dimension).
3. Therefore, anything moving into such a previously empty space would he occupying the same space simultaneously, and two bodies cannot do so.

More persuasive was the argument that a void is unnecessary, that since the fundamental character of an object consists of those measurable dimensions, then a void with the same dimensions as the cup, or horse, or ship occupying it is no different from the object. One, therefore, is redundant, and since the object cannot be superfluous, the void must be.

It takes millennia to recover from that sort of unassailable logic, temptingly similar to that used in Monty Python and the Holy GraiI to demonstrate that if a woman weighs as much as a duck, she is a witch. Aristotle's blind spot regarding the existence of a void would be inherited by a hundred generations of his adherents. Those who read the work of Heron did so through an Aristotelian scrim on which was printed, in metaphorical letters twenty feet high: NATURE ABHORS A VACUUM.

Source:

Rosen, William. The Most Powerful Idea in the World: A Story of Steam, Industry, and Invention. New York: Random House, 2010.

September 19, 2010

Harry Frankfurt's Critique of Postmodernist "Bullshit"

"Harry G. Frankfurt." Source of caption and photo: online version of the NYT article quoted and cited below.

(p. 29) Q: Your new book, "On Truth," is a sequel to "On Bull--," a slim philosophical tract published by Princeton University Press that became an accidental best seller last year.
What do you mean by accidental? People didn't know they were buying it?

. . .

In your new book, you are especially critical of academics and their theories of postmodernism, which treat all truth as an artificial construction as opposed to an independent reality.
I used to teach at Yale, which was at one time a center of postmodernist literary theory. Derrida was there. Paul de Man was there. I originally wrote the bull-- essay at Yale, and a physics professor told me that it was appropriate that this essay should have been written at Yale, because, after all, he said, Yale is the bull-- capital of the world.

But there is probably far more bull-- in politics and entertainment than in academia.
I hope so!

What about in philosophy, which you still teach?
I think there is a certain amount of bull-- in philosophy -- people pretending to have important ideas when they don't and obscuring the fact by using a lot of impenetrable language.

July 5, 2010

Life is Too Short to Waste on Hypercomplex Music and Literature

(p. W14) Are certain kinds of modern art too complex for anybody to understand? Fred Lerdahl thinks so, at least as far as his chosen art form is concerned. In 1988 Mr. Lerdahl, who teaches musical composition at Columbia University, published a paper called "Cognitive Constraints on Compositional Systems," in which he argued that the hypercomplex music of atonal composers like Messrs. Boulez and Carter betrays "a huge gap between compositional system and cognized result." He distinguishes between pieces of modern music that are "complex" but intelligible and others that are excessively "complicated"--containing too many "non-redundant events per unit [of] time" for the brain to process. "Much contemporary music," he says, "pursues complicatedness as compensation for a lack of complexity." (To read his paper online, go to: http://www.bussigel.com/lerdahl/pdf/Cognitive%20Constraints%20on%20Compositional%20Systems.pdf)

. . .

Mr. Lerdahl is on to something, and it is applicable to the other arts, too. Can there be any doubt that "Finnegans Wake" is "complicated" in precisely the same way that Mr. Lerdahl has in mind when he says that a piece of hypercomplex music like Mr. Boulez's "Le marteau sans maître" suffers from a "lack of redundancy" that "overwhelms the listener's processing capacities"?

. . .

"You have turned your back on common men, on their elementary needs and their restricted time and intelligence," H.G. Wells complained to Joyce after reading "Finnegans Wake." That didn't faze him. "The demand that I make of my reader," Joyce said, "is that he should devote his whole life to reading my works." To which the obvious retort is: Life's too short.

April 4, 2010

Philosopher Duped by Hoax Because He Failed to Consult Wikipedia

(p. A4) PARIS -- For the debut of his latest weighty title, "On War in Philosophy," the French philosopher Bernard-Henri Lévy made the glossy spreads of French magazines with his trademark panache: crisp, unbuttoned white Charvet shirts, golden tan and a windswept silvery mane of hair.

But this glamorous literary campaign was suddenly marred by an absolute philosophical truth: Mr. Lévy backed up the book's theories by citing the thought of a fake philosopher. In fact, the sham philosopher has never been a secret, and even has his own Wikipedia entry.

In the uproar that followed over the rigors of his research, Mr. Lévy on Tuesday summed up his situation with one e-mailed sentence: "My source of information is books, not Wikipedia."

April 2, 2010

"Expert Scholarship" Versus "People of Dubious Background"

(p. 71) The acknowledgment, by name, of volunteers in the preface sections of the OED is akin to Wikipedia's edit history, where one can inspect who contributed to each article. Some Oxford contributors were professors, some royalty, but most were ordinary folks who answered the call. Winchester, in The Professor and the Madman: A Tale of Murder, Insanity, and the Making of the Oxford English Dictionary, tells the story of the "madman" William Chester Minor, a U.S. Civil War survivor whose "strange and erratic behavior" resulted in him shooting an "innocent working man" to death in the street in Lambeth. He was sent to Broadmoor asylum for criminal lunatics. He discovered the OED as a project around 1881, when he saw the "Appeal for Readers" in the library, and worked for the next twenty-one years contributing to the project, receiving notoriety as a contributor "second only to the contributions of Dr. Fitzedward Hall in enhancing our illustration of the literary history of individual words, phrases and constructions." Minor did something unusual in not just sending submissions, but having his own cataloging system such that the dictionary editors could send a postcard and "out the details flowed, in abundance and always with unerring accuracy." Until Minor and Murray met in January 1891, no one working with (p. 72) the OED knew their prolific contributor was a madman and murderer housed at Broadmoor.

As we will see in later chapters, a common question of the wiki method is whether one can trust information created by strangers and people of dubious background. But the example of the OED shows that using contributors rather than original expert scholarship is not a new phenomenon, and that projects built as a compendium of primary sources are well suited for harnessing the power of distributed volunteers.

Source:

Lih, Andrew. The Wikipedia Revolution: How a Bunch of Nobodies Created the World's Greatest Encyclopedia. New York: Hyperion, 2009.

January 24, 2010

"Better to Be Socrates Dissatisfied than a Fool Satisfied"

(p. 10) Happiness is, . . . , a complex concept and difficult to measure, and John Stuart Mill had a point when he suggested: "It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied."

January 14, 2010

(p. 182) . . . , Poincaré's elegant math prevailed over Boltzrnann's practical findings. For some thirty years, Boltzmann struggled to get his ideas across. But he failed. He had the word, but he could not find a way to gain its acceptance in the world. For long decades, the establishment held firm.

So in the year 1906, Poincaré became president of the French (p. 183) Académie des Sciences and Boltzmann committed suicide. As Mead debatably puts it, "Boltzmann died because of Poincaré." At least, as Boltzmann's friends attest, this pioneer of the modem era killed himself in an apparent fit of despair, deepened by the widespread official resistance to his views.

He died, however, at the very historic moment when all over Europe physicists were preparing to vindicate the Boltzmann vision. He died just before the findings of Max Planck, largely derived from Boltzmann's probability concepts, finally gained widespread acceptance. He died several months after an obscure twenty-one-year-old student in Geneva named Albert Einstein used his theories in proving the existence of the atom and demonstrating the particle nature of light. In retrospect, Boltzmann can be seen as a near-tragic protagonist in the greatest intellectual drama of the twentieth century: the overthrow of matter.

November 28, 2009

Source of book images: online version of the NYT review quoted and cited below.

(p. C6) Ayn Rand poses theatrically in her signature cape and gold dollar-sign pin on the cover of a groundbreaking new biography. Rand also poses theatrically in this same Halloween-ready costume (Rand impersonators have been known to wear it) on the cover of another groundbreaking new biography. The two books are being published a week apart. And both have gray covers that make them look even more interchangeable. Yet Rand, whose Objectivist philosophy is enjoying one of its periodic resurgences, loathed the very idea of grayness. She preferred dichotomies that were strictly black and white.

. . .

Ms. Heller's book is worth its $35 price, which is not the kind of detail that Rand herself would have been shy about trumpeting. When Russian Bolshevik soldiers commandeered and closed the St. Petersburg pharmacy run by Zinovy Rosenbaum, they made a lifelong capitalist of his 12-year-old daughter, Alissa, who would wind up fusing the subversive power of the Russian political novel with glittering Hollywood-fueled visions of the American dream.

. . .

Crucially, both authors understand the reasons that Rand's popularity has endured, not only among college students dazzled (and thronged into packs) by her triumphant individualism but also by entrepreneurs. From the young Ted Turner, who rented billboards to promote the "Who is John Galt?" slogan from "Atlas Shrugged," to the founders of Craigslist and Wikipedia, who have found self-contradictory new ways to mix populism with individual enterprise, it is clear that (in Ms. Burns's words) "reports of Ayn Rand's death are greatly exaggerated."

(p. A21) Luigi Zingales points out that the legitimacy of American capitalism has rested on the fact that many people, like Warren Buffett and Bill Gates, got rich on the basis of what they did, not on the basis of government connections. But over the years, business and government have become more intertwined. The results have been bad for both capitalism and government. The banks' growing political clout led to the rule changes that helped create the financial crisis.

October 20, 2009

Scientist Huxley: "The Great End of Life is Not Knowledge But Action"

John Barry calls our attention to the views of Thomas Huxley who gave the keynote address at the founding of the Johns Hopkins University:

(p. 13) A brilliant scientist, later president of the Royal Society, he advised investigators, "Sit down before a fact as a little child, be prepared to give up every preconceived notion. Follow humbly wherever and to whatever abysses nature leads, or you shall learn nothing." He also believed that learning had purpose, stating, "The great end of life is not knowledge but action."

Source:

Barry, John M. The Great Influenza: The Story of the Deadliest Pandemic in History. Revised ed. New York: Penguin Books, 2005.

(Note: from the context in Barry, I am not certain whether the Huxley quotes are from the keynote address, or from elsewhere in Huxley's writings.)

July 25, 2009

The Epistemological Implications of Wikipedia

Source of book image: online version of the WSJ review quoted and cited below.

I think the crucial feature of Wikipedia is in its being quick (what "wiki" means in Hawaiian), rather than in its current open source model. Academic knowledge arises in a slow, vetted process. Publication depends on refereeing and revision. On Wikipedia (and the web more generally) knowledge is posted first, and corrected later.

In the actual fact, Wikipedia's coverage is vast, and its accuracy is high.

(Chris Anderson has a nice discussion of Wikipedia in The Long Tail, starting on p. 65.)

(p. A13) Until just a couple of years ago, the largest reference work ever published was something called the Yongle Encyclopedia. A vast project consisting of thousands of volumes, it brought together the knowledge of some 2,000 scholars and was published, in China, in 1408. Roughly 600 years later, Wikipedia surpassed its size and scope with fewer than 25 employees and no official editor.

In "The Wikipedia Revolution," Andrew Lih, a new-media academic and former Wikipedia insider, tells the story of how a free, Web-based encyclopedia -- edited by its user base and overseen by a small group of dedicated volunteers -- came to be so large and so popular, to the point of overshadowing the Encyclopedia Britannica and many other classic reference works. As Mr. Lih makes clear, it wasn't Wikipedia that finished off print encyclopedias; it was the proliferation of the personal computer itself.

. . .

By 2000, both Britannica and Microsoft had subscription-based online encyclopedias. But by then Jimmy Wales, a former options trader in Chicago, was already at work on what he called "Nupedia" -- an "open source, collaborative encyclopedia, using volunteers on the Internet." Mr. Wales hoped that his project, without subscribers, would generate its revenue by selling advertising. Nupedia was not an immediate success. What turned it around was its conversion from a conventionally edited document into a wiki (Hawaiian for "fast") -- that is, a site that allowed anyone browsing it to edit its pages or contribute to its content. Wikipedia was born.

The site grew quickly. By 2003, according to Mr. Lih, "the English edition had more than 100,000 articles, putting it on par with commercial online encyclopedias. It was clear Wikipedia had joined the big leagues." Plans to sell advertising, though, fell through: The user community -- Wikipedia's core constituency -- objected to the whole idea of the site being used for commercial purposes. Thus Wikipedia came to be run as a not-for-profit foundation, funded through donations.

. . .

It is clear by the end of "The Wikipedia Revolution" that the site, for all its faults, stands as an extraordinary demonstration of the power of the open-source content model and of the supremacy of search traffic. Mr. Lih observes that when "dominant encyclopedias" were still hiding behind "paid fire walls" -- and some still are -- Wikipedia was freely available and thus easily crawled by search engines. Not surprisingly, more than half of Wikipedia's traffic comes from Google.

July 1, 2009

RIP Marjorie Grene, Who Helped Polanyi with Personal Knowledge

"Marjorie Grene in 2003." Source of photo and caption: online version of the NYT obituary quoted and cited below.

The NYT reported, in the obituary quoted below, that philosopher Marjorie Grene died on March 16, 2009, at the age of 93.

Although I studied philosophy at the University of Chicago, my time there did not overlap with Marjorie Grene's and I don't believe that I ever met her, or ever even heard her speak (though I did occasionally walk past her former husband David Grene, on my way to talk to Stephen Toulmin).

I am increasingly appreciating Michael Polanyi's book Personal Knowledge in which he introduced his view of what he called "tacit knowledge." In particular, I am coming to believe that tacit knowledge is very important in understanding the role and importance of the entrepreneur.

So if Marjorie Grene was crucial to Personal Knowledge, as is indicated in the obituary quoted below, then she is deserving of serious consideration, and high regard.

(p. 23) In Chicago, she had met Michael Polanyi, a distinguished physical chemist turned philosopher; she ended up helping him research and develop his important book "Personal Knowledge" (1958). The book proposed a far more nuanced, personal idea of knowledge, and directly addressed approaches to science.

"There is hardly a page that has not benefited from her criticism," Dr. Polanyi wrote in his acknowledgments. "She has a share in anything I may have achieved here."

. . .

Her sense of humor sparkled when she was asked about being the first woman to have an edition of the Library of Living Philosophers devoted to her -- Volume 29 in 2002. Previous honorees included Bertrand Russell and Einstein. "I thought they must be looking desperately for a woman," Dr. Grene said.

June 6, 2009

The Ascent of Science Led to Belief that the World Could Improve

I believe the following paragraph expresses the central message of Steven Johnson's book The Invention of Air:

(p. 211) In the popular folklore of American History, there is a sense in which the founders' various achievements in natural philosophy---Franklin's electrical experiments, Jefferson's botany---serve as a (p. 212) kind of sanctified extracurricular activity. They were statesmen and political visionaries who just happened to be hobbyists in science, albeit amazingly successful ones. Their great passions were liberty and freedom and democracy; the experiments were a side project. But the Priestley view suggests that the story has it backward. Yes, they were hobbyists and amateurs at natural philosophy, but so were all the great minds of Enlightenment-era science. What they shared was a fundamental belief that the world could change---that it could improve--- if the light of reason was allowed to shine upon it. And that believe emanated from the great ascent of science over the past century, the upward trajectory that Priestley had s powerfully conveyed in his History and Present State of Electricity. The political possibilities for change were modeled after the change they had all experience through the advancements in natural philosophy. With Priestley, they grasped the political power of the air pump and the electrical machine.

Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

June 2, 2009

Adams, as a Point of Honor, Defended the Innovations of Science

(p. 211) It is no accident that, despite the long litany of injuries Adams felt had been dealt him in Jefferson's letters to Priestley, he chose to begin his counterassault by denying, as a point of honor, that he had ever publicly taken a position as president that was resistant to the innovations of science. Remember that Jefferson had also insinuated that Adams had betrayed the Constitution with his "libel on legislation." But Adams lashed out first at the accusation that he was anti-science. That alone tells us something about the gap that separates the current political climate from that of the founders.

Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

May 29, 2009

"The American Experiment Was, Literally, an Experiment"

(p. 199) This is politics seen through the eyes of an Enlightened rationalist. The American experiment was, literally, an experiment, like one of Priestley's elaborate concoctions in the Fair Hill lab: a system of causes and effects, checks and balances, that could only be truly tested by running the experiment with live subjects. The political order was to be celebrated not because it had the force of law, or divine right, or a standing army behind it. Its strength came from its internal balance, or homeostasis, its ability to rein in and subdue efforts to destabilize it.

Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

May 25, 2009

In the United States "Innovation" Became a Positive Word

(p. 198) "All advances in science were proscribed as innovations." Jefferson is using the older, negative sense of the word "innovation" here: a new development that threatened the existing order in a detrimental way. (The change in the valence of the word over the next century is one measure of society's shifting relationship to progress.) But that regressive age was now over, and Priestley--the most forward-thinking mind of his generation--could now consider himself fully at home:

Our countrymen have recovered from the alarm into which art and industry had thrown them: science and honesty are replaced on their high ground, and you, my dear Sir, as their great apostle, are on its pinnacle. It is with heartfelt satisfaction that in the first moments of my public action, I can hail you with welcome to our land, tender to you the homage of its respect and esteem, cover you under the protection of those laws which were made for the wise and good like you, and disdain the legitimacy of that libel on legislation which under the form of a law was for some time placed among them.

Perhaps inspired by the legendary optimism of Priestley himself, Jefferson then added some of the most stirringly hopeful words that he ever put to paper:

(p. 199) As the storm is now subsiding, and the horizon becoming serene, it is pleasant to consider the phenomenon with attention. We can no longer say there is nothing new under the sun. For this whole chapter in the history of man is new. The great extent of our Republic is new. Its sparse habitation is new. The mighty wave of public opinion which has rolled over it is new. But the most pleasing novelty is, it's so quietly subsiding over such an extent of surface to its true level again. The order and good sense displayed in this recovery from delusion, and in the momentous crisis which lately arose, really bespeak a strength of character in our nation which augurs well for the duration of our Republic; and I am much better satisfied now of it's stability than I was before it was tried.

Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

May 21, 2009

Mary Priestley Praises the Middle Class

(p. 86) Joseph and Mary had not exactly entered English high society, but for the first time in their lives, they were down the hall from it. Mary was largely unimpressed by her firsthand view of the upper classes. One story has Shelburne arriving to welcome them at their new house in Calne, and finding Mary on a ladder, industriously papering the walls. Joseph apologized for their not providing a more gracious welcome, but Mary quickly dismissed her husband's proprieties. "Lord Shelburne is a statesman," she said, "and knows that people are best employed in doing their duty." Later she would observe candidly to (p. 87) Shelburne, "I find the conduct of the upper so exactly like that of the lower classes that I am thankful I was born in the middle."

Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

May 19, 2009

Bacon Died Experimenting and Hegel Died Contradicting Himself

(p. C32) The philosopher Francis Bacon, that great champion of the empirical method, died of his own philosophy: in an effort to observe the effects of refrigeration, on a freezing cold day he stuffed a chicken with snow and caught pneumonia.

As a philosopher dies, so he has lived and believed. And from the manner of his dying we can understand his thinking, or so the philosopher Simon Critchley seems to be saying in his cheekily titled "Book of Dead Philosophers."

. . .

Mr. Critchley recounts that Voltaire, after decades of denouncing the Roman Catholic Church, announced on his deathbed that he wanted to die a Catholic. But the shocked parish priest kept asking him, "Do you believe in the divinity of Christ?" Voltaire begged, "In the name of God, Monsieur, don't speak to me any more of that man and let me die in peace."

Hegel, who, as much as any philosopher, Mr. Critchley says, saw philosophy as an abstraction, while he was dying of cholera, moaned, "Only one man ever understood me ... and he didn't understand me."

May 17, 2009

Joe Biden's "First Principle of Life": "Get Up!"

(p. xxii) To me this is the first principle of life, the foundational principle, and a lesson you can't learn at the feet of any wise man: Get up! The art of living is simply getting up after you've been knocked down. It's a lesson taught by example and learned in the doing. I got that lesson every day while growing up in a nondescript split-level house in the suburbs of Wilmington, Delaware. My dad, Joseph Robinette Biden Sr., was a man of few words. What I learned from him. I learned from watching. He'd been knocked down hard as a young man, lost something he knew he could never get back. But he never stopped trying. He was the first one up in our house every morning, clean-shaven, elegantly dressed, putting on the coffee, getting ready to go to the car dealership, to a job he never really liked. My brother Jim said most mornings he could hear our dad singing in the kitchen. My dad had grace. He never, ever gave up, and he never complained. The world doesn't owe you a living, Joey," he used to say, but without rancor. He had no time for self-pity. He didn't judge a man by how many times he got knocked down but by how fast he got up.

Get up! That was his phrase, and it has echoed through my life. The world dropped you on your head? My dad would say, Get up! You're lying in bed feeling sorry for yourself? Get up! You got knocked on your ass on the football field? Get up! Bad grade? Get up! The girl's parents won't let her go out with a Catholic boy? Get up!

Source:

Biden, Joe. Promises to Keep: On Life and Politics. New York: Random House, 2007.

April 21, 2009

An Intellectual Collaboration Beyond the Grave

There is something touchingly noble in this:

(p. 11) There is no direct evidence in the historical record, but it is entirely probable that it was the waterspout sighting that sent Priestley off on his quest to measure the temperature of the sea, trying to marshal supporting evidence for a passing conjecture his friend had made a decade before. Franklin had been dead for nearly four years, but their intellectual collaboration continued, undeterred by war, distance, even death.

Source:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

April 19, 2009

Why Disney Was a Better Artist than Picasso

Source of book image: http://ebooks-imgs.connect.com/ebooks/product/400/000/000/000/000/035/806/400000000000000035806_s4.jpg

(p. 275) The popularity of the creative arts, and the influence they exert, will depend ultimately on their quality and allure, on the delight and excitement they generate, and on demotic choices. Picasso set his faith against nature, and burrowed within himself. Disney worked with nature, stylizing it, anthropomorphizing it, and surrealizing it, but ultimately reinforcing it. That is why his ideas form so many powerful palimpsests in the visual vocabulary of the world in the early twenty-first century, and will continue to shine through, while the ideas of Picasso, powerful though they were for much of the twentieth century, will gradually fade and seem outmoded, as representational art returns to favor. In the end nature is the strongest force of all.

Source:

Johnson, Paul M. Creators: From Chaucer and Durer to Picasso and Disney. New York: HarperCollins, 2006.

(Note: I am grateful to John Devereux for telling me about Paul Johnson's views on Picasso and Disney.)

April 14, 2009

Steven Johnson's The Invention of Air

Source of book image: http://stevenberlinjohnson.typepad.com/photos/uncategorized/2008/09/10/invention_final_81908.jpg

Steven Johnson's The Ghost Map, about the determined entrepreneurial detective work that uncovered the cause of cholera, is one of my all-time favorite books, so I am now in the mode of reading everything else that Steven Johnson has written, or will write.

The most recent book, The Invention of Air, is not as spectacular as The Ghost Map, but is well-written on a thought-provoking topic. It focuses on Joseph Priestley's role in the American Revolution. Priestley is best known as an early chemist, but Johnson paints him as a poly-math whose science was of a piece with his philosophy, politics and his religion.

Johnson's broader point is that for many of the founding fathers, science was not a compartment of their lives, but part of the whole cloth (hey, it's my blog, so I can mix as many metaphors as I want to).

And the neat bottom line is that Priestley's method of science (and polity) is the same broadly empirical/experimental/entrepreneurial method that usually leads to truth and progress.

Along the way, Johnson makes many amusing and thought-provoking observations, such as the paragraphs devoted to his coffee-house theory of the enlightenment. (You see, coffee makes for clearer thinking than beer.)

The book:

Johnson, Steven. The Invention of Air: A Story of Science, Faith, Revolution, and the Birth of America. New York: Riverhead Books, 2008.

April 9, 2009

How Ayn Rand Matters Today

(p. A7) Ayn Rand died more than a quarter of a century ago, yet her name appears regularly in discussions of our current economic turmoil. Pundits including Rush Limbaugh and Rick Santelli urge listeners to read her books, and her magnum opus, "Atlas Shrugged," is selling at a faster rate today than at any time during its 51-year history.

. . .

Rand . . . noted that only an ethic of rational selfishness can justify the pursuit of profit that is the basis of capitalism -- and that so long as self-interest is tainted by moral suspicion, the profit motive will continue to take the rap for every imaginable (or imagined) social ill and economic disaster. Just look how our present crisis has been attributed to the free market instead of government intervention -- and how proposed solutions inevitably involve yet more government intervention to rein in the pursuit of self-interest.

Rand offered us a way out -- to fight for a morality of rational self-interest, and for capitalism, the system which is its expression. And that is the source of her relevance today.

March 21, 2009

The Values of the Belgian Diamond Market

"Orthodox Jews have been at the center of Antwerp's diamond trade since the late 19th century, when they fled Eastern Europe." Source of the caption and photo: online version of the NYT article quoted and cited below.

Markets will work better when a critical mass of participants hold certain core values, including those of tolerance and honesty.

(p. A11) ANTWERP, Belgium -- Teetering on their bicycles or strolling amiably while chattering into cellphones in Yiddish, Dutch, French, Hebrew or English, the Orthodox Jews of this Belgian port city have set the tone of its lively diamond market for more than a century.

Hoveniersstraat, or Gardener's Street, is the backbone of the market, where four-fifths of the world's uncut diamonds are traded. It winds past the L & A Jewelry Factory and the office of Brinks, the armored car company, and on to the World Diamond Center just opposite the little Sephardic synagogue. On any given day but Friday, it is sprinkled liberally with Orthodox Jewish diamond traders, many of them Hasidim.

. . .

Ari Epstein, 33, is the son of a diamond trader, whose father emigrated from a village in Romania in the 1960s. "It's a typical shtetl environment," he said, wearing the yarmulke with a business suit. "It's live and let live. Most important is to do business together and to be honorable."

February 22, 2009

The Future Is "a Whirlpool of Uncertainty"

(p. B1) Nearly all of us try forecasting the market as if each of the past returns of every year in history had been written on a separate slip of paper and tossed into a hat. Before we reach into the hat, we imagine which return we are most likely to pluck out. Because the long-term average annual gain is about 10%, we "anchor" on that number, then adjust it up or down a bit for our own bullishness or bearishness.

But the future isn't a hat full of little shredded pieces of the past. It is, instead, a whirlpool of uncertainty populated by what the trader and philosopher Nassim Nicholas Taleb calls "black swans" -- events that are hugely important, rare and unpredictable, and explicable only after the fact.

January 28, 2009

Even Dogs "Have a Sense of Fairness"

"This series of photos from the National Academy of Sciences shows a dog being asked for its paw and obeying, left. In the second photo, the dog watches its partner in the experiment receive a food reward that it didn't receive. In the third photo, the dog refuses to give its paw and avoids looking at the experimenter." Source of caption and photos: online version of the Omaha World-Herald article quoted and cited below.

(p. 2A) Ask them to do a trick, and they'll give it a try. For a reward, they'll happily keep at it.

But if one dog gets no reward and then sees another dog get a treat for doing the same trick, just try to get the first one to do it again.

Indeed, the animal may turn away and refuse to look at you.

Dogs, like people and monkeys, seem to have a sense of fairness.

. . .

In the experiments described in today's edition of Proceedings of the National Academy of Sciences, Range and colleagues experimented with dogs that understood the command "paw'' to place a paw in the hand of a researcher. It's the same game as teaching a dog to "shake hands.''

. . .

The dogs sat side by side with an experimenter in front of them. In front of the experimenter was a divided food bowl with pieces of sausage on one side and brown bread on the other.

The dogs were asked to shake hands and could see what reward the other dog received.

When one dog got a reward and the other didn't, the unrewarded animal stopped playing.

September 13, 2008

Do Not Apologize for Your Pursuit of Happiness

(p. A17) There is a whiff of hypocrisy here. Mr. Obama, who made $4.2 million last year and lives in a $1.65 million house bought with the help of the indicted Tony Rezko - and whose "elegant suits" and "impeccable ties" made him one of Esquire's Best-Dressed Men in the World - disdains college students who might want to "chase after the big house and the nice suits." Mr. McCain, who with his wife earned more than $6 million last year and who owns at least seven homes, ridicules Mr. Romney for having built businesses.

But hypocrisy is not the biggest issue. The real issue is that Messrs. Obama and McCain are telling us Americans that our normal lives are not good enough, that pursuing our own happiness is "self-indulgence," that building a business is "chasing after our money culture," that working to provide a better life for our families is a "narrow concern."

They're wrong. Every human life counts. Your life counts. You have a right to live it as you choose, to follow your bliss. You have a right to seek satisfaction in accomplishment. And if you chase after the almighty dollar, you just might find that you are led, as if by an invisible hand, to do things that improve the lives of others.

September 10, 2008

Americans Happy with Work if Advancement is Possible

Source of book image: http://www.arthurbrooks.net/images/book-2.gif

(p. A13) In "Gross National Happiness," Mr. Brooks has assembled an array of statistics to measure the mood of America's citizens and to discover the reasons they feel as they do. Most often he cites polls that ask for self-described happiness levels, matching up the answers with various beliefs, habits, life choices or experiences.

And what exactly is happiness? Who knows? The term might refer joy or contentment or moral self-approval or material well-being or appetitive pleasure - or some combination of them all. Mr. Brooks is aware of the problem. He says that Potter Stewart, the Supreme Court justice, could have been describing happiness when he said, of pornography, "I know it when I see it."

. . .

He challenges those partial to tales about long-suffering Wal-Mart workers and surly burger flippers to rethink their victimology creed. The woe is not nearly as widespread as rumored: 89% of Americans who work more than 10 hours a week are very satisfied or somewhat satisfied with their jobs while only 11% are not very satisfied or not at all satisfied. Most surprisingly, Mr. Brooks writes, there "is no difference at all in job satisfaction between those with below-average and above-average incomes."

What really makes Americans hate their jobs is a perception that advancement is impossible. And while Mr. Brooks agrees that the nation's income gap is growing, the national happiness level is steady. Just under one-third of American adults say that they are "very happy"; up to 15% are not too happy; and everyone else is somewhere in the middle. Those numbers have been roughly true since the early 1970s. More government spending doesn't seem to raise happiness levels, though direct government assistance may diminish it. Charitable giving, Mr. Brooks adds, generally lifts the spirits; Americans do a lot of it.

August 9, 2008

Blacklisting of Voight Urged in Display of Liberal Hollywood McCarthyism

Source of the images: screen captures from the CNN report cited below.

With self-righteous indignation, the left often accuses the right of "McCarthyism."

But many on the left are happy to limit free speech when what is spoken is not to their liking.

Jon Voight's column in the Washington Times has ignited a firestorm, and caused at least one Hollywood insider to openly advocate blacklisting Voight from the movie business. The CNN story cited and linked below, gives some of the details.

Another example is from my own personal experience as a young scholar many decades ago. I had applied to three or four top PhD programs in philosophy and was initially rejected from every one of them, even though I had a nearly perfect GPA, and very high test scores.

I was especially surprised by the rejection from Chicago, because an Associate Dean had visited the Wabash campus the year before and talked with me about applying to Chicago. He had looked at my record and said, 'with your record, if you score X, or above on the GREs, it is almost certain that you will be accepted.' (I don't remember the exact number he said.) Well I scored above X, but was rejected. So I wrote to the Associate Dean, saying I was disappointed and asking if he had any insight about the rejection. He told me that he was dumbfounded and that he would look into it.

Awhile later, I received a letter reversing the decision of the University of Chicago Department of Philosophy. I never learned all the details, but apparently the Dean of Humanities had over-ruled the Department of Philosophy. (This is fairly unusual in academics, and though I do not remember her name, I salute that Dean for taking a stand.)

Years later, the episode came up in a conversation with a member of the philosophy faculty. He said that he had been on the admissions committee the year that I had applied, and that I had been rejected because I had mentioned Ayn Rand in my essay about how I had become interested in philosophy.

March 23, 2008

A Little Optimism Goes a Long Way

(p. B1) Far from deforming our view of the future, this penchant for life's silver lining shapes our decisions about family, health, work and finances in surprisingly prudent ways, concluded economists at Duke University in a new study published in the Journal of Financial Economics. "Economists have focused on optimism as a miscalibration, as a distorted view of the future," said Duke finance scholar David T. Robinson. "A little bit of optimism is associated with a lot of positive economic choices."

. . .

Optimists, the Duke finance scholars discovered, worked longer hours every week, expected to retire later in life, were less likely to smoke and, when they divorced, were more likely to remarry. They also saved more, had more of their wealth in liquid assets, invested more in individual stocks and paid credit-card bills more promptly.

Yet those who saw the future too brightly -- people who in the survey overestimated their own likely lifespan by 20 years or more -- behaved in just the opposite way, the researchers discovered.

Rather than save, they squandered. They postponed bill-paying. Instead of taking the long view, they barely looked past tomorrow. Statistically, they were more likely to be day traders. "Optimism is a little like red wine," said Duke finance professor and study co-author Manju Puri. "In moderation, it is good for you; but no one would suggest you drink two bottles a day."

January 8, 2008

"Working Families in France Want to Be Richer"

(p. 1) In proposing a tax-cut law last week, Finance Minister Christine Lagarde bluntly advised the French people to abandon their “old national habit.”

“France is a country that thinks,” she told the National Assembly. “There is hardly an ideology that we haven’t turned into a theory. We have in our libraries enough to talk about for centuries to come. This is why I would like to tell you: Enough thinking, already. Roll up your sleeves.”

Citing Alexis de Tocqueville’s “Democracy in America,” she said the French should work harder, earn more and be rewarded with lower taxes if they get rich.

. . .

(p. 9) The government’s call to work is crucial to its ambitious campaign to revitalize the French economy by increasing both employment and consumer buying power. Somehow Mr. Sarkozy and his team hope to persuade the French that it is in their interest to abandon what some commentators call a nationwide “laziness” and to work longer and harder, and maybe even get rich.

France’s legally mandated 35-hour work week gives workers a lot of leisure time but not necessarily the means to enjoy it. Taxes on high-wage earners are so burdensome that hordes have fled abroad. (Mr. Sarkozy cites the case of one of his stepdaughters, who works in an investment-banking firm in London.)

In her National Assembly speech, Ms. Lagarde said that there should be no shame in personal wealth and that the country needed tax breaks to lure the rich back.

. . .

“We are seeing an important cultural change,” said Eric Chaney, chief economist for Europe for Morgan Stanley. “Working families in France want to be richer. Wealth is no longer a taboo. There’s a strong sentiment in France that people think prices are too high and need more money. It’s not a question of thinking or not thinking.”

December 21, 2007

"People Giddy on Hope and Thrilled to Be Changing"

"Emily Prager at her lane house in Shanghai." Source of caption and photo: online version of the NYT article quoted and cited below.

The centers of dynamism are not set in stone. I once asked the philosopher Alan Donagan why the Scottish enlightenment had occurred where (Edinburgh) and when (in the mid-late 18th century) it did. With his usual good humor he told me that I was asking a bad question--that my question assumed that enlightenments were determined. He instead believed that they were chance occurrences resulting from the free-will choices of individuals.

I think that there was something to what he said. But I also believe that some institutions, and some policies of government, can greatly increase the probability that fruitful dynamism will occur. For instance, free markets tend to tolerate diversity and experimentation, and to reward initiative.

In the past, locations of economic dynamism, also were often locations of intellectual dynamism. I wonder if the connection is still true today, and if not, why not?

Among past centers of dynamism were Miletus, Athens, Florence, Amsterdam, Edinburgh, and New York City. Today, centers of economic dynamism include Las Vegas, Dubai and Shanghai. The article quoted below paints a generally appealing picture of Shanghai.

(p. D1) I decided to move myself and my 12-year-old daughter, Lulu — whom I had adopted as a baby in China — from the old capital of the world to the new: to make a home in Shanghai, a city of the future.

I knew something about Shanghai, having been here on trips several times in the last few years. The city was always so excited it could hardly contain itself. It is a microcosm of the Asian boom, stuffed with people giddy on hope and thrilled to be changing. It recalls the greatness of New York in the early ’70s, except for one thing: Like the rest of China, Shanghai was largely closed to the outside world, and real economic growth, for nearly 50 years after World War II. It is a place where every car on the road is brand new and every pet recently acquired, but the person you just met might trace his family back 70 generations. The modernity and polish that Manhattan learned between 1945 and 1995, Shanghai is cramming for as fast as it can, and it’s fascinating to watch.

. . .

(p. D6) Pets are new to Chinese people and they don’t know very much about them. Dogs are not neutered and they are walked without leashes. Many people are terrified of dogs, particularly given the country’s serious rabies problem.

Twice when I was walking Skippy, young women caught sight of him and screamed in terror at the top of their lungs. Because having a pet is so new, there is a video showing how to pick up after a dog and wash his paws after his walk, which appears many times a day on a huge video screen on Huaihai, the city’s other main shopping street.

"On the streets of Shanghai, the author's injured foot attracts less attention than her pet dog, still a rare sight in the city." Source of caption and photo: online version of the NYT article quoted and cited above.

December 17, 2007

Life Lesson #1: When Facing a Hungry Bear, the Fence is Your Friend

The photo was taken by Art Diamond at Omaha's Henry Doorly Zoo, at about 6:00 PM on Weds., July 18, 2007. (It was 'Member's Day' and there were signs posted that 6:00 PM was a feeding time for the bears.)

December 15, 2007

Nozick (and Bush) Think it is Fair for You to Keep More of What You Earn

Source of table: online version of the NYT commentary quoted and cited below.

(p. 4) DO the rich pay their fair share in taxes? This is likely to become a defining question during the presidential campaign.

. . .

Fairness is not an economic concept. If you want to talk fairness, you have to leave the department of economics and head over to philosophy.

. . .

In his 1974 book, “Anarchy, State, and Utopia,” Professor Nozick wrote: “We are not in the position of children who have been given portions of pie by someone who now makes last-minute adjustments to rectify careless cutting. There is no central distribution, no person or group entitled to control all the resources, jointly deciding how they are to be doled out. What each person gets, he gets from others who give to him in exchange for something, or as a gift. In a free society, diverse persons control different resources, and new holdings arise out of the voluntary exchanges and actions of persons.”

To libertarians like Professor Nozick, requiring the rich to pay more just because they are rich is little more than officially sanctioned theft.

There is no easy way to bridge this philosophical divide, but the political process will, inevitably, try to forge a practical compromise among those with wildly divergent views. At the 2000 Republican National Convention, the candidate George W. Bush made clear where he stood: “On principle, no one in America should have to pay more than a third of their income to the federal government.” As judged by the C.B.O. data, he has accomplished his goal.

A question for any political candidate today is whether he or she agrees with the Bush tax ceiling. If not, how high above a third is he or she willing to go?

I believe that all art should be private art. But if the government is going to force art on us, at least they should commission art that most find enjoyable to look at.

Tom Wolfe in The Painted Word skewered the pretension of modern "artists" whose "art" is not intended to please, but is intended to make some obscure philosophical point.

If somebody wants to privately finance such activity, fine. But don't force the rest of us to finance it through taxation.

(p. 1B) “So is this where they’re putting Stonehenge?” Stan Hille was walking his dog through Elmwood Park when he stopped to ask me the question. He thought I was a city employee. “Yes, it is,” I said as I stood near one of the gravel pads awaiting Leslie Iwai’s gigantic five-piece sculpture “Sounding Stones .” “But you don’t sound excited.” “Well, I guess I’m not,” he said, stopping to contemplate art. “This thing just reminds me of that old question: ‘When is art not art?’ ” Hmm. Great question. Ancient question. I suggested it might not be art until people, especially a commission of people, tells you it’s art. Or, if it’s big, it’s art. Or, if you make something and then say there’s some meaning to it, then maybe it’s art. As we pondered, Hille’s dog defecated. “Perhaps if I can find some meaning in this poop, then maybe it’s art,” I told him as I rubbed my chin. The retired UNO professor and Dundee resident absorbed my genius. “Perhaps,” he responded, rubbing his chin also. But, alas, I could find no meaning. “Perhaps its lack of meaning is its meaning,” I then argued, sounding not unlike French philosopher Jacques Derrida. “It’s post-postmodern ironic poop.”

. . .

For Hille and others around Elmwood Park, the bigger question seems to be aesthetics. Elmwood Park is a quiet forest setting. Is this really the place for large chunks of concrete, no matter what they mean? “It just doesn’t seem to fit,” Hille said. I’m with him on that. “Sounding Stones ” might make more sense, or at least be better received, in the midst of, say, modern architecture, not nature. You know, perhaps put it downtown, where it could look like it fell off the old Union Pacific building. But if “Sounding Stones ” does end up in Elmwood Park, whichit most likely will, I’m guessing it still will end up being a positive move. Because, as with Hille and me, it’s going to get people thinking and talking about art. And even when you’re looking at dog droppings, taking time out of the day to contemplate art can’t be a completely bad thing.

Yes, Robert, it can be "a completely bad thing" if you have alternative uses for your time.

November 3, 2007

Not All World Views Can Be Accommodated

I remember in a philosophy class back in the 1970s, the philosopher Stephen Toulmin mentioning that he had once attended a conference with the famed anthropologist Claude Lévi-Strauss. For a long while Lévi-Strauss sat in silence. Finally he stirred himself to speak, and Toulmin wondered what wisdom the great man would pronounce.

His comment was something like: "It is hot in here. Will someone open a window?"

News of the death of the philosopher Richard Rorty on June 8 came as I was reading about a small Brazilian tribe that the French anthropologist Claude Lévi-Strauss studied in the 1930s. A strange accident, a haphazard juxtaposition — but for a moment this pragmatist philosopher and a fading tribal culture glanced against each other, revealing something unusual about the contemporary scene.

. . .

For Mr. Rorty, the importance of democracy is that it creates a liberal society in which rival truth claims can compete and accommodate each other. His pragmatism was postmodern, tolerant to a fault, its moral and progressive conclusions never appealing to a higher authority.

. . .

The Caduveo founding myth recounts that, lacking other gifts at the moment of creation, the tribe was given the divine right to exploit and dominate others.

. . .

But there was also something else about this tribe that drew Mr. Lévi-Strauss’s attention: “It was a society remarkably adverse to feelings that we consider as being natural.” Its members disliked having children. Abortion and infanticide were so common that the only way the tribe itself could continue was by adoption, and adoption — more properly called abduction — was traditionally implemented through warfare. The tribal disdain for nature extended into its active denigration of hair, agriculture, childbirth and even, perhaps, representational art.

. . .

In reasoning one’s way into pragmatism, in minimizing the importance of natural constraints and in dismissing the notion of some larger truth, the tendency is to assume that as different as we all are, we are at least prepared to accommodate ourselves to one another. But this is not something the Caduveo would necessarily have gone along with. Mr. Rorty’s outline of what he called “the utopian possibilities of the future” doesn’t leave much room for the kind of threat the Caduveo might pose, let alone other threats, still active in the world.

One tendency of pragmatism might be to so focus on the ways in which one’s own worldview is flawed that trauma is more readily attributed to internal failure than to external challenges. In one of his last interviews Mr. Rorty recalled the events of 9/11: “When I heard the news about the twin towers, my first thought was: ‘Oh, God. Bush will use this the way Hitler used the Reichstag fire.’ ”

If that really was his first thought, it reflects a certain amount of reluctance to comprehend forces lying beyond the boundaries of his familiar world, an inability fully to imagine what confrontations over truth might look like, possibly even a resistance to stepping outside of one’s skin or mental habits.

But in this too the Caduveo example may be suggestive. As Mr. Lévi-Strauss points out, neighboring Brazilian tribes were as hierarchical as the Caduveo but lacked the tribe’s sweeping “fanaticism” in rejecting the natural world. They reached differing forms of accommodation with their surroundings. The Caduveo, refusing even to procreate, didn’t have a chance. They survive now as sedentary farmers. Such a fate of denatured inconsequence may eventually be shared by absolutist postmodernism. The Caduveo’s ideas weren’t useful, perhaps. Some weren’t even true.

October 19, 2007

Business Should Stop Apologizing for Creating Wealth

David Kelley's op-ed piece, excerpted below, was published in the WSJ on October 10, 2007, the 50th anniversary of the publication of Ayn Rand's greatest novel.

Fifty years ago today Ayn Rand published her magnum opus, "Atlas Shrugged." It's an enduringly popular novel -- all 1,168 pages of it -- with some 150,000 new copies still sold each year in bookstores alone. And it's always had a special appeal for people in business. The reasons, at least on the surface, are obvious enough.

Businessmen are favorite villains in popular media, routinely featured as polluters, crooks and murderers in network TV dramas and first-run movies, not to mention novels. Oil company CEOs are hauled before congressional committees whenever fuel prices rise, to be harangued and publicly shamed for the sin of high profits. Genuine cases of wrongdoing like Enron set off witch hunts that drag in prominent achievers like Frank Quattrone and Martha Stewart.

By contrast, the heroes in "Atlas Shrugged" are businessmen -- and women. Rand imbues them with heroic, larger-than-life stature in the Romantic mold, for their courage, integrity and ability to create wealth. They are not the exploiters but the exploited: victims of parasites and predators who want to wrap the producers in regulatory chains and expropriate their wealth.

. . .

. . . At a crucial point in the novel, the industrialist Hank Rearden is on trial for violating an arbitrary economic regulation. Instead of apologizing for his pursuit of profit or seeking mercy on the basis of philanthropy, he says, "I work for nothing but my own profit -- which I make by selling a product they need to men who are willing and able to buy it. I do not produce it for their benefit at the expense of mine, and they do not buy it for my benefit at the expense of theirs; I do not sacrifice my interests to them nor do they sacrifice theirs to me; we deal as equals by mutual consent to mutual advantage -- and I am proud of every penny that I have earned in this manner…"

We will know the lesson of "Atlas Shrugged" has been learned when business people, facing accusers in Congress or the media, stand up like Rearden for their right to produce and trade freely, when they take pride in their profits and stop apologizing for creating wealth.

October 7, 2007

Thales of Miletus Lives

This is part entertaining rant and part serious epistemology. I've finished 9 of 19 chapters so far--almost all of my reading time spent smiling.

Historians of Greek philosophy used to tell the story of one of the first philosophers, Thales of Miletus, that he once was watching the stars, and fell into a well. The citizens of Miletus made fun of him being an impractical philosopher. To prove them wrong, he used his knowledge to corner the market in something, and made a fortune.

Not a very plausible story, but appealing to us philosophers. (Like Thales, we like to think we could all be rich, if we didn't have higher goals.)

Well apparently Taleb is the real Thales. He wanted to be a philosopher, got rich on Wall Street using his epistemological insights, and is now using his wealth to finance his musings on whatever he cares to muse on.

Beautiful!

Here's an amusing sentence that broadened my grin. (It was even more amusing, and profound, in context, but I don't have time to type in the context for you.)

(p. 87) If you are a researcher, you will have to publish inconsequential articles in "prestigious" publications so that others say hello to you once in a while when you run into them at conferences.

July 31, 2006

"Capitalism has Not Corrupted Our Souls; It has Improved Them"

Deirdre McCloskey's unfashionable, contrarian and compelling manifesto in favor of what she calls the bourgeois virtues starts with an uncompromising "apology" for how private property, free labor, free trade and prudent calculation are the fount of most ethical good in modern society, not a moral threat to it.

The intelligentsia -- in thrall for centuries to religion and now to socialism -- has for a long time snobbishly despised the bourgeoisie that practices capitalism. Ms. McCloskey calls such people the "clerisy." Their values and virtues, like those of the proletariat and the aristocracy, are widely admired. But almost nobody admires the bourgeoisie. Yet it was for anti-bourgeois ideologies, she notes, that "the twentieth century paid the butcher's bill."

As Ms. McCloskey explains: "Anyone who after the twentieth century still thinks that thoroughgoing socialism, nationalism, imperialism, mobilization, central planning, regulation, zoning, price controls, tax policy, labor unions, business cartels, government spending, intrusive policing, adventurism in foreign policy, faith in entangling religion and politics, or most of the other thoroughgoing nineteenth-century proposals for government action are still neat, harmless ideas for improving our lives is not paying attention." By contrast, she argues, "capitalism has not corrupted our souls. It has improved them."