Friday, September 30, 2011

Daniel Kahnemann and Nassim Taleb discuss biases, the illusion of patterns as well es the perception of risk and denial

These guys are very good, and it is very surprising to me that they know each other well and talk all the time.
The key points here are:

Rare events occur.

It is generally easier to model common events than rare events.

People have two predictive systems, an intuitive one and an analytic system.

People have limited rationality.

If you provide a simple model that works almost all of the time, and you don't have a model that predicts both the common and the rare events, then people will use the simple model.

The intuitive predictive system will tend to expect things to continue; thus you will intuitively feel that a simple model that has worked successfully many times in the past will continue to work.

(Apparently) your intuitive system feeling that the future will continue as the past will tend to overpower your analytic system which tells you that a rare event may occur.

If you build fragile systems, when the rare event does occur things will break.

This analysis explains why people live in beach communities subject to hurricanes, why Tokyo is built in a place where it may be destroyed by earthquake and typhoon, and why the financial system crashed in 2008.

Nobel Laureate Kahnemann made the point that it is wrong to assume that the firms are agents with independent will. The decisions made for firms are made by people in those firms. The incentives, risks and time frames of firm executives are not those of "the firm", its investors nor the general public. The financial meltdown was in part due to executives making decisions that paid off in terms of their own objectives and time frames but bankrupt their firms. Talib points out that the result was that a lot of these firms were capitalist while the simple predictions were paying off (and executives went home with big pay envelopes) but became socialist when the rare event of the financial crash occurred (and the state was asked to pay the costs).

While the context of this talk given two and a half years ago was the financial crisis, the deep structure applies to many situations. I very much like Talib's point that one should build structures and systems where possible so that rare events do not cause human catastrophes. Build earthquake resistant houses, hurricane resistant buildings, and bubble safe financial institutions.

The problem of course occurs in Haiti where day to day survival is so tough, where people are living in a continuing state of disaster, that you can't really expect them to move to places safe from earthquakes and tsunamis nor build safe houses rather than the shacks that they can barely afford.

Visionaries see a future of telecommuting workers, interactive libraries and multimedia classrooms. They speak of electronic town meetings and virtual communities. Commerce and business will shift from offices and malls to networks and modems. And the freedom of digital networks will make government more democratic.

Baloney. Do our computer pundits lack all common sense? The truth in no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works.

This is one of many examples of smart people making mistakes in prognostication of the impact of a new technology by assuming that the technology will not become better with time and effort, that the technological system will not spawn flocks of killer apps, and that the technology and its social construction will not evolve to produce something more useful and more used that what exists at the time of the forecast.

On the other hand, I think it useful to consider the dark side of the force. Are there things to worry about with respect to the growth of the Internet? Sure! We have become very dependent on the Internet, and so exposed to threats of cyber war, cyber terrorism, and cyber crime. The Internet is full of threats to public morality, and many of the ways that culture can evolve in the Internet era will be less than optimum. We are seeing systems of curation (e.g. librarians and expert booksellers providing advice), news gathering and the editing of news deteriorating as their old financial models are undermined by the Internet.

The question is how to worry about the realistic threats from the dark side without worrying unduly about current failures in the technology that will likely be corrected in time by the normal processes of cultural and technological evolution.

I was again watching the TV broadcast of Justice, the Harvard course taught by Michael Sandel. I have a few simple thoughts.

It seems to me that since we think with our brains, our concept of justice must come back to what or who we are. If a proposition does not correspond to what we intuitively feel to be just, then there is something deficient in it. As I consider theories of justice it seems to me that ethicists return to justifications based on what "feels right".

But what feels right is also a function of culture, what we learned at our mother's knee. I have always thought the Kant's deontological ethics owe something to his Germanic culture. That actually seems useful to me in that we have developed societies that seem much better at doing justice than those of the past. (See The Better Angels of Our Nature by Stephen Pinker.) On the other hand, German culture brought us militarism and people who claimed they were only doing their duty when the conducted the holocaust.

I also wonder about whether it makes sense socially to go through all this complex analysis. In the famous example of the trolley car (which will either kill five people if allowed to continue on its course, or can be stopped by an action that would drop someone on the track and derail the trolley) going into a long, exhaustive analysis to make the right decision would leave the decision made in abeyance. We recognize the heroism of the soldier who instantaneously throws himself on a grenade to save his buddies, doing so without any reflection. More generally, it would seem that the world is a better place if people chose lots of courses of action according to what feels just without much reflection, rather than spending a lot of time and effort reviewing the ethics of every decision.

Which brings me to the point that maybe we should be going towards an explicit theory of "situational ethics", not in the sense of Joseph Fletcher, but rather in the sense of the role of the person making a decision as to the just course of action. Thus the just course of action for a soldier may well be dependent on his/her role as a soldier in battle; the just course of action may depend on whether a person is in the role of a policeman, a doctor, a parent. or a bystander with none of those roles.

Utilitarian ethics seem to make a lot more sense to me for a legislator analyzing proposed legislation rather than a parent analyzing what to do with no money and a hungry child. If one is actually involved in a decision that may affect millions of people with strong benefits and risk associated with several alternatives, a lengthly, detailed and expensive analysis may well be required.

The more I think about the U.S. Constitution, the more impressed I am by the authors who more than 200 years ago conceived of a system that would evolve, that would be based on the negotiation among many conflicting systems, that would work to assure the rule of law rather than the rule of man, that would provides checks and balances between those who set policies, those charged with implementing them, and those charged with assuring the rule of law, and that would protect basic human rights against the will of the majority that would infringe upon those rights. Getting justice in the real world is hard and the results of the Constitution have often been unjust, but could you have done better?

Tuesday, September 27, 2011

Historically, rights of diplomats and diplomatic missions evolved over time. Diplomats came to have rights of diplomatic immunity from the laws of the country to which they were accredited, and embassies and consulates came to be regarded as the territory of the country to which they belong.

When the United Nations was created, it was obvious that the United Nations would need a legal personality to enter into contracts, etc., and that its functionaries would need similar rights to those of diplomats. The United Nations created the Convention on the Privileges and Immunities of the United Nations to secure those rights, and the United States acceded to that Convention in 1970.

The United Nations also decided that the specialized agencies of the UN system (ILO, FAO, UNESCO, ICAO, IMF, IBRD, WHO, UPU, ITU and similar agencies) would also need legal personalities and employee rights recognized by the nations in which the were located and worked. The United Nations therefore created the Convention on the Privileges and Immunities of the Specialized Agencies. The United States has never acceded to that Convention.

A United Nations convention is essentially a multinational treaty, and conventions are regarded in U.S. law as treaties; accession is handled as a treaty ratification by the Government.

This is but one of many United Nations conventions that the United States has failed to ratify. Important examples exist in the field of human rights. The United States, in the person of Eleanor Roosevelt, chaired the UN committee that drafted the Universal Declaration of Human Rights. Declarations are passed by the UN General Assembly, but since they do not have the force of treaties, they do not require ratification. The member states of the United Nations therefore began to negotiate a number of conventions to give teeth to the protection of human rights. Several such conventions have been drafted, and have even gone into effect internationally when enough countries had acceded to them. The United States has not ratified conventions on the rights of women, rights of children, the International Criminal Court, the rights of workers, and rights of migrants. The United States and Somalia are the only two nations that have not ratified the Convention on the Rights of the Child!

Clearly, the United States should now ratify the Convention on the Privileges and Immunities of the Specialized Agencies.

There was a long decline of the GDPs of China and India as a percentage of the world total GDP from the beginning of the graphs until the second half of the 20th century when those countries started to regain GDP share.

Europe's share grew during the 19th century and declined in the 20th.

Japan's share declined through most of the 19th century, stabilized, and then grew until World War II. In the post war years, Japan's GDP grew very rapidly as a share of the global GDP, and then peaked and began to decline in the last decade or two.

The growth of the U.S. GDP was very strong until the Great Depression when it declined precipitously. It grew very rapidly during World War II, primarily because of the rapid growth in U.S. economic output, but also due to the reduction of output in other industrialized nations. A long decline in the second half of the 20th century has been reversed in recent decades.

The Great Depression and World War II both depressed the economies of the rest of the world.

Be a little careful with the data, since it is not clear exactly how it was developed, and what kind of measure of GDP is being used (nominal, PPP). The trends, however, are clear. The Depression and world wars are bad for the global economy. Europe, which dominated the global economy in 1900 has seen its productivity in a long decline through the 20th century, and its influence in world affairs suffer correspondingly.

India and China, now with almost a third of global population, failed to grow with the developed nations for a very long time, but are now growing more rapidly that the global economy as a whole, beginning to recapture some of their past importance.

The persistence of these trends over many decades would tend to support Niall Ferguson's analysis that we are dealing with major institutional innovations and the time needed for their implantation and diffusion.

One suspects that governments do not take well to a long term reduction in international influence related to a long term reduction in comparative economic power of their nations. On the other hand, governments of nations that are experiencing a decades long increase in relative economic power seem likely to flex their political muscle.

Ferguson's major point is that the elements institutionalized in Western Society a few centuries ago that led to the great economic divergence are now diffusing to other continents and the process of divergence has been replaced by one of convergence. His graphs are very convincing.

His six killer apps, which he gets to almost 10 minutes into the presentation include thinks like competition, the work ethic and consumerism, but number 2 is The Scientific Revolution. I suspect he is right in that that revolution affected both what we think and the way we think, and was very important in the rise of the West.

I would add the Industrial Revolution. The Scientific Revolution seems to me to deal with how we think about the world around us and how it works. The Industrial Revolution deals with technology, how we think about the ways we do and make things. I believe that there was a technological revolution that complements and was complemented by the Scientific Revolution.

An interesting question is whether we (civilization) have made or will make soon another killer app. After all, the six named by Ferguson occurred over a few hundred years and were not quickly recognized for their huge potential. Why should the institutional infrastructure of modern society not continue to evolve by the invention and institutionalization of new apps? Indeed, perhaps the Information Revolution will prove to be such a killer app for society.

In the background as I have been blogging, the television has been playing Justice: What's the Right Thing to Do?, the televised lectures on philosophy from Harvard University. A couple of things occurred to me in the process.

Students were asked whether it was right to sacrifice one person's life to save several with various examples illustrating that the decision often depends on the circumstances. In some of the circumstances, the decision maker could have in principle sacrificed his/her own life to save that of others rather than sacrifice the life of an innocent bystander. None recognized the alternative of self sacrifice. Yet we honor soldiers in battle who make just that sacrifice, sometimes with the Congressional Medal of Honor.

There was also a discussion of utilitarian theories of justice. It occurs to me that the way to determine the just course of action may well depend on the institutional location of the decision maker.

People in the White House and the Congress may well find cost-benefit analysis, seeking the greatest benefit for the greatest number to be an important tool in the search for justice. Of course, cost-benefit analysis is often done badly, and there is little moral justification for badly done analysis.

Google has a policy "Don't be evil". Not a bad approach to justice for a corporation, except of course it is often difficult to determine whether a course of action is evil. Still, corporations that do cost benefit calculations based on corporate profits and risks rather than the benefits and risks to their public are likely to do evil.

An individual seems well advised to follow his/her conscience and avoid the merely expedient. We admire those who rise to courageous self-sacrifice, but we seldom criticize those who fail that test, especially in the heat of the moment.

I suspect that ethics depend fundamentally on our evolved being. Ethics for a tiger must be different than for a rabbit, and both for a man. We are evolved to make snap decisions in times of immediate peril, and we must recognize that such decisions may well appear less than ethical were they made in leisure. I find it hard to think that it is "right" to act in a way that we feel is morally wrong because of an abstract argument about the right. Indeed, I think a lot of ethical arguments are based in bringing principles into line with feelings.

In this respect, we are social animals, evolved to live in communities. Even other primates seem to have feelings that the right thing to do is in some cases to promote the common good. The extreme libertarian arguments that seem based on maximizing one's private good run into the reality that we have evolved to value our families and our communities.

(T)wo researchers gave 208 undergraduates a battery of trolleyological tests and measured, on a four-point scale, how utilitarian their responses were. Participants were also asked to respond to a series of statements intended to get a sense of their individual psychologies. These statements included, “I like to see fist fights”, “The best way to handle people is to tell them what they want to hear”, and “When you really think about it, life is not worth the effort of getting up in the morning”. Each was asked to indicate, for each statement, where his views lay on a continuum that had “strongly agree” at one end and “strongly disagree” at the other. These statements, and others like them, were designed to measure, respectively, psychopathy, Machiavellianism and a person’s sense of how meaningful life is.

Dr Bartels and Dr Pizarro then correlated the results from the trolleyology with those from the personality tests. They found a strong link between utilitarian answers to moral dilemmas (push the fat guy off the bridge) and personalities that were psychopathic, Machiavellian or tended to view life as meaningless. Utilitarians, this suggests, may add to the sum of human happiness, but they are not very happy people themselves.

Of course, as Machiavelli suggested, and I did above, people who make public policy may be well advised to base them on utilitarian principles. Perhaps the Machiavellian undergraduates will be those who in the future with be "the princes" of their time, making public policy.

Wednesday, September 21, 2011

Your Medical Mind: How to Decide What Is Right for Youby Jerome Groopman and Pamela Hartzband, as its title indicates, discusses medical decision making, and how individuals should make decisions with respect to their treatment incorporating information and advice from medical professionals. Following the discussion of a study in which patients and doctors made different decisions about preferred treatments because the patients on average were more risk adverse than the doctors, the authors write:

Patients should be aware that there can be differing views among specialists about who should be treated for various conditions. For example, expert committees in Europe and the United States crafted different guidelines about when to treat high blood pressure. The group of American experts believed that the benefits outweighed the risks from treatment for mild elevation of blood pressure and wrote guidelines that advise medication for patients like Alex Miller. But in Europe, an expert committee with access to the same scientific data formulated different guidelines that don't advise treatment for mild elevation of blood pressure. In Europe, Alex and others like him would not be encouraged to take medication. Different groups of experts can disagree significantly about what is "best practice." Dr. Rodney Hayward, a widely respected researcher on health care at the University of Michigan, recently wrote in the New England Journal of Medicine that "the assessment of whether the benefit is great enough to warrant the risk of harm — i.e., the decision of where the threshold for intervention should lie — is necessarily a value judgment."

Why is it subjective, a value judgment, rather than a matter of a clear black-and-white answer? Because, Hayward continues, for many treatments there exists a substantial "gray area of indeterminate net benefit."

Hayward mentions cholesterol levels as one example of such a gray area. We examined the "net benefit" of treatment in Susan Powell's deliberation about taking a statin medication. "Net benefit" means the potential gains from the treatment minus the downsides. After seeing all the data, particularly the "number needed to treat," she didn't believe the net benefit was worth it, given the risks statins entail. In effect, Susan set a different cutoff for herself from the one some experts would apply, not because she was "health illiterate" or "irrational," but because she has a different subjective assessment from that of the experts who wrote the recommendations. We agree strongly with Hayward that within the substantial gray area of indeterminate net benefit, "physicians should defer to an individual patient's preferences in choosing whether or not to intervene."

How do recommendations for "best practice" come about? Committees of specialists are convened to draw up guidelines that aim to identify "best practice" for a certain medical condition. The principle is that guidelines should be drawn from the "best" evidence and craft ed by the "best" scientific experts in the field. These guidelines are a key component of so-called evidence-based medicine, the idea that clinical practice should be based solely on the results of scientific studies. The recommendations are presented not only to physicians, but directly to patients, in informational brochures, on the Internet, and in the media. Guidelines therefore have become one of the most powerful forces on patient decisions, since the very language used to describe their content is "best" practice. Advocates of guidelines assert that both doctors and patients should accept their recommendations as the default option. Some physicians and health policy planners conclude that patients who deviate from expert recommendations aren't adequately informed or are "irrational."

Doctors and patients certainly should consult guidelines since they provide considerable background information about disorders and treatment options. But, it's important to recognize that guidelines aren't strictly "scientific." They incorporate biases and subjective judgments. Experts select which clinical studies to use and which to discard when they formulate their recommendations. Further, all studies have limitations. They provide results from statistical averages of selected groups of study subjects. These averages may not be applicable to a particular patient. Even the most rigorous, inclusive studies cannot address all the variables of age, gender, genetics, lifestyle, diet, and concurrent medical conditions that make us individuals and often influence how effective a particular treatment will be or what sorts of side effects we might experience. Many studies exclude the elderly or those who have coexisting common medical problems. When making their final recommendations about the need for treatment, experts also apply their subjective judgment about how much risk is worth taking in order to obtain a certain benefi t. Concerns have also been raised by the Institute of Medicine about potential conflicts of interest, since some experts who write guidelines are consultants to drug and device companies or private insurers. Finally, guideline committees have an imperative for consensus and present their recommendations with one voice. As a result, their conclusions usually fail to mention dissenting opinions that may have arisen among committee members.

It's also important for patients to realize that guidelines aren't engraved in stone; they can change quickly. A survey of one hundred recommendations from expert committees found that within a year 14 percent were reversed, within two years 23 percent were changed, and fully half were overturned at five and a half years. The American College of Physicians, representing internists in the United States, stated in 2010 that all of its guidelines, if not rewritten, should be automatically suspended after five years. This isn't only because new and better data become available, but also because the composition of expert committees may change, and with this change, subjective judgments of "utility" or value may shift . Consider the guidelines that recommended the use of estrogen in virtually all postmenopausal women to prevent heart disease and dementia. These guidelines were overturned by new information from the Women's Health Initiative trial. Yet some experts remain critical of this study and still endorse parts of the earlier guidelines, believing that for some women the "value" of hormone replacement may be enough to risk the downsides.

Compare "best practices" in medical practice with those in international development. In medicine, new drugs or new techniques are accepted only after large scale case-control studies showing efficacy. There are peer reviewed medical journals that publish individual studies and review articles summarizing many studies. There is the Cochrane Library that maps the status of research on key medical topics. Physicians undergo long formal education and internships and residencies before allowed to specialize, and are subjected to peer review when in practice and required to participate in continuing education and periodic re-certification. "Best practice" judgments are based on thousands of cases.

In international development, individual practitioners usually learn on the job and have severely limited experience with development cases as compared to physicians with medical cases. There are few serious statistical studies of development interventions, and arguably there are greater differences among the situations in which development projects are undertaken than there are among a doctor's patients suffering from the same medical problem. If recommended "best medical practices" should be taken with a grain of salt, then "best development practices" should be regarded with a jaundiced eye.

Tuesday, September 20, 2011

It seems clear that "Arab Spring" is an example of the contagion of an idea, reminiscent of the spread of revolutionary ideas in Europe in 1848. It appears that there is a spirit of the times in a large area with hundreds of millions of people. Clearly people in one city saw demonstrations occurring in other cities with which they could relate and began their own demonstrations; people in one country saw uprisings in other countries with which they could relate and began their own uprisings.

It seems also seems that countries in which the movement have been successful share socio-economic commonalities. They have populations with high proportions of young people, more educated than earlier generations, who have faced limited opportunities for economic success, and who have felt that their governments were unrepresentative, unresponsive to the demands of youths, and dependent on coercion for continuing in power.

We have been describing this year's political contagion as "Arab Spring" and it is true that many of the countries in which demonstrations, uprisings and insurgencies have been taking place are countries in which people speak Arabic. I had a debate on this blog in 2009 as to whether Arab nations were culturally uniform. Based on my limited experience in Egypt, Jordan and Morocco and my reading I felt that there were important differences among these countries, and I continue to feel so. Still, language is an important aspect of culture, an important index of similarity among peoples and countries, of the feelings of commonality among countries and peoples.

As the map above shows, not all countries in which Arabic is spoken have Arabic as the only official language. Indeed, there are minorities in many Arab countries that speak other languages and there are differences in the Arabic spoken in different Arab countries.

It is not surprising therefore that the course of politics has been quite different this year in Tunisia, Egypt, Syria, Yemen, Bahrain, Lebanon, Morocco, Jordan, Sudan, Saudi Arabia, Somalia, and other Arab countries. Obviously the Palestinian peoples are facing very different problems than the Sudanese, the people of the Maghreb or the peoples of the oil-rich Gulf states.

Religion is also an important aspect of culture and the countries most involved in the Arab Spring movements have been Islamic. As the map below shows, there are significant differences in the fraction of peoples in different "Muslim countries" that profess the Muslim religion. Comparing this map with that of the Arab speaking world, it is clear that while 22 Arab countries are all Islamic, not all of the 54 Islamic nations are Arab. Indeed, There seems to be a significant contest for influence in the Islamic world among Arab Egypt and Saudi Arabia, Turkic Turkey, and Persian Iran. Moreover, as experience in Iraq has shown with considerable violence, there are peoples who profess different Muslim faiths who can not only differ in culture but who have killed each other over those cultural differences.

Geography also counts, in the sense that natural resources, climate, neighboring countries, access to foreign markets, etc. all influence culture and have repercussions on the likelihood of violence and insurrection.

So too does history, and the history of the Arab countries is very long and very different. Egypt has been a regional cultural leader for thousands of years, influenced (in my opinion) by its Pharaonic, Greek and Roman past as well as by its Arab and Muslim roots; Syria has an equally deep past, but one with different aspects than that of Egypt. The Arab countries have different experiences in the 20th century in part related to the differences between British, French and Italian colonial policies, not to mention differences in the earlier experience with the Ottoman Empire. History leaves its mark on culture and thus on the response to the contagion of Arab Spring as well as on the underlying economic and political conditions that form the conditions that foster or discourage contagion.

I have spent decades working in Latin America. I began the experience with the assumption that the Latin American countries, especially the Ibero-American and Spanish-American countries were rather homogeneous. I emerged from those decades with the realization that countries "south of the border" are very different one from another. Indeed even countries very close to each other (Costa Rica and Nicaragua, Colombia and Ecuador) can be very different one from another. I think the same is likely to be true of the Arab countries and of the Islamic countries.

Monday, September 19, 2011

ABSTRACT
Scientific information looks a lot like the news because it's printed on paper and built with sentences we believe to be true. But treating scientific papers as if they were a constant stream of news is dangerous because it gives even the most crucial discoveries about Earth's ecosystems only one day of public attention. This confuses the public by obscuring true scientific consensus and allows policymakers to avoid tough decisions.

In my talk, I'll explain why we should be organizing scientific discoveries as if they were products, using informal peer ratings to keep the most important work at the top of list for policymakers and the public to see. Such an effort requires us to consider factors like subscription walls, data access, political agendas, the nature of scientific debate and the deeply-entrenched habits of old academia. Despite the challenges of ranking a product as peculiar as the world's scientific information, I'll argue that a new approach is crucial if we are to make the tough decisions that will protect Mother Earth for the long term.

Comment: I think this talk has relevance to Zunia, the development portal. Currently the portal separates "news" and "events" from other "posts" with one filter. Its default is to present results of searches in order of "popularity", and while I don't fully understand the algorithm, that seems to be a measure of frequency of recent page views of the full post. It can also sort posts by date and time posted, listing the most recent first. There is also a listing by the time of the most recent comments.

Of course, the site allows searches by key words, and it also allows searches by tags, including fixed geographic and categorical tags.

But would it be useful to have the ability to provide responses to a search in order of credibility, comprehensiveness, importance, clarity or other concepts (based on ratings from users)? Sometimes you don't want something that is either popular or recent, but rather something that is a comprehensive summary of information.

I still like the idea for a portal of dividing it into a library of information and a directory of organizations, programs and projects (which would require alternative criteria for sorting resources).

The last 15 minutes of Charlie Rose' recent interview with Lisa Randall focuses on the value of the way scientists think as applied in other aspects of life. Thus she emphasizes that scientists consider it important to not only recognize the uncertainty in their observations but also to try to quantify that uncertainty. Politicians tend to hide uncertainty regarding it as a point of weakness while scientists recognize the recognition of uncertainty as a point of strength.

The interview was occasioned by the publication of her new book, Knocking on Heaven's Door: How Physics and Scientific Thinking Illuminate the Universe and the Modern World. If I understood her correctly, the purpose of the book is to help people understand the nature of scientific thinking as a process, hoping that they will emulate scientists in the future by responding to assertions of importance by asking questions such as "how did the speaker arrive at that assertion?". "what is the basis of that assertion?", "what is the evidence in support of that assertion?", and "how credible is that assertion?". A distinguished physicist, Randall draws on her own work and on modern theories from physics to motivate and move forward the discussion, in the process showing how different the world is in the mind of a physicist than it is in the mind of the average, middle American.

Randall talks about the most important questions in science today, and answers in terms of her own research interests, asking about the fundamental nature of space and time, why subatomic particles have mass and why they have the mass that the do, etc. A scientist interested in the neurobiological basis of consciousness and thought might have given a different response from his/her research interests, focusing on the complexity of the brain and the emergent nature of the mind.

I was thinking about a colleague who said that he seldom had time to focus on the important stuff because he always had too much urgent stuff he had to focus on immediately. A fundamental understanding of space and time may be of huge importance, but I kind of feel that mankind can wait a little longer for its development. On the other hand, it would be very nice now to have a cure for HIV/AIDS, effective and affordable vaccines for malaria and HIV, and a solution to anthropogenic global warming.

At the time of the American Revolution, Virginia was by far the most populous state, followed by Massachusetts (which at the time included what are now Maine and Vermont), Pennsylvania, North Carolina, New York and Connecticut in that order.

Presidents Washington (1), Jefferson (3), Madison (4) and Monroe (5) were from Virginia and each served for two terms or eight years. Presidents John Adams (2) and John Quincy Adams (6) were from Massachusetts and served 8 years between them. Thus the presidency was held by people from the two most populated states at the time of the Revolution for the first 40 years after the Constitution went into effect.

Of course, the two Adams presidents were father and son.

James Madison was married to the sister of George Washington's nephew's wife.

William Henry Harrison, the 9th president was elected from Ohio but born in Virginia. He was the son of a Virginia signatory of the Declaration of Independence. Harrison's maternal uncle was married to Martha Washington's sister. Thomas Jefferson was his second cousin. His Grandson, Benjamin Harrison was the 23rd president. He was a distant cousin of John Tyler (10) who was also from Virginia.

Zachary Taylor (12) was born in Virginia (although elected from Louisiana after a military career). He was a third cousin of George Washington, and related to Franklin Delano Roosevelt (and Robert E. Lee).

Martin Van Buren (8) was from New York. He was related to later presidents from New York Theodore Roosevelt (30) and Franklin Delano Roosevelt (32).

Washington is also related directly or indirectly to Zackary Tailor, William Howard Taft, Grover Cleveland and Franklin Delano Roosevelt, George W. Buah, and Hillary Rodham Clinton.
The Adams family was related to Millard Filmore, Calvin Coolidge and Franklin Delano Roosevelt.

Not surprisingly, the most populous states have tended to produce presidents. We think of the United States as having democratic government rather than aristocratic one, but in the time of our founding fathers the stratum of potential presidents was very thin and very connected. Not only did many of the early presidents know each other well as colleagues and neighbors. but they also had family ties.

Of course in our time we have seen Franklin Delano Roosevelt married to Theodore Roosevelt's niece who was also a distant cousin, and the Bush father and son presidents. There are many more links between presidents and well known political leaders such as senators.

Sunday, September 18, 2011

The World Bank funded a $30 million project in Uganda five years ago to support science and technology in that country. I was involved as one of the team helping to determine whether such a project made sense and then helping to design the project. The intent was to have peer review of project proposals for much smaller subgrants to Ugandan scientists that would fund research and development on important topic for Ugandan development; scientists and their institutions would develop capacity while doing research and developmenet.

David Dickson, the editor of SciDev.Net, has an editorial in this weeks edition indicating that the Government of Uganda is not seeking a second tranche of that funding. Official spokespersons for the government are quoted as stating that the decision is simply that the government prefers to use other sources to fund the continuation of the efforts. Scientists are described as expressing skepticism about that claim and concern for the stability of funding of the scientific enterprise. I share their concern.

Sorry, but I should have posted an appreciation for the Constitution yesterday. I took a class in an adult learning program this summer on the Constitution, so I practice what I preach when I say that we all should study and pay attention to the Constitution.

R. Shah's powerful editorial “Breakthroughs for development” (22 July, p. 385) underscores the proud history of America's scientific and engineering contributions to development around the world. Many were privately funded and led; many were stimulated by the United States Agency for International Development (USAID). Yet USAID has drifted from its past strengths. Shah bluntly states that “budget cuts and shifting mandates pulled the agency's focus away from emphasizing science and technology.” He implicitly refers to blizzards of congressional earmarks and to USAID's deliberate and consistent de-emphasis on science over the past 30 years.

Observers have repeatedly criticized these trends and recommended exactly what Shah now sees as a priority. For example, 20 years ago, in 1992, the Carnegie Commission on Science, Technology, and Government argued for a new strategy for USAID and advocated “critical roles for science and technology” (1). Just 5 years ago, in 2007, the Bipartisan Congressional-Presidential HELP Commission called for a new unit in USAID, similar to the creative projects of the Defense Department's Advanced Research Projects Agency, that would invest $50 million per year of “patient capital”—i.e., federal government funding for innovative long-range research (dubbed “patient” because it may not yield immediate results) (2). The reports sat on shelves. No administration took the initiative. Little changed, and the defects Shah cites became worse.

One objection to vigorous U.S. science and technology cooperation is that developing countries such as China and India become competitors as they flourish with economic growth powered by science. However, such countries also become larger markets for U.S. exports and more capable partners in global goals, such as protecting public health.As the Congress weighs paths to prudent austerity in the overall federal budget, the scientific, medical, and engineering foundations of programs in foreign assistance are as important as such foundations are in defense. Let us move USAID out of its late-20th-century ruts and into the 21st century's frontiers. Shah deserves our help.

Rajiv Shah is the Administrator of the U.S. Agency for International Development. His administration has been marked by a serious effort to return science and technology to an important place in the USAID programs. Rod Nichols is not only the former president of the New York Academy of Sciences but someone who has been an advocate of science and technology in U.S. foreign assistance policy. He knows of which he speaks.

I served in the science and technology portions of USAID for many years, including as the Deputy Director of the USAID Office of Science and Technology, as special assistant to the Deputy Assistant Administrator for the Bureau of Science and Technology, and directing the USAID Office of Research.

The modest portion of the USAID program devoted to science and technology was important for developing countries, yielding great benefits linked to the Green Revolution, the eradication of Smallpox, and the development of low cost public health interventions that saved millions of lives. Yet it was frequently under fire, often from USAID staff who had little background nor understanding of science and technology, and who were fighting for resources for their own meritorious programs.

For those who agree with Shah, Nichols and me that science and technology cooperation should be an important part of U.S. foreign aid, now is indeed the time to support Shah in his effort to make it so!

The Canadian lynx (Lynx canadensis) is thriving in Canada but is a threatened species in the United States. The chain of events that led to the mysterious decline of lynxes in the United States, scientists now say, may have begun with the extirpation of another species: the gray wolf (Canis lupus), which was hunted to near extinction in the United States during the 20th century. Today, wolf populations are growing in parts of the west and Minnesota.

The loss of the wolf may have set in motion an “ecological cascade,” William Ripple, an ecologist at Oregon State University, Corvallis, and his co-authors write 30 August in Wildlife Society Bulletin. Without wolves, populations of coyotes and herbivores (such as elk and deer) have soared—leading to a double whammy for the lynx's primary prey, the snowshoe hare (Lepus americanus). First, there are more coyotes to hunt them; and second, elk and deer consume the shrubby cover hares eat and seek for protection from predators. The result: fewer snowshoe hare for the lynx to hunt. Climate change may be another factor; snowshoe hare and lynxes thrive at high elevations with deep snow packs, but milder winters open up these areas to coyotes.

Since their reintroduction to Yellowstone National Park in 1995, wolves have sharply curtailed the coyote population, altered the behavior of both coyotes and herbivores, upped the number of snowshoe hare, and helped restore overall ecosystem health, the authors say. So wildlife managers should consider wolves' “ecological role”—and value as top dog—when deciding their fate.

I recently posted on the decision of the U.S. government to allow hunting of wolves again in the Rockies, including in Wyoming. Here is more information on research that shows how complex is the web of life, and how sensitive that web is to the top predators in an ecosystem, as the wolves are in Yellowstone.

I think all Americans (with the exception of some cattle ranchers and western politicians) want our national parks to remain pristine, and especially want Yellowstone to remain as a reminder of what this continent was like before 1492. Moreover, the United States committed itself internationally to preserve Yellowstone as a natural World Heritage site for our descendants and our neighbors.

So lets stop hunting wolves, at least until they become more common in Yellowstone than they were in 1492, and thereby help maintain the Yellowstone ecosystem more as it once was!

The showing of the United States in the latest PISA results is unsatisfactory. Clearly kids need to learn reading, arithmetic and science. I suspect that teachers are important. In many nations teachers are high status and given the freedom and resources to help kids learn. I think our parents have to take a stronger stand assuring that their kids are motivated to learn and making sure that the schools and teachers that serve their kids are held to account for the quality of their services.

Data show that social class is a predictor of American student performance. I suspect that in part that is due to the fact that the United States has become a place with little social mobility. If you are a lower class kid and perceive little chance of rising out of the lower class, how well are you going to perform in school and on these kinds of tests? If you are a teacher with a class of lower class kids, kids who see little social mobility in their future, how hard are you going to try to help them learn? I could go on.

I suspect that there are further issues. What do we have to help kids to learn now for their future, and do the PISA tests do a good job in testing these accomplishments. I suspect that kids will be facing rapid social, economic and technological change, and will need the skills to adapt to that change and to learn new skills and obtain new knowledge and understanding well and efficiently. I doubt that the PISA tests do much to test those abilities.

I wonder too whether we are thinking enough about how to help kids to learn. To what degree can we find technology to help the schools? To what degree can we improve out of school learning opportunities? Indeed, to what degree can we adjust the demands of the economy and the society to better adapt to what kids can learn and adapt what we ask kids to learn to better meet the demands of the economy and society?

Ease with which someone can find the relevant materials on the corporate website;

What policies, procedures, and corporate governance structures are in place and disclosed; and

What the corporation says about who and what it gives to, and how those donations are made.

The graph above from The Economist shows that many of our industries are categorized as opaque and many more fall in the weak classification. Since the Supreme Court in its wisdom has decided that corporations have the same rights as individual people to make donations to political parties and politicians, the lack of transparency is really worrisome!

Friday, September 16, 2011

The graph indicates that the longer one's education, the less likely one is to be unemployed. The effect is found in all the countries of the OECD. It suggests that schooling pays off in later employment, which seems reasonable.

I wonder if there are some hidden variables. In advanced developing nations, who are the people above the age of 25 who have less than high school education? I suspect that they include older people, who while still in the workforce may experience age discrimination. They may be immigrants, who again may not be as fully employed in the formal economy for several reasons. They may also be people who due to some disability found schooling difficult, and thus may also find suitable employment difficult to find.

All these things would make it difficult to infer the value of schooling, disentangling it from the values of being young, healthy and native born.

Thursday, September 15, 2011

There are peaks of unemployment associated with Ford, Reagan, Bush I and Bush II, all Republicans, and decreases in unemployment rates over for Clinton and (so far) Obama. The long term reduction of unemployment rates during the Carter administration was interrupted by the oil shock at the end of his term, but generally Democrats have been associated with lower unemployment.

I think we are in a precarious situation, and if the United States' and European governments do not take effective action, we might go into a depression. It is unfortunate that there is already distrust of the U.S. Congress as not having the guts and will to do the right thing, as well as doubting that the Congress could find a route to fight the current recession and also reduce the debt to GDP ratio over the next decade or so by cutting the growth of spending, growing the economy, and allowing modest inflation. I hear that there are already Democratic legislators who are thinking of abandoning the Obama jobs bill, which is putting something against the needs of the nation while deserting their leader in times of difficulty.

The Economist has an article in its Technology Quarterly focusing on analysts who simulate economic and political processes. They apparently create a virtual world in which the agents make decisions based on game theory. In some cases real political leaders have been simulated, based on expert estimates of the values that they hold, simplified for the game theoretic simulation. The military are using the approach for strategic planning as well as private corporations for economic planning.

Could software-based mediation spread from divorce settlements and utility pricing to resolving political and military disputes? Game theorists, who consider all these to be variations of the same kind of problem, have developed an intriguing conceptual model of war. The “principle of convergence”, as it is known, holds that armed conflict is, in essence, an information-gathering exercise. Belligerents fight to determine the military strength and political resolve of their opponents; when all sides have “converged” on accurate and identical assessments, a surrender or peace deal can be hammered out. Each belligerent has a strong motivation to hit the enemy hard to show that it values victory very highly. Such a model might be said to reflect poorly on human nature. But some game theorists believe that the model could be harnessed to make diplomatic negotiations a more viable substitute for armed conflict.

Today’s game-theory software is not yet sufficiently advanced to mediate between warring countries. But one day opponents on the brink of war might be tempted to use it to exchange information without having to kill and die for it. They could learn how a war would turn out, skip the fighting and strike a deal, Mr Bueno de Mesquita suggests. Over-optimistic, perhaps—but he does have rather an impressive track record when it comes to predicting the future.

I was just looking at Academic Ranking of World Universities, 2011. I was pleased to see that the university that granted my BS degree was ranked as high as 13th in the world, the one that granted my MSEE was ranked second, and the one that granted my PhD came in number 46. In Engineering, the schools that granted my engineering degrees were ranked 3 and 28. All five of the U.S. universities in which I have taught are in the top 100 in the Americas.

Of course these are reputational ratings for universities in their entirety, and my experience may not compare as favorably as the numbers indicate. Still, with some 4000 institutions of higher education in the United States, those ranked in the top 100 are all probably pretty good.

The term "terrorist sympathizer" has quite a different connotation than "love the sinner, hate the sin". Perhaps we should reconsider the term. Are we not really opposed to supporters of terrorist acts or terrorism?

It seems to me that we will be better off fighting terrorism if we understand the terrorists, and that we will better understand them if we can empathize with them. You are seen as "sympatico" if you share a mental connection or bond with a person.

I think that the phrase "war on terrorism" has led to some sloppy thinking. As many others have said, the way to prevent terrorist acts is not to use military force against nation states. The words we use affect the thoughts we come up with.

We were impressed that it is possible to write a biography at all of a person who lived 2000 years ago. On the other hand, Everett guessed a lot as to what Augustus was thinking or intending. (Indeed, we still have to guess a lot about what a modern politician really intends or really thinks.) There was agreement that the biography enlivened the historical narrative, making it more interesting.

Augustus was not only the survivor but the victor of decades of civil war, the man who began the empire and saw the end of the Roman republic. He had the advantage of being the adopted son of Julius Caesar, and importantly of a partnership with Agrippa, the winning general of many key battles. Augustus was clearly lucky (even to have survived to old age at a time when many died young, not only in battle but from disease), but one must assume that he was also smart and a capable political tactician and strategist.
Everett writes that Augustus planned and intended a great deal of what occurred, while I would tend to assume that his plans probably often went awry, but that he was good at muddling through. Everett suggests that Augustus may not have been personally courageous in battle, and often ill at moments of crisis, but was very brave in the public confrontations necessary in his time (facing down crowds of citizens, Senators, and soldiers). There was considerable discussion of the degree to which Augustus benefited from the myth that accumulates around the surviving chief of the victorious party of a civil war, especially when he then presides over a long period of relative peace and prosperity. Augustus was also the beneficiary of the propaganda of his supporters and government.

Rome, in Augustus' time reached from the Atlantic well into the Middle East. It stabilized its northern borders at the Rhine and Danube, while consolidating control of north Africa. It continued and elaborated Greek culture, leaving a huge legacy for Europe, North America, Australia and New Zealand, and indeed all of the modern world.

Everett spends time explaining a number of things, if only briefly, from Roman foods and dining customs, to clothing, to the political organization of the Roman legislative and administrative systems. I found it interesting to think about a culture so different than ours that was still successful in its time in uniting tens of millions of subjects in an empire so broad in extent, in spite of not having invented effective systems of taxation and bureaucratic administration, nor having separated government from an excessively superstitious religion in rapid flux.

I was not alone in the club in wondering about Roman communications. How were the roads built and maintained, and how were ships built and shipping managed? How did messages travel from the ends of empire to its capitals?

We discussed the sources of silver, the inflation that occurred when Rome acquired the accumulated riches of Egypt, and the extent of commerce. We discussed the slave trade and acquisition of slaves through war (but not the institutions of slavery, nor the situation of freemen in the class society of Rome), but did not discuss where barter ruled, where there were free distributions of bread, and where the money economy functioned well.

I would have welcomed a better discussion of the Roman military and naval power. There is a hint of the role of cavalry and auxiliaries to the legions in the book, but no real discussion of why Roman armies and navies were so successful. Indeed, how did Rome support a huge military establishment with a primitive system of taxation and bureaucratic administration?

On the other hand, Everett and his publishers are to be commended on providing a chart of Augustus' family tree, a time line, an explanation of the naming customs of Roman society (all girls given the same name, names changing over the life of leaders to mark significant achievements), and maps of the empire and the sites of significant battles.

The book club discussed this book for a couple of hours, which I found especially useful as members not only brought out aspects of the book that I had failed to fully appreciate, but also complemented the information in the book with that drawn from previous readings and study. We touched on modern parallels to Roman times.

All in all, I found the book to be worth while, and continue to find the history book club a great contribution to my community. Augustus was a man who lived an eventful life, leading a major transformation of a great society, a society that legacy still with us for good and ill. We will read more Roman history to learn more about the rise of the culture in the centuries before Augustus and the fall of the empire in the centuries following his death.