Sunday, October 31, 2010

This week's election occurs 150 years after the nation elected Abraham Lincoln president. Lincoln gained national prominence in his opposition of the Supreme Court's Dred Scott decision. That decision is thought to have had some major implications:

That people of African descent were not citizens of the United States and therefore did not have the protection afforded to citizens by the constitution and the courts.

That slaves were the property of their owners and could not be taken from those owners without due process of law.

The Chief Justice held that the Congress could not make laws that applied in the territories. Essentially the five slave owners on the Supreme Court that decided the Dred Scott case sought to assure that the territories west of the Mississippi would allow the importation of slaves, and they were sure that that fact would assure that the United States remained a country in which slavery was accepted.

Lincoln traveled over much of the north speaking against this decision. His opposition took him from being a retired Congressman to having the prestige and visibility to win the nomination of the Republican party for President. The Republican party abolitionists were energized -- galvanized -- in their opposition to the Dred Scott decision, and Lincoln was elected President.

The southern states recognized the election as a turning point as to what kind of nation the United States would be. The seceded from the Union to preserve their social system and Lincoln declared war to preserve the Union. The Civil War led to the abolition of slavery and began a process which is still continuing by which African-Americans are gaining the full rights of citizenship and equality in our nation.

Strangely, there has been a reversal in the roles of the parties with the Democrats now including the progressive wing of American politics, as the Civil Rights Act of 1964 leading to the white population of the southern states shifting en mass to the Republican Party.

It would be ironic if in this election, 150 years after the election of Lincoln, the nation voted against the party of our first African American President and in favor of the party that opposed the extension of health insurance to the marginalized portions of our society and opposed taxes that would have the rich among us pay their fair share of the bill for government services.

But then it was ironic that people who did not own nor benefit from slaves fought for the Confederacy in the Civil War and that the south was so successful for so long after emancipation of slaves in keeping the ex slaves and their descendants in a state of subservience.

Saturday, October 30, 2010

If you have not noticed, in the important Congressional votes over the past two years, the members of both houses voted overwhelmingly with their parties. Party discipline reigned and individual judgement went out the window.

Connie Morella, a Republican, was the Representative of my district in the House of Representatives in the 1990s. I had met her, and I regarded her as moderate, very smart, and careful to help her constituents. She had a record of occasionally voting with the Democrats and against Republicans. I felt well represented by her until 1995. In that year there was a Democratic President and a Republican majority House of Representatives. The budget legislation stalled in the Congress. In theory the fiscal year began on October 1, but the budget for the new year had not been passed. Rather than pass a continuing resolution to allow the government to continue functioning, the Congress twice let the government shut down all but the most essential functions, first for a few days and then for a few weeks. I was furloughed from my government job.

I did not believe for a moment that Ms. Morella believed that it was better for the United States to shut down the government than for the government to work under temporary budget legislation. I did not believe for a moment that she was representing the large portion of her constituency that worked for the government in voting for the stoppage of the government, nor indeed did I believe that the majority of all her constituents (who were by a large majority Democrats) wanted the government closed down. She went along with her party, and the government was left in a budgetless limbo.

As I sat at home, unable to do a job that I thought important, I came to understand that I wanted people representing me in Congress that belonged to the party that best represented my political views, not merely people who seemed very reasonable in their personal views.

The primary elections are your opportunity to assure that the representative of your party is in fact a good person capable of doing a good job as your representative. Unfortunately, it is not only the case that the more senior members of Congress are not only more effective because they understand better how to work in the Congress, but also because the processes of the Congress give more senior members more power; I would rather see the more effective members given more power. While it is tempting to support the person in the office for reelection, after 1995 I became more and more convinced that it was more important to support the candidate of my party.

Thursday, October 28, 2010

I watched President Obama on the Daily Show, and thought I might share my opinion. It was one of the best interviews I have seen on the show, with Jon Stewart asking good questions and pressing for serious answers. The President showed up as articulate, in command of the message he wished to deliver, good humored and smart. He made the case that in a year and a half his administration has some real accomplishments -- preventing a depression, passing a serious first step in the reform of the health care system, and regulating the financial industry. He also made the case that he believes major reforms in the political system are still needed, and they are possible but that they will take time.

I am convinced that Barack Obama really cares about the people on "Main Street". He says so frequently. His career from law school, to the tough neighborhoods in Chicago, to the legislatures in Illinois and Washington, and to the White House are totally consistent with that concern for real people who are having a tough time dealing with the economy. With his parents and his grandparents, how could he not feel for the ordinary people?

He is also someone who is quite realistic. He is unlikely to accept defeat on an important matter by insisting that his way is the only way to get it done. I suspect that he won the presidency by firing up the electorate, realizing that it would be hard to satisfy all the hopes he was raising during the campaign. He is a former Senator and he not only knows that you have to compromise to get legislation passed, but he has shown he is willing to do so for important legislation.

Thanks to both Jon Stewart for hosting the President and to President Obama for appearing!

I have been reading Mike Rapport's 1848: Year of Revolution. The political disturbances that broke out in the first months of 1848 in Italy and France were noted all over Europe helping to trigger outbreaks of liberal and radical opposition to the reigning conservative (ofter monarchist) regimes.

It seems likely that the transportation and communication infrastructures that had been so recently improved helped spread the word of the outbreaks.

More fundamentally, the revolution in communications and transportation, together with the growth of the industrial production of textiles and the growth of coal and steel industries (related to the other technologies) promoted the conditions conducive to the destruction of the old political order and the creation of a new order, one in which political power was more widely shared.

It is striking that in 1848, there were important efforts to combine the various smaller provinces in which Italian was spoken into an Italian nation state, as there were efforts to combine the various smaller units in which German was spoken into a German nation state. The geographically expanding markets were surely related to the success of the efforts to build larger states on the basis of a common language. I would say that the further geographically expanding markets in the second half of the 20th century were surely related to the success of the effort to build a European Union that transcended the boundaries of traditional language groupings as well as the success of efforts to build common markets transcending language differences in other regions of the world.

This suggestion is a form of "technological determinism", implying that changes in technology lead to changes not only in economic but also political (and social) organization. I have also recently done three postings on this blog suggesting that weather and climate led to the American Civil War (posting one, posting two, posting three); this might be seen as a theory of climatic determinism.

How can I simultaneously propose two alternative theories of determinism? Of course the answer is that each is a theory of partial determinism. Technological and climate, technological and climate change all affect political and economic systems. So to do other factors, especially perhaps cultural factors. The world is a complex place!

The nation was initially divided into slave and free states largely because of the weather. The problem with owning people is that you have to house and feed them, whether they are working or not. This is not such a conflict in an area with a long growing season. But in the chilly North, the summers were not long enough for the profits from slave labor to outweigh the high cost of maintaining indentured servants.

Thus most slaves ended up in Brazil, the Caribbean islands, and the American South, where the tropical temperatures and long growing seasons made slave ownership profitable. By the Revolutionary War period, 40 percent of African Americans in the northern states were free, compared to only 4 percent in the southern states. A little-known fact is that African slavery was such a vibrant trade, the forced migration of Africans exceeded that of Europeans to the New World until the 1830s. The cumulative total of African migrants continued to exceed that of Europeans until the 1880s.

The book goes on to point out that the health conditions in the hot, humid south were bad; remember, these were the days before control of vector born diseases and malaria and yellow fever were common in the southern states. The argument was made that people of African ancestry were needed because those of European ancestry could not stand the health conditions; one might wonder whether the reason was that free people would not stay and do the back breaking work without economic incentives, but slaves could be forced to do so cheaply.

The excerpt from the book also mentions that after the slave trade was abolished in the first decade of the 19th century

the value of existing slaves began to skyrocket. This meant that even in temperate southern states, it was no longer economically feasible to support a slave year-round. To make up the difference, plantation owners would rent out slave labor during slow seasons, and increasingly young slaves were taught trades to increase their value. Unlike the previous generation of slaves, these skilled tradesmen learned about life off the farm. They interacted with city people and slaves from other plantations. This would prove to be quite dangerous to the institution of slavery.

There was a debate in the Virginia legislature in about the abolition of slavery in the state after the Nat Turner slave rebellion in 1831. I have read that the tobacco production in the eastern part of Virginia, where most of Viginia's African Americans lived, had become less profitable as the crop and cropping practices exhausted the soil, and that increasingly profits were achieved in those plantations by selling young slaves to plantations further south. As migration continued further west, slavery was less and less a part of the local system.

So differences in climate were crucial factors behind the fact that the southernmost states were those that developed highly stratified societies based on plantation slavery and the export of primary products, while the northernmost states were those that developed more egalitarian societies based on small, privately-owned farms, manufacturing and trade. Recall that South Carolina was the secessionist leader, and it was the northern states that were the most willing to go to civil war to maintain the Union. Border states were far less enthusiastic about going to war, with Maryland and West Virginia opting for the Union and Virginia having a debate in 1861 as to whether to join the Confederacy or remain with the Union.

It is argued as to whether the Civil War was about slavery. I suspect that the direct cause was the irreconcilable differences between the southern and northern societies and the way in which those in power in those societies based that power. Southern aristocrats thought, probably correctly, that if they continued in the Union, the Union would act in such a way as to destroy the basis of their wealth and power; the primary way in which this would have happened would be through the abolition of slavery. Thus slavery might be seen as an indirect cause of the Civil War. However, it may well be that the difference in climate between the southern and the northern states was the reason for both the difference in the distribution of slavery and the differences in the social systems of north and south, and thus a still more fundamental cause of the Civil War!

Tuesday, October 26, 2010

I was a member of the team that evaluated the Initiative. Here is a link to the report:

"The Jordan Education Initiative (JEI) was created in 2003, with the assistance of the World Economic Forum (WEF), to leverage public-private partnerships to improve the application of information and communication technology (ICT) in grades 1–12 in Jordanian schools.......

"The familiarity gained from interviews, surveys, visits and observations have helped us understand where the JEI is on the path to integrating ICT into the Discovery Schools and promoting the MoE’s reforms at the classroom level. The JEI has been able to mobilize its network of partners to provide the Discovery Schools a broad range of ICT resources, including hardware and software as well as development of e-content. While the student-computer ratio is still higher than that of wealthier nations and there are on-going challenges to providing stable connectivity, the data show that teachers are able to use the ICT resources they have. Howevethe most frequently observed uses of these new resources do not yet align with the vision of udesired by the JEI and MoE. The data indicate that most educators still have a traditional viewheir role as teachers. Teacher-centered practices still predominated among most of the teachers we interviewed and observed. Innovation requires taking risks, and many teachers expressed concern that the highly centralized education system, and the emphasis on coverage in the curriculum and preparing for the tawjihi exams, limited their flexibility to experiment with new practices.

"Nevertheless, there are some bright spots that suggest where the JEI should focus its work going forward to promote deeper changes in the use of e-learning resources in Jordanian classrooms. thin the ten case study schools chosen from among the nearly 100 Discovery Schools there are a nmber of capable principals and teachers who have embraced the new approachesgrating active learning strategies and student-centered practices and are using the e-learnes in more innovative ways."

Over 317,000 waiters and waitresses have college degrees (over 8,000 of them have doctoral or professional degrees), along with over 80,000 bartenders, and over 18,000 parking lot attendants. All told, some 17,000,000 Americans with college degrees are doing jobs that the BLS says require less than the skill levels associated with a bachelor’s degree.

Last month, Sarah Kaufman published an article inthe Washington Post suggesting that for many students college was not a good investment. The relative cost of university tuition and related costs has grown more rapidly than overall inflation, suggesting that college is generally not as good an investment as it was in the past. And clearly some students waste the opportunity provided by a university education.

Kaufman's article points out:

In 2008, the median annual earnings of young adults with bachelor's degrees was $46,000; it was $30,000 for those with high school diplomas or equivalencies. This means that, for those with a bachelor's degree, the middle range of earnings was about 53 percent more than for those holding only a high school diploma.

But a lot of college graduates fall outside the middle range -- and many stand to make considerably less.

Similarly, some non-college graduates make more than the average high school leaver. A college graduate with a major in Norse mythology may earn less than a GED carpenter who has learned to do an important and useful job well.

In my previous posting I suggested that Americans are going to have to cut back on consumption. It seems obvious to me that the cost of a university education can be conceptually divided into investment and consumption. A lot of what one does in college is fun and interesting, but not likely to yield future benefits. Perhaps the consumption aspect of college is one area in which we will have to make savings in the future.

My son divided his college education between Ireland and the United States. His time in Ireland was before the Celtic Tiger days. He was struck by the fact that his Irish friends were much more career oriented than his American ones. In a hard economy, the Irish had learned to invest in university education that would pay off in future jobs and income. It may have been less fun, but it was more prudent. American students may have to learn the same lesson!

Fareed Zakaria has an interesting article titled "How to Restore the American Dream" in Time magazine. He notes that U.S. chartered multinational corporations have gone truly global, cutting back on business in stagnant markets and building business in expanding markets, moving production to the places where it is most profitable to produce.

The multinationals are good at capturing profits, but their investors who in turn capture a good part of those profits are increasingly international, as stock markets are increasingly international. The corporations that are successful in the globalized markets provide remunerative jobs for technical and managerial staff, many of whom are in the headquarters country.

A portion of his argument that interested me greatly was the following:

People who get paid a decent wage for skilled but routine work in manufacturing or services are getting squeezed by a pincer movement of technology and globalization. David Autor, an MIT economist, has done an important study on what he calls "the polarization of job opportunities" in America. Autor finds that job growth divides neatly into three categories. On one side are managerial, professional and technical occupations, held by highly educated workers who are comfortable in the global economy. Jobs have been plentiful in this segment for the past three decades. On the other end are service occupations, those that involve "helping, caring for or assisting others," such as security guard, cook and waiter. Most of these workers have no college education and get hourly wages that are on the low end of the scale. Jobs in this segment too have been growing robustly.

In between are the skilled manual workers and those in white collar operations like sales and office management. These jobs represent the beating heart of the middle class. Those in them make a decent living, usually above the median family income ($49,777), and they mostly did fine in the two decades before 2000. But since then, employment growth has lagged the economy in general. And in the Great Recession, it has been these middle-class folks who have been hammered. Why? Autor is cautious and tentative, but it would seem that technology, followed by global competition, has played the largest role in making less valuable the routine tasks that once epitomized middle-class work.

He also writes:

During the early 1950s, personal consumer expenditures made up 60% to 65% of the U.S.'s GDP. But starting in the early 1980s, facing slower growth, we increased our personal spending substantially, giving rise to new economic activity in the country. Consumption grew to 70% of GDP by 2001 and has stayed there ever since. Unfortunately, this rise in consumption was not triggered by a rise in income. Wages have been largely stagnant. It was facilitated, rather, by an increase in credit, so that now the average American family has no fewer than 13 credit cards. Household debt rose from $680 billion in 1974 to $14 trillion in 2008. This pattern repeated itself in government, except on a much larger scale. People everywhere — from California to New Jersey — wanted less taxes but more government. Local, state and federal governments obliged, taking on massive debts. A generation's worth of economic growth has been generated by an unsustainable expansion of borrowing.

Zakaria is well aware of the fact that the United States enjoyed a huge advantage in the decades after World War II, before the European and Japanese economies were rebuilt and before so many countries that had been on the periphery developed economically and joined the increasingly global markets. His point is to recapture the American dream, Americans are going to have to change our culture.

He stresses that we will have to borrow less, save more, and invest wisely. I suspect that a part of the effort to spend less should involve cutting our spending on health care, a form of spending in which we exceed all the other countries in the world. The example suggests that cutting spending may involve not only belt tightening, but also institutional changes such as reforms of our health care financing system.

Investing in technological innovation to keep our corporations competitive in the global economy will help keep our entrepreneurial, managerial and engineering cadres fat and happy. So too will investment in infrastructure, both in new infrastructure such broadband and upkeep of existing infrastructure such as roads and bridges, help to keep us efficient and competitive.

How do we invest so as to keep the middle class productive and comfortable. I suppose that we will have to both create middle class jobs that the nation can sustain in the face of global competition and prepare people to fill those jobs. Perhaps that means we have to create more of the middle class jobs like those of the cadres that have been doing well -- entrepreneurs, managers and technical jobs running globally competitive jobs. I suppose we will have train people not for the repetitive jobs that were held by many middle class people in the past, preparing them for intellectually demanding jobs in the global economy. The implication is that we will have to change our middle class culture so that kids will not assume that a middle class life is theirs by right, but that they will have to work very hard to prepare to compete for that middle class life.

Will we continue to create lots of low paid jobs providing services for a poorly educated workforce? Or will our belt tightening start to cut into these jobs. Certainly, as Zakaria suggests, many of these jobs are non-tradable; you won't get your hair done by a hair dresser in India if you live in California. On the other hand, I can easily imagine automation cutting into the number of jobs of this kind, as household automation has in the past century freed women to work outside the home. The costs of automation are going down, and as labor costs go up there are more possibilities and incentives to automate the lower wage jobs in the economy.

In previous technological revolutions there were major disruptions, and indeed we can see huge internal disruptions such as mass urbanization as China and India industrialize. We may see the disruption in the United States from the Information Revolution and Globalization continue for another generation or more. Zakaria is right that we have to change our culture to respond well to the challenge of overcoming that disruption well. Probably a key element of that change will be to change our political institutions, making them more responsive to the needs of the population and less influenced by the demands of the affluent and the corporate elite. Changing culture and changing political institutions will not be fun nor will they be easy.

I recently posted a thought on the possible effect of climate and weather in creating conditions leading to the Civil War. The graph posted there indicated that the period from 1810 to 1860 was "a little ice age" in the northern hemisphere. The figure above shows that immigration to the United States grew greatly during that period. It would seem likely that one of the driving forces for that immigration was the poor agricultural performance in Europe during the cold weather (and the potato blight that hit in the 1840s, which may also have been worsened by the cold weather).

Might the climate change have contributed to the pressures helping to cause the Civil War by promoting this immigration? The immigrants went primarily to the northern states and the immigrants may have been a source of cheep labor in competition with slave labor. (Indeed, I have read that Irish immigrants were used in especially dangerous occupations in the south because the financial risk was less employing a poor immigrant than a slave!) The immigrants also were very concerned with the maintenance of the Union. And of course, the immigrants were an important source of the soldiers who fought the war.

Monday, October 25, 2010

I have been reading 1848: Year of Revolution by Mike Rapport. It points out that in the early part of the 19th century the Industrial Revolution was disrupting existing patterns of production all over Europe, throwing people out of work, and creating large areas of rural and urban poverty.

The solution to the problem in Europe was in part the increase in production that took place in the second half of the 19th century and later, and the attention to the "social problem" that percolated through society.

I suppose we can term the changes that came in transportation with the internal combustion engine and in manufacturing and commerce with electrification as a Second Industrial Revolution.

Today we see different technological revolutions reaching different parts of the world. The Information Revolution is driving change in the developed world while large areas of the developing world are seeing industrialization and changes in energy and communications patterns that mark their experience of the Industrial Revolution and the Second Industrial Revolution. The overall pattern is Globalization.

On a global scale urbanization is taking place at an unprecedented rate, with massive slums in the developing world. The gap between rich and poor is greater than ever before. Perhaps, however, we are beginning to see the solution as China and India are making economic and social progress based on the massive technological innovation, and as a part of Globalization is the increasing spread of more productive technologies worldwide.

The papers released from Iraq and Afghanistan on Wikileaks seem to indicate:

that the direct collateral damage to civilians from military activities in the form of people wounded and killed has been very severe. This does not count of course the indirect collateral damage of people who have gotten sick who would not have, sick people who did not receive care, people who have lost jobs they would otherwise have had had, etc.

that there have been severe infringements on human rights. not only those at Abu Ghraib but others committed by our allies.

that the role of contractors in the wars has been greater than had been realized, and that the behavior of some of those contractors has been more problematic than had been made public.

We know that the war has gone on for a very long time, that it has been very expensive, and it has carried major costs in terms of the reputation of the United States abroad and the debts we own in our foreign policy. We also know that the "nation building" efforts have failed to install effective and honest governments nor have they successfully installed democratic institutions. The economies of Iraq and Afghanistan are a mess.

I conclude that President Obama's policy of withdrawing from the two wars with all deliberate haste is the right one, and of course that withdrawal will be a major step in putting our economic house in order. That policy may be the greatest achievement of the Obama administration.

Sunday, October 24, 2010

The early chapters of The Company: A Short History of a Revolutionary Idea by John Micklethwait and Adrian Wooldridge trace the history of business organization. While there were partnerships and other forms of business organization into the distant past, the early "corporations" were typically chartered by European monarchical governments giving monopolistic rights to groups to encourage the investment needed to exploit emerging commercial opportunities with high risk such as the exploitation of overseas markets in Asia and America, or the construction of canals.

Early bubbles led to general distrust and fear of corporations and draconian legislation to reduce the powers of corporations.

I was struck by the comment that corporations were accorded juridical personality to protect against government capriciously ending privileges that had been granted in corporation charters. Legislatures felt that what they had given they could take away at will; investors felt that they needed more surety of charter commitments to justify investments in the face of risk. The idea that corporations had rights to free speech, as we are now suffering, seems very recent indeed!

The idea of limited liability for corporate investors was quite controversial when first introduced, and was not embodied in law until the 19th century. It had been felt by many before that time that the management of business organizations by their owners was necessary for efficient operation and that the stake of the partners in their organization was needed to assure prudent management. The need for regulation to assure good management and prudent risk management by business organizations is obviously still with us.

It seems that the birth of the modern corporation really dates to the middle of the 19th century, although many of the characteristics of the modern corporation draw on earlier institutional inventions.

Although scientists have warned of possible social perils resulting from climate change, the impacts of long-term climate change on social unrest and population collapse have not been quantitatively investigated. In this study, high-resolution paleo-climatic data have been used to explore at a macroscale the effects of climate change on the outbreak of war and population decline in the preindustrial era. We show that long-term fluctuations of war frequency and population changes followed the cycles of temperature change. Further analyses show that cooling impeded agricultural production, which brought about a series of serious social problems, including price inflation, then successively war outbreak, famine, and population decline successively. The findings suggest that worldwide and synchronistic war–peace, population, and price cycles in recent centuries have been driven mainly by long-term climate change. The findings also imply that social mechanisms that might mitigate the impact of climate change were not significantly effective during the study period. Climate change may thus have played a more important role and imposed a wider ranging effect on human civilization than has so far been suggested. Findings of this research may lend an additional dimension to the classic concepts of Malthusianism and Darwinism.

The following graph shows average temperature in the northern hemisphere, from that paper.

The data indicate higher than average temperatures in the northern hemisphere in the 18th century, peaking about the time of the creation of the United States, followed by a drop in average temperatures through the first part of the 19th century.

I am not a meteorologist and the following are obtained from some brief surfing on the Internet. Indeed, 1860 was the very beginning of modern meteorology and the publication of weather records.

1859, the year before Lincoln was elected president seems to have been an unusual year. For example, here is one event, perhaps the hottest event in U.S. history:

"June 17, 1859 - The only 'simoon' ever to occur in the United States is reported by a United States Coast Survey vessel off Goleta (California). A northwest wind brings scorching temperatures of 133 degrees between 1:00 and 2:00 that afternoon. Birds fall from the sky, crops shrivel and cattle die under the shade of oak trees."

Just before dawn the next day, skies all over planet Earth erupted in red, green, and purple auroras so brilliant that newspapers could be read as easily as in daylight. Indeed, stunning auroras pulsated even at near tropical latitudes over Cuba, the Bahamas, Jamaica, El Salvador, and Hawaii.

Even more disconcerting, telegraph systems worldwide went haywire. Spark discharges shocked telegraph operators and set the telegraph paper on fire. Even when telegraphers disconnected the batteries powering the lines, aurora-induced electric currents in the wires still allowed messages to be transmitted.

A major hurricane struck the southern United States in August 1860, moving from the Atlantic coast to the Gulf coast. Maximum wind speed was estimated as high as 130 miles per hour. A huge outbreak of tornados occurred in June 1860 in what is now the mid west, killing many in what was then a sparsely populated land and causing great damage to a number of towns.

Could the low temperatures of the early 19th century have created conditions that exacerbated the disagreements that led to the Civil War? Could the anomalies of 1859 and 1860 have had a psychological impact? I don't know, but the question might be worth someone investigating.

The new data indicate that in the period 2006–08 about 22% of the manufacturing companies introduced product innovations (one or more new or significantly improved good or service) and about 22% introduced process innovations (one or more new or significantly improved method for manufacturing or production; logistics, delivery, or distribution; support activities). In comparison, about 8% of companies in the nonmanufacturing sector were product innovators and 8% were process innovators. Nonetheless, much higher innovation incidences are observed in the manufacturing subsectors of chemicals, computer/electronic products, and electrical equipment/appliances/components, and in some parts of the nonmanufacturing sectors of information and professional/scientific/technical services. Further, the BRDIS data indicate that companies that perform and/or fund R&D have a far higher incidence of innovation than do companies without any R&D activity.

Jared Diamond has an article in Science magazine with this title. He cites research "that suggests that bilingualism offers some protection against symptoms of Alzheimer's dementia in old people." It is good to know that I may get some of the mental time back that I invested in the past in learning other languages.

He tells an anecdote from his time in New Guinea when he learned that his Highlander companions spoke from 5 to 15 languages each. I recall chatting with a teen age street vendor in Cairo, and realizing that not only was he conversing with me in English, but with others in French and Arabic. When asking people from Africa or the subcontinent I have often found that they speak several "local" languages from the region where they grew up as well as one of more "metropolitan" languages. (The priest who married my wife and me spoke eight languages fluently, and I recall lunching with him at the university where he was conversing with four of us at the same time in four languages.) I live in a county in which one in five people was born in another country, yet the majority of my fellow Americans are monolingual and fear or otherwise resist learning another language. Why is that?

Diamond does not mention the benefits of being able to learn what other people think by talking with them or reading what they write in their own languages. That seems an odd omission.

I have read other of his works and he is an entertaining author with intriguing ideas.

He suggests that really innovative ideas usually develop over time and through discussions and interactions among people. One of his extrapolations from the stories of innovation that he has researched is that while it may be very useful to have (online) discussions of ideas in the search for important new concepts, it is important to do so in a way that the ideas presented are saved in an accessible form and can be revisited and added to over time (weeks or months or even years).

Mr. Johnson's engaging writing style guides us through seven key areas that must be understood in order to maximize our creativity, the key areas being:

1. The adjacent possible - the principle that at any given moment, extraordinary change is possible but that only certain changes can occur (this describes those who create ideas that are ahead of their time and whose ideas reach their ultimate potential years later).

2. Liquid networks - the nature of the connections that enable ideas to be born, to be nurtured and to blossom and how these networks are formed and grown.

3. The slow hunch - the acceptance that creativity doesn't guarantee an instant flash of insight but rather, germinates over time before manifesting.

4.Serendipity - the notion that while happy accidents help allow creativity to flourish, it is the nature of how our ideas are freely shared, how they connect with other ideas and how we perceive the connection at a specific moment that creates profound results.

5. Error - the realization that some of our greatest ideas didn't come as a result of a flash of insight that followed a number of brilliant successes but rather, that some of those successes come as a result of one or more spectacular failures that produced a brilliant result.

6. Exaptation - the principle of seizing existing components or ideas and repurposing them for a completely different use (for example, using a GPS unit to find your way to a reunion with a long-lost friend when GPS technology was originally created to help us accurately bomb another country into oblivion).

7. Platforms - adapting many layers of existing knowledge, components, delivery mechanisms and such that in themselves may not be unique but which can be recombined or leveraged into something new that is unique or novel.

I believe I just heard Chief Justice of the Supreme Court John Roberts say that he believed the current system works in that it was the job of the lawyers advocating for the competing interests in the case to explain the science in a way he the non-scientist judge could understand, as it was his responsibility as a judge to write decisions that explained the evidentiary basis for his position on a case in a manner that others could understand.

I don't like that position. I have spent a lot of time in peer review meetings watching scientists debate on the credibility of claims made by other scientists in their research proposals. The advocates in a case are seeking scientists who support the position which they advocate, and the judge may sometimes be faced by testimony of scientists, apparently equally credible, who testify to opposing views of the meaning of the evidence. It seems to me that having access to an independent source of expert advice should help the judge to evaluate that testimony.

If Chief Justice Roberts position were taken to the extreme, one might conclude that one need not be trained in the law to be a good judge because the advocates in a case have the responsibility not only in presenting the legal argument in support of their case, but in doing so in a way that anyone can understand that argument. However, law school and legal experience equips that judge to interpret the quality of the legal arguments made by each side in a way that is likely to be more accurate and valid than would a lay person.

Most people want a physician rather than a bureaucrat (neither a government bureaucrat nor an insurance company bureaucrat) deciding which medical treatment is called for to treat a specific patient with that patient's specific needs. This is in spite of the fact that we have regulatory oversight that requires manufacturers to fairly describe the efficacy of their pharmaceutical products. Training and experience count!

Thursday, October 21, 2010

Land-locked Bolivia is getting a tiny sliver of the Pacific – a dock, a free-trade zone and the right to run some naval vessels, although the agreement signed with Peru falls far short of what Bolivians have dreamed of for 126 years – a coastline of their own.

Peru's president Alan Garcia announced the pact during a ceremony at the southern Peruvian port of Ilo. It is part of a longstanding crusade by both Peru and Bolivia to prod neighbouring Chile into giving back some of the territory it seized in the 19th-century War of the Pacific.

I recall that working in Bolivia one could get a round of applause in any meeting by injecting "hacia al mar!" Not only was access to the sea a matter of national pride, but it should help Bolivia in its economic development, a development that is very much needed in this very poor nation.

Tuesday, October 19, 2010

This graph was published with a letter to the editor of Science magazine. The letter objects to a previous article in Science which suggested that money should be redirected from antiretroviral therapy for AIDS patients to other persons. The graph indicates that AIDS spending grows more slowly than does AIDS burden. One might think that countries should allocate funds to diseases according to the burden the diseases impose on the nation. That is clearly not true. Think about effective preventive programs such as childhood immunizations. If you can successfully immunize all the kids then you can effectively reduce the burden of the prevented diseases to zero.

we believe that health spending should not be allocated in any strict proportion to disease burden, but rather in proportion to the marginal return in terms of reducing disease burden. We advocate allocating incremental resources to the interventions that save the most life-years per dollar spent.

This is better, but still a simplification. DALYs (disability adjusted life years) are only one measure of the burden of disease, and in my opinion that measure does not fully capture the burden of disease. (Indeed, there are different ways to calculate DALYs.) Even if you accept that early death is equally unacceptable no matter who dies (and certainly we do not insure all members of a family equally), the degree to which a person's activities are limited by a physical disability depends on who that person is and what/he or she does. Think of Stephen Hawking who contributes enormously to world culture and knowledge in spite of almost complete physical immobilization!

An alternative measure is willingness to pay for health services to prevent death and to prevent disability and discomfort. Such a measure, might be in part able to get at how much the health problem distresses the public. A well known example was polio, which people felt was more important than the deaths and disability it caused would have implied. Think of the public willingness to support efforts to save the Chilean miners who were trapped under ground, as compared with the concern for the large number of kids who die each day from malnutrition and preventable diseases.

Kristen Minogue and Eliot Marshall have written an article in the October 8, 2010 issue of Science titled "BIOMEDICAL ETHICS: Guatemala Study From 1940s Reflects A 'Dark Chapter' in Medicine". The news article focuses on the epidemiological research on syphilis and other sexually transmitted diseases jointly supported by the governments of the United States and Guatemala. I have previously posted on the case, and would make the point that standards for the ethical conduct of medical research have evolved (for the better) since the 1940s when the research in question was conducted.

Some of my problems with the Science article:

While the article mentions John Cutler, implying that he was the key official in the program, it does not mention that he went on to have a distinguished career in public health, inventing a key diagnostic test for syphilis, rising to the highest level in the U.S. uniformed Public Health Service, serving as Deputy Director of the Pan American Health Organization and Chair of the Department of Health Administration at the University of Pittsburgh. I knew John Cutler and saw no indication of lack of ethics in his behavior.

The lead paragraph of the Science article describes the study in question as "a stunningly unethical U.S. medical study that was conducted in Guatemala 64 years ago." The study seems clearly to have been a collaboration between Guatemalan and U.S. researchers, not a U.S. study conducted in Guatemala.

The judgement that the research was "stunningly unethical" would seem to depend in part on the risks to which the subjects of the research were exposed. The article does not provide the reader with information on those risks, and most of the readers of Science probably do not know much about the diseases involved. The article quotes a CDC report that there were high levels of infection, but it occurs to me that the prevalence of those diseases might have been high in prisoners and military personnel in Guatemala in the 1940s. The diseases in their early phases are quite responsive to antibiotics, and the protocol was for infected subjects to be treated.

The article states that "as far as is known, most were treated, but records suggest that only 76% of those directly inoculated and infected with syphilis received "adequate" amounts of penicillin." Was the problem a failure of the record keeping, an rejection of treatment by the subjects, departure from the experiment of subjects (such as by completion of jail sentences or military duties), cure with less than the optimal dose of penicillin, or some other cause? Is there any reason to infer that people were in fact infected and not cured?

The article states that "the doctors who ran it (the research project) for the U.S. Public Health Service (PHS) never published results. It does not mention that the Guatemalan principal investigator did publish the results in the Bulletin of the Pan American Sanitary Bureau. This omission implies a failing on the part of Dr. Cutler, which may not have been merited. Moreover, it would usually be the principal investigator who would publish the results of an experiment. Dr. Juan Funes, the author of the report on the research was at the time Chief of the venereal disease control division of the Guatemalan Ministry of Public Health, a physician who had received post doctoral training in the United States. Was he in fact the principal investigator in the research? The article does not even raise the possibility that Cutler, at the time a very junior officer recently out of medical schools, was the junior partner in the project.

I agree that the syphilis study would not pass modern ethical standards and that it should stand as a warning against the inappropriate conduct of research funded even in part by the United States in other countries. On the other hand, Science should give a more balanced and complete account of such controversial events, and should be very careful to refrain from inappropriately blackening the reputation of scientists (especially those who have died and are thus unable to defend themselves).

Officials of the Government of the United States have hastened to apologize publicly that the United States funded and participated in the research. I recall that the Government of the United States was very much involved in the overthrow of the Guzman government of Guatemala in 1954, even though that government was democratically elected. The Dulles brothers, who headed the Department of State and the CIA at the time, were shareholders in United Fruit and that company's lands were threatened by expropriation by the Guatemalan land reform program at the time. If the Obama administration wishes to apologize to the world for unethical behavior in the past, perhaps the overthrow of the elected government of Guatemala would be a more important event than the syphilis study.

Monday, October 18, 2010

I came across a good article by Ben Wildavsky in the New York Academy of Sciences Magazine on the internationalization of higher education. I quote a revealing statistic:

Perhaps some of the anxiety over the new global academic enterprise is understandable, particularly in a period of massive economic uncertainty. But setting up protectionist obstacles is a big mistake. The globalization of higher education should be embraced, not feared—including in the U.S. In the near term, it's worth remembering that, despite the alarmism often heard about the global academic wars, U.S. dominance of the research world remains near-complete. A RAND report found that almost two-thirds of highly cited articles in science and technology come from the U.S. Seventy percent of Nobel Prize winners are employed by U.S. universities, which lead global college rankings. And Yale president Richard Levin notes that the U.S. accounts for 40 percent of global spending on higher education.

and:

Indeed, the economic benefits of a global academic culture are significant. In a recent essay, Harvard economist Richard Freeman says these gains should accrue both to the U.S. and the rest of the world. The globalization of higher education, he writes, "by accelerating the rate of technological advance associated with science and engineering and by speeding the adoption of best practices around the world ... will lower the costs of production and prices of goods."

It may be worthwhile to differentiate "science" from "technology" in considering the globalization of higher education in these fields.

Science is primarily focused on the development of knowledge about the physical and social world. That knowledge is not protected by intellectual property rights and new scientific knowledge becomes part of our world heritage. There doesn't seem to be any disadvantage to any nation of other nations contributing more and more to the amount of scientific knowledge and understanding in the global knowledge commons.

Technological knowledge is where the money is. Some of it can be protected by intellectual property rights for some time, and thus gives a competitive advantage to the firms that own those rights and the countries in which those companies place their production. If the global rate of technological innovation is increased by the globalization of S&T higher education and the dissemination of research intensive universities into more nations, then people everywhere should benefit from new products and more productive industries. Moreover, technological invention should not be a zero-sum game. If the United States continues to emphasize research and development, technological innovation and higher education in the sciences and technology then its rate of invention and commercialization of inventions should not be reduced.

Note however that higher education in technology produces engineers and others whose function is not so much invention as development, maintenance and operation of the infrastructure (including the productive plant as well as the civil works and energy infrastructure). These folks contribute to increased economic productivity where they work and again it is hard to see why an improvement of the infrastructure in other countries should be a disadvantage to the United States as long was we continue to develop and maintain our own.

The economic dominance of the United States was in part the result of Europe's wars that were so destructive to the economic systems of the European based empires that existed in the first decade of the 20th century. The standard of living in the United States should not be maintained by an effort to maintain an economic imperialism to exploit the people of other nations, but rather by an economy that innovates to increase productivity and a society that trades globally to take advantage of comparative advantages whereever they exist.

There are implications for the United States of the globalization of invention and S&T higher education. Technology transfer from abroad will become more important as a means of increasing productivity of U.S. industry (as it was in earlier centuries). The international pattern of comparative advantage will change more and more rapidly, so the United States will have to be more agile in changing domestic production and international trade to respond to changes in the international pattern of comparative advantage. It will also need a workforce prepared to live and compete in a more global economy that is growing and changing ever more rapidly.

Saturday, October 16, 2010

Thomas Cahill, the author of the Hinges of History series of books likens state execution to human sacrifice. Perhaps the analogy works for me since I am ashamed that my country is the only one among developed nations that still practices execution, and like Cahill I am Irish-American dealing with the problem that the ancient Celts practiced human sacrifice.

Like the practice of human sacrifice, the practice of state execution is thought justified by the superstitious belief that it somehow makes the world a better place for those left living. One aspect of that superstition is that executing the very occasional person deters others from committing murders and other capital crimes, but those of us who believe in statistics understand that there is no evidence that such is true. Another aspect of the superstition is the belief that it is cheaper to execute people than to keep them in prison, but again we know that that is not true,

One might also think about the choice of people to be executed and compare it with choice of people to be sacrificed. In both cases there is a ritual. Note that our legal system does not generally consider innocence to be a reason to overturn the guilty verdict of a legal process that condemns someone to execution. Sociologists might suggest that those chosen for execution have things in common. Not only are they in the power of the state, unable to procure good legal representation, but also they come from the fringes of society almost always.

Kwame Anthony Appiah says that societies change their ethical views when they become ashamed of a position once considered ethically acceptable. Perhaps we should make an effort to shame those who foolishly think state execution is an acceptable practice, comparing them with those primitives who practice human sacrifice in the hope it will please the gods. Certainly there is no honor in the practice of state execution.

The UNESCO Executive Board voted a couple of years ago to accept a donation of $3.5 million to establish a science prize in the name of President Obiang of Equatorial Guinea. A significant portion of the money would go to UNESCO, which is strapped for money and does a lot of good things. If well administered, the prize might recognize important scientific achievements and thus stimulate people to engage in important scientific efforts.

The announcement of the prize raised a storm of criticism. Many non-governmental organizations protested that UNESCO should not honor the dictatorial and repressive Obiang government by attaching the Obiang name to an prestigious international prize. There have been petitions circulated in the intellectual communities of Africa and Latin America to oppose the prize, and these have been signed by many distinguished people, including a significant number of Nobel Prize winners from those continents.

UNESCO Director General Bokova, elected last year, postponed the process for the awarding of the first prize, referring the policy question back to the Executive Board, which is meeting now. I understand that a group of countries, including major donors to UNESCO are seeking to reverse the earlier Executive Board Decision, reject the donation, and return the funding; a second group of countries, led by African delegations, is seeking to reaffirm the earlier decision. A committee of members of the Executive Board is apparently working to draft a resolution that can be accepted by the whole by the end of the week.

I don't know enough about the pros and cons of the matter to have a serious opinion on the merits of the matter. My instincts are that UNESCO should not dignify Obiang by attaching his name to a UNESCO activity.

UNESCO was created in the aftermath of World War II to engage the intellectuals of the world in a global effort to promote peace through the promotion of education, cultural exchanges, communications and science. Certainly the allied powers that worked to create UNESCO and the countries that continue to provide most of its funding see the promotion of democracy and human rights as central to UNESCO's mission. The Obiang case may be seen as a marker as to whether UNESCO will continue to focus on peace, democracy and human rights or will increasingly accommodate to a more limited mission.

Last Year's Election of the Director General

Last year at the end of four ballots, Irina Bokova and Farouk Hosny were tied 29 to 29 in the Executive Board voting to choose the next Executive Director of UNESCO. The election was very contentious with several diplomats refusing to accept their government's instructions on voting and being replaced. In the final ballot, Ms. Bokova was elected by 31 votes to 27. By all reports she is proving to be an exceptionally able leader, who has selected a good executive team, is implementing important reforms in UNESCO's programs and methods of work, and whose diplomatic skills are improving the reputation of the organization. That improvement is much needed!

When he failed in his election bid, it is reported, Farouk Hosny charged that he had been the victim of an international Jewish conspiracy. He reneged on his commitment to resign his position of Egyptian Minister of Culture if he lost the UNESCO election and has continued in that position. He was recently again in the international news related to the theft of a Van Gogh painting from an Egyptian museum. Even though the painting had been previously stolen from the museum (and later recovered), it was concluded that the security measures of the museum were dismally inadequate. There are charges that the security problems were due to the inadequate funding from the Minister of Culture. Rather than take responsibility for a failure on his watch of his ministry, Minister Hosny saw the imprisonment of his Vice Minister and several other subordinates to be tried for dereliction of duty. He himself has been quoted as saying the theft was "no big deal".

The governance structure of UNESCO is such that Mr. Hosny was almost elected instead of Ms. Bokova. I do not feel confident that that governance structure will find the right solution to the current Obiang controversy. I suspect that if UNESCO is to regain the support it had among its member nations and the international community, the governance has to be improved. Not only is the current structure subject to pressures which produce poor decisions, it is expensive and cumbersome requiring major investments of time of the leadership team for the Organization.

Thursday, October 14, 2010

Two years ago the Obama administration passed legislation that turned an economic crisis in some ways worse than the stock market crash of 1929 into a fairly acceptable recession with unemployment rates comparable to those seen in the 1980s in the Reagan administration. While some economists continue to complain that the stimulus package was not large enough, others are complaining that it was too large -- suggesting that the Obama team (composed of very accomplished economists) got it about right.

He has proposed not to extend the Bush administration tax cuts for the very rich, a policy that would help cut the deficit without affecting the poor or the middle class. The Republicans are opposing this policy, suggesting that the tax cuts for the rich should be continued while protesting the large deficits.

The Obama administration managed to pass the most comprehensive reform of our health care legislation in a generation. Not only will the reform extend coverage, but it will according to the politically neutral Congressional Budget Office cut the deficit.

The Obama administration is moving to end the U.S. involvement in the wars in Iraq and Afghanistan, wars that are unpopular with the general population, and in so doing reduce the hemorrhage of government funds which has totaled a trillion dollars so far.

The Obama administration has taken some important steps to promote peace between Israel and Palestine, recognizing that theirs is a conflict that has complicated the accomplishments of U.S. foreign policy goals all over the world.

The Obama administration has passed legislation to improve oversight of the financial industry, making any repeat of the crisis of the last few years less likely (and probably less severe if a repeat does occur). In this respect, the Obama administration has reversed deregulation policies of previous Republican administrations that contributed to the recent crisis.

Wednesday, October 13, 2010

My history book club met tonight to discuss Margaret MacMillan's book, Dangerous Games: The Uses and Abuses of History. It was well received, perhaps because of its plethora of examples of uses and abuses of historical information and/or its conversational style.

MacMillan adds evidence to that which should be obvious -- poor historical analogies or badly used historical analogies have led governments into very bad decisions. I also assume that analysis drawing on good analogies with proper caution can be very helpful in avoiding bad policy decisions. Unfortunately, MacMillan does not provide much guidance on how to choose good historical analogies or how to use them effectively and avoid their misuse.

MacMillan seems to be trying to make the case that one should study history because knowledge of history can be useful (if substantial and well used rather than abused). I feel, and I think most of my friends in the book club feel that there is no need to justify the study of history for its instrumental value. History is interesting. We are curious creatures and history tells us about people and what they have done and presumably may do again.

One of the points made in the discussion is that different peoples understand history in different ways. If you want to understand a foreign culture, it is important not only to understand the history of the people who share that culture, but also to understand the way in which they view their own history.

The idea of history was invented. If you go back far enough in time there seem to be no histories. Herodotus is often described as the first to write a true history. However, his innovation was built on other previous innovations. There were epochs that mixed accounts of events with mythological materials, records of the family trees of rulers, land ownership records, etc. Over time other forms of history have been invented, such as military history, economic history, social history, etc.

I was thinking of my own experience as a development planner. Only once in my career did I contract with a local historian to produce a history of the sector in the country in which I was developing a program. He produced a very interesting paper, providing historical precedents for innovations we were considering, as well as explanations of why related innovations had failed and short summaries of politically motivated development policies that had serious sectoral consequences. My counterparts, citizens of that country, were very enthusiastic about the utility of the study, saying that it provided them with a lot of ammunition that they could use in defending the program as well as a road map as to how to proceed politically to get the support the project would need. They also refused to publish the study as too sensitive -- the only time I recall having such a pre-project study censored. The project was eventually funded, my counterparts in the planning were put in charge, and the project was extremely successful. A part of the success was attributed to the historical groundwork that had been done. Yet the example was never, to my knowledge, replicated.

Indeed, I wonder how much of the history of the country in which they are working that most professionals involved in foreign aid actually know and understand!

A news article in Science magazine, based on a research report in the same issue, suggests that there may be a generalized ability of small nominal groups to solve problems analogous to the general intelligence attributed to individuals by "intelligence tests". Interestingly, this group intelligence seems only weakly predicted by the intelligence of the smartest member of the group and/of the average intelligence of group members, but that the social ability of the group members to work together is also predictive of group problem solving ability.

It occurs to me that the research was conducted using nominal groups -- that is putting people together for the purpose of the experiments in problem solving, which were rather short term. That situation might be different than the situation in a formal organizations such as a research laboratory or corporate executive suite in which people work for extended periods on a limited class of problems with feedback based on success or failure of past approaches. Certainly my own experience in small group professional decision making situations suggests that such groups are far more effective with increased experience, and in that sense do "learn" to solve problems better.

Thus, less socially adept groups might still find ways to work together effectively over time, increasing the importance of individual intelligence in group problem solving. Still, it might well be the case that the average research group would do well to spend some time on the social aspects of planning and evaluating its research agenda.

In her talk at Canada's International Development Research Center, Margaret MacMillan (the author of Dangerous Games: The Uses and Abuses of History) states that reasoning from historical analogies can be very helpful in making policy decisions, but that reasoning from the wrong analogy has sometimes led policy makers into very bad decisions. She gives an example (the Vietnam decisions by the U.S. administration in the Kennedy administration) in which vigorous debate on the appropriate analogy led to use of the wrong analogy over what in retrospect seems a better analogy.

How then should one use historical analogies in making policy decisions. Here are some suggestions:

Make a conscious effort to search for a significant number of alternative analogous situations, don't simply accept the first analogy that comes to mind (avoid an "availability bias").

Define explicitly the relevant aspects of the current and historical situation which are analogous and those which are different.

Recognize that the first step mentioned above requires a serious understanding of history

Recognize the the second step mentioned above requires a serious understanding of the nature of the current problem and of each and every one of the analogous situations.

Consider in depth several alternative analogous situations and the lessons that they bring. Ideally, seek to compare analogies that offer conflicting lessons.

Use the lessons of decision making to assure that there is a full discussion and that analogies favored by low status members of the policy team are as fully considered as those favored by high status members.

Recognize that policy is incremental and that policy decisions can be adjusted or reversed with experience; therefore reexamine the analogies used in initial policy decisions, updating them with experience with the current policy.

Sunday, October 10, 2010

In my last posting, beginning a discussion of The Company: A Short History of a Revolutionary by Micklethwait and Wooldridge, I described the invention of the modern limited-liability stock company in British legislation of 1862. I think this might be described as the invention of a social technology or institutional technology. That is, the modern corporation is a way or organizing people to utilize resources to achieve a human purpose, but its invention was not the invention of a physical device or material, but rather of a socio-economic institution.

In Brian Arthur's book, The Nature of Technology: What It Is and How It Evolves, he considers the growth of technology as a cumulative process in which inventions build on previous inventions. The modern large-scale corporation shares that property. There is a history of organization of men and resources to carry out trade or manufacturing that goes back thousands of years. Importantly, lessons learned in the management of large scale military and religious organizations could be applied to the management of corporations involved in service or manufacturing industries.

As Micklethwait and Wooldridge point out, the corporation became a widely used institutional form because changes in manufacturing technology and transportation technology allowed great benefits to be achieved by scaling up production and distribution of products if and only if the larger enterprises could be managed effectively. The corporation was more efficient in dealing with the information requirements of large scale manufacturing and distribution of goods and services than the market. (With the new information and communications technologies of the latter part of the 20th century, markets are increasingly competitive for many information processes and we are seeing restructuring with some firms downsizing and focusing on core competencies while using Internet mediated markets to outsource many non-core activities.)

One of the questions of economic analysis is whether to regard innovation as exogenous or endogenous to the economy. The question is whether to include technological invention and innovation within models of the economy or to leave them outside as perceived time changing boundary conditions.

The Xerox, in my opinion, might be seen as an exogenous innovation, driven by an inventor who recognized a physical property of materials and saw how it might be exploited to perform a useful purpose with real economic value that might be appropriated from the users to build a company.

Semiconductors, in my opinion, might be seen as endogenous. There was a need to amplify telephone signals in long distance lines, especially undersea cables, which was only poorly met by vacuum tube electronics which failed frequently and required costly replacement services. It was recognized that solid state devices might well be longer lasting and more reliable if they could be developed. Thus Bell Labs and other research laboratories were chartered to do basic research on solid state physics, leading eventually to the invention of solid state electronic devices.

I suppose that the same analytic distinction might be applied to the invention of institutions. The creation of the legislation allowing modern corporations would seem to be an induced innovation in which Parliament sought a new institutional model that would serve to expand the industrial revolution. The earlier effort by Alexander Hamilton to create a National Bank for the United States might be seen as a similar induced innovation.

On the other hand, a lot of Internet based institutional inventions -- eBay, Amazon.com, social networking -- might be seen as exogenous, in that they were not sought by the major pre-existing institutions, but rather invented by "outsiders" and developed into large institutions in response to the benefits that they conferred.