Tuesday, October 28, 2008

After reaching 38% of GDP during World War II, defense spending dropped below 5% of GDP in the years immediately following the war, rose to nearly 15% at the start of the cold war as the US rearmed and built up its stock pile of atomic weapons, declined through the late '50s and early '60s to 7%, jumped again during the Vietnam War to 10%, declined throughout the '70s and early '80s, headed up again during the Reagan era of rearmament as we spent the Russians into state failure, droped again in the late '80s and early '90s until Desert Storm, declined until the beginning of the millennium until 9/11 when it rose steadily to almost 5% of GDP in 2008.

It looks like credit woes are pretty much the same around the world. The Indian middle-class, enjoying rising prosperity, has gotten caught up in fast-rising revolving credit card debt that they seem to little understand and have difficulty managing.

Unsecured loans and credit card receivables more than three months overdue: 7%-9% of total loans outstanding this year; expected to rise to 15% according to ratings agency Crisil Ltd. in Mumbai.

Number of credit cards in India: 30 million, up 3x in the last 5 years.

At the end of FY '08 (ended March 31) Indians charged more than $14 billion on their credit cards, over 3x the amount charged four years ago.

Indians have little experience with handling revolving credit and more people are turning up desperate for help with their credit card payments, according to V.N. Kulkami, chief counselor at Mumbai's Abhay Credit Counseling Center, which advises borrowers.

The nominal definition of a bear market is one in which securities prices drop 20% over a period of time. In today's Wall Street Journal Arthur B. Laffer reminds us what a real bear market is:

I saw up close and personal Presidents Gerald Ford and George H.W. Bush succumb to panicked decisions to raise taxes, as well as Jimmy Carter's emergency energy plan, which included wellhead price controls, excess profits taxes on oil companies, and gasoline price controls at the pump.

The consequences of these actions were disastrous. Just look at the stock market from the post-Kennedy high in early 1966 to the pre-Reagan low in August of 1982. The average annual real return for U.S. assets compounded annually was -6% per year for 16 years. That, ladies and gentlemen, is a bear market. And it is something that you may well experience again. Yikes!

Laffer has a new book out on the subject: "The End of Prosperity: How Higher Taxes Will Doom the Economy--If We Let It Happen"

Laffer says we are making the same mistakes as previous generations of politicians, whether Republican or Democtratic, with hasty solutions cooked up under panic conditions that will set the stage for "the end of prosperity".

The main problem with the modern financial system based on widespread use of derivatives and securitization is that while financial specialists understand how individual assets function, even they have little understanding of how the whole incredibly complex financial system operates when exposed to various types of stress.

Like many exotic financial products which are extremely complex and profitable in times of easy credit, when markets reverse, as has been the case since August 2007, in addition to spreading risk, credit derivatives, in this case, also amplify risk considerably.

Now the other shoe is about to drop in the $62 trillion CDS market due to rising junk bond defaults by US corporations as the recession deepens. That market has long been a disaster in the making. An estimated $1,2 trillion could be at risk of the nominal $62 trillion in CDOs outstanding, making it far larger than the sub-prime market.

No regulation

A chain reaction of failures in the CDS market could trigger the next global financial crisis. The market is entirely unregulated, and there are no public records showing whether sellers have the assets to pay out if a bond defaults. This so-called counterparty risk is a ticking time bomb. The US Federal Reserve under the ultra-permissive chairman, Alan Greenspan and the US Government’s financial regulators allowed the CDS market to develop entirely without any supervision. Greenspan repeatedly testified to skeptical Congressmen that banks are better risk regulators than government bureaucrats.

Use of Credit Derivatives to Transfer Risk outside the Banking SystemPerhaps the most significant development in financial markets over the past ten years has been the rapid development of credit derivatives. Although the first credit derivatives transactions occurred in the early 1990s, a liquid market did not emerge until the International Swaps and Derivatives Association succeeded in standardizing documentation of these transactions in 1999. According to the BIS, the notional value of credit derivatives outstanding increased sixfold between 2001 and 2004, reaching $4.5 trillion in June of last year. Moreover, this growth has been accompanied by significant product innovation, notably the development of synthetic collateralized debt obligations (CDOs), which allow the credit risk of a portfolio of underlying exposures to be divided or "tranched" into different segments, each with different risk and return characteristics. Recent growth of credit derivatives has been concentrated in these more-complex structured products.

As is generally acknowledged, the development of credit derivatives has contributed to the stability of the banking system by allowing banks, especially the largest, systemically important banks, to measure and manage their credit risks more effectively. In particular, the largest banks have found single-name credit default swaps a highly attractive mechanism for reducing exposure concentrations in their loan books while allowing them to meet the needs of their largest corporate customers. But some observers argue that what is good for the banking system may not be good for the financial system as a whole. They are concerned that banks' efforts to lay off risk using credit derivatives may be creating concentrations of risk outside the banking system that could prove a threat to financial stability. A particular concern has been that, as credit spreads widen appreciably at some point from the extraordinarily low levels that have prevailed in recent years, losses to nonbank risk-takers could force them to liquidate their positions in credit markets and thereby magnify and accelerate the widening of credit spreads.2

A definitive evaluation of these concerns about nonbank risk-takers would require information on the extent of credit risk transfer outside the banking system and on the identities and risk-management capabilities of the entities to which the risk has been transferred. Unfortunately, available data do not provide this information.

ZenithOptimedia today lowered its forecast for advertising growth both in the US and worldwide.

Zenith, whose forecasts are closely followed by the industry, said it expects ad spending in the U.S. to grow just 1.6% this year and by less than 1% in 2009. In June, the ad-buying firm, a unit of Publicis Groupe, predicted growth of 3.4% and 2.6%, respectively, for this year and next.

World-wide, Zenith says it now expects ad spending to grow 4.3% to $506.3 billion this year and 4% in 2009. In June it predicted 6.6% growth for 2008 and 6% growth for 2009.

Monday, October 6, 2008

According to the government, there are 35 working public ambulances with life-saving equipment to serve the needs of a population of over 14 million in New Dehli, India. This greatly contributes to the loss of life during terrorist attacks when the "golden hour" turns into the "golden four hours".

Thursday, October 2, 2008

I was amazed to discover to what extent demand for commodities world-wide is driven by growth in China. In 2009 Deutsche Bank forecasts that almost 100 per cent of the growth in demand for aluminum, around 80 per cent of the growth in demand for iron ore, oil, and steel, and 60+ per cent of the growth in demand for copper will come from China.

Forecast of China's Share of the Growth in Demand for Commodities Worldwide in 2009

Although growth of the Asian manufacturing economies of India and China has slowed in response to slowing global markets and the recent credit crash, commodity prices remain high. Economist Jeffrey Frankel believes that that low real interest rates have been the cause as the continued strength of commodity prices:

One wouldn’t want to try to reduce commodity markets to a single factor, nor to claim proof of any theory by a single data point. Nevertheless, the developments of the last six months provided added support for a theory I have long favoured: real interest rates are an important determinant of real commodity prices.

High interest rates reduce the demand for storable commodities, or increase the supply, through a variety of channels:

by increasing the incentive for extraction today rather than tomorrow (think of the rates at which oil is pumped, gold mined, forests logged, or livestock herds culled)

by decreasing firms’ desire to carry inventories (think of oil inventories held in tanks), by encouraging speculators to shift out of spot commodity contracts, and into treasury bills.

All three mechanisms work to reduce the market price of commodities, as happened when real interest rates were high in the early 1980s. A decrease in real interest rates has the opposite effect, lowering the cost of carrying inventories, and raising commodity prices, as happened in the 1970s, and again during 2001-2004. It’s the original “carry trade.” (http://www.voxeu.org/index.php?q=node/1002)

U.S. auto sales reached a 15-year low with a double digit decline in September as sales of cars and light trucks fell 27% to 964,873 units in September (2008), down from 1.31 million a year earlier, according to Autodata Corp. Tightening credit, a financial system in shambles, and consumer fear all contributed to a seasonally adjusted annualized rate of 12.5 million units, down from 16.19 million units in September 2007.

Piracy off the coast of Somalia has more than doubled in 2008; so far over 60 ships have been attacked. Pirates are regularly demanding and receiving million-dollar ransom payments and are becoming more aggressive and assertive.

The international community must be aware of the danger that Somali pirates could become agents of international terrorist networks. Already money from ransoms is helping to pay for the war in Somalia, including funds to the US terror-listed Al-Shabaab.

The high level of piracy is making aid deliveries to drought-stricken Somalia ever more difficult and costly. The World Food Programme has already been forced to temporarily suspend food deliveries. Canada is now escorting WFP deliveries but there are no plans in place to replace their escort when it finishes later this year.

The danger and cost of piracy (insurance premiums for the Gulf of Aden have increased tenfold) mean that shipping could be forced to avoid the Gulf of Aden/Suez Canal and divert around the Cape of Good Hope. This would add considerably to the costs of manufactured goods and oil from Asia and the Middle East. At a time of high inflationary pressures, this should be of grave concern.

Piracy could cause a major environmental disaster in the Gulf of Aden if a tanker is sunk or run aground or set on fire. The use of ever more powerful weaponry makes this increasingly likely.

There are a number of options for the international community but ignoring the problem is not one of them. It must ensure that WFP deliveries are protected and that gaps in supply do not occur.

Update, November 18, 2008: Somali pirates have hijacked a Saudi supertanker carrying a cargo of $100 million in oil. The capture of Sirius Star 450 nautical miles southeast of Kenya's Mombasa port, and way beyond the Gulf of Aden where most attacks have taken place this year, is their boldest attack and the culmination of several years' increasing activity.

Sunday, September 28, 2008

Circumcision rates of newborns worldwide has dropped dramatically since World War II, except in the US where the rate has dropped from 64 percent of newborn males to 57 percent of newborn males. Doctors fear this is a serious public health issue since circumcision defends against many forms of sexually transmitted diseases, including HIV/AIDS. Arguments against the procedure include fear of desensitization of the penis, which doctors claim has never been demonstrated clinically.

Rates of Newborn Circumcision Have Dropped Dramatically Since WWII, Except in the US

Source: NewScientist, "Cut!", July 19-25, 2008

Female genital circumcision (mutilation) showed a drop from 35% to 20% in Ghana at the War Memorial Hospital from 1995 to 2003.

The objective of this study was to evaluate the prevalence of female genital cutting (FGC) in Upper Egypt, after 6 years of putting prohibition law into action. A total number of 3730 girls between the ages of 10-14 years were recruited to participate in this study. They were mainly preparatory school students (three urban and three rural areas). Social workers interviewed them as to whether they had undergone circumcision within the last 6 years or not. Subsequently, a questionnaire was sent to parents of girls who were positive for circumcision as to the circumstances surrounding the procedure. The prohibition law of FGC seems not to have altered the prevalence of this procedure. The majority of girls (84.9%) had had circumcision within the last 6 years with high prevalence in rural areas (92.5%). Circumcision was done for a combination of reasons, according to parents, with high rates of non-medical personnel participation (64.15%). This study's results indicate that the practice of FGC in Upper Egypt remains high despite enforcement of law. Extensive efforts are needed both to revise public awareness and to change attitudes regarding FGC.

Employer health care insurance cost trends continued their upward march in 2008, rising an average of 5% from 2007. This is a far cry from the 13% to 14% increases of five years ago, but it still makes employer paid health care one of the greatest benefits to employees in the US. The costs for family coverage in 1999 were an average of $5,900. In 2008 family health care insurance coverage averaged $13,000, an increase of 54% in nine years. But employers and employees are paying more for less, because today's insurance packages come with much higher deductibles, in some cases as much as $1,000 or more per year.

Interest rates in the US remain 2 1/4 points lower than those in the EU, despite greater consumer price inflation in the US. The Federal Reserve in the US is charged with both keeping inflation under control and stimulating the economy, while the EU central bank's mandate is only to keep inflation in check.

Over 3000 banks failed during the savings and loan crises, and the run up to it, in the '80s and '90s. By comparison, bank failures post the S&L crisis of the '80s have been minuscule by comparison. But the ones of late have been gigantic in size.

Tuesday, September 23, 2008

At the core of the Wall Street crisis lies unresolved issues with the housing market in the US. This is the rot at the core of investment banks' holdings. Unfortunately, the data still does not paint a pretty picture about the housing market. The inventory of unsold, previously owned homes stands at a high approaching 12 months. This means there is a pipeline of 12 month's worth of sales in currnt back inventory of unsold homes. Since large numbers of people are probably postponing their home sales if they can, due to the lousy housing market, this backlog is probably a dam behind which lie many more months of unsold inventory just waiting for a sign of improvement before it gets dumped onto the market, depressing housing prices once again.

There was a brief uptick in the median price of homes in the first quarter of '08, but the trend resumed its downward course in the second quarter of '08. Meanwhile, the delinquency rate shot up in the second quarter of '08, now above 5 percent on first mortgages, and the default rate on first mortgages approached 2% in the second quarter of '08.

None of this reflects added woes that may come from the loss of jobs on a related to the Wall Street melt down. For example, economist James Hughes, dean of Rutgers' Edward J. Bloustein School of of Planning and Public Policy, said that he expects New Jersey to suffer economic difficulties and job losses well in excess of the 16,000 that the state has already lost since the beginning of the year. In a worst case scenario, 100,000 jobs could be lost, he fears.

Morgan Stanley and Goldman Sachs Group, the remaining independent Wall Street brokerages, threw in the towel today and converted to traditional bank holding companies, placing themselves under the regulatory supervision of numerous federal regulatory agencies. Instead of being overseen just by the Securities and Exchange Commission, Goldman Sachs and Morgan Stanley will now face much stricter oversight from numerous federal agencies. The Federal Reserve will regulate the parent companies, the Comptroller of the Currency will oversee the national bank charters, and the Federal Deposit Insurance Corp. will likely play a bigger role because the companies are expected to seek much higher volumes of federally backed deposits, and the entities may seek to take customer deposits, instead of depending on short-term borrowing to finance their investments.

The firms will reduce their leverage ratios -- a measure of a firm's risk in relation to the equity on its balance sheet -- over the next two years from current levels to something more in line with that at commercial banks. Investment bank ratios now stand above 20, with commercial banks closer to 10. The combination of easy money, interest rates kept lower than the real inflation rate by the Fed after the dot com bust, Fannie Mae and Freddie Mac incentives to make housing affordable to more Americans, and finally the housing bubble burst, at the core of the problems, that all contributed to the rapid deleveraging of investment banks as their underlying assets crashed in value.

Morgan Stanley and Lehman Brothers had leverage ratios above 30 in Q1 2008, while Goldman Sachs had a leverage ratio of 27. For comparison purposes, Bank of America has a leverage ratio of 11, giving it far more assets to back up any troubled investments it makes. Having more assets on their balance sheets allowed the commercial banks to ride out the storm, since they had more resources to fall back on, without having to raise more capital, as their investments declined in value. Much of the investments banks out-sized profits came from this leverage - investing borrowed money in the hopes of getting a great return before having to pay it back. This fueled the out-sized returns of the investment banks during their salad day, and fuled their death pyres during the housing crash. They will be safer in the future, but also have smaller returns, as their ability to bet large amounts of someone else's money has now come a'cropper.

The Newark Star-Ledger, New Jersey's largest newspaper, expects to loose $30 million to $40 million this year, while its sister newspaper, the Trenton Times expects to loose $5 million. Both papers are properties of Advance Publications, owned by the Newhouse family. The Star-Ledger is attempting to negotiate voluntary buy-outs of 225 non-union workers, which would leave its workforce with 500 employees. The Trenton Times is attempting to achieve 25 voluntary buy-outs. The Star-Ledger's circulation has shrunk to fewer than 300,000 subscribers.

The Pew Research Center for the People and the Press has released its Biennial News Consumption Survey. The survey shows that all old media news viewership and readership has been steadily loosing ground since 1993.

This morning (Sept. 22, 2008) Internet entrepreneur Mark Cuban told newspaper executives to give up and declare bankruptcy. It's not that people won't continue reading newspapers, its that they won't continue reading them in the numbers they have. The handwriting is on the wall and newspapers need to start over again. Some, such as the Wall Street Journal, have made yeoman efforts to reinvent themselves on line. A few, with such herculean efforts, will probably survive. But for the most part, newspaper online sites are disappointing at best, hard to use, and merely repurpose their print editions with an online edition.

Friday, September 19, 2008

Google has delayed its search advertising partnership with Yahoo for three months to give government regulators a chance to examine their deal on the fear that it might lead to anti-competitive developments in search engine marketing. Google and Yahoo together control over 80% of the lucrative and fast growing search engine advertising market (SEM).

But now Google CEO Eric Schmidt says the companies cannot wait any longer and will move forward with their partnership. Search marketing company SearchIgnite has published a whitepaper, Potential Impact of Google-Yahoo! Partnership & Cost to Marketers, which purports to show that Google's keyword prices are as much as 22% higher on average than Yahoo's. This could result, SearchIgnite says, in advertisers paying more for keywords on Yahoo than they otherwise would, if Yahoo chooses a revenue maximization strategy for its Google partnership. Google claims that its search keyword prices are set by auction, as are Yahoo's, and that no collusion between the companies on setting prices can result. Google also claims that there are methodological errors in the SearchIgnite research that make its results questionable.

While no one outside the companies has seen the agreement, Yahoo has described it as providing supplemental results to its own search advertising. As Cnet describes Hilary Schneider, Executive Vice President of Yahoo, saying:

She [Schneider] showed a specific example to bolster her case. A search for "red roses in Birmingham Alabama" yields no advertisements on Yahoo's search engine and 11 on Google's. Under the deal, Yahoo can show Google's ads when it chooses, sharing the resulting revenue.

This is a long-time and established relationship in the search and online advertising industry referred to as "backfill". It is quite common for a search site A to have an arrangement with a partner B to display B's results, either search results or search ads or both, as a supplement to A's. This provides A with more paying inventory and better results to display, B with greater distribution of its results, B's advertisers with broader reach, and visitors to the site with better results. It is generally held to be an all-around win for all parties.

In this case "A" is Yahoo and "B" is Google. It would seem that this arrangement benefits all parties concerned. Users of Yahoo's search engine will get more and better ads. Advertisers on Google will get broader reach to another audience of highly qualified searchers (Yahoo's search engine users). Google will derive additional revenue from displaying its ads to a broader audience (Yahoo's search engine users), and Google advertisers will have broader reach to another audience without the trouble of running multiple search advertising campaigns on multiple search engines.

Yahoo has said that it expects $800 million in revenue and $250 million to $450 million in incremental cash flow from the first year of the deal. Google offered the deal to Yahoo to keep the company and its search engine out of the hands of Microsoft, who had made a bid to buy Yahoo for as much as $47.5 billion, or $33 per share. Yahoo shares fell 44 cents to $18.82 on Wednesday, September 17, 2008.The fear by advertisers, who have been the most vocal opponents of the deal, is that this arrangement will somehow boost their advertising costs. But Google and Yahoo will not be commingling their keyword auctions in this arrangement, nor will either company be able to see the price of the other's keywords, although Yahoo will know it after thte fact, that is after a user clicks on one of Google's keyword ads. Yahoo could use this after-the-fact knowledge to cherry pick higher-paying Google ads that perform better than their own lower-paying ads.

With respect to SearchIgnite's claim that keyword prices could rise as much as 22% as a result of this deal, this seems unlikely. Assume that it is true that Google's keywords are, on average, 22% more expensive than Yahoo's. Even if Yahoo could see the Google price on each keyword (which it can after the fact), and even if it decided to pursue a profit maximization strategy by replacing each of its cheaper keywords with a more expensive keyword (which it says isn't the nature of the agreement), Yahoo would still have to pay the affiliate fee for each Google PPC ad so used. This could range from a Yahoo/Google revenue split of 70%/30% to 90%/10%. While Yahoo has alot of clout due to the size and quality of its search traffic, 70/30 seems on the low end while 90/10 seems on the high end.

Based on the agreements I have been privy to I would speculate that the revenue split between Yahoo and Google is on the order of 80%/20%. With just a hypothetical 22% premium on Google search ads, Yahoo essentially breaks even on this profit maximization strategy. The only way Yahoo makes money from this deal is if it uses Google PPC ads to supplement its own, not replace its own.

There is one other area of concern that none of the above addresses, and that is the fear of the concentration of power in the hands of just two players of over 80% of the search advertising on the Internet. This advertising is extremely desirable and likely to continue growing strongly in the future. The idea of just two players controlling so much of this market has to be unsettling, despite any demurrals by the principals and admonishments to "do no evil" from Google. This is an issue that cannot be addressed by the facts of the deal. Internet advertising technology is likely to continue to change rapidly and the partnership between Google and Yahoo may grow tighter. If it is any consolation, historically such deals have usually ended with the partners at odds with each other, rather than closer to each other. The interests of the two parties eventually diverge so that they cannot sustain the relationship. The Internet is the most dynamic environment, technologically and business-wise, we have ever created. What seems like a good idea and a threat today can just as quickly go sour tomorrow.

Google now provides its advertisers control of which of its affiliate networks, even down to the particular web site, they want their ads to appear on. I would hope that Google extends this control to its advertisers for this Yahoo partnership. If an advertiser is afraid that it will be charged too much when its ads appear as backfill on Yahoo, or if they don't like the resulting ROI in such an arrangement, they should be able to opt out of having their ads appear on Yahoo. It is something that advertisers now can do on Google anyway, and it should ease concerns over rising advertising prices for those who are worried, even theoretically, about such things.

Wednesday, September 17, 2008

Total retail sales and total e-commerce sales have been leveling off, since Q2 2006 for straight retail and since Q4 2007 for e-commerce. So, the e-commerce slow down has lagged the retail sales slow down by six quarters, but it is finally here. Q1 2008 e-commerce retail sales were up only .3% from the previous quarter and retail sales were up only .2% quarter over quarter. In Q2 2008 total retail sales were up just .9% from Q1 2008, but e-commerce sales did better, up 2.9% from Q1 2008. Still, e-commerce sales growth has sunk to its lowest level since the recession of 2001 when sales fell .5% in Q2 2001.

The Commerce Department reported that retail sales in August were weak across the board, falling .3% from July. This despite lower gas prices which left more discretionary income in consumers' pockets. The continued crisis in financial markets, the housing bust, disappearing credit, and higher unemployment with attendant uncertainties continue to be a drag on consumers' spirits.

Tuesday, September 16, 2008

Leica, always very conservative in its design approach, was very late to the digital camera market. Leica even resisted incorporating a light meter into its film cameras until the M6 was introduced in 1986, over a decade behind its Japanese competitors. Despite this, the quality and feel of its cameras has created a devoted and fanatical following. Leica introduced its first digital rangefinder, the M8, in 2006. Its price was high compared to other digital cameras, $4500 then, $5400 now for the M8. The M8's successor, the M8.2 was announced in September 2008 at $6500. The main new features were resized finder lines for more accurate framing, a "point and shoot" mode, and a quieter shutter. These mean something to some rangefinder aficionados, but on the whole are a disappointment after a nearly two year wait for the successor to the M8.

There is always alot of speculation on Internet photography forums on Leica's finances, which are hard to come by because the company was taken private by Mr. Andreas Kaufman, the current acting CEO, in 2005. However, Mr. Kaufman owns just 96.5% of Leica Camera. The remainder is thinly traded on the Frankfurt Exchange. In today's WSJ an article reported on Leica's finances, something worth keeping for future reference on Leica's business.

The annual revenue for its last fiscal year was 150 million euros, or $213 million dollars. Sales for its first fiscal quarter, ended June 30, were 26.999 million euros, less than half of the previous year's first quarter. The company had a loss of 3.85 million euros for FY 2008, ended March 2008. It anticipates a loss approaching 10 million euros for the fiscal year ending March 2009. Mr. Kaufman estimates that sales have to grow by about 66 percent to 250 million euros to finance the R&D spending Leica needs to stay competitive in digital markets.

It is hard to see this happening based on the new products and prices Leica has announced. Without the money to finance R&D not enough new products can be introduced to reverse Leica's death spiral. This is why I believe we see such a disappointing set of features in the M8.2: not enough money to invest in anything more spectacular. And the fall in Leica's quarterly revenue shows that it is not immune to weakening retail markets around the world, nor that its cachet is enough to carry it through lean times.

Because of the short back-focus distance of the rangefinder camera design, Leica needs a custom digital sensor for its digital rangefinder camera. This must be quite a large R&D expense for a camera maker as small as Leica with its small unit volume. The current sensor has a 1.33x crop factor, as it is referred to in digital circles, meaning the focal length of a lens made for a standard 35mm camera is 1.33x longer than it would otherwise be. This upsets alot of photographers, especially the film-loving Leica aficionados. But to create a new "full frame" sensor, as it is called, with a 1.0x crop factor, is a tall order for Leica, both because of the technical challenge, it might not be possible yet, and the high cost of R&D to create such a beast that would sell in small volumes. The resulting camera price might be prohibitive even for Leica collectors who are used to nose-bleed prices.

There may also be a business case for Leica not to make a full-frame sensor. The 1.33x crop factor sensor requires new, shorter focal length lenses to give the same field of view as the older lenses designed for 35mm film. Leica is introducing a whole slew of these new lenses at astonishingly high prices. They are also designs that have never been seen before, such as very fast and very wide lenses, 21mm @f1.4. One of Leica's big problems is the cannibalization of its sales by its own older equipment, some of it 50+ years old that is still perfectly usable on its cameras. But there are no such older lenses that can be used on the digital M8. Hence, if buyers want them they have to get them new from Leica - they can't turn to the used market to buy them cheaper there. For this reason alone, I believe that it will be a long time before Leica introduces a full frame M8 successor. It servers its business interests better to have the 1.33x crop factor M8 and make more money selling lenses than the one time charge for a full frame camera body that would rekindle the cannibalization of its new lenses by old.

Wednesday, September 10, 2008

Major movie theater chains and independent movie houses are trying to make the jump to 3D pictures to entice couch potatoes out of their 50-inch screen, THX powered homes. The leader in this technology area is RealD Corp., of Beverly Hills, California. In order to run 3D pictures a theater first has to upgrade to digital projection, then add the RealD 3D system to it.

Major theater chains such as Regal Entertainment Group and Cinemark USA Inc. have signed deals to outfit 1500 screens with RealD 3D projection technology. The Cinema Buying Group LLC, which represents 643 small theater owners in the US and Canada, has said it plans to add 3D equipment to 1,000 of the 8,000 screens it represents.

Digital 3D Theater Factoids

Cost to upgrade reel-based projector to digital projector: $50,000 - $70,000 per screenCost to license RealD 3D technology: $20,000 per screen over 10 yearsTheaters with RealD screens in North America: 1200Theaters with RealD screens in North America by the end of 2008: 3200

Premium charged for 3D ticket: $2 - $5

Movie tickets sold in US in 2002: 1.6 billionMovie tickets sold in US in 2007: 1.4 billion

Bab el-Mandab, the strait connecting the Red Sea with the Gulf of Aden and the Indian Ocean, has come under the attack of pirate vessels. Ships have been captured and their crews held for ransom. This threatens some 3.3 million barrels of crude, or 3.9% of daily world supply, that moves through the strait headed east for the Suez Canal.

The Congressional Budget Office (CBO) has issued its September 2008 budget and economic update. The CBO estimates that the deficit in 2008 will be over twice the shortfall of 2007, rising from $161 billion in 2007 (or 1.2% of GDP) to $407 billion in 2008 (about 2.9 percent of GDP). The CBO says:

The significant expansion in the deficit is the result of asubstantial increase in spending and a halt in revenuegrowth. In 2008, CBO estimates, federal spending will be8.3 percent higher than it was in 2007; at the same time,total revenues will be less than they were in 2007.

The only time in recent history the Federal budget has run a surplus was for four years (1996 - 2001) of the Clinton administration. Although the economy has soured due to stagnating wages, layoffs, the credit crunch, and the housing debacle, the Federal deficit is in no where near as bad shape as it was during the stagflation era of the '70s and early '80s.

We will have to wait and see about the CBO's prognostications. As Yogi Berra said, "Prediction is very hard, especially about the future."

Defense outlays comprise nearly 66 percent of Federal discretionary outlays. It costs alot to be the policeman of the world:

Contributing to the run-up in the record price of oil, the CBO report shows that demand for oil has far outpaced the supply since 2005, and year-over year demand growth has also outstripped supply growth since 2005.

Data shows that the great oil price run-up of the '00s, not seen since the early 1980s, dates from the beginning of this mismatch in oil supplies and oil demand. Tell me again why Congress needed to hold hearings on this issue? Couldn't they just have read their own report?

Monday, September 8, 2008

In 1770 the then 14-year old Wolfgang Amadeus Mozart came to Rome from Salzburg with his father during Holy Week. He attended concerts of the Sistine Chapel Choir, whose signature masterpiece was Allegri's Miserere, sung only on three days during Holy week, a piece so sublime and coveted with its soaring voices that the Holy See forbade publishing the music, lest it fall into hands other than those of the Sistine Choir. Mozart heard it once, maybe twice, went back to his hotel room and wrote it down from memory.

Word got out that the music was now available and the guards of the Holy See showed up in Mozart's hotel room demanding to know how he had gotten hold of the music. When he told them, the guards didn't believe him, so he sat down and wrote out the first twelve measures again from memory. They finally believed him. The secret was out. And so began one of the earliest chapters of musical piracy.

What once required a 1000-year genius can now be done by anyone. This is how technology becomes the great leveler.

Cost of shipping a standard 40-foot container of goods from China to the US:

2000 -- $3,0002008 -- ~$8,000

Increase: 266% over eight years, caused largely by the increase in the price of oil from around $20/barrel to triple digits today. The result has been a dimming of appeal among manufacturers to outsource manufacturing to China and other Asian countries. The price of oil is making the world round, again.

Friday, September 5, 2008

The Newspaper Association of America (NAA) today reported Q2 2008 advertising expenditures for newspapers. The results were not good. Total print expenditures were down 16.07% to $8.8 billion from $10.5 B in Q2 2007. Total online expenditures, showing declining growth for the past 5 quarters, shrunk for the first time quarter over quarter from $795 million in Q2 2007 to $776 million in Q2 2008, a growth rate of -2.4 percent. The combined quarter-over-quarter growth rate for newspaper advertising expenditures is now -18.47%.

Classified advertising expenditures continued showing the greatest loss, down 27% from Q2 2007. Classified advertising has shown the deepest losses for newspapers starting in Q2 2006. Retial and national advertising expenditures were down 9.56% and 13.90%, respectively. The Internet has done a job on classified newspaper advertising with sites like eBay (revenue model: auction) and Craig's List (revenue model: free) having taken a big bite out of this important newspaper advertising category. The undeclared recession of 2007-2008 has also not been kind to advertising expenditures.

Wednesday, September 3, 2008

Google's introduction of its new browser "Chrome" is the first serious challenge to Microsoft's Internet Explorer since the end of the Netscape browser wars in, what was it?, 1999. The culmination of those wars was ultimately the break-up and sale of Netscape, the software division going to Sun Microsystems, the Netscape browser and Netscape portal going to AOL and eventually AOL-Time Warner, and the creation of the Mozilla Foundation to shepherd the open source version of the next generation Netscape browser. Of all the outcomes, perhaps only the Mozilla Foundation's Mozilla browser had any staying power, now representing almost 20 percent of the browser market.

Today Internet Explorer has the dominant market position with about 72% share of the desktop browser market. There seems little doubt that Microsoft's bundling of its browser with its operating system resulted in this dominant position. Microsoft was taken to court for monopolistic behavior, but any outcome was moot as to the destiny of the desktop browser. The wheels of justice grind too slow for Internet time. Google's introduction of a new browser will have to take this into account. It seems hard to imagine that Google can gain a market share greater than that of Firefox, the most popular alternative browser today, with almost 20% market share.

That acknowledged, one of the reasons given for Google's development of a browser, to be free of Microsoft control, seems difficult to believe. All browsers are pretty much compatible with one another. They have to be since no one is used to the exclusion of the others, no matter how dominant one of them has become. The Internet really is, pretty much, application independent. So, for Google to go its own way and create any large incompatibility between Chrome and IE would be difficult to imagine. On the other hand, being the proprietor of its own browser does give Google the opportunity to have a large say and stake in discussions around browser standards. Perhaps it hopes to be able to throw its weight around in such venues. But with browsers, as with other software, the de facto reality of the market place can count more than anything else. Here even a 20 percent de facto reality would be hard to move the market. It seems to me that Microsoft has been in the position of setting browser standards since it vanquished Netscape. Google will need all of its mojo if it hopes to get more than Firefox's 20% share and create de facto standards on the desktop.

The best time to have been on the minimum wage was the mid-'60s, when the minimum wage had the greatest buying power - $10.11 in 2008 inflation adjusted dollars. Since the mid-60s, the inflation adjusted value of the minimum wage has been dropping steadily, with an uptick in 2008 from its adjusted increase to $6.55. Another increase to $7.25 is scheduled for 2009, but that still keeps its buying power far below historical amounts.

Minimum Wage

1938

$0.25

2008

$6.55

2009

$7.25

Highest value of the minimum wage in 2008 inflation adjusted dollars was in 1968

Thursday, August 28, 2008

This graph was used by Louis Uchitelle of the New York Times to illustrate the "missing productivity" that the information revolution was supposed to bring to business. It cleverly shows the increase in productivity in the US economy from 1870 to the mid-'90s with major technological revolutions highlighted along the way. The Internet Revolution was just getting underway when Mr. Uchitelle first penned his thoughts and would later make visible the pent-up productivity that the computer was about to unleash.

Growth of the US economy 1870 - 1995 tied to major industrial innovations

Here is one of Mr. Uchitelle's essays on the subject from December 1996.

By LOUIS UCHITELLE

t the end of the 19th century, railroads and electric motors were expected to transform America, making a young industrial economy far more productive than any seen before. And they did.

At the end of the 20th century, computers were supposed to perform the same miracle. They haven't.

Computers do wonderful things. But in purely economic terms, their contribution has been less than a transforming force: they have failed to bring back the strong growth that characterized so many decades of the American Century. By that standard, they have been a disappointment.

"It is a pipe dream to think that computers will lead us back to a promised land," said Alan Krueger, a Princeton University economist.

The issue is productivity. Those who look to computers for economic miracles, and there are many, insist that measuring their contribution only in dollars misses the less tangible improvement in quality that computers have made possible. But quality is often in the eyes of the beholders rather than in their wallets.

Through decades of invention and change, productivity has been measured as the amount of "output," in dollars, that comes from an hour of labor.

A worker who makes 100 pencils in an hour, each valued at 50 cents, produces $50 of output. And the more output from each of the nation's workers, the greater the national wealth.

Or, put more broadly, productivity is the amount of output in dollars that comes from various "inputs," not only a worker's labor, but the tools he or she uses to carry out that labor: a machine or a computer or a wrench or an air conditioner that makes work more comfortable in summer.

People work faster or concentrate better, and that shows up quickly in tangible output.

By this definition, the output resulting from the computer revolution of the last 25 years has been disappointing.

Computers have, of course, contributed to productivity and economic growth. But that contribution has failed to register in government statistics as the kind of robust catalyst that made the 1950s and 1960s such prosperous years.

If computers have fallen short of expectations, that would help explain an apparent paradox that has puzzled economists and policy makers for two decades: how rapid technological progress and a booming stock market took place during a period of sluggish economic performance -- sluggish, that is, relative to earlier decades.

One possibility is that the statistics are wrong. A panel of economists came to this conclusion in a report to Congress last week, suggesting that growth has actually been quite robust but that this fact has been obscured by overstating the amount of output lost to inflation.

This happened, the panel hinted, partly because the beneficial economic role of computers was not correctly taken into account. Some price increases that registered as inflation should really have registered as increases in output from computers.

But there is another explanation. Perhaps the computer is one of those inventions, like the light bulb early in the century, that makes life much better without adding as much to tangible national wealth as appearances might suggest.

That is because, while the light bulb allowed factories to operate night shifts and students to study more easily, the measurable result was less impressive than the great improvement in the quality of life that the electric light bulb made possible.

Given the computer's ubiquity and convenience, should the calculation of productivity and wealth be changed to give more dollar value to the conveniences the computer has wrought?

That kind of recalculation has not been done over generations of technological change, largely because convenience is too hard to quantify and translate into dollars. Too often, convenience increases consumption more than production. With computers, "most of the recent use has been on the consumption side," said Zvi Griliches, a Harvard economist. "The time you waste surfing the Internet is not an output."

Others take a broader view. Children using home computers for schoolwork -- gathering data from the Internet, for example -- become better students, they say.

In time, that will translate into rising workplace skills and greater measurable output. But it hasn't yet, and standard practice dictates that the nation wait until it shows up in the numbers before proclaiming the computer's great contribution to productivity.

"People have high expectations of this happening overnight," said Nathan Rosenberg, an economic historian at Stanford University. "Computers are a major innovation, but absorbing so great an innovation involves many changes in work practices and behavior."

Right now, much of a personal computer's power goes untapped, or is employed in low-output tasks like sending and sorting through junk E-mail, compiling electronic Rolodexes and playing solitaire in the office.

Harnessing a computer's spectacular ability to deliver and manipulate information is not easy. Edward McKelvey, a senior economist at Goldman Sachs, offers a hypothetical illustration:

A consultant who charged $50 an hour 10 years ago to forecast trends in the economy now has a powerful desktop computer at his fingertips, feeding him information that in theory should make his forecasts more accurate. But he still charges clients $50 an hour because the forecasts, despite the computer, are not more accurate.

Perhaps the consultant might never get that good at forecasting, even with a computer, or perhaps he will become so adept at extracting data from its depths that his forecasts will begin to hit the bull's eye. And that accuracy would allow him to raise his hourly fee, or "output," to $70 an hour, a handsome improvement in his productivity.

There are other problems. The automated teller machine, for example, illustrates how measurable productivity has failed to respond fully to computer investment. A half-dozen machines installed in a bank's lobby permit the bank to cut its teller staff by half. That is clearly measurable productivity.

The bank's income, or output, from bank transactions remains unchanged, but the input in teller hours goes down. The idled tellers can shift to other income-producing activities, perhaps becoming loan officers.

To make the productivity rate continue rising, however, the bank must continue cutting teller hours as it installs more ATMs. Instead, the next machines go to a dozen outlying neighborhoods, so that customers can bank at odd hours, almost at their doorsteps, or verify the balances in their checking accounts, something they did not bother to do very often before ATMs.

That is convenience. Most banks don't charge extra fees for this convenience. If they had no neighborhood ATMs, then customers would have found themselves forced to use the machines already installed in the lobbies of their banks.

"The question is, how much would you have been willing to pay in fees for the convenience of having that neighborhood ATM if the banks refused to furnish them otherwise?" said Erich Brynjolfsson, an economist at the Massachusetts Institute of Technology's Sloan School of Business. "That would then enter into measurable output."

Through a survey, Brynjolfsson tried to calculate what additional amounts Americans would pay for hundreds of conveniences that computers make possible. He came up with a total of $70 billion in additional output.

That would add only one-tenth of one percent to the national wealth, which is the value of all the goods and services produced in the United States in a year -- hardly enough to get economic growth back to the rates (at least 3 percent a year) that were characteristic of the 1950s and 1960s.

Still, computers and software in all their various forms make an important contribution. The national wealth -- also known as the gross domestic product -- has risen at an annual rate of less than 2.5 percent, on average, in recent years.

That includes a contribution of roughly four-tenths of a percentage point from computers and their trappings, according to the calculations of two Federal Reserve economists, Stephen D. Oliner and Daniel E. Sichel. Manufacturing and the telecommunications industry have benefited especially from computerization.

But why haven't computers lifted the overall economy the rest of the way back to 3 percent growth? One reason is that they represent only 2 percent of the nation's capital stock, which is all the existing machinery, equipment, factories and buildings that business uses to produce goods and services.

By comparison, railroads in their heyday represented more than 12 percent. And they became the tool for opening up frontier lands to agriculture, and to new cities and industries.

At the same time, electric motors, replacing steam, gave the nation a much more flexible and efficient source of power, and made possible the assembly line. The output resulting from railroads and electric motors became enormous.

Perhaps there is some set of conditions, having no direct connection to computers, that must develop before American productivity and economic growth can return to the old levels -- conditions like greater demand for the potential output from computers, or hegemony again in the global economy.