I was mulling over some conversations and circumstances from the last few days this morning, and decided to put my thoughts into a post titled “Why Telecom Sucks.” Telecom sucks because the companies in it are dominated by two different motivations, both of which suck for the customer. The incumbent phone and cable companies’ core motivation is to leverage their monopolistic power to extract as much profit from existing investments as possible, and never, ever exit. Technical innovation in these companies can usually only happen in pursuit of new monopolies, not new customers. Customer experience is inherently only a necessary expense.

On the other hand, competitive telecom is almost always driven by the exit strategy. Price is the main differentiator, and the main motivation to acquire customers is for the purpose of an exit. You don’t need to hold onto that customer for life, you need to hold onto him for just as long as it takes for someone to come in and pay you more for that customer than you paid to get him. Technical innovation and customer experience need to be slightly better than the incumbent, but only far enough to get that customer to sign up. Even mature, publicly held competitive telecom companies continue to think this way well past the exit of the people that built the business…long term investors like Warren Buffet are nowhere near these boardrooms (although he has made one exception).

What has developed is that the universal expectation from any customer that does enough business with a telecom company, is that the the telecom company will say they are great, but break that commitment early and often, and that is just the nature of telecom…the best you can get is a provider that sucks less. When a company puts on the hat of “I want to be good enough for this customer to choose to pay me for the rest of my life and his life to work with him” it changes everything. For both the company and the customer.

The National Cable & Telecommunications Association (NCTA), the cable television industry’s top lobbying group, announced recently that Michael Powell, Chairman of the FCC from 2001 to 2005 had been hired as a CEO of the organization. My initial reaction was a cynical “hmmph” that another public servant in charge of regulating an industry was going to work for the companies they had been charged with regulating on behalf of consumers. It happens all the time right? But Michael Powell was not just an average bureaucrat that quietly

Former FCC Chair Michael Powell

collected a paycheck for a few years before heading back off into private industry. Powell successfully lead the most significant effort to destroy telecommunications and broadband competition in the U.S. in modern history through the FCC’s Triennial Review Order in 2003 and subsequent Triennial Review and Remand Order in 2004, essentially cementing the cable/Bell Telephone duopoly that controls the residential telecom market in the U.S. It wouldn’t have been a big deal if he went directly into the group when he left the FCC in 2005 because history hadn’t yet separated the spin on his actions from reality. Now that we have seven years of history to look at, seeing Michael Powell going to work for the NCTA looks a lot like seeing Sheriff Rosco P. Coltrane report in to the nefarious Boss Hogg after a day spent harassing them good ‘ol Dukes of Hazzard.

According to the FCC’s site “The competitive framework for communications services should foster innovation and offer consumers reliable, meaningful choice in affordable services.” Powell’s FCC released “deregulations” that rolled back hugely significant portions of the local competition provisions of the Telecommunications Act of 1996 under the guise of removing the barriers that were preventing

Boss Hogg

the local Bell telephone monopolies (Verizon and ATT here in California) from making large investments in new, competitive, innovative services. The idea pitched by Powell and the other two Republican appointees on the FCC was that if we removed the pesky competitors from the equation, the Bells would then be free from providing discounted wholesale services and would go on building and hiring sprees. Among the most significant de-regulations:

Regulatory Change: Forbid competitive DSL carriers from splitting a customers existing phone line to provide DSL while the phone company continued to provide phone service.

Effects: This forced competitive providers to purchase an entirely separate phone line to provide DSL, artificially inflating the cost. The competitive broadband providers attempting to provide residential service stopped providing their own DSL circuits to the customer premise and were forced to buy the Bell company’s ADSL lines at essentially the same price that the Bell company sold it directly to consumers, and then provide their services across it. A few brave souls stuck it out on a regional basis and continued to provide residential DSL services under that mechanism, but it essentially reduced the field to the cable monopoly and the Bell monopoly in the U.S. Oh, and that massive network build out we were told would happen? Residential DSL availability and speeds are pretty close to exactly the same as they were in 2005.

Regulatory Change: Allowed the incumbent Bell telephone companies to stop selling switched access lines to competitors. The original rule was designed to offset the 100 year advantage the original monopolies had and allow competitive carriers to resell complete telephone services from the incumbent monopoly until they built big enough customer bases to justify purchasing their own switching.

Effects: According to a fascinating presentation by David Brevitz, in June 2004 there were 17.1 million resold switched access lines (known as UNE-P). That’s a small chunk of the U.S. market, but it was getting into some real numbers that would almost signify competition. ATT and Worldcom were long distance companies that were the largest providers of UNE-P in 2004. Powell’s theory was that they had enough of a chance to gain market share and would suck it up and install their own switches if he pulled the rug out from under them (more of the promised building spree the rule changes were going to create). Both companies were forced to abandon providing residential local telephone service, and since the Bells had been allowed to get into the long distance business and compete for their customers, they were suddenly left with dim prospects for survival. ATT was subsequently purchased by the incumbent Bell company SBC, with the overall company taking the ATT name. Worldcom (MCI) was acquired by incumbent Bell company Verizon. And just like that the two biggest competitors to the local Bell monopolies were gone. No building spree, no enhanced competition, just round after round, year after year of layoffs.

Regulatory Change: Prevented competitors from being able to provide services across the incumbent phone company’s fiber to the home.

Effects: You already know this one. You don’t have fiber (other than what’s in your cereal), and you aren’t going to get it any time soon. If you are one of the few that does have a fiber option, the chances of you having an option to buy it from a company other than one of Powell’s duopolies is somewhere between slim and none. In fact, odds are you’re paying a portion of Powell’s salary now, just like the good old days when he was running the FCC.

This time around, Roscoe P. Coltraine caught those Dukes, locked ‘em up and threw away the key. Dissenting Democratic Commissioner Michael Copps’ statement in 2004 has turned out to be amazingly prescient:

Sheriff Rosco P. Coltrane

“What we have in front of us effectively dismantles wireline competition. Brick-by-brick, this process has been underway for some time. But today’s Order accomplishes the same feat with all the grace and finality of a wrecking ball. No amount of rhetoric about judicially sustainable rules and economically efficient competitors can hide the blockbuster job this Commission has done on competition. During its tenure, the largest long distance carriers have abandoned the residential market. And as a result of today’s decision, other carriers will follow suit. In their wake we will face bankruptcies, job losses and customer outages. Billions of dollars of investment capital will be stranded. And down the road consumers will face less competition, higher rates and fewer service choices.”

I migrated to the ultra-convenience of e-books when I received a Kindle for Christmas of ’08 and progressed from carrying around the Kindle to reading solely on my iPhone for over a year. Always having a few books with you without having to actually carry books is really an amazing thing, and the convenience of being able to shop for, purchase and be reading a new book in 2 minutes from bed on a lazy Saturday morning is magical. I didn’t have much of a problem reading for hours on the iPhone’s 2″x4″ screen; it’s not ideal but it’s a reasonable trade-off for the benefits of always having your personal library with you. After getting an iPad last Christmas and downloading the Kindle app, my Kindle experience was just about perfect (ironically without an actual Kindle). My remaining problem though is that I read a lot of fiction and I don’t see any reason to own most fiction books after I read them once. My pre-Kindle routine was to head to the library, check over the new books, usually winding up with a couple, and then browse the shelves and pull a few more books. I usually wouldn’t read every book I borrowed, but I’d use the library’s web site to renew a few times and generally get through 2-3 books every month. Now that I was buying every book through Amazon I became much more targeted with my reading, and since I no longer had the stack of books with the ticking clock of the library loan sitting on my nightstand, my tongue-in-cheek summary of my Kindle experience is that now I read less and spend more money on books.

I recently discovered that my local library system was now lending ebooks using Adobe’s Digital Right’s Management platform, and after a few Google searches I learned that Bluefire made an iPhone and iPad ebook reader that was compatible with Adobe DRM. I set about downloading it and going through the steps of getting it to work and quickly discovered again why the Kindle and the accompanying amazon Kindle store have been so successful: it’s a lot of work to get a book onto your iPhone when compared to the Kindle. With Bluefire first you have to get the app, then you have to open an Adobe account, then download the book to a computer running both Adobe Digital Editions and iTunes, move the file into ITunes and then sync it to your iPhone or iPad to get the file transferred. The quality of the reader itself is great, but since it sits locally on the reader, besides the complications in getting your ebook onto the reader, there is no syncing between devices so when you switch between your iPhone and iPad you have manually catch up to your latest page. It is a way to accomplish reading library books on your iPad, but the complicated process made it only complementary to my Kindle app and not a replacement. Also there is much larger problem with transitioning to primarily reading library books on your iPad that I will discuss later in this post.

In February I learned that Overdrive, the company whose software powers my library’s ebooks system had released a reader for the iPhone and iPad that could natively handle the browsing and checkout process from the library’s site, and download the book directly to my device. Their first iteration of the software simply stretched the iPhone reader display to the size of the iPad at the same resolution, making it very hard on the eyes, but in March they released an update that has native iPad support that solved that problem. So now I have the convenience of getting library books directly to my device on a lazy Saturday morning, and the only thing technically that I’m giving up is automatic syncing between devices and all should be well… but it’s not.

The biggest problem now is that finding available books at the library is an absolute bear. In my library system as of today there are approximately 2100 EPUB books in inventory, but only about half of them are available for checkout. Finding available titles for popular authors is nearly impossible. For example there are 5 Michael Connelly books in inventory, but all are checked out. There are 10 Lee Child titles but none are available. You can place a hold on an unavailable book and the system will notify you when it available and hold it for a period of time for you to check it out, but that process runs counter to the ultra-convenience I want from eBooks.

My overall conclusion is that Amazon has done a phenomenal job with their Kindle system, the Overdrive Media Console app almost replicates the Kindle experience, but the scarcity of eBook content from the public library is a big enough hurdle to keep me using my Kindle app on my iPad. Except for the occasions I feel up to battling it out with my fellow digital library patrons for the right to read about Jack Reacher’s oh-so-satisfying vigilante justice for free.

The results of latest quarter’s YPO GlobalPulse survey of economic sentiment were released to the public last week and CEO confidence in the U.S. rose in the fourth quarter to its highest level since YPO began measuring CEO sentiment in July 2009. Globally, confidence either rose or remained high in every region. The U.S., European Union and Australasia regions were the only three regions below the average, and the E.U,’s score the was the lowest among all regions with Greece, Ireland, Spain and Portugal weighing on the region’s score. Asia, the Middle East and North Africa (MENA) region, and Latin America were the most optimistic regions.

CEO confidence in the U.S. was elevated by rising expectations about sales, hiring and capital spending.
All three of the graphs to the right reflect a dip in confidence in the second quarter and modest increases thereafter, but it’s important to note that in the first survey done in July of ’09 all three of these were at or below 50, and in the January 2010 results they were all at least 3 points below the April 2010 results. So when you look further back you see a long steady climb with a hiccup in the second quarter of 2010 that appears to be solidly in the rear view mirror now. The fact that the market’s reaction to the Egypt crisis was limited to a single day provides some anecdotal evidence that we’ll continue to see this progression in sentiment in the next set of results and the “ripples” I discussed in last quarter’s post have flattened out in a healthy way.

Stephen Slifer, Chief Economist of Numbernomics, former Chief U.S. Economist for Lehman Brothers in New York City from 1980 until his retirement in 2003, and former senior economist at the Board of Governors of the Federal Reserve in Washington, D.C. presented his view of the state of the economy today on a conference call to discuss the GlobalPulse results.

Source: Numbernomics.com

Mr. Slifer’s statistics supported the rise in CEO sentiment and he expects GDP growth of 4.3% in 2011, inflation at 1.7%, the unemployment rate getting down to 8.4% and the Fed funds rate ending the year at a mere 0.13% as the Fed waits for substantial improvement in the employment picture before making any significant adjustments. Consumers have reduced their debt and employment is picking up (although there is a still some slack in hiring to work through because of the increases in productivity we continue to see) which will unleash the consumer to start spending.

Source: Numbernomics.com

Cash sitting in corporate coffers is at unprecedented levels and needs to be put to work, and likely will as confidence in the economy increases. According to Mr. Slifer, “Once the Fed starts to tighten they will need to go a long way. If the Fed wants GDP growth of 3.0% and inflation of 2.0% , then a neutral funds rate would be about 5.0% and it will take the Fed about two years to get there,” supporting his belief that the chances of another recession prior to 2015 at the earliest are extremely remote.

Visit www.numbernomics.com to see a host of other slides that paint a complete picture of Stephen’s Slifer’s view, and www.ypo.com for more GlobalPulse Results.

I lamented in a previous post that it appeared we were going to forgo a vitally necessary green bubble in pursuit of a second dot-com bubble. Rest assured though that Gordon Gecko is alive and well and not just focused on re-living 1999.

Gevo, (NASD: GEVO) a biofuels technology company, went public last week and according to their prospectus they sold 29% of their shares for about $120M. They had under $2M in revenue for the first nine months of 2010, and acquired an ethanol plant in September 2010 that did $32M in ethanol sales for the first nine months of 2010. They intend to convert the ethanol plant to an isobutanol plant and continue to sell ethanol until the plant is converted.

They have some great intellectual property and an exciting story to tell albeit they are losing money and the prospects of making the kind of profits that would justify a $400M value are uncertain at best. This is a company that would not have been able to use the public markets to fund their venture in past years, so I wish them and their investors well and hope they go on to change the world.

Meanwhile, over in the tech sector, Pandora, the Internet radio service filed their S-1 and although they don’t set an IPO price they do discuss an offer they had on the company recently that had them set the value of their subsequent stock option grants at $3.14/share after an unidentified party bought 2.5M shares from employees at that price. After the IPO there will be 150M shares outstanding, so it’s fair to assume they will be valued at no less than $450M. This against:

“Our revenue was $55.2 million and $90.1 million in fiscal 2010 and the nine months ended October 31, 2010, respectively. Our net loss was $16.8 million and $0.3 million in fiscal 2010 and the nine months ended October 31, 2010, respectively.”

Their revenue growth is phenomenal and they have definitely redefined streaming music – I am listening to Pandora radio instead of my own iTunes library as a I write this. Prior to the share purchase offer by the third-party they had valued their shares internally at a very reasonable $.94 as recently as June 2010, so it appears that even they were surprised by what investors were willing to pay.

Netflix hit 20 million subscribers last year, and with reportedly over 60% of their subscribers streaming video over the Internet, the pressure they are putting on the Internet as a delivery system for television shows and movies has taken center stage lately. Netflix holds the most notoriety as the leader of the pack, but Hulu and even DirecTV through its on-demand service delivered over your broadband connection have been at it for years.

The heart of the problem is that a single movie viewed in a day can consume magnitudes more bandwidth than a traditional Internet user will consume in a day through typical Internet browsing, email and social network use. Consider this example graphing a day of “traditional” Internet use for a single home in which a total of 150 Megabytes of data is downloaded in short bursts of use:

Graph of Normal Internet Use: 150MB Downloaded

Then consider the graph with a Netflix movie on a Netflix rated “Medium Quality” connection where nearly 1500 Megabytes is consumed:

Graph of Internet Use with Netflix Movie on Medium Rated Connection: 1500MB Downloaded

And then the network operator’s nightmare customer that runs a webcam uploading a nearly constant 700 Kbps, and downloads Season 3 of Dora the Explorer on Itunes for a whopping 10 Gigabytes of data transferred:

Consumer Internet access in its current state is designed to support the usage in the first example, and there is generally enough flexibility and capacity to support a small proportion of heavy Internet users without slowing everybody down. A parallel example is the sewer system: only a few people in a particular neighborhood happen to flush their toilet at the same time, and the sewer pipes are generally big enough to support to the anomalous times of heavy simultaneous usage like half-time during the Superbowl. Now imagine if everyone in your neighborhood replaced their 1.6 gallon toilet tanks with 16 gallon tanks, with a few of the geekier neighbors dropping in 160 gallon tanks. There would be a whole lot of wet bathroom floors pretty quickly all over your neighborhood as the sewer system backed up. In the case of an Internet access network clogged up with video, we face the prospect of crawling speeds as computers on either end of the connection wait their turn to send packets through the network.

On the bright side, expanding the capacity of the Internet to accommodate video distribution is a lot more feasible than retrofitting all the sewer systems in the world. The big question is who will pay for it? The cost to deliver large amounts of one way video has traditionally been paid for by the consumer in the form of a cable TV or satellite bill. In the case of nearly defunct broadcast TV it was paid by the advertisers. Streaming video providers like Netflix are simply poaching the Internet connection of the consumer and shifting a large portion of the delivery cost on to the Internet access provider. Internet access providers like Comcast are chafing at bearing the cost of delivering Netflix’s movies across their Internet network, and to add insult to injury are also faced with the long-term prospect of losing their cable TV customers to streaming content providers. Last November Comcast threw a challenge flag onto the playing field in the form of a well-publicized dispute with Level 3, a deliverer of Netflix content, and has been battling along with Verizon against the FCC over their net neutrality efforts that would forbid network providers from discriminating against any content. Video doesn’t belong on the Internet in its current form, and without a mechanism to ensure providers and consumers of video pay their share of the delivery costs, we may very well wind up back in the days of dial-up speeds. In the mean time the Internet creaks a little bit each time Netflix adds a new streaming subscriber.

Author’s Note: I am a huge fan of Netflix and believe they are one of the most transformational companies of the Internet age. I also can’t argue with the convenience of streaming video, but as a network operator I acutely feel the stress it puts on the current model of Internet access.

Santa Monica based Demand Media (NYSE: DMD) completed the first “tech” IPO of the year this week, and perhaps the first IPO of the impending Facebook bubble. Demand Media owns web properties including eHow.com, Trails.com, and LIVESTRONG.COM that draw over 100 million monthly visitors to the 3 million articles and 200,000 videos that their network of over 13,000 freelance writers has produced.

They make money on the “Content & Media” side of their business by selling advertising and renting their underlying platform to other web sites. The company also owns the world’s largest domain registrar next to GoDaddy.com with over 10 million domains under management. Those numbers tell a great story and along with $179M in revenue over the first 9 months of 2010 they have a multi-year track record of double-digit revenue growth according to their prospectus. They lose money though – about $6M in the first nine months of 2010 and $20M in 2009.

Demand Media targeted the offering at $14-16/share, came out on 1/26 a buck higher than the top of the range at $17/share and shot up to $23.85 shortly after the open. The stock ended the second trading day at $21.85/share giving the company a valuation of approximately $2B, or about 8-9 times revenue. Quite a high valuation and according to Kevin Berk a sign that “The Bubble Is Back”

The company is somewhat unique, and you could make a case that since it is a non-traditional company some other measure of its cash generating capability should be used as a basis to measure the true value of the company other than a multiple of annual Net Income. In their prospectus Demand Media explains that it uses adjusted operating income before depreciation and amortization expense to manage its business since;

“the exclusion of certain expenses in calculating Adjusted OIBDA can provide a useful measure for period to period comparisons of our business’ underlying recurring revenue and operating costs which is focused more closely on the current costs necessary to utilize previously acquired long-lived assets. In addition, we believe that it can be useful to exclude certain non-cash charges because the amount of such expenses is the result of long-term investment decisions in previous periods rather than day-to-day operating decisions. For example, due to the long-lived nature of our media content, revenue generated from our content assets in a given period bears little relationship to the amount of our investment in content in that same period. Accordingly, we believe that content acquisition costs represent a discretionary long-term capital investment decision undertaken by management at a point in time. This investment decision is clearly distinguishable from other ongoing business activities, and its discretionary nature and long-term impact differentiate it from specific period transactions, decisions regarding day-to-day operations, and activities that would have immediate performance consequences if materially changed, deferred or terminated.”

Demand Media reports Adjusted OIBDA of $42M for the nine months ended September 30th. The value of the company to an outside acquirer would likely be based on OIBDA since that would be the most accurate picture of ongoing cash flow a potential suitor could expect to generate by acquiring them. Assuming they finished the year at about $55M in adjusted OIBDA they are valued currently at 36 times annual OIBDA which stretches rationality.

The company marketed their valuation prior to the IPO in the $1.2B range, so they certainly had more reasonable views on their valuation, but apparently the market is on the fast track to irrational exuberance about tech stocks. Again.