Commentaries on Directions That Will Impact the Future of Technology

Archive for the month “January, 2012”

CBS syndicated columnist Dave Ross recently said “The best thing about the Internet is that there is no paper. The worst thing about the Internet is that there is no paper.” I think Mr. Ross has succinctly captured the essence of the dichotomy.

Several years ago, the Aetna Insurance Company hired IBM to do an extensive $3 million study with the objective of reducing paper handling throughout Aetna by 10%. When the study was completed, and recommendations made, Aetna management rejected it, concluding that the trauma caused by doing things differently would cost the company far more than the anticipated savings!

The US Post Office is on the ropes because its paper shipping business has declined to the point where the existing infrastructure is bigger than it needs to be.

The Kindle, the Nook and the iPAD have obviated the need to read books on paper.

Several years ago, I visited the Head of HP’s Workstation Division. When I asked him how they were making any money given the current competitive situation, he replied: “Every morning, before office hours, we all gather in the parking lot, face Boise, bow down and utter a prayer of thanks.” If you didn’t already know it, Boise is the HQ of HP’s Printer Division!

The newspaper business is on its last legs, given that an increasing number of people get their news over the Internet or on TV.

Who among you men can imagine waiting in a barber shop reading Playboy on a tablet? Who among you women can imagine reading Vogue on a tablet while waiting for your manicure.

Although paper consumption in North America has declined by 25% in recent years, demand is on the upswing due to improvement in the economy, the introduction of new products and huge growth from Asia, especially China.

Here are oft-repeated statements: “Finding specific information in a stack of paper can be time consuming and often frustrating. Finding specific information in digital data is quick and easy.” My response to that is a big MAYBE. The statements may be true IF the data has been organized efficiently and the reader knows the right keywords to search on. But suppose you are reading a lengthy document and want to refer back to something you read earlier. If you are sitting in front of screen, how easy is it to do that? Often, it is much faster to riffle through the pages of a paper document than to do a computer search where you might not even remember searchable keywords.

So what is one to make out of these seemingly contradictory observations? I think it is safe to say that paper and electrons will coexist for a very long time. Paper will be replaced where it makes sense, and used where it makes sense.

According to ecology.com, each person in the United States uses 749 pounds of paper per year! (Yes, that includes toilet paper). We are not going to see that consumption decline significantly for a very long time.

Another example, this one truer to life, was the Jeopardy contest between IBM‘s Watson supercomputer and the show’s most successful contestants. Questions were asked by host Alex Trebek and Watson’s answers were given in a staccato-sounding computer-generated English.

Speech recognition and speech synthesis are technologies that have been studied and under development for a long time. IBM was one of the pioneers of this research, and the company continues to pursue it in labs all over the world. IBM groups various voice-related technologies under the umbrella phrase Human Language Technologies. Clicking on this link will bring up a page that will direct you to a layman’s overview of IBM’s many research projects, patents and related information.

It is difficult to pinpoint the level of investment in speech-related technologies, but with big companies and government agencies heavily involved, together with push from the hugely competitive mobile market, we will see continuing investment and great accomplishment in the years to come

Eventually, these technologies will lead to nothing short of a computing revolution. Chuck the keyboard, the mouse and the pad. We were all born with the I/O of the future.

I just finished reading about a San Francisco startup called Topsy Labs. This company searches posts on social media websites and uses this information to detect forthcoming trends. For example, it picked up a lot of tweets from people who said they were cancelling their Netflix subscriptions and used that information to predict a drop in Netflix’ stock price. There are other new companies doing similar work, for example, WiseWindow and Derwent Capital Markets, a London-based boutique investment company running a hedge fund.

The work of these companies is based on sentiment analysis in which the chatter on Facebook, Twitter, blogs and other social media is analyzed and used to predict stock movement, market trends, product acceptance, competition and other factors that is the purview of classical market research. It is a bit early to predict the demise of classical market research, but it will certainly be impacted in a significant way.

Will the Statistics Texts be Rewritten?

I spent more than 30 years working for market-research based companies. We were called “industry analysts”. Setting aside things like focus groups, the principal data-gathering MO of such companies is interviewing. We did face-to-face interviews, phone interviews and online surveys interviews. We tried to do enough of them to be “statistically significant.” (It turns out that the statistics books state that 50 interviews of a homogeneous population will produce results with ±10% accuracy with 90% confidence.). Nonetheless, obtaining 50 great interviews wasn’t easy and was very expensive.

Realtime Twitter Sentiment

I don’t know what the sentiment analysis companies are charging or going to charge for their services, but, since the process is highly automated, it could be relatively inexpensive. In fact, it could ultimately put the market research firms out of business or force them to change their their existing business models. In reply, representatives from those firms are touting the limitations of sentiment analysis, but the quotes I’ve seen aren’t very convincing.

Conclusion: Sentiment Analysis and Spin-offs Will Have a Huge Impact

In conclusion, I predict that data gathering by scanning social media and other web-based information will have a huge impact on the way market research will be done in the future, and will provide a precision unmatched by conventional techniques.

Last night, my wife and I began to watch a rental movie on DVD. The opening screen said (I’m paraphrasing) “This DVD contains only the movie. If you want the best picture and sound plus many bonus features, you should buy this movie on Blu-ray.”

Audio

I checked and found out that the audio on the Blu-ray disk is 5.1 surround, the same audio provided on the DVD. It is true that Blu-ray can provide “lossless” 6.1 or 7.1 surround sound which adds an additional back channel like you might encounter in a movie theater. However, a competing 6.1/7.1 technology, Dolby Digital EX or THX EX can also provide 6.1 or 7.1 sound on DVDs. Maybe if you have 15-year old ears you can hear the difference, but most people won’t notice. In either case the argument is virtually moot, since the number of movie disks that have either Blu-ray or EX 6.1/7.1 sound is miniscule and the numbre of people set up for 6.1/7.1 sound reproduction is even smaller. Strike 1 for Blu-ray.

Picture Quality

Now let’s examine the issue of picture quality. Most TV sets sold in the past 5 years are capable of handling HD (High Definition) images with up to 1080p resolution. That means 1080 scan lines, non-interlaced (“p” stands for “progressive”, which means exactly the same thing as non-interlaced). Blu-ray provides 1080p natively, while DVDs offer only 480p resolution. Thus it would seem that Blu-ray has a huge advantage in picture quality. While Blu-ray images are superior, they are not that superior, because the clever folks who design DVD players have provide a feature called “video upscaling” that takes a 480p signal and converts it to a 1080p signal! It used to be that this technology cost $20,000, but today it is reduced to a chip that costs less than $5. Therefore, if your DVD player has upscaling (and almost all of them built in the past five years do), your picture will be almost as good as Blu-ray. Strike 2 for Blu-ray. In fact, 90% of people over the age of 50 can’t tell the difference.

Bonus Features

So far, I’ve shown that Blu-ray’s advantages in picture and sound quality are there, but meaningless to the average TV viewer. That leaves bonus features as the last significant DVD/Blu-ray differentiator. As it turns out, very few Blu-ray disks have whiz-bang bonus features. Why? It turns out that the great majority of Viewers have very little interest in bonus features. Not only that, whiz-bang bonus features are expensive to produce, and, as a result, are not common. Strike 3 for Blu-ray.

Streaming Video Competition

Finally, we have the issue of streaming Internet video which virtually every pundit has declared is the future of TV or at least will be a major part of it. It will be a long time, if ever, that streaming video will be able to accommodate Blu-ray. There simply isn’t enough network bandwidth. This is especially true in the US, which is practically a third-world country when it comes to providing its citizens with broadband Internet service. In any event, streaming video is not Blu-ray’s friend.

Not Going Away Soon

I don’t mean to say that Blu-Ray is dead. Disney, for example which has the highest ratio of disk to box office sales in the industry, says it will continue to push Blu-ray until it no longer is no longer a viable medium for movies. I also want to emphasize that this discussion is limited to movies. As a data medium, Blu-ray has a lot to offer – unparalleled storage density in a portable format, for example. If you have a lot of data, a Blu-ray disk player/recorder is just the thing for your PC or Mac. Blu-ray is also an important technological piece of the gaming market.

Cost Differential

Back to the movie I saw last night (“Dolphin Tale“, excellent family movie by the way). Amazon sells the DVD version for $15 and the Blu-ray version for $24, a 60% premium. I think that price differential is very difficult for Joe Couch Potato to justify. Don’t you?

References

If you are interested in learning more about TV audio and video, I recommend the following websites:

For an excellent layman’s description of the various audio options,go to the Crutchfield website.

This is just the tip of the iceberg. New sites are coming online so frequently, it is nearly impossible to keep up.

To make it easy for people to access free radio from anywhere (at home, on foot, in a vehicle or hotel, etc.) all one needs is a smartphone, and, unless headphones will satisfy, a means of transmitting the sound from the phone to a player. Today, that is cheap and easy. Here are some choices:

A headphone -to-RCA cable to play through any amplifier or stereo system that has RCA connector jacks: Cost: $1 – $3.

Bluetooth: Needs to be built into the receiving device. An increasingly popular option in cars.

RF (Radio Frequency) Transceivers. These are a great solution for whole house applications. There are many choices on the market using 900 MHz, 2.4 GHz and other frequencies. $35 and up.

What this means is that the days of SiriusXM Radio are numbered. Unless there are one or more channels exclusive to Sirius XM that you absolutely have to have (e.g., Howard Stern or Martha Stewart), there is no reason to pay for digital radio if you have a smartphone. Considering that SiriusXM charges $180 – $200/year for a single radio subscription and more for multiple radios, there is little economic justification for the service.

I don’t mean to say that SiriusXM will disappear overnight. The company, in league with the car manufacturers, has a racket going that will take years to get rid of. Today, it is almost impossible to buy a car that doesn’t have a Sirius or XM radio in it with a “free” 3-month subscription to suck in the buyer. But the result is inevitable as smartphones and smart people who know how to use them become ubiquitous in society.

Today, it takes approximately 1 million atoms to store a single bit (0 or 1) of information using conventional magnetic storage technology. Researchers at IBM’s Almaden Laboratory in San Jose, California led by Dr. Andreas Heinrich, have accomplished the same feat with only 12 atoms!

Before we get too excited, it was done by reducing the temperature to near absolute zero (-458 degrees F), which is a bit impractical for ordinary use. Nevertheless, the researchers think that stable storage can be accomplished with as few as 150 atoms at room temperatures.

If you are interested in the details of the technology, they have been published in Science, the journal of the American Association for the Advancement of Science, one of the world’s top scientific publications. Suffice it to say that this discovery may have enormous implications for the future of computing. Not only will the density of storage be reduced by orders of magnitude, but power requirements will follow suit.

For decades, the computer industry has followed the dictates of Moore’s Law which says that transistor count will double on integrated circuits every two years. If IBM’s research becomes practical reality, Moore’s Law will go the way of the dodo. Atomic-scale memory is 100x denser than hard disk drives, 160x denser than NAND flash chips, 417x denser than DRAM components, and 10,000x denser than SRAM chips. This is truly a game changer.

Practical implementation of this “nanomemory”will require the discovery of new materials that don’t presently exist. IBM researchers think that will happen, but that it could take 5 – 10 years. Fortunately, IBM is making a full-court press. It has been investing upwards of $100 million per year in nanotechnology research, and intends to continue investing at that rate.

IBM has “opened its kimono” a bit on the subject. Besides the Science article, which is geared to scientists, IBM has tried to explain what this is all about in terms most lay persons can understand. If you are interested, go to this website and this one

This is the first year in many that I have not bothered to attend the annual CES show in Las Vegas. I didn’t go for a couple of reasons. First, virtually everything you might want to see at the show is available online, usually in video; second, there hasn’t been much in the way of exciting new products. The “Best in Show” award this year went to LG for a 55″ OLED TV that won’t be out until mid-year and will cost (an estimated) $8,000 – $10,000. It has about the same picture quality as a plasma selling for a fifth of the price. It is skinnier than a plasma and uses less energy. Yawn.

Maybe this is exciting for you, but not for me. Lemme tell you what I do find exciting.

Last year, a startup company calling itself Bluestacks came up with a piece of software that allowed a user to run Android apps on a Windows 7-based PC. The program is called App Player for PC (isn’t that exciting?). The company also produces a program called Cloud Connect. That program allows you to transfer Android-based apps from your phone or tablet to App Player for PC. In other words, you get an Android app from Android Market (which does not run on your PC), move it to the cloud and send it to your PC. Even though the program is still in Alpha version, it appears to work flawlessly on my PC. Very cool indeed.

Back to CES. Microsoft announced that BlueStacks would be built into Windows 8, scheduled for release later this year. Windows 8, if ya didn’t know, is Microsoft’s next major iteration of Windows. It sports an entirely new User Interface called Metro (you will still be able to use the old one if you want), and will run on every platform including phones and tablets as well as PCs. Although European cellphone leader Nokia made a commitment to Windows 8, nobody else in the mobile world has paid much attention to it, in large part because there aren’t many apps or developers who plan to write apps for it.

BlueStacks changes all that. Windows 8 users will immediately have access to all 400,000 Android apps and run them on phone, tablet, netbook or PC. Underscoring its importance, Microsoft announced that Bluestacks will be bundled with Windows 8. This will remove the lack-of-apps barrier to purchasing Windows 8 based mobile platforms. Sure, Microsoft would prefer that all those 400,00 apps would be written native to run on Windows, but it might have to wait for the return of the dinosaurs for that to happen.

BlueStacks has announced that it is working on a version for the iOS (Apple’s operating system), so you will be able to run Android apps on Apple products. This is the kind of strategy that should ultimately lead to hardware price wars. I can hardly wait.

“If you haven’t decided on in what you want in a home theater system then there are some things to consider. First of all, while Blu-ray may be the big thing now, you may not want to pay for the extra expense. For one thing, once you go with a Blu-ray system, you’re locked into only using that system. Further, these systems are usually bigger than standard DVD systems, and they are harder to set up. Plus, all the components must be compatible with Blu-ray.

Blu-ray, however, still has some of the best video quality. While they are a hassle to deal with (especially with the size and proprietary components), you may still like the quality better.”

The above quote comes from an article I found on the website of one of those companies that publishes articles by freelance writers. I won’t mention which one for fear of getting sued. The author of this article published another article in which she stated that, for a home theater system, one needs to buy cables made from “99.99 percent oxygen free copper that resists corrosion and is made for great connectivity.”

The crap that this author writes would be OK if she stated that these conclusions were her opinion and explained the basis of that opinion, but she makes it sound like fact, even though she clearly knows nothing about either Blu-ray or copper. The misinformation published on the web about technology is frightening. Pity the poor consumer.

The picture on the left shows the Bozak Concert Grand Speaker System. Invented by audio pioneer Rudy Bozak and sold from 1951 – 1965, The Concert Grand was universally considered by almost every reviewer as the finest production speaker system available. Weighing 250 pounds and costing more than $2000 (About $20,000 in 2011 dollars), each unit contained four 12″ woofers, two midrange drivers and an array of eight tweeters contained in a gigantic box that was close to an infinite baffle enclosure. If you wanted to hear (feel) the lowest note on a bass viol, the Concert Grand was, arguably, the only system capable of producing those notes distortion-free. Short of very costly custom-built systems, no modern technology has been able to sound as sweet as the Concert Grand in my less-than-humble opinion. Unfortunately, the Concert Grand demanded a) vast wealth; b) an extremely understanding spouse; and c) a very large room, thus narrowing the market to the point at which the Bozak company could not sell enough to make a profit.

Since those heady “HiFi” days, speaker designers have developed hundreds of systems based on technologies, both esoteric and simple. The speaker designer’s job today is complicated by the need to reproduce both music and movie sounds. Crashing automobiles heard through 6 channels and the strains of Beethoven heard through 2 stereo channels require very different aural profiles. Further, most music content these days is digitally-sourced and digital music sounds a lot different than analog-sourced music. (Although, if you are young enough, you may never have heard analog music, and therefore don’t know the difference!) In short, except for a diminishing number of audiophiles, the Concert Grand and its brethren are no longer hot.

Hot speakers today are likely to be a) small; and b) wireless. Small means that infinite baffles are out, and enclosures tiny or non-existent. To make up for that deficit, designers compensate by using the aforementioned electronic trickery, employing sound modifier circuitry that attempts to create realism. Sometimes the trickery is built into the amplification system and sometimes in electronics that are embedded into the speaker equipment. Bose pioneered this methodology with great success. (Although I will admit that some Bose systems sound very good, they still ain’t Concert Grands.)

Typical Sound Bar

The technology has progressed to the point where much sound processing circuitry has reached commodity status, enabling speaker and amplifier manufacturers to offer a range of sound processing options at low cost. An outstanding example is the so-called sound bar. A sound bar is a collection of speakers and sound processing electronics housed in a narrow long cabinet designed to fit under or over a TV set. The electronics often try to emulate surround sound. Since the speakers are very small, a separate subwoofer is usually needed to get decent bass response. Sound bars typically sell from $150 to $1500. A few years ago, the Polk Audio company was the only producer of sound bars. Today, virtually every speaker supplier is in the sound bar business. Put them in the very hot category.

Another hot category are speaker systems with tiny satellite speakers.

Typical Satellite Speaker System

Pioneered by Bose under the trademark “Acoustimass”, these systems consist of a subwoofer and 2-7 little satellite speakers usually coupled with electronics that strive to make the sound realistic. Prices for satellite systems range from $100 to $2000. Spouses tend to like them because they are inobtrusive.

Typical Tower Speaker

The person who would have bought a Concert Grand 50 years ago, can get some very hot speakers that offer great sound in large rooms. The most common form factor for these high-end speakers is the so-called “tower.” They look like skinny Concert Grands and usually have several speakers contained in a single enclosure. They may or may not include sound processing electronics. You can expect to pay from $300 to $2000 per enclosure for tower systems, so a multichannel setup can set you back big bucks.

I mentioned before that wireless systems are hot. That means that the audio signal can be sent from its source to speakers using either an Internet-based network or a proprietary wireless scheme. This eliminates the need for wires and makes it easy to play music sourced in one room to speakers located in another room. Unfortunately, wireless transmission quality is not as good – yet – as wired transmission, so, if you want high-end sound, you are still stuck with wires.

I close this article with mention of an item that is semi-hot. That is, the high-end DAC (Digital -to-Analog Converter). Audiophiles will tell you that analog music beats digital music hands down. That is why many DJs use vinyl records rather than CDs, and why vinyl media is actually on the increase. Neilsen Soundscan recently reported that, while overall album sales dropped 13% in 2010, sales of vinyl increased by 14 percent over the previous year, a new record. These DACs take digital input from a CD/DVD player (for example) through a Toslink or Digital Coax connection and output analog sound to the system receiver or amplifier. Very cool, indeed!

Steve Wozniak was quoted in a January 4 article in USA Today, stating ” I do expect Apple to make an attempt (to get into the TV business) since I expect the living room to remain a center for family entertainment, and that touches on all areas of consumer products that Apple is already making.” In response to that statement, I say “duh.”

It is certainly true that Apple could equip a TV with an iPAd/iPhone
interface. The article mentioned above cites a Barclay Capital analyst
as saying that Apple could sell $19B worth of TVs so equipped in 2013. I think the guy is smoking something illegal, but I was wrong once before.

The fly in the ointment as it were is, of course, content. Why should Apple be able to cut better deals for content than any other company? Cable and satellite companies are making more money than ever. What could induce them to share the goodies with Apple or anyone else for that matter. Of course, they could theoretically cut deals with the content providers like the networks and independent production companies, but why would those companies give Apple an edge over other big players like Google TV/Sony, or even Microsoft.

For Apple to compete profitably in the TV business, it will have to offer something truly unique. TV hardware, including network interfaces is essentially a commodity. Embedded network interface hardware costs around a dime these days. Who wants to play in that game outside of a few crazy Korean and Japanese companies who will probably get knocked off by Chinese competition?

I don’t see much happening to change the TV landscape in any fundamental way UNLESS there is consolidation with the content providers. Given that Google, Apple, Microsoft et al are gagging on cash these days, perhaps that is not beyond the realm of possibility. As you probably know, Sony owns a bunch of Studios like Columbia and Tri-Star. Suppose you could watch the product of those studios only on Sony TVs or at least less expensively than on competing products. I think that is called “thinking-out-of-the-box.” Apple is pretty good at that.