Professor Peter Nordlander has announced "a universal relationship between the behavior of light and electrons" which "can be exploited to create nanoscale antennae that convert light into broadband electrical signals capable of carrying approximately 1 million times more data than existing interconnects."

This is big. In many ways it's as big as the original BuckyBall discovery, and more readily exploited.

Wingard runs Euclid Discoveries, which is working on an object-based video compression technology he says will deliver 10 times the performance of MPEG-4, enough to "turn your iPod into a DVD player."

And he's done it all with angel investors, who are best-known for backing only early-stage customers. Wingard has rejected the entreaties of venture capital firms, saying their time frames for pay-outs are too short. Yet he has succeeded in getting angels who will wait as much as 7 years for a private auction of his technology, and a distribution.

Back in 1985, you would have spent big money to get an Intel 386 chip, with over 100 Megabytes of storage, and a local network that ran as fast as 1 megabits per second.

I know I didn't have one. The closest I saw to one that year was an entrepreneur 10 miles north of me who had a Digital Equipment PDP-8 minicomputer in his office.

Yet that is just what you see in the picture to the right:

Over on the left is a keycharm given me by the folks at Intel in the late 1980s. Inside the plastic is a 386 chip. Turn it over and you see a 486. These were real chips, discards from production runs, which were given to the press to illustrate what Intel did at the time.

That big round thing in the front-center of the picture is what we now call a stick memory device. This particular unit has 128 Megabytes of storage. Perfect for moving files, like this very picture, from a laptop to a desktop, or for bringing spreadsheets home to work on over the weekend.

Over on the right, in the back, that little blue thing is a Bluetooth dongle. It ran this picture from my cellphone, where it was taken, over to my laptop at a 1 mbps speed.

Thanks to Moore's Second Law (complexity causes costs to scale exponentially) competition in the semiconductor business is held in an ever-thickening mud, which represents the cost of building new capacity.

The number of company-owned fabrication plants, or fabs, must decline over time, as their cost rises above even corporate affordability. The decision to build one must be taken with increasing care, with an eye toward a far-off future. It's the opposite of what happens in the product cycle, at the other end of the factory floor, where things are constantly accelerating.

While Intel has played its hand in Asia, AMD has chosen Europe, specifically the former East Germany. More specifically Dresden, firebombed during World War II, left for dead during the Cold War.

In 2003 AMD broke ground for its second Dresden Fab, AMD 36. The plant goes into volume production next year, at a point where AMD's designs seem to be excelling those of Intel.

Market share, in other words, could make a big swing next year.

At the very same time, AMD is advancing in court, forcing Intel to defend an already-fading monopoly. A few years ago Intel had knocked AMD practically out of the ballpark. With the Dresden Fab 36 that won't be true, but AMD figures Intel must still have a case to answer for, because its hyper-competitive marketing department never changed tactics.

Evidence will likely show that Intel did have a near-monopoly under Craig Barrett, and that it did abuse its position in its dealing with big customers. But a court finding for AMD would still be a mistake.

Every decade of computing technology can be summarized fairly simply. (That's an Apple ad to the right.)

The 1950s were the decade of the computer.

The 1960s were the decade of the mini-computer.

The 1970s were the decade of the PC.

The 1980s were the decade of the network.

The 1990s were the decade of the Internet.

The 2000s are the decade of wireless.

It's now clear that wireless technology defines this decade. Mobile phones are opening up Africa as never before. WiFi is making networking truly ubiquitous.

Walk or drive down any street, practically anywhere in the world, and you will find people obsessed by the use of wireless. Behaviors that in previous decades were shocking -- walking around chatting animatedly to the air for instance -- are now commonplace.

What's amazing, as we pass the halfway point, is how far this evolution has to go, and how easy it is to see where it can go:

The question of Wi-Fi and real estate is about to come to a head, at Boston's Logan Airport. (Picture from MIT.)

Declan McCullagh reports that the Airport is trying to close Continental Air's free WiFi service, based in its Frequent Flyer lounge, in favor of a paid service on which it gets a 20% cut of revenue.

Continental has appealed to the FCC under the 1996 Telecommunications Act. Massport, which runs the airport, is making bogus arguments about security (its paid service uses the same spectrum as Continental so if one goes under its argument, both go).

If this thing goes to trial it will be a very important case. Here's why.

Even economic and foreign policy issues are, in the end, defined in terms of social issues. This creates identification, and coalitions among people who might not otherwise find common ground -- hedonistic Wall Street investment bankers and small town Kansas preachers, for instance.

I am coming to believe the next political divide will be technological. That is, your politics will be defined by your attitude toward technology.

On one side you will find open source technophiles. On the other you will find proprietary technophobes.

It's a process that will take time to work itself out, just as millions of Southern Democrats initially resisted the pull of Nixon. Because there are are divisions within each grand coalition we have today, on this subject.

On the right you see many people who work in open source, or who worry about their privacy, asking hard questions of security buffs and corporate insiders.

On the left you see many people who consider themselves cyber-libertarians facing off against Hollywood types and those who create proprietary software.

This latter split gets most of the publicity, because more writers are in the cyber-libertarian school than anywhere else.

Initially, the proprietary, security-oriented side of this new political divide has the initiative. It has the government and, if a poll were taken, it probably has a majority on most issues.

But open source advocates have something more powerful on their side, history. You might call it the Moore's Law Dialectic.

Either my wonderful mother (who still walks among us, to my great joy) failed to check the box indicating I was a citizen on my Social Security application, or some clerk failed to do so when the data was entered because there were separate forms then for citizens and non-citizens.

The clerk who put me through this hell blamed "Homeland Security." But I think he was really responding to the reality of how this number is used.

As I've noted many times before, the Social Security Number is an index term. Everybody has one. Everyone's number is different. By indexing databases based on Social Security Numbers (SSNs), government and businesses alike can make certain there's a one-to-one correspondence between records and people.

Stories like this AP feature don't really address this need, this fact about how data is stored. Without the SSN we'd have to create one. Some companies like Acxiom do just that. Every business and individual in their database has their own unique identifier, created by the company. Which also means that the Acxiom indexing scheme is proprietary. The only way toward a non-proprietary indexing scheme, in other words, is for government to provide one. Which gets us back to the need for an SSN.

Since Mark Hurd left NCR to run the mess Carly Fiorina made of Hewlett-Packard in March, he has been fighting to turn the old boat around. The company turned in solid numbers in May, he hired away Dell's CIO, Randy Mott, and now he has the credibility with his board needed to prune the deadwood.

H-P has a lot of deadwood.

In buying Compaq, her signature move, Fiorina took on a lot of old, tired, even worthless brands, like DEC and Tandem. Compaq's latter-day strategy had been to buy these outfits for their book of business, and Fiorinia's deal was the apotheosis of this old-line industrial strategy. She insisted at the time there would only be a few survivors of the PC wars, and buying Compaq was the only way to make sure H-P would be one of them.

She was wrong. What works in steel does not work in tech. A book of business is worthless, because computers are short-term capital goods. It's not what you did for me, or even what you did for me lately, but what you're going to do for me tomorrow that counts.

Training, learning, adaptation -- call it what you will -- must happen at its own pace. This is why the productivity boom arising from the 1990s IT spending boom didn't become apparent until this decade.

But there is a way to accelerate Moore's Law of Training (which doesn't exist) -- publicity. If a good idea, an obvious use of existing technology, is heavily publicized, it can spread very, very quickly, and provide real benefits.

Given the direction of antitrust law recently I was surprised to see the recent suits by AMD and (more recently) Broadcom. They left me scratching my head.

But there is an answer to my quandary.

Antitrust has become a process. It's not a goal, but a weapon in the business war.

The idea that Qualcomm has a monopoly in the mobile phone industry is laughable. It may abuse what position it has, charging chip makers like Broadcom the equivalent of an "intellectual property tax" in areas which use CDMA (and its variants). But GSM is the major world standard. It would be like calling the Apple Macintosh a monopoly.

The Broadcom antitrust suit comes right after it filed a patent suit against Qualcomm, accusing it of violating Broadcom patents regarding delivery of content to mobile phones.

The first shot didn't open up the Qualcomm ship, maybe the second will. All lawyers on deck!

When the Comdex show closed its doors a few years ago a lot of people threw up their hands and decided it was some sort of secular turning-point, the lesson being that people didn't do trade shows anymore.

The show just finished, and the reports are still dribbling in. But what's clear is that the same spirit of innovation, the same corporate social mobility, and the same establishment of distribution that marked the Comdex show in its heyday all took place in Taiwan.

This Administration does not look kindly at anti-trust claims. They settled with Microsoft, they gave the cables and Bells a duopoly (leaving America a third-world broadband country), and they seem content to let China monopolize world trade while India gains control of services. All this is in pursuit of an ideology that becomes less-and-less distinguishable from Putinism and Kleptocracy by the day.

He was so much more. Like Hedy Lamarr, who created the technology underlying WiFi, he led a double-life, as an intellectual in the fun house.

For starters he was the first TV star I remember, one of many models for what became The Simpsons' Krusty the Klown. He had a morning show with puppets, more entertaining (I thought) than Kaptain Kangaroo, with more brain and heart (I thought) than even Fred Rogers. The puppets, which he made himself, were called Jerry Mahoney and Knucklehead Smiff (right).

What I didn't know at the time was he was also a polymath, with a wide range of interests and a photographic memory. One of his interests was medicine. As an entertainer he manuevered into the worlds of famous physicians, including Dr. Henry Heimlich (then Arthur Murray's prospective son-in-law), and with his help won the first U.S. patent on an artificial heart.

Irregular readers of this blog may think Gordon Moore invented the microchip.

He didn't. Moore did have a major role. He was part of the Fairchild team, co-founder of Intel, and his Moore's Law article popularized the changes that chips would bring.

But Jack Kilby won the Nobel Prize for the microchip, in 2000. He died today.

The original invention, designing multiple devices on a single piece of substrate, was invented in two places at once. One team, which Kilby headed, worked at Texas Instruments. The other team, with Moore, Robert Noyce, and other key men, worked at Fairchild Semiconductor.

The resulting patent was shared, but it was Kilby's team that created the basic technology. The key contribution from the Noyce team involved manufacturing process.

All this related to what my book calls Moore's Law of Optical Storage. Instead of storing data on a fairly flat substrate, the Optware design uses all three dimensions. Think of the storage medium as a cube rather than a circle.

There's a long way to go before this threatens the CDs we're used to. Right now, however, the high price of the readers may be an advantage, making this perfect for applications like physical security.

Imagine the depth of personal knowledge that could be input on a 30 GByte substrate for an entry badge. Connect that to a variety of biometric readers so the bad guys can't hide their identities behind, say, phony fingerprints or contact lenses. Add a human guard to the mix and your entry portal could be pretty darned secure (for a time).

But the best news here may be this, the fact that there's competition in this space from Inphase Technologies, a spin-off of Bell Labs. They're looking at issues like the speed of data transfer, issues that could make holograms an alternative for the archiving of Web data.

A decade ago Microsoft reached a tipping point. Maybe this came with its release of Windows 95. It was obvious in its obsession over destroying Netscape.

Before 1995 Microsoft was about creating capabilities for others. Since then its mission has been embracing and extending, bringing the great ideas of others into its own operating system, destroying rather than creating niches.

It all sounds like a Jon Stewart set-up. "Aw, Bill, it used to be about the world domination." But in truth, at some point, people do come to dominate their worlds.

The 1990s were all about the Internet. (The picture is from a great site called i-Learnt, for teachers interested in technology.)

This decade is all about gadgets.

Digital cameras, musical phones, PSPs, iPods -- these are the things that define our time. While they can be connected to networks their functions are mainly those of clients.

In some ways it's a "back to the future" time for technology. We haven't had such a client-driven decade since the 1970s, when it was all about the PC.

In some ways this was inevitable. The major network trend is wireless, so we need a new class of unwired clients.

But in some ways this was not inevitable. If we had more robust local connectivities than the present 1.5 Mbps downloads (that's the normal local speed limit) we would have many more opportunities to create networked applications.

Not only is Apple switching its chip supply contract from IBM to Intel, but it is moving to Intel processors in the bargain.

In making the announcement this morning, Steve Jobs said he didn't see how he could continue making great products beyond next year "based on the Power roadmap."

Right after his speech he had a cagey interview with CNBC's Ron Insana. "Its not as dramatic as youre characterizing it," he insisted.

"This is going to be a gradual transition. Hopefully a year from today well have Intel-based Macs in the market. Its going to be a two-year transition.

"As we look into the future, where we want to go is different (from IBM's product roadmap). A year or two in the future Intels processor roadmap aligns with where we want to go.

"I think this will get us where we want to be a year or two down the road." Jobs refused repeated requests by Insana to explain what he meant by that. (Jobs is also shaving even more closely than this picture shows. He's down to tiny stubble around a a still-brownish moustache. Hey, Steve, I'm 50 too.)

The folks at RepRap would like you to think they've got something truly revolutionary. But they don't. The technology has been around for some time. You need to input a lot of files to make anything, so there's a lot of intellectual capital involved.

Assuming Apple does switch to Intel chips tomorrow, as News.Com reports, the value of Intel stock will likely rise.

That would be a mistake.

Intel is making a big investment here to gain a very small amount of market share. Meanwhile it's losing far more market share to AMD in what used to be called the Windows world.

WinTel has been broken. That should be the real headline here.

Microsoft is perfectly happy to have AMD supply chips for Windows machines. People are very happy to buy them. And right now AMD has a price-performance advantage there.

This move toward Apple will, if anything, accelerate that shift. Intel should be spending all its time addressing its loss of share in the Windows world, and in the Linux world, instead of wasting energy with a tetchy, demanding Apple, an outfit that even IBM couldn't please.

Despite his ponytail and his sometimes counter-cultural language, despite being what I like to call a Truly Handsome Man (it's a brighter term for bald, people) Ted Waitt was always a follower, not a leader. (The picture is from a 2002 profile in the Sioux Falls, South Dakota Argus-Leader.)

Waitt was Gimbel's to Michael Dell's Macy's. He wanted to be Pepsi to Dell's Coke.

But computing lacks the stability of the retailing or the soda business. So when Waitt announced his resignation today (at 42 it wouldn't sound right to call it a retirement) it wasn't big news.

Waitt and Gateway did well in the 1990s, following Dell into mass customization. He made his big mistake when he tried to out-think Dell, opening a chain of retail stores that caused $2.4 billion in losses, according to The New York Times.

As of today Intel's new direction is better. Better doesn't always mean more. In the case of microprocessors it can mean putting more computers on each chip (multi-core) or running with lower power. In terms of communications it can mean a host of attributes, from security to coverage to throughput.

The best way to understand the future is to look into how chips are changing.

Two transitions are transforming Moore's Law. The original article, in 1964, described only the density of circuits on silicon substrate.

The rule implied that chips could get better-and-better, faster-and-faster. Doubling bigger numbers means bigger incremental changes in the same time. Over the years chemists and electrical engineers learned to apply this exponential improvement concept to fiber cables, to magnetic storage, to optical storage, even to radios, so that 802.11n radios will transmit data at over 100 Mbps -- twice what earlier 802.11g models could deliver, but still 50 Mbps more.

Rimm, you may or may not remember, wrote a paper at Georgetown Law in 1995 claiming 85% of Web traffic was dirty pictures. This was later disproved, but the damage was done and Congress passed the ill-fated Communications Decency Act.

Mike Godwin, the former EFF counsel who fought the Rimm study and is now senior counsel at Public Knowledge, remains skeptical, noting that the Cachelogic study hasn't gone through peer review. He also notes that, since Cachelogic sells systems to control P2P traffic, it has a natural bias.

The Cachelogic claims may have logic behind them, however. Many ISPs do report that over half their traffic is on ports commonly used by P2P applications. Brett Glass of Lariat.Net, near the University of Wyoming, says the claim seems accurate, noting that unless ISPs cut-back capacity to those ports (a process called P2P Mitigation), the applications quickly discover the fat pipe and divert everyone's traffic to it, filling it at the cost of thousands per month.

Intel's role in the development of Always On is crucial, and its strategy today seems muddled. It's not just its support for two different WiMax standards, and its delay in delivering fixed backhaul silicon while it prepares truly mobile solutions.

I'm more concerned with Maloney's failure to articulate a near-and-medium-term wireless platform story, one that tells vendors what they should sell today that will be useful tomorrow.

Hitachi Eyes 1 Terabyte Drives, writes MacWorld, noting new technology the Japanese company says lets it put 4.5 Gigabytes of data on a single centimeter of hard drive.

I'm like, don't the first people read the second paper?

Moore's Law of Storage is rocketing along right now even faster than Moore's other "laws" (as described in The Blankenhorn Effect). Magnetic storage is eliminating the cost of physically maintaining content, any content, with profound implications for everyone.

Dana's Iron Law of Laptops holds that an ounce on the desk is a pound in my hands.

My favorite laptop of all time was a 2-pound Sinclair ZX-81. It had a tiny screen (nearly non-existent) but it had a pliant membrane keyboard that let me write and send stories from a beach. I haven't seen anything so light, rugged and useful since.

Instead, laptops have been desktop analogs. When desktop power increased, so did that of laptops, and they became no lighter in the process. Even today most laptops on the market weigh 7-8 pounds.

The cost of making something good is directly proportional to the complexity of the tools needed to create it. (The picture is from Freeadvice.com.)

This blog item is quite good. The tools needed to create words are very cheap. Even if the tools were more expensive, as they were when I began writing, my cost to create this text would not go up much. And the likelihood of its being of high quality would be just as high.

If I read this on the radio it would not be as good. The tools needed to create a Podcast require knowledge of radio or music production values. Even if Podcasts were as cheap to make as blog items, the proportion of good ones would be smaller than they are for blog items.

And so we come to the latest moves by Microsoft and Sony to deliver consumer video.

I didn't believe this when I started in journalism. I started in Houston, whose economy was based entirely on the concept of money coming out of the ground - Black Gold, Texas Tea.

For most of history, money has mainly come out of the ground. Assets were what you could drill for, what you could mine, or what you could grow. The exceptions to this rule were those of trade. If you sat astride a trade route, if you had a deep water port, if the railroads decided that your location would work for a station, then your land had value.

Moore's Law has changed all that. The Internet has changed that for all time.

Uh-huh. Maybe that's all true. But even if it is, that will take time.

Bluetooth has taken over a half-decade to reach its present level of prominence, and many mobile phones still don't have the capability -- despite cool applicationsl like Hypertag being written for it. (Thanks to point-n-click and Billboard for that link.)

I have headlined this Moores Law of Market Acceptance because, again, there is none. (It's like Moore's Law of Training.) Market acceptance is a human process, involving many actors.

The rate at which a new technology is accepted and replaces an old one depends on how revolutionary it is, how nimble its sponsors, and how rapid is the replacement within the older market.

Haptics recreates touch and texture artificially. If your kid has a "force-feedback" joystick on their computer game console, they're getting a taste of haptics. Northwestern, USC and MIT are among the universities doing research in the field. (The image is from USC.)

It's vital that something like haptics comes to mobiles because, in a hands-free environment, you can't depend on just sight and sound. Bringing other senses, like touch (or smell) into the mix allows for communication to happen invisibly.

It's also vital for haptics to come to mobiles because this is a huge (in terms of installed base) platform. If the coding and messaging can be delivered in this space, we're talking about billions of users. And we're talking about a universal language.

Many different types of solutions go into creating an Always On world.

Ive talked here often of medical applications for Always On, where you wear a monitor (or have it implanted) that connects to the network and can alert you (or others) to dangerous changes in your physical condition, thus saving your life.

I have also talked of inventory applications for Always On, in which RFID tags or bar codes give you a ready inventory of your stuff. This lets you, for instance, find your keys, or check the fridge to see what you need for tonights dinner.

But the low-hanging fruit lies in automation applications. CABA (it stands for Continental Automated Buildings Association) is one of the trade groups involved here. They work mainly with landlords who want to save money on utilities, provide security, and keep track of whats happening in lots of space so as to minimize labor costs.

That's right, gang. The old joke from The Graduate is here again, aiming to drive silicon into the ground.

Nanomarkets, a market research outfit with a beat that looks like tons of fun from here (call me) has a $2,000 report out with a hockey stick chart for plastic semiconductors, estimating the market at $5.8 billion in 2009 and $23.5 billion three years after that.

Plastic electronics -- chips built on conductive polymers and flexible substrates, will be cheaper, take less power, and (obviously) be more flexible than silicon circuits. This makes them perfect for, say, mobile phones.

It will also bring a bunch of new suppliers to the electronics market, names like Dow Chemical, DuPont, Kodak, and Xerox, along with the usual suspects.

One of the nastiest open secrets in the Internet is the switching bottleneck.

Optical fibers move data at, well, light-speed. But electricity moves data much more slowly. Getting between the two is like trying to get onto a freeway from an old cloverleaf junction -- there's not enough of an acceleration lane.

Many companies, including Intel, have been working this problem for a long time. Photonic switching is already a reality. But linking silicon directly to optics remains elusive.

That's the heart of Intel's claimed breakthrough, announced yesterday. Intel managed to produce a full Raman effect on silicon. This should enable Intel to build lasers just as chips are built.

Right now electronic signals have to be multiplexed, and packaged, before getting into the optical net. It's a very expensive, complex process. It's one of the chief capital costs a telecommunications provider faces.

But if PCs had their own photonics, they could plug directly into fiber and, as their processing speeds increased, take full advantage of what fiber can do. You could even have photonic processing inside silicon chips. Voila -- no bottleneck.

That's the hope, anyway. As Alan Huang, a 20-year veteran of this silicon laser business points out, "it's a neat science experiment" and there's a long way to go before this shows up on your desktop.

The Cato Institute claims to be an advocate of free enterprise, by which we are meant to think free and open competition. (That's the logo from one of their standard online products.)

Nope.

They are, in fact, huge supporters of untrammeled business power, of oligopoly. Hey, where do you think their funding comes from, rabbits?

Here's a great example. It's a blog they call Tech Liberation. It takes a few clicks to learn this is a Cato shop, but they're not really hiding it.

The piece is by Adam Thierer (left), who works full-time at Cato as "director of telecommunication studies.". Its theme is the latest round of telecom mergers. Its message is don't worry, be happy.

"We can safely conclude that the communications / broadband networking business can be very competitive with 2 or 3 or even 4 major backbone providers in each region providing some mix of voice, video and data services."

Evidence for this? A Wall Street Journal piece noting that SBC wants to get into cable television. Other than that, a lot of chirping crickets. And some very nasty lies.

Permanent hardware encryption isn't going to happen. (The image, by the way, is from DBC of Germany, a player in this market game.)

This does not mean we should give up on encryption as protection, or on hardware for encryption. It's just that, just as Moore's Law means today's state-of-the-art PC is tomorrow's door stop, so today's RFID lock could become tomorrow's open door.

Unfortunately this has major implications for the security industry as it is today.

Of course, I'd need the prescription version. And I really like photograys. And have you got that in a bifocal model?

As you can see there is a way to go before Motorola's Cannes fashion statement turns into a really big market. Yes, there are cool-types who will grab on to this, so they can walk down the street gabbing away, like well-dressed homeless. But how many are there? And are all these fashionistas going to be satisfied with just these Oakley wrap-arounds?

A better solution, to my mind, would mount this user interface on the frame, with the electronics hidden in one of those cool eyeglass retainers 49er coach George Seifert used to wear? (That's George, left and above, and you may be able to make out his retainers. From the Seifertsite on Earthlink.)

In a New Yorker profile of chef Mario Batali (left) there's a wonderful scene of Mario rooting around a waste pail, looking for what the author-turned-prep chef has tossed away.

Our job is to sell food for more than we paid for it, Mario lectures him. You're throwing money away.

Apple Computer is the greatest exponent today of what I call Batali's Clue. Your job, as the maker of products, is to get more for your creation than the cost of the electronic "food" that goes into it.

It's a vital Clue because components in the Moore's Law age spoil like dead fish on a wharf.

She's out because her strategy was doomed from the start. She tried to treat computing as a traditional industry, where the pattern is that once growth slows to a modest level you get consolidation, companies merging together until just a few are left and profits are regular.

This doesn't work because Moore's Law prevents it. Moore's Law means the nature of systems are always changing. Companies rise because they know something about the market, and fall when they lose touch. No amount of consolidation can change that. The merger that created Unisys didn't save Univac and Burroughs, merger didn't save Digital Equipment and Compaq, and it didn't save Hewlett-Packard.
Fiorina's key ad campaign, "Invent," implying the company was going tback to its roots in the garage, turned out to be just that -- an ad campaign. What has H-P invented under Fiorina, except financial manipulation. Anything?

The digirati are in a fury today over claims by an outfit called i-mature which claims to have solved the problem of age verification with a $25 device that checks a finger's bone density to determine just how old you are.

The image, by the way, is from Vanderbilt University, which has no affiliation with either Corante, i-Mature, or this blog. It describes x-rays of a finger taken at different power settings. Go Commodores.

RSA announced "a joint research collaboration" with the company. But there is skepticism over exactly how precisely a bone scan can measure age, and the more people investigate, the more questions they raise.

Think of it as a LAN on a chip. Not just the network itself, but the computers on the network and, to some extent, the people behind the computers as well. (The illustration is from the first section of Blatchford's report.)

Software programs on the chip, called apulets, portion work out among the computing sections, then recompile the results, the way an editor does at a newspaper desk. (Only without the coffee and the yelling and the pressure or the beer after work for a job well done.)

The result is true multi-tasking. As good as some teenagers, who will listen to music, watch TV, and gab on the phone while allegedly doing their homework, and still get As. (You know who you are.)

The best thing, though, is that this thing scales. You have 8 cells on the chip now. You can have more.

I'm no electrical engineer. I just went to school with some fine ones and picked up some of the lingo by osmosis. But it does seem to me that the "dual core" ideas Intel has committed to are merely extended here, in a way very consistent with Moore's Law.

The key point Moore missed (because it wasn't relevant to the paper, hadn't been discovered, and don't you dare criticize Mr. Moore for this) is that the exponential improvements he saw in silicon fabrication apply elsewhere. As I've written many times here, they apply to fiber, they apply to storage, to optical storage, to radios.

It's an anthology series, built around various scientific "principles" that define the Star Trek franchise.

Think of it as Science made into Drama.

Yes, it's an excuse to make science exciting. (Just think of the educational spin-offs we can produce!) And the production costs are low enough to put this on the SciFi channel (where Enterprise should have been all along). Or might I suggest a pitch to Discovery Networks, which has got proven talent in making science fun with shows like Mythbusters?

For host, might I recommend Stephen Hawking? Playing the role Alistair Cooke made famous, he opens each show by describing the science (and the Star Trek technology) on which the show will be based. (I might recommend getting several scientists for this role, perhaps one for each specialty. But Hawking is a name. He'll do great for starters.) Or, with confidence this show will last for decades, Lance Armstrong, who's already under contract to Discovery, who knows how to read a cue card, and who owes his life to science?

To all those wishing to bury Moore's Law. There are more tricks left in it than are dreamt of in your philosophy.

We all know about "dual-core" chips. Intel has switched development here, AMD has them in droves. They're basically multiple chips drawn on the same piece of silicon, taking advantage of parallel processing on-the-chip. Great stuff. Makes chips faster, makes processing faster, and keeps Moore's Law going.

Now IBM (with Sony) is rolling out what it calls Cell technology . This extends the dual core philosophy, a single chip that passes instructions to as many as eight processors at once. (Think of it as an editor chip in the "slot" of a computerized editing desk.) IBM says it can handle up to 10 instructions at one time.

All the speculation surrounding the Cell involves where it might go, and what it might do. (They're putting it first into Sony's Playstation 3, but it's listed as a PowerPC advance.)

The printer is in Moto, a Chicago restaurant, and it's programmed by executive chef Homaro Cantu. The paper is the same stuff you see on some birthday cakes, made of soybeans and cornstarch. The ink is edible, and the flavors are powders placed on the paper after it's printed. This means he can create a 10-course "tasting menu" that won't leave you bloated -- just well-read and out several Benjamins.

Cantu is making paper sushi and menus that can be crunched into his gazpacho for "alphabet soup."

The other day a colleague sent me a party invitation. The headline was "HP Plans Retirement Party for Moore's Law." (Real retirement parties, of course, feature lovely cakes like this specimen, from the Carolina Cake Co., Hilliard, Ohio.)

Moore's Law has been buried more often than Dracula, but like Elvis it keeps coming back.

As I've written, the exponential improvements Moore first revealed in silicon have been replicated in optical fiber, in hard drives, in radios, across the technological universe. And it shows no sign of ending.

In fact, the "Retirement Party" was a tongue-in-cheek reference to a new Hewlett-Packard technology that could extend the life of Moore's Law improvements many, many years.

It's called a crossbar latch and in theory it's just a circuit line crossed by two other lines. But it's capable of performing the same functions as a circuit etched in silicon, and when made on nanoscale, it's more efficient.

The key is that the size of the crossbar latch can scale down further than today's circuits. They can be made smaller, thinner, run closer together, and hence, create more circuit density, which is what Moore's Law is all about.

The significance of WiFi-cellular roaming lies in Always On applications.

Think about it. Cellular channels are relatively low in bandwidth, WiFi channels are high in bandwidth.

Now, you're wearing an application, like a heart monitor. When you're at home, or in your office, this thing can be generating, and immediately disgorging, tons and tons of data, detailed stuff that may be fun for your doctor to analyze later.

I've been re-reading the last in Harry Turtledove's Worldwar series, called Homeward Bound, and I'm once again struck by the similarities between the U.S. military in Iraq and the Lizards of the story.

The Lizards (not to give the story away) invade Earth i 1942, at the height of World War II. They have the weapons of 2000, Earth has what it had. The overall theme of the piece (which has now run into its seventh 500-page book) is human ingenuity vs. reliance on technology.

I don't know what they're thinking with this latest battle robot. (The picture, which I'm confident betrays no military secrets, is from the BBC.) But I'm pretty certain we're going to have some captured, disabled electronically and then grabbed under covering fire. The wireless link between the operator and the bot is the weak link.

Simply put, Moore's Law makes large productivity gains absolutely necessary. To compete in a Moore's Law world, you have to continually replace people with technology, and move folks' time into more productive tasks, or they fall behind.

This is true for individuals, for business, for government, for nations. It has very profound implications for all of us.

Over the last few weeks I've read a lot of commentary about the recent mobile phone health scares.

Much of it follows the industry line. Even on blogs, the tone seems dismissive. Case not proven, nothing to see here, move on.

But that's the wrong attitude to take. (The ostrich came from a financial planning site.) It's ignorant on how easy it would be to address valid concerns, and even improve the product at the same time.

What seems to matter is the power of the wave hitting your head, the distance between sensitive tissue and high frequency waves, and the duration of exposure. Stick a high-powered microwave brick next to an ear for 10 years or more, it seems, and something's going to fry.

But Moore's Law of Radios shows we don't need that much power. We're better off without it. Frequencies are used most efficiently when you have a lot of very low-power devices -- this lets you put more traffic in less space.

As I've said before, separating the handset from the headset can also work wonders, not just from a health standpoint but from a user interface standpoint. A close friend of mine has had a Bluetooth headset on his ear for some weeks, and now he's hot to replace his phone with something that has more functionality, more expense, that's more like a PDA. This should be good news for the industry.

But by sticking our heads in the sand, by dismissing reports of health effects out of hand, rather than addressing what we can now, the industry is setting itself up for a nasty fall, and many unhappy jury returns.

Along with all their other implications, the mass adoption of mobile phones represents the first step in the single-chip era.

If you look inside the guts of your phone you are unlikely to find a big honking circuit board. (The circuit board illustration is from Sciencetechnologyresources.com.) Instead you will find one, two or three single chips performing major functions in an integrated way.

This is happening across-the-board in technology. We've gone from circuit boards in the 1980s to modules in the 1990s, to single chips. Just as early IBM PC add-in board producers created "multi-function cards" to assure a price worthy of retail distribution 20 years ago, so chip makers today put multiple functions on many chips, creating entire systems no bigger than a finger-nail.

NOTE: The following was published in this week's edition of my free e-mail newsletter, A-Clue.Com. You can get on the list here.

The Great Race has always been between tyranny and freedom, with order as tyranny's worthy handmaiden, and crime as freedom's ugly stepsister.

The triumph of liberty in the 20th century was basically a technological triumph. It was Moore's Law that did it. Moore's Law, and all its antecedents, changed the rules of the economic game, of the power game, and the balance between rulers and the ruled.

Moore's Law, the idea that things get better-and-better faster-and-faster, means that trained minds are the key to economic growth. Willing hands, the key to economic growth in the industrial age, matter far less than they did. Chains may keep trained hands working. They don't do so well with trained minds.

In America the result, as Dr. Richard Florida (left) wrote, was the rise of a new "Creative Class" that could dominate societies and drive economic growth. These were people, accused of wealth and guilty of education, whose values were intellectual and meritocratic, and (perhaps most important) were capable of economic satiation. Creative people have, on the whole, risen through Maslow's "hierarchy of needs," and are in search of self-actualization, not food or even luxury.

The idea is that you have a wireless network based on a scalable, robust operating system that can power real, extensible applications for home automation, security, medical monitoring, home inventory, and more.

As I wrote I often came back to Motorola and its CEO, Ed Zander. They would be the perfect outfit to do this, I wrote.

Little did I know (until now) but they did. A year ago.

It's called the MS1000.

The product was introduced at last year's CES, and re-introduced at various vertical market shows during the year. It's based on Linux, responds to OSGi standards, and creates an 802.11g network on which applications can then be built.

At this year's CES show, Motorola is pushing a home security solution based on the device, with 10 new peripherals like cameras and motion sensors that can be easily set-up with the network in place, along with a service offering called ShellGenie.

Previously the company bought Premise, which has been involved in IP-based home control since 1999, and pushed a version of the same thing called the Media Station for moving entertainment around the home.

What should Motorola do now? Well, the platform is pretty dependent on having a home PC. The MS1000 could use space for slots so needed programs could be added as program modules. They need to look at medical and home inventory markets, not just entertainment and security.

But they've made an excellent start. And from here on out everyone else is playing catch-up.

Reuters today discusses this in terms of New Mexico, home to two Intel plants outside Albuquerque that make Pentium chips. But the problem is industrywide and worldwide. It's baked into the system. The fact is that etching chips requires the use of caustic chemicals that pollute the air and water.

Before you start thinking what's that for, imagine yourself in an Airport facing a nasty business flight. Imagine if you could turn on your phone and watch that DVD you got from someone else for Christmas.

As hard drives become cheaper, faster, and capable of handling more files, we leave more files in them. I now have copies of all my music on my PC, and could quickly transfer most of it to an iPod. My son and daughter never clear their Internet cache any more, and my wife keeps e-mails going back months and months.

So what do the good people at the BBC call this? They call it digitally obese.

It's an interview with Ronald Chweng, chairman of Acer Technology Ventures. Acer is based in Taiwan which China calls a renegade province, and the U.S. once called the Republic of China. Chweng's charge is to find U.S. investments. He says there are plenty, but that the focus may be changing.

The idea, which is valid, is that through blogging ordinary communication becomes content. I know this is true because my own newsletter, a-clue.com, has been losing readers ever since I started blogging here. It's not just that readers prefer getting my thoughts through the blog instead of e-mail. It's that the one-week lag between my writing and your reading is eliminated by blogging. You're not just an audience here, you're practically reading over my shoulder as I type.

In watching how people use their devices, I have come to the conclusion that we're witnessing four separate evolutions of the user interface:

The laptop interface is the one most folks are comfortable with. It requires two hands, a lap, and full concentration. I'm using this now. Last weekend at Stanford I watched DeWayne Hendricks play with one of the best such interfaces, on his PowerMac, and while it's powerful, I was struck by how it's that last requirement -- full concentration -- that is its great weakness.

The PDA interface lets you stand up, but it requires two hands, and nearly as much concentration as the laptop interface. This may be why PDAs are so out of fashion right now. Even with a color screen and handwriting recognition you still need to stop and look at it for several seconds to get anything done.

The mobile telephone interface (right, from the BBC) requires just one hand and minimal concentration. Once you learn whatever tricks you need to learn in order to do whatever you most commonly do with a phone, you can keep one hand on the wheel (or on whatever's in your eye) and half a mind on your driving (or getting the dust out -- ouch) while getting full use of it. I do suspect more lives are now being lost on our roads each year to phones than to alcohol, but phones are far more addictive and I don't see how a ban could be enforced.

Michael Thomas launched a company some time ago to push the use of nanoelectronics in data storage. Hence its name: Colossal Storage Corp. (The image is from the company's Web site.)

Al Shugart is on his board, so you know these are serious storage folks.

For months he's been talking about 3.5 inch removable disks storing 10 Terabytes each. Blu-Ray disks, the most effective CD-type technology out there, can currently store, at most, 50 Gigabytes, so we're talking about improvements of nearly three orders of magnitude.

But it turns out the technology he's worked on can also be applied to displays.

The number of circuits that can be drawn on a given piece of silicon doubles every 18 months or so. (And that's its author, Gordon Moore, to the left. Note the high forehead, a sure sign of fierce intelligence and handsomeness.)

Or, to put it Dana's way, things get better and better faster and faster.

But we also need to remind ourselves of Moore's Second Law, which follows directly from his first and which may (fortunately) apply only to silicon.

The cost of doubling circuit density increases in line with Moore's First Law. In other words, when you go from 1 billion circuits on a chip to 2 billion, the cost of developing the plant to produce that latter chip also doubles.

"You would like to think that public leaders are statesmen and have the country's best interests at heart," Barrett said. "We spend $25 billion on agriculture subsidies a year. Yet we spend $5 billion a year on basic research and engineering. Do you think agriculture is the industry of the future? You would like your government leaders to stand up and say something about that. I would like them to stand up and say something about it."

This is the conclusion of a United Nations study, whose census indicated there were 607,000 domestic robots in use at the end of 2003, 570,000 lawnmowers and 37,000 vacuum cleaners. (The illustration is from the adventures of Hubie and Bertie, two of the lesser-known Warner Brothers characters.)

But the prediction was pretty grand:

By the end of 2007, some 4.1 million domestic robots will likely be in use, the study said. Lawnmowers will still make up the majority, but sales of window-washing and pool-cleaning robots are also set to take off, it predicted.

In other words, general purpose "mechanical men" are still a long way off. We're building a host of small machines geared to specific tasks, something more of a Chuck Jones future than an Isaac Asimov one.

Nothing could be further from the truth, of course. But that's how the SCCM (So-Called Computer Media) or (if you're conservative) the MCM (Mainstream Computer Media) has it. And if Intel were running for something (which it is, namely your money) it has some explaining to do.

Now you may call this nuanced, but it's just the straight facts. There's more to Moore's Law than clock speed. Truth to tell there is more to Moore's Law than chip count. Yes, that was the focus of Moore's 1964 article, but the man was writing about the challenges of his time, he wasn't trying to be Nostradamus. (If he were, he wouldn't have written so plainly.)

Comcast has been tossing me price increases for years, but the latest back-door price increase will send my bill way north of $60/month. If I were using cable broadband I might think differently, but...

So I went to the store, and I learned something about Moore's Law. It works. Mass production and better electronics means you can buy a dish and install it yourself for just $50. A version with a DVR (like a TiVo) on it costs just $50 more.

You may think I credit Moore's Law with just about everything. Maybe I do.

But compute power makes many things possible. It accelerates the pace of all types of change. Even in materials science.

So here we have Integral Technologies and its conductive plastic ElectriPlast antenna. Plastic that conducts like metal means lighter conductors that can be molded into any shape. So we're talking about more than just antennas here.

NOTE: The above paragraph originally said the conference was in England, but Chris Potts corrected me. Also, the folks at Semacode deserve credit for extracting the slides and pointing them out to us.

First was a chart tracking the cost of making a smartphone over time, going back two years and forward six. (These are PDF files.) Despite the fact they're getting a lot better, they're also getting cheaper -- the bill of materials cost could be cut in half in four years.

One of the great things about Moore's Law is that it's multi-dimensional. It's not just that things get better-and-better, faster-and-faster. It's that you can create new revolutions by combining existing ones. (As in this example, from XTalk.)

When DVDs first came out in the late 1990s they were able to offer about 5 Gigabytes of permanent storage on a disk that could sell for $20, even including the cost of the content. At that time it was a big deal to have a 5 Gigabyte hard drive, and you paid through the nose for it.

Today drive storage prices are down to $1 per Gigabyte. And you can get this storage in any form factor you want. There are even hard drives in some mobile phones.

Not only has the price come down, but today's drives are sturdier than ever. Those dancers on the iPod commercials? Their music isn't skipping, as it might if they were holding CD players in their hands. (That's a subliminal point made in the marketing.) Remember tape back-up? Raise your hands if you still have it, or think you still need it.

The optical disk, meanwhile, has become a floppy. Sure you can buy a blank disk for as little as 60 cents, in quantity. (Fry's has a special on of 50 for $30, with rebate.) But what do you get for that? A few hours of a movie, maybe a dozen albums. And it's still not as sturdy or portable as a hard drive device.

The reason hard drives have become cool while optics drool has something to do with Moore's Law, but also the process by which these two technologies march forward.

A team headed by Yang Wang at Boston College has found that an array of aligned, but randomly placed, carbon nanotubes (pictured, from Physics News) can act as an antenna for visible light. (The little scale bar on the right-side of the illustration is one micron in length.)

This could be used to create optical television or (more important I think) convert light directly into electricity. That had been one of the perceived promises of Buckyballs when Rice scientists first found them almost 20 years ago, but no one had come up with a method for making it happen until now.

I took my son to see it Sunday and while we both enjoyed it the film didn't draw applause.

No heart.

One early scene explains it all. The heroine, played Gwyneth Paltrow, goes to Radio City Music Hall to meet a contact. While she and the contact huddle in the foreground, the background is the Radio City screen showing "The Wizard of Oz." (The film is set in a fictional 1939.) Just look at Judy Garland's face, reacting to the effect of Billie Burke's good witch arriving on a soap bubble, then compare it to what Paltrow is doing in the foreground.

And that's the most emotion Paltrow gives through the whole performance. (I blame the director, by the way. If actors in front of a blue screen aren't given proper instruction, none of them can get it right.)

The scientists were about to give up on their project when they decided to re-check their samples one more time, a month after making them. Turns out they took time to settle and as time went by all their batches showed traces of magnetism.

History buffs will recall that Sun Microsystems began in the 1980s with one goal, to put the power of an engineer's minicomputer onto his desktop. (Picture from The Register.)

As Moore's Law proceeded this ideal seemed to die away. After all, couldn't Intel chips, running Windows, provide all the power the average engineer needed? Over time, in fact, Windows-based machines did indeed reach, and then exceed, the capabilities of SPARC-based Sun designs.

A dual-core chip is actually two chips in one, so products with them can run multiple tasks at the same time.

AMD is pushing the dual core idea, which it's been working on for five years, in part to extend the life of the 90 nm process technology. Dual core lets you boost performance without actually shrinking transistors, much as RISC technology did in an earlier time.

It's yet another example of how Moore's Law doesn't just apply to gate size (although that was what Moore was writing about in 1964). Moore's Law proceeds because exponential improvements can happen in many different directions. Even as we reach the limits of gate size (as we approach the atomic scale) new advances in the design of chips, in the material used to make chips, and in the way we put systems together will keep the improvements coming.

It's said that the creator of Pokemon got the idea from watching insects around his home, creating complex imaginary societies. This sort of thing is common in Japan, where space is tight, and so the eye is drawn to the small in order to imagine big things. (A lot of kids elsewhere do the same thing -- I did as a kid -- but we tend to grow out of it faster.)

Anyway this preoccupation is leading to real advances in robotics, here in the real world. And here's the latest. It's a tiny helicopter -- less than a half-ounce in size -- that can be programmed to take short flights and send back pictures.

The biggest problem we have with Moore's Law is that we think of it linearly. (That's the man himself, from the BBC.)

Chips get faster and faster, faster and faster. That's the short form for Moore's Law, as Moore himself wrote it back in the 1960s.

But there are other ways for chips to get better other than by just getting faster. RISC made chips better. Low power technology makes chips better. FPGA makes chips better. Technologies like IBM's new "chip morphing" make chips better.

So I'm going to do something really big here. I'm going to re-state Moore's Law, for the 21st Century, as companies like Intel and IBM now understand it.

The price of stock in Intel fell after the company announced its earnings for the quarter had doubled. (That's not the real Intel symbol, by the way. It's something I found at a nifty Web company in Bristol, England.)

The reason: falling margins, rising inventories, and a prediction growth will slow in the second half.

In this era, Intel will try to move from being "just" a chipmaker to being a standards-setter, a la Microsoft. It will move from enabling the future to trying to define it.

The era starts with the three-chip project code-named Grantsdale. Otellini didn't talk speeds-and-feeds on introducing it. He talked about features, Features like built-in Wi-Fi, and support for new fault-tolerant storage, features that will define an Always-On world.

Barrett has a year left to run Intel before turning it over (most likely) to Paul Otellini. It's a reflective time. And in a recent talk with News.Com, he reflects on the "complacency" of America.

As Intel CEO this doesn't matter much to Barrett. The company can grow anywhere. But as an American it must upset him, especially since, before joining the company he was an assistant professor of materials science at Stanford. He's walked the walk of education.

Gordon Moore's 1964 prediction was based on the idea that we could shrink the size of components indefinitely. (Oh, and yes the title of this is a pun.)

If you limit your look at Moore to that one point, last week's announcement by Intel that it will change the way it looks at chips is, indeed, Gordon Moore's Last Sigh. (What, you haven't bought the book yet? What's wrong with you?) That method for increasing chip speeds is, henceforth, inoperative. (Picture of Gordon Moore from CNN.)

But Moore's Law is going to keep on keeping on. Moore's article cited a method for making chips faster, but Moore's Law itself was really a challenge to the industry, to keep those improvements going. Here's how the challenge will be met:

Moore's Law tells the electronics industry what it should hope to do, however it can do it, based on the idea that, in 1966, the goal of 100% improvement every 18 months looked achievable for some time to come.

What most people don't know is that, in many cases, and in many different areas of technology, engineers and scientists have been blowing the Moore's timetable to smithereens.

This is a so-called Blu-Ray disc, using a blue laser beam which, because it's so short, doesn't read below the disc's surface, into its substrate. One way to translate that 25 GByte size, by the way, is to note that it's two hours of High Definition TV. Hi-def movies need Blu-Ray.

So the breakthrough here isn't just in the paper. Repeat, the breakthrough here isn't just in the paper.

The revolution is being launched by Nvidia, best known for its graphics chips, and a new line of chips it's going to ship soom called GeForce. In the words of Phil Carmack, vice president for handheld products, who spoke at the Mobile Entertainment Summit, we’re talking a thousand-fold magnitude in the improvement of common cell phones, starting next year.

On the CTIA show, of course, all the talk is about enabling content that wasn't really new in the late 1970s. The most popular game on cellphones remains bowling. Yes, there's talk about ring tones (10 second bursts of sound that play when your phone rings), which should be a huge business, and the Hollywood crowd was out in force this year, pawing for revenue. (The illustration, by the way, is from the Domestickers line, number 96 if you're keeping score at home.)