Observers have been waiting for carbon nanotubes, buckyballs, and graphene to transform the world for quite some time, and the wait has been longer than they expected. Enthusiasts for this new miracle material had all but vanished. Is this warranted? Where does the state of innovation in various forms of carbon, that could yield ultra-strong, ultra-light materials and superfast computing really stand?

Ultra-dense computing and storage : Graphene transistors smaller than 1 nanometer have been demonstrated. Carbon allotopes could keep the exponential doubling of both computing and storage capacity going well into the 2030s.

Carbon Fiber Vehicles : This lightweight, ultrastrong material can save vast amounts of fuel by reducing the weight of cars and aeroplanes. While premium products such as the $6000 Trek Madone bicycles are already made from carbon fiber, greater volume is reducing prices and will soon make the average car much ligher than it is today, increasing fuel efficiency and reducing traffic fatalities.

Energy Storage : Natural Gas is not only much cheaper than oil per unit of energy (oil would have to drop to about $30 to match current NG prices), but the supply of NG is more evenly distributed across the world than the oil supply. The US alone has an enormous reserve of natural gas that could ensure total energy independence. The main problem with NG is storage, which is the primary reason oil displacement is not happening rapidly. But microporous carbon can effectively act as a sponge for natural gas, enabling safe and easy transport. This could potentially change the entire energy map.

There are other applications beyond these core three, but suffice it to say, the allotopes of carbon can perform a greater variety of functions than any other material available to us today. Watch for indications of carbon allotopes popping up in the strangest of places, and know that each emergence drives the cost down ever lower.

Each year, I post a roundup of technology breakthroughs for that year from the MIT Technology Review, and I now present the 2008 edition.

2008 was a year of unusually low technological innovation. This is not merely the byproduct of the economic recession, as some forms of innovation actually quicken during a recession. Furthermore, the innovations from 2006 and 2007 (linked below) showed very little additional progress in 2008, except in the field of energy. This also confirms my observation from February 2008 that technology diffusion appears to be in a lull.

What is conspicuously absent is any article titled 'The Year in Nanotechnology'. Both 2006 and 2007 had such articles, but the absence of a 2008 version speaks volumes about how little innovation took place in 2008. The entire field on nanotechnology was lukewarm.

Most of the innovations in the articles above are in the laboratory phase, which means that about half will never progress enough to make it to market, and those that do will take 5 to 15 years to directly affect the lives of average people (remember that the laboratory-to-market transition period itself continues to shorten in most fields).

The Year in Nanotechnology : Stanford University research into nanowires that dramatically increase battery capacity is the most promising breakthrough of 2007, in any discipline. Think 30-hour laptop batteries.

Most of the innovations in the articles above are in the laboratory phase, which means that about half will never progress enough to make it to market, and those that do will take 5 to 15 years to directly affect the lives of average people (remember that the laboratory-to-market transition period itself continues to shorten in most fields). But each one of these breakthroughs has world-changing potential, and that there are so many fields advancing simultaneously guarantees a massive new wave of improvement to human lives.

This scorching pace of innovation is entirely predictable, however. To internalize the true rate of technological progress, one merely needs to appreciate :

We are fortunate to live in an age when a single calendar year will invariably yield multiple technological breakthroughs, the details of which are easily accessible to laypeople. In the 18th century, entire decades would pass without any observable technological improvements, and people knew that their children would experience a lifestyle identical to their own. Today, we know with certainty that our lives in 2008 will have slight but distinct and numerous improvements in technological usage over 2007, just as 2007 was an improvement over 2006.

On September 28, 2006, I made the case that telescopic power is indeed an accelerating technology, set to improve at an estimated rate of 26% a year for the next 30 years. I believe that increasingly more powerful telescopes will ensure that we discover the first genuinely Earth-like planet in another star system by 2011, and that by 2025, we will have discovered thousands of such planets.

The mirror is a pool of salt-based liquids that only freeze at very lower temperatures, coated with a silver film. While practical usage is at least 20 years away, the details reveal a technology that is brilliantly simple, yet tantalizingly capable of addressing almost all of the problems facing the construction of giant telescopes. Glass mirrors are exceedingly difficult to scale to larger sizes, and even the most minor defect can render a mirror useless. Reflective liquid, by contrast, can be scaled up almost indefinitely, limited only by the perimeter of the enclosure it is placed in. External blows that would crack or scratch a glass mirror would have no effect on a liquid that could quickly return to the original shape.

I don't expect updates on this technology in the near future, but the next logical step would be for a smaller telescope to be demonstrated to use this technology. If that succeeds, the ultimate goal would be, by 2030, a massive telescope more than 200 meters in diameter placed on the Moon, where the sky is free of atmospheric distortions, and the ground is free of tiny seismic shaking. This would enable us to observe Earth-like planets at a distance of up to 100 light years, as well as observe individual stars near the center of the Milky Way galaxy (30,000 light years away).

As we have seen before, technological change follows a smooth, exponential curve, with a relatively predictable rate of progress. However, the fruits of this change accrue to those individuals, corporations, and most importantly, those nations which position themselves to benefit from it. This requires funding for even the earliest stages of the process, where the return is going to be highly uncertain.

The US remains the foremost source of scientific breakthroughs and technological innovations, partly due to our willingness to fund federal basic research and sustain institutions that can productively utilize these funds. The $140 Billion that the US spends on federal R&D is greater than the nominal total GDP of all but 35 countries. But are we funding research enough, and in the correct way?

First, let us take note of federal R&D expenditures as a percentage of GDP (which is the only way to accurately measure it).

First, it appears that President Clinton trimmed R&D each year he was in office, until R&D fell from 1.2% to just 0.8% of GDP. The brief budget surplus he took credit for near the end of his term was largely at the expense of R&D expenditures. Had he maintained R&D expenditures at the same percentage of GDP throughout his Presidency, he would not have achieved a budget surplus. Now tell me, which outcome would you have rather had?

President Bush increased R&D funding and got it back to historical levels in his first term. However, his second term has brought another slight downward trend, which I hope is reversed in the next few years. He is still keeping it higher than it was for most of the Clinton years, however.

Some have argued that funding should gradually become an increasingly larger percentage of GDP, as technological changes continue to affect a greater share of the US economy. I might agree, but I also feel that since US scientific innovations lead to products and services that benefit every nation in the world, other prosperous nations should also be obligated to fund basic research in their own countries. Europe and China spend much less than the US, as a percentage of GDP, and it is time they they contributed more to the advancement of human knowledge, rather than simply benefit from US resources.

As far as which fields of science are being funded and which are not, the next chart provides the answer. Basic research (the red line in the first chart) is broken out by agency.

It does appear that President Bush has reduced the funding of NASA, but he has increased NIH funding tremendously. Even defense has not risen over the past several years, contrary to popular belief. Energy reseach has risen slowly over the past 30 years, but there has been no major boost to the DoE at any point.

There you have it - the state of the seeds that lead to fruits we reap years and even decades hence. Is it enough? How do we measure which agency uses a dollar better relative to others? Should we be spending this much when other countries spend much less, and simply wait for our breakthroughs? Can more be done with the same amount of funding dollars?

I happened to come across this post, which displays the author's selections of the top astronomical photographs of 2006. The one I am particularly stunned by is #5, the transit of the Space Shuttle and International Space Station in front of the Sun. The precise timing needed to execute this image mind boggling, and probably less than one in a million. The photographer, Thierry Legault, had to 1) know when the shuttle was approaching the ISS, 2) know when both of them would be in front of the sun relative to his location in France, which was a zone of observation only 7.4km wide, and 3) get this image in the 0.6 seconds of the transit duration.

Not all of these ten photographs are exclusively the result of instruments and technologies that did not exist a few years ago, but 3 to 4 of them are. As we have discussed before, telescopic power is also an accelerating technology, and increasingly impressive images will continue to emerge as new telescopes and supporting resources become operational.

Little can match an astronomical discovery's ability to generate wonder, optimism, and just a general good mood. We shall see, within just the next couple decades, images that even the late Carl Sagan would have been in awe of.

Most of the innovations in the articles above are in the laboratory phase, which means that about half will never progress enough to make it to market, and those that do will take 5 to 15 years to directly affect the lives of average people (remember that the laboratory-to-market transition period itself continues to shorten in most fields). But each one of these breakthroughs has world-changing potential, and that there are so many fields advancing simultaneously guarantees a massive new wave of improvement to human lives.

This scorching pace of innovation is entirely predictable, however. To internalize the true rate of technological progress, one merely needs to appreciate :

We are fortunate to live in an age when a single calendar year will invariably yield multiple technological breakthroughs, the details of which are easily accessible to laypeople. In the 18th century, entire decades would pass without any observable technological improvements, and people knew that their children would experience a lifestyle identical to their own. Today, we know with certainty that our lives in 2007 will have slight but distinct and numerous improvements in technological usage over 2006.

The first telescope used for astronomical purposes was built by Galileo Galilei in 1609, after which he discovered the 4 large moons of Jupiter. The rings of Saturn were discovered by Christaan Huygens in 1655, with a telescope more powerful than Galileo's. Consider that the planet Uranus was not detected until 1781, and similar-sized Neptune was not until 1846. Pluto was not observed until 1930. That these discoveries were decades apart indicates what the rate of progress was in the 17th, 18th, 19th, and early 20th centuries.

The first extrasolar planet was not detected until 1995, but since then, hundreds more with varying characteristics have been found. In fact, some of the extrasolar planets detected are even the same size as Neptune. So while an object of Neptune's size in our own solar system (4 light-hours away) could remain undetected from Earth until 1846, we are now finding comparable bodies in star systems 100 light years away. This wonderful, if slightly outdated chart provides details of extrasolar planet discoveries.

The same goes for observing stars themselves. Many would be surprised to know that humanity had never observed a star (other than the sun) as a disc rather than a mere point of light, until the Hubble Space Telescope imaged Betelgeuse in the mid 1990s. Since then, several other stars have been resolved into discs, with details of their surfaces now apparent.

So is there a way to string these historical examples into a trend that projects the future of what telescopes will be able to observe? The extrasolar planet chart above seems to suggest that in some cases, the next 5 years will have a 10x improvement in this particular capacity - a rate comparable to Moore's Law. But is this just a coincidence or is there some genuine influence exerted on modern telescopes by the Impact of Computing?

Many advanced telescopes, both orbital and ground-based, are in the works as we speak. Among them are the Kepler Space Observatory, the James Webb Space Telescope, and the Giant Magellan Telescope, which all will greatly exceed the power of current instruments. Slightly further in the future is the Overwhelmingly Large Telescope (OWL). The OWL will have the ability to see celestial objects that are 1000 times as dim as what the Hubble Space Telescope (HST) can observe, and 5 trillion times as faint as what the naked eye can see. The HST launched in 1990, and the OWL is destined for completion around 2020 (for the moment, we shall ignore the fact that the OWL actually costs less than the HST). This improvement factor of 1000 over 30 years can be crudely annualized into a 26% compound growth rate. This is much slower than the rate suggested in the extrasolar planet chart, however, indicating that the rate of improvement in one aspect of astronomical observation does not automatically scale to others. Still, approximately 26% a year is hugely faster than progress was when it took 65 years after the discovery of Uranus to find Neptune, a body with half the brightness. 65 years for a doubling is a little over 1% a year improvement between 1781 and 1846. We have gone from having one major discovery per century to having multiple new discoveries per decade - that is quite an accelerating curve.

We can thus predict with considerable confidence that the first Earth-like planet will make headlines in 2010 or 2011, and by 2023, we will have discovered thousands of such planets. This means that by 2025, a very important question will receive considerable fuel on at least one side of the debate...

The 2006 edition of the Nanotech Report from Lux Research was published recently. This is something I make a point to read every year, even if only a brief summary is available for free.

Some of the key findings that are noteworthy :

1) Nanotechnology R&D reached $9.6 billion in 2005, up 10% from 2004. This is unremarkable when one considers that the world economy grew 7-8% in nominal terms in 2005, but upon closer examination of the subsets of R&D, corporate R&D and venture capital grew 18% in 2005 to hit $5 billion. This means that many technologies are finally graduating from basic research laboratories and are being turned into products, and that investment in nanotechnology is now possible. This also confirms my estimation that the inflection point of commercial nanotechnology was in 2005.

But a deeper concept worth internalizing is how an extension of the Impact of Computing will manifest itself. If the quality of nanotechnology per dollar increases at the same 58% annual rate as Moore's Law (a modest assumption), combining this qualitative improvement rate with a dollar growth of 64% a year yields an effective Impact of Nanotechnology of (1.58)*(1.64) = 160% per year. As the base gets larger, this will become very visible.

3) Nanotech-enabled products on the market today command a price premium of 11% over traditional equivalents, even if the nanotechnology is not directly noticed.