Month: February 2007

There are some interesting reflections on the recent Ideas Factory Software control of matter from the German journalist Neils Boeing, in a piece called Nano-Elvis vs Nano-Beatles. He draws attention to the irony that a research program with such a Drexlerian feel had as its midwife someone like me, who has been such a vocal critic of Drexlerian ideas. The title comes from an analogy which I find very flattering, if not entirely convincing – roughly translated from the German, he says: “It’s intringuingly reminiscent of the history of pop music, which developed by a transatlantic exchange. The American Elvis began things, but it was the British Beatles who really got the epochal phenomenon rolling. The solo artist Drexler launched his vision on the world, but in practise the crucial developments could made by a British big band of researchers. We have just one wish for the Brits – keep on rocking!” Would that it were so.

In other media, there’s an article by me in the launch issue of the new nanotechnology magazine from the UK’s Insititute of Nanotechnology – NanoNow! (PDF, freely downloadable). My article has the strap-line “Only Skin Deep – Cosmetics companies are using nano-products to tart up their face creams and sun lotions. But are they safe? Richard A.L. Jones unmasks the truth.” I certainly wouldn’t claim to unmask the truth about controversial issues like the use of C60 in face-creams, but I hope I managed to shed a little light on a very murky and much discussed subject.

Ball lightning is an odd and obscure phenomenon; reports describe glowing globes the size of footballs, which float along at walking speed, sometimes entering buildings, and whose existence sometimes comes to an end with a small explosion. Observations are generally associated with thunderstorms. I’ve never seen ball lightning myself, though when I was a physics undergraduate at Cambridge in 1982 there was a famous sighting in the Cavendish Laboratory itself. This rather elusive phenomenon has generated a huge range of potential explanations, ranging from the exotic (anti-matter meteorites, tiny black holes) to the frankly occult. But there seems to be growing evidence that ball lightning may in fact be the manifestation of slowly combusting, loose aggregates of nanoparticles formed by the contact of lightning bolts with the ground.

The idea that ball lightning consists of very low density aggregates of finely divided material originates with a group of Russian scientists. A pair of scientists from New Zealand, Abrahamson and Dinnis, showed some fairly convincing electron micrographs of chains of nanoparticles produced by the contact of electrical discharges with the soil, as reported in this 2000 Nature paper (subscription required for full paper). Abrahamson’s theory is also described in this news report from 2002, while a whole special issue of the Royal Society’s journal Philosophical Transactions from that year puts the Abrahamson theory in context with the earlier Russian work and the observational record. The story is brought up to date with some very suggestive looking experimental results reported a couple of weeks ago in the journal Physical Review Letters, in a letter entitled Production of Ball-Lightning-Like Luminous Balls by Electrical Discharges in Silicon (subscription required for full article), by a group from the Universidade Federal de Pernambuco in Brazil. In their very simple experiment, an electric arc was made with a silicon wafer, in ambient conditions. This produced luminous balls, from 1- 4 cm in diameter, which moved erratically along the ground, sometimes squeezing through gaps, and disappeared after 2 – 5 seconds leaving no apparent trace. Their explanation is that the discharge created silicon nanoparticles which aggregated to form a very open, low density aggregate, and subsequently oxidised to produce the heat that made the balls glow.

The properties of nanoparticles which make this explanation at least plausible are fairly familiar. They have a very high surface area, and so are substantially more reactive than their parent bulk materials. They can aggregate into very loose, fractal, structures whose effective density can be very low (not much greater, it seems in this case, than air itself). And they can be made a variety of physical processes, some of which are to be found in nature.

Al Gore visited Sheffield University yesterday, so I joined the growing number of people round the world who have seen his famous Powerpoint presentation on global warming (to be accurate, he did it in Keynote, being a loyal Apple board member). As a presentation it was undoubtedly powerful, slick, sometimes moving, and often very funny. His comic timing has clearly got a lot better since he was a Presidential candidate, even though some of his jokes didn’t cross the Atlantic very effectively. However, it has to be said that they worked better than the efforts of Senator George Mitchell, who introduced him. It is possible that Gore’s rhetorical prowess was even further heightened by the other speakers who preceded him; these included a couple of home-grown politicians, a regional government official and a lawyer, none of whom were exactly riveting. But, it’s nonetheless an interesting signal that this event attracted an audience of this calibre, including one government minister and an unannounced appearance by the Deputy Prime Minister.

Since a plurality of the readers of this blog are from the USA, I need to explain that this is one way in which the politics of our two countries fundamentally differ. None of the major political parties doubts the reality of anthropogenic climate change, and indeed there is currently a bit of an auction between them about who takes it most seriously. The ruling Labour Party commissioned a distinguished economist to write the Stern Report, a detailed assessment of the potential economic costs of climate change and of the cost-effectiveness of taking measures to combat it, and gave Al Gore an official position as an advisor on the subject. Gore’s UK apotheosis has been made complete by the announcement that the government is to issue all schools with a copy of his DVD “An Inconvenient Truth”. This announcement was made, in response to the issue of the latest IPCC summary for policy makers (PDF), by David Miliband, the young and undoubtedly very clever environment minister, who is often spoken of as being destined for great things in the future, and has been recently floating some very radical, even brave, notions about personal carbon allowances. The Conservatives, meanwhile, have demonstrated their commitment to alternative energy by their telegenic young leader David Cameron sticking a wind-turbine on top of his Notting Hill house. It’s gesture politics, of course, but an interesting sign of the times. The minority third party, the Liberal Democrats, believe they invented this issue long ago.

What does this mean for the policy environment, particularly as it affects science policy? The government’s Chief Scientific Advisor, Sir David King, has long been a vocal proponent of the need for urgent action on energy and climate. Famously, he went to the USA a couple of years ago to announce that climate change was a bigger threat than terrorism, to the poorly concealed horror of a flock of diplomats and civil servants. But (oddly, one might think), Sir David doesn’t actually directly control the science budget, so it isn’t quite the case that the entire £3.4 billion (i.e., nearly $7 billion) will be redirected to a combination of renewables research and nuclear (which Sir David is also vocally in favour of). Nonetheless, one does get the impression that a wall of money is just about to be thrown at energy research in general, to the extent that it isn’t entirely obvious that the capacity is there to do the research.

One of the most talked-about near term applications of nanotechnology is in in nanosensors – devices which can detect the presence of specific molecules at very low concentrations. There are some obvious applications in medicine; one can imagine tiny sensors implanted in one’s body, which continuously monitor the concentration of critical biochemicals, or the presence of toxins and pathogens, allowing immediate corrective action to be taken. A paper in this week’s edition of Nature (editor’s summary here, subscription required for full article) reports an important step forward – a nanosensor made using a process that is compatible with the standard methods for making integrated circuits (CMOS). This makes it much easier to imagine putting these nanosensors into production and incorporating them in reliable, easy to use systems.

The paper comes from Mark Reed’s group at Yale. The fundamental principle is not new – the idea is that one applies a voltage across a very thin semiconductor nanowire. If molecules adsorb at the interface between the nanowire and the solution, there is a change in electrical charge at the interface. This creates an electric field which has the effect of changing the electrical conductivity of the nanowire; the amount of current flowing through the wire then tells you about how many molecules have stuck to the surface. By coating the surface with molecules that specifically stick to the chemical that one wants to look for, one can make the sensor specific for that chemical. Clearly, the thinner the wire, the more effect the surface has in proportion, hence the need to use nanowires to make very sensitive sensors.

In the past, though, such nanowire sensors have been made by chemical processes, and then painstakingly wiring them up to the necessary micro-circuit. What the Reed group has done is devised a way of making the nanowire in-situ on the same silicon wafer that is used to make the rest of the circuitry, using the standard techniques that are used to make microprocessors. This makes it possible to envisage scaling up production of these sensors to something like a commercial scale, and integrating them a complete electronic system.

How sensitive are these devices? In a test case, using a very well known protein-receptor interaction, they were able to detect a specific protein at a concentration of 10 fM – that translates to 6 billion molecules per litre. As expected, small sensors are more sensitive than large ones; a typical small sensor had a nanowire 50 nm wide and 25 nm thick. From the published micrograph, the total size of the sensor is about 20 microns by 45 microns.