Posted
by
Unknown Lamer
on Monday October 07, 2013 @08:12PM
from the wait'll-the-hippies-learn-it's-nuclear dept.

mysqlbytes writes "The BBC is reporting the National Ignition Facility (NIF), based at Livermore in California, has succeeded in breaking even — 'During an experiment in late September, the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel — the first time this had been achieved at any fusion facility in the world.'"

In 2009, NIF officials announced an aim to demonstrate nuclear fusion producing net energy by 30 September 2012. But unexpected technical problems ensured the deadline came and went; the fusion output was less than had originally been predicted by mathematical models.

Soon after, the $3.5bn facility shifted focus, cutting the amount of time spent on fusion versus nuclear weapons research - which was part of the lab's original mission.

However, the latest experiments agree well with predictions of energy output, which will provide a welcome boost to ignition research at NIF, as well as encouragement to advocates of fusion energy in general.

Except that the news is not on their website yet (maybe the people who update it are "non-essential government personnel"). The shot they're talking about in your link consumed 1.7MJ and yielded 8kJ, which is a far cry from what is claimed on the BBC website. As I understood, it also wasn't aimed to maximize energy yield.

Time flows the same in England as it does in the US, and they get the information at the same instant as the US (Barring marginal transmission delays). If it was a case of hours and timezones, I might agree with you somewhat, but as the freakin' summary quotes: "During an experiment in late September," (Emphasis mine).

Even assuming that means September 30th, that's 7 days the US press has had to sit on this. At that point, the fact that the UK is 5-7 hours ahead doesn't make an iota of difference (Well, technically I guess it makes 4.1666% of difference, but that's hardly the point).

The *experiment* was in late September. Researchers tend to be rather cautious about announcing significant milestones, especially in high-profile areas such as this, taking time to double check their numbers and the like beforehand. I can easily see the process taking a few days or weeks before they're ready to make a statement.

As for getting the information the same instant the world over - how exactly do you see that happening? The scientists send a press release to (presumably) a small number of news organizations (the BBC probably being one of them). All other organizations hear about it second-hand, likely meaning at least a fair portion of a day, possibly several days, before it's published, and another delay before anyone else can publish anything more than a blatant plagarization. Repeat that a few times before it hits some other news stream that you watch and...

Sure the info probably went up on the researchers website about the same time as the press release, and that is available to everyone everywhere, but I would suspect that very few people routinely check the websites of random researchers on a daily basis - after all it's not something important like the latest celebrity scandal[/sarcasm], it won't make any difference to most people if they don't hear about it for a few days.

That was more of an accident than military funded. Sulphur got dropped on some hot rubber and it was found to be tougher. From there, military went in, but the patent was well before the military (before the civil war, IIRC) thought about getting involved.

On the other hand, an institution that is regularly criticized by folks like Dr. Ben Goldacre of http://www.badscience.net/ [badscience.net] and Prof. Mark Liberman of Language Log for the incredibly poor quality of their science reporting may not be the source you really want to trust on this or any other topic.

Granted, few general-purpose new sources are particularly good when it comes to their coverage of science, but the BBC does have a bit of a reputation for being above average--a reputation which seems to be rather undeserved, as far as I can tell.

Tokamaks are far closer to practicality. In 1997, JET achieved 16 MW of fusion power with 24 MW of heating. ITER will almost certainly achieve much greater than breakeven. The goal is Q=10, where Q is fusion power/input power.

It's 5 to 8 hours later in England than it is here. They've had a few more hours to report on it than we have.

Uhh what earth do you live on? In mine the sun rises in the East and sets in the West... putting the UK 5 to 8 hours AHEAD of the USA.

Same Earth, you're both correct. The UK is ahead of the US, which makes the time later there. So, 12:00pm in the US is 5:00pm in the UK (except when it's 4:00pm... damn you DST), which is 5 hours later.

When was the NIF project initiated? I found a "funding confirmation" in 1993, but not when the project itself was started. But if it was less than 5 years earlier, then ITER had a head start bureaucracy-wise.

NIF construction itself started in 1997, while ITER's started in 2008. So if you ignore the time spent on bureaucracy, NIF has had an 11 year head start. But I think the most interesting comparison is not planning time or construction time, but results/time after the facility opens. That will have to wait 7 more years, though.

That's why we speak of "physical break-even", which was, according to some definitions, reached here, and "technical break-even", which takes into account the efficiency of the whole system and compares power in with usable electricity out.

You're not delusional. JT-60 in Japan sort of reached breakeven, but with one hell of a caveat: JT-60 only uses D-D fuel, but it achieved conditions in the plasma such that if the D-D fuel was replaced with D-T fuel, it would have achieved Q=1.25.

What's delusional is the notion that ICF can ever be a commercial source of fusion power. Even after you squint and wave your hands and say "We reached break-even, if you count only the energy absorbed by the fuel," you need to realize the huge inefficiencies at every step along the chain. Conversion of electricity into laser energy is really inefficient. The IR lasers are frequency-converted into UV beams, a process which is only 50% efficient. And only about 10% of *that* actually goes into compressing the fuel.

And that fuel is frozen D-T contained within a copper-doped beryllium capsule that needs to be spherical to micron tolerances, and the surfaces of that sphere need to be smooth to *nanometer* tolerances. The beryllium must be precisely 150 microns thick, and a 5-micron hole is laser-drilled through it. The capsule in turns rests within an equally-precisely made hohlraum comprised of a gold/uranium alloy. Each one of these precision assemblies costs tens of thousands of dollars to make, assembly of the various parts also must be done to micron tolerances. And out of this, if fusion works perfectly and every bit of the fuel is used, you can expect a maximum possible energy output of 45 megajoules. That's 12.5 kilowatt-hours of energy; if you can manage the miraculous feat of 100% efficiently converting that back into electricity, you could sell that electricity for about $1.25.

For commercial fusion, they'll need to burn 15 of these targets per second, every second, indefinitely. Which means that in addition to needing a fusion gain factor of about *60* (compared to 20 for a tokamak, which will also probably never produce commercial fusion power), they'll need to get the fuel cost down to like 10 cents per target.

Meanwhile, fission just works. Figure out how many LFTRs we could build for the cost of the NIF and weep. ICF is a jobs program for engineers who got scared as hell when the cold-war ended and started pimping their bomb-research machines to environmentalists who don't understand physics or economics.