The importance of the US electrical grid and its increasing dilapidation is a theme that first crossed my radar screen in the year 2000 at Huber-Mills Powercosm conference. Also, as technology evolves, the quality requirements of the electricity itself ("high nines"-- i.e. smoothing out farts in the flow of the current) evolve as well.

Despite the trillions spent on "shovel ready jobs investing in America's future", as best as I can tell little to none of it has made it to desperately needed upgrades to the US grid. Furthermore, it appears that our grid may well have been infiltrated by the Chinese, to disrupt at will should ever they see fit to do so.

It is against that background I open this thread-- unfortunately with a piece from Pravda on the Hudson.

Ideas to Bolster Power Grid Run Up Against the System’s Many OwnersCharlie Riedel/Associated Press

Turbines in central Kansas, where there are few major power lines.By MATTHEW L. WALDPublished: July 12, 2013

WASHINGTON — Bill Richardson often denigrated America’s power transmission network as a “third-world grid” when he was President Bill Clinton’s energy secretary, but the more current description of it is “balkanized,” with 500 separate owners. Marc L. Spitzer, a former member of the Federal Energy Regulatory Commission, said even that analogy was not harsh enough.

Bill Richardson, the former energy secretary, who called the American network a “third-world grid.”

“To call the U.S. grid balkanized would insult the Macedonians,” he said.

When President Obama presented his plans last month for executive action that would cut emissions of greenhouse gases, one item on his list was strengthening the power grid. It was on the lists of President George W. Bush and Mr. Clinton, too. But for the most part, experts say the grid is not being changed, at least not on a scale big enough to make much difference.

Their view is reflected in what they say is a largely hypothetical three-year effort by hundreds of engineers to redraw the grid for the eastern two-thirds of the United States. Engineers in the project, which is now drawing to a close, have proposed a basic redesign for beefing up the Eastern Interconnection, the part of the grid that stretches from Nova Scotia to New Orleans.

The redesign would reduce carbon dioxide emissions by replacing coal with wind energy and give the United States something it has never had, a grid designed for shipping bulk amounts of electricity across the continent. The planning, which cost $16 million, shows a substantial carbon emissions reduction.

But the project is covered with footnotes that assert that it does not represent the position of the participants.

“Our work goes into the general knowledge base of the kind of answers you would get when you ask certain policy questions,” said David Whiteley, the executive director of the Eastern Interconnection Planning Collaborative, which carried out the study. Christopher Russo, an energy consultant at Charles River Associates, which helped with the redesign, called it “a technical road map” of thousands of miles of high-capacity transmission lines, and calculations of electricity supply and load and the paths between them.

“We said, ‘Here’s what we could do,’ ” he said. “We haven’t said how we would pay for it.”

Still, drawing a sketch is a step forward. The grid is divided into regions that cover a state or a compact area (like New England) or slightly larger units, like PJM, which once stood for Pennsylvania-Jersey-Maryland but now extends through West Virginia, Ohio and the Chicago area. Almost all planning is done within those regions, as if they were islands. Federal officials say there is not even a regulatory mechanism for planning a line that does more than connect two regions.

“Given the history of this particular industry and its complexity, it is just not going to happen, at least not any time soon,” said James J. Hoecker, a former member of the Federal Energy Regulatory Commission, which has some jurisdiction over transmission lines. One problem, he said, is “resource nationalism,” in which individual states want to use local resources, whether they are coal or yet-to-be-built offshore wind, rather than importing from neighbors in a way that could be more economical.

For now, engineers in the grid redesign project have determined that conducting business as usual between 2010 and 2030 would require $18.5 billion in new transmission lines in the United States, while a system designed to integrate renewables like wind energy on a large scale would cost $115.2 billion. In some places, however, renewables could cut electricity costs by allowing the replacement of high-cost generators with lower-cost ones.

The technology, the engineering skill and even the money are all available, experts say, but the ability to reach agreement on such a grid is not. Dozens of experts said in interviews that there were simply too many players, both commercial and governmental, and too many conflicting interests.

Some of the players have a stake in cleaner or cheaper electricity, but others do not. “There are participants who have a vested interest in the high price of electricity, not the low price of electricity,” said Douglas Gotham, an industry analyst at Purdue University.

At the Illinois Citizens Utility Board, a state-chartered organization that represents consumer interests in regulatory proceedings, David Kolata, the executive director, said new lines could lower costs for customers. But, he said, “for every winner, you get just as many losers, perhaps even more losers.”

The hurdles are particularly acute with wind. Electricity can be made from natural gas almost anywhere, because a superb gas network, built under federal regulation over the last 60 years, will move the gas to wherever it is most convenient to burn it. Energy from coal can also be made almost anywhere. But to make electricity from wind, the generator has to be where the resource is, and for wind, that means places with few major power lines.

In Kansas, for example, sites are available where the wind is so strong that over the course of a year, a wind machine will produce half of its theoretical maximum capacity — an excellent output. But wind machines are more common in eastern locations where energy production is only one-third of the theoretical maximum.

“You could expect 40 or 50 percent more energy” with wind machines in western Kansas, said Michael Skelly, the president of Clean Line Energy Partners, a company that is trying to build, piecemeal, elements of the current plan. In an end run around the traditional regulatory process, Clean Line’s transmission lines would be a bit like private toll roads, financed outside the usual system, and available under contract. The company is planning four large projects but faces significant regulatory hurdles.

The existing grid also makes it difficult to predict the energy output from wind projects. At a single wind farm, energy production can range from zero to 100 percent. But with hundreds of wind farms networked together, production would almost never be zero. Utility planners could in fact derive a minimum likely capacity, an important statistic as more resources are poured into building wind farms.

However, wind energy works only if it is widely shared. Already, there are times in the Pacific Northwest and the Midwest when wind production exceeds demand in the regions to which it can be easily sent. Electricity is a supply chain with a time lag even shorter than the one for sushi. If the power cannot be sent somewhere instantly, it is useless.

For now, there is simply no momentum for a transmission system that would connect the best sites for renewable energy with the biggest areas of demand. “There’s no overall transmission planning for the entire interconnection,” said Vladimir S. Koritarov, deputy director of the Center for Energy, Environmental and Economic Systems Analysis at Argonne National Laboratory.

There is some hope for individual projects, although experts say they are the equivalent of building Interstate highways one route at a time.

“We’ve found a lot of different ways that transmission will fail to be built,” said David S. Hamilton, the director for clean energy of the Sierra Club’s Beyond Coal Campaign “This, at least, is one that has not yet failed.”

Back in 2000 the Huber-Mills Powercosm (an off-shoot of the Gilder Technology Newsletter) spoke of the importance of "High 9s" electricity i.e. the level of smoothness in the current i.e. no "burps" as the level of the technology advanced. Looks like NSA is learning this lesson the hard way.

Chronic electrical surges at the massive new data-storage facility central to the National Security Agency's spying operation have destroyed hundreds of thousands of dollars worth of machinery and delayed the center's opening for a year, according to project documents and current and former officials.[image]

There have been 10 meltdowns in the past 13 months that have prevented the NSA from using computers at its new Utah data-storage center, slated to be the spy agency's largest, according to project documents reviewed by The Wall Street Journal.

One project official described the electrical troubles—so-called arc fault failures—as "a flash of lightning inside a 2-foot box." These failures create fiery explosions, melt metal and cause circuits to fail, the official said.

The causes remain under investigation, and there is disagreement whether proposed fixes will work, according to officials and project documents. One Utah project official said the NSA planned this week to turn on some of its computers there.

Electrical surges at the new data-storage facility central to the National Security Agency's spying operation have destroyed hundreds of thousands of dollars worth of machinery and delayed the center's opening for a year. Siobhan Gorman reports. Photo: Getty.

NSA spokeswoman Vanee Vines acknowledged problems but said "the failures that occurred during testing have been mitigated. A project of this magnitude requires stringent management, oversight, and testing before the government accepts any building."Washington Wire

Failures Follow History of Electrical Troubles

The Utah facility, one of the Pentagon's biggest U.S. construction projects, has become a symbol of the spy agency's surveillance prowess, which gained broad attention in the wake of leaks from NSA contractor Edward Snowden. It spans more than one-million square feet, with construction costs pegged at $1.4 billion—not counting the Cray supercomputers that will reside there.

Exactly how much data the NSA will be able to store there is classified. Engineers on the project believe the capacity is bigger than Google's largest data center. Estimates are in a range difficult to imagine but outside experts believe it will keep exabytes or zettabytes of data. An exabyte is roughly 100,000 times the size of the printed material in the Library of Congress; a zettabyte is 1,000 times larger.

But without a reliable electrical system to run computers and keep them cool, the NSA's global surveillance data systems can't function. The NSA chose Bluffdale, Utah, to house the data center largely because of the abundance of cheap electricity. It continuously uses 65 megawatts, which could power a small city of at least 20,000, at a cost of more than $1 million a month, according to project officials and documents.

Utah is the largest of several new NSA data centers, including a nearly $900 million facility at its Fort Meade, Md., headquarters and a smaller one in San Antonio. The first of four data facilities at the Utah center was originally scheduled to open in October 2012, according to project documents.

In the wake of the Snowden leaks, the NSA has been criticized for its expansive domestic operations. Through court orders, the NSA collects the phone records of nearly all Americans and has built a system with telecommunications companies that provides coverage of roughly 75% of Internet communications in the U.S.

In another program called Prism, companies including Google, Microsoft, Facebook and Yahoo are under court orders to provide the NSA with account information. The agency said it legally sifts through the collected data to advance its foreign intelligence investigations.

The data-center delays show that the NSA's ability to use its powerful capabilities is undercut by logistical headaches. Documents and interviews paint a picture of a project that cut corners to speed building.

Backup generators have failed numerous tests, according to project documents, and officials disagree about whether the cause is understood. There are also disagreements among government officials and contractors over the adequacy of the electrical control systems, a project official said, and the cooling systems also remain untested.

The Army Corps of Engineers is overseeing the data center's construction. Chief of Construction Operations, Norbert Suter said, "the cause of the electrical issues was identified by the team, and is currently being corrected by the contractor." He said the Corps would ensure the center is "completely reliable" before handing it over to the NSA.

But another government assessment concluded the contractor's proposed solutions fall short and the causes of eight of the failures haven't been conclusively determined. "We did not find any indication that the proposed equipment modification measures will be effective in preventing future incidents," said a report last week by special investigators from the Army Corps of Engineers known as a Tiger Team.

The architectural firm KlingStubbins designed the electrical system. The firm is a subcontractor to a joint venture of three companies: Balfour Beatty Construction, DPR Construction and Big-D Construction Corp. A KlingStubbins official referred questions to the Army Corps of Engineers.

The joint venture said in a statement it expected to submit a report on the problems within 10 days: "Problems were discovered with certain parts of the unique and highly complex electrical system. The causes of those problems have been determined and a permanent fix is being implemented."

The first arc fault failure at the Utah plant was on Aug. 9, 2012, according to project documents. Since then, the center has had nine more failures, most recently on Sept. 25. Each incident caused as much as $100,000 in damage, according to a project official.

It took six months for investigators to determine the causes of two of the failures. In the months that followed, the contractors employed more than 30 independent experts that conducted 160 tests over 50,000 man-hours, according to project documents.

This summer, the Army Corps of Engineers dispatched its Tiger Team, officials said. In an initial report, the team said the cause of the failures remained unknown in all but two instances.

The team said the government has incomplete information about the design of the electrical system that could pose new problems if settings need to change on circuit breakers. The report concluded that efforts to "fast track" the Utah project bypassed regular quality controls in design and construction.

Contractors have started installing devices that insulate the power system from a failure and would reduce damage to the electrical machinery. But the fix wouldn't prevent the failures, according to project documents and current and former officials.

Contractor representatives wrote last month to NSA officials to acknowledge the failures and describe their plan to ensure there is reliable electricity for computers. The representatives said they didn't know the true source of the failures but proposed remedies they believed would work. With those measures and others in place, they said, they had "high confidence that the electrical systems will perform as required by the contract."

A couple of weeks later, on Sept. 23, the contractors reported they had uncovered the "root cause" of the electrical failures, citing a "consensus" among 30 investigators, which didn't include government officials. Their proposed solution was the same device they had already begun installing.

The Army Corps of Engineer's Tiger Team said the contractor's explanations were unproven. The causes of the incidents "are not yet sufficiently understood to ensure that [the NSA] can expect to avoid these incidents in the future," their report said.

I think this is part of the argument against ethanol, wind, solar subsidies. By definition, you pay more (willingly) for a source other than lowest cost power, that will work when the grid is down. Undermining and distorting the free market is not how you bring down those prices.

My experience with a gasoline generator is that you run out of gas very quickly and stations require electricity to pump gas. Wind and solar tend to small in output and weather dependent. In this part of the country where we have severe winters and natural gas pipelines to nearly everyone, a natural gas backup system seems far more useful. Blackouts have tended to happen during the air conditioning season or as the result of storms. But if a grid attack or failure happened in winter, most people don't seem to realize their natural gas furnace requires electricity to operate.

SAN JOSE, Calif.—The attack began just before 1 a.m. on April 16 last year, when someone slipped into an underground vault not far from a busy freeway and cut telephone cables.

Within half an hour, snipers opened fire on a nearby electrical substation. Shooting for 19 minutes, they surgically knocked out 17 giant transformers that funnel power to Silicon Valley. A minute before a police car arrived, the shooters disappeared into the night.

A sniper attack in April that knocked out an electrical substation near San Jose, Calif., has raised fears that the country's power grid is vulnerable to terrorism. WSJ's Rebecca Smith has the details. Photo: Talia Herman for The Wall Street Journal

With over 160,000 miles of transmission lines, the U.S. power grid is designed to handle natural and man-made disasters, as well as fluctuations in demand. How does the system work? WSJ's Jason Bellini has #TheShortAnswer.

To avoid a blackout, electric-grid officials rerouted power around the site and asked power plants in Silicon Valley to produce more electricity. But it took utility workers 27 days to make repairs and bring the substation back to life.

Nobody has been arrested or charged in the attack at PG&E Corp.'s PCG -0.41% Metcalf transmission substation. It is an incident of which few Americans are aware. But one former federal regulator is calling it a terrorist act that, if it were widely replicated across the country, could take down the U.S. electric grid and black out much of the country.

The attack was "the most significant incident of domestic terrorism involving the grid that has ever occurred" in the U.S., said Jon Wellinghoff, who was chairman of the Federal Energy Regulatory Commission at the time.

The Wall Street Journal assembled a chronology of the Metcalf attack from filings PG&E made to state and federal regulators; from other documents including a video released by the Santa Clara County Sheriff's Department; and from interviews, including with Mr. Wellinghoff.Related

Q&A: What You Need to Know About Attacks on the U.S. Power Grid

The 64-year-old Nevadan, who was appointed to FERC in 2006 by President George W. Bush and stepped down in November, said he gave closed-door, high-level briefings to federal agencies, Congress and the White House last year. As months have passed without arrests, he said, he has grown increasingly concerned that an even larger attack could be in the works. He said he was going public about the incident out of concern that national security is at risk and critical electric-grid sites aren't adequately protected.

The Federal Bureau of Investigation doesn't think a terrorist organization caused the Metcalf attack, said a spokesman for the FBI in San Francisco. Investigators are "continuing to sift through the evidence," he said.

Some people in the utility industry share Mr. Wellinghoff's concerns, including a former official at PG&E, Metcalf's owner, who told an industry gathering in November he feared the incident could have been a dress rehearsal for a larger event.

"This wasn't an incident where Billy-Bob and Joe decided, after a few brewskis, to come in and shoot up a substation," Mark Johnson, retired vice president of transmission for PG&E, told the utility security conference, according to a video of his presentation. "This was an event that was well thought out, well planned and they targeted certain components." When reached, Mr. Johnson declined to comment further.

A spokesman for PG&E said the company takes all incidents seriously but declined to discuss the Metcalf event in detail for fear of giving information to potential copycats. "We won't speculate about the motives" of the attackers, added the spokesman, Brian Swanson. He said PG&E has increased security measures.View Graphics

Utility executives and federal energy officials have long worried that the electric grid is vulnerable to sabotage. That is in part because the grid, which is really three systems serving different areas of the U.S., has failed when small problems such as trees hitting transmission lines created cascading blackouts. One in 2003 knocked out power to 50 million people in the Eastern U.S. and Canada for days.

Many of the system's most important components sit out in the open, often in remote locations, protected by little more than cameras and chain-link fences.

Transmission substations are critical links in the grid. They make it possible for electricity to move long distances, and serve as hubs for intersecting power lines.

Within a substation, transformers raise the voltage of electricity so it can travel hundreds of miles on high-voltage lines, or reduce voltages when electricity approaches its destination. The Metcalf substation functions as an off-ramp from power lines for electricity heading to homes and businesses in Silicon Valley.

The country's roughly 2,000 very large transformers are expensive to build, often costing millions of dollars each, and hard to replace. Each is custom made and weighs up to 500,000 pounds, and "I can only build 10 units a month," said Dennis Blake, general manager of Pennsylvania Transformer in Pittsburgh, one of seven U.S. manufacturers. The utility industry keeps some spares on hand.

A 2009 Energy Department report said that "physical damage of certain system components (e.g. extra-high-voltage transformers) on a large scale…could result in prolonged outages, as procurement cycles for these components range from months to years."

Mr. Wellinghoff said a FERC analysis found that if a surprisingly small number of U.S. substations were knocked out at once, that could destabilize the system enough to cause a blackout that could encompass most of the U.S.

Not everyone is so pessimistic. Gerry Cauley, chief executive of the North America Electric Reliability Corp., a standards-setting group that reports to FERC, said he thinks the grid is more resilient than Mr. Wellinghoff fears.

"I don't want to downplay the scenario he describes," Mr. Cauley said. "I'll agree it's possible from a technical assessment." But he said that even if several substations went down, the vast majority of people would have their power back in a few hours.

The utility industry has been focused on Internet attacks, worrying that hackers could take down the grid by disabling communications and important pieces of equipment. Companies have reported 13 cyber incidents in the past three years, according to a Wall Street Journal analysis of emergency reports utilities file with the federal government. There have been no reports of major outages linked to these events, although companies have generally declined to provide details.

"A lot of people in the electric industry have been distracted by cybersecurity threats," said Stephen Berberich, chief executive of the California Independent System Operator, which runs much of the high-voltage transmission system for the utilities. He said that physical attacks pose a "big, if not bigger" menace.

There were 274 significant instances of vandalism or deliberate damage in the three years, and more than 700 weather-related problems, according to the Journal's analysis.

Until the Metcalf incident, attacks on U.S. utility equipment were mostly linked to metal thieves, disgruntled employees or bored hunters, who sometimes took potshots at small transformers on utility poles to see what happens. (Answer: a small explosion followed by an outage.)

Last year, an Arkansas man was charged with multiple attacks on the power grid, including setting fire to a switching station. He has pleaded not guilty and is undergoing a psychiatric evaluation, according to federal court records.

Overseas, terrorist organizations were linked to 2,500 attacks on transmission lines or towers and at least 500 on substations from 1996 to 2006, according to a January report from the Electric Power Research Institute, an industry-funded research group, which cited State Department data.

An attack on a PG&E substation near San Jose, Calif., in April knocked out 17 transformers like this one. Talia Herman for The Wall Street Journal

To some, the Metcalf incident has lifted the discussion of serious U.S. grid attacks beyond the theoretical. "The breadth and depth of the attack was unprecedented" in the U.S., said Rich Lordan, senior technical executive for the Electric Power Research Institute. The motivation, he said, "appears to be preparation for an act of war."

The attack lasted slightly less than an hour, according to the chronology assembled by the Journal.

At 12:58 a.m., AT&T fiber-optic telecommunications cables were cut—in a way that made them hard to repair—in an underground vault near the substation, not far from U.S. Highway 101 just outside south San Jose. It would have taken more than one person to lift the metal vault cover, said people who visited the site.

Nine minutes later, some customers of Level 3 Communications, LVLT +10.00% an Internet service provider, lost service. Cables in its vault near the Metcalf substation were also cut.

At 1:31 a.m., a surveillance camera pointed along a chain-link fence around the substation recorded a streak of light that investigators from the Santa Clara County Sheriff's office think was a signal from a waved flashlight. It was followed by the muzzle flash of rifles and sparks from bullets hitting the fence.

The substation's cameras weren't aimed outside its perimeter, where the attackers were. They shooters appear to have aimed at the transformers' oil-filled cooling systems. These began to bleed oil, but didn't explode, as the transformers probably would have done if hit in other areas.

About six minutes after the shooting started, PG&E confirms, it got an alarm from motion sensors at the substation, possibly from bullets grazing the fence, which is shown on video.

Four minutes later, at 1:41 a.m., the sheriff's department received a 911 call about gunfire, sent by an engineer at a nearby power plant that still had phone service.

Riddled with bullet holes, the transformers leaked 52,000 gallons of oil, then overheated. The first bank of them crashed at 1:45 a.m., at which time PG&E's control center about 90 miles north received an equipment-failure alarm.

Five minutes later, another apparent flashlight signal, caught on film, marked the end of the attack. More than 100 shell casings of the sort ejected by AK-47s were later found at the site.

At 1:51 a.m., law-enforcement officers arrived, but found everything quiet. Unable to get past the locked fence and seeing nothing suspicious, they left.

A PG&E worker, awakened by the utility's control center at 2:03 a.m., arrived at 3:15 a.m. to survey the damage.

Grid officials routed some power around the substation to keep the system stable and asked customers in Silicon Valley to conserve electricity.

In a news release, PG&E said the substation had been hit by vandals. It has since confirmed 17 transformers were knocked out.

Mr. Wellinghoff, then chairman of FERC, said that after he heard about the scope of the attack, he flew to California, bringing with him experts from the U.S. Navy's Dahlgren Surface Warfare Center in Virginia, which trains Navy SEALs. After walking the site with PG&E officials and FBI agents, Mr. Wellinghoff said, the military experts told him it looked like a professional job.

In addition to fingerprint-free shell casings, they pointed out small piles of rocks, which they said could have been left by an advance scout to tell the attackers where to get the best shots.

"They said it was a targeting package just like they would put together for an attack," Mr. Wellinghoff said.

Mr. Wellinghoff, now a law partner at Stoel Rives LLP in San Francisco, said he arranged a series of meetings in the following weeks to let other federal agencies, including the Department of Homeland Security, know what happened and to enlist their help. He held a closed-door meeting with utility executives in San Francisco in June and has distributed lists of things utilities should do to strengthen their defenses.

A spokesman for Homeland Security said it is up to utilities to protect the grid. The department's role in an emergency is to connect federal agencies and local police and facilitate information sharing, the spokesman said.

As word of the attack spread through the utility industry, some companies moved swiftly to review their security efforts. "We're looking at things differently now," said Michelle Campanella, an FBI veteran who is director of security for Consolidated Edison Inc. ED -0.37% in New York. For example, she said, Con Ed changed the angles of some of its 1,200 security cameras "so we don't have any blind spots."

Some of the legislators Mr. Wellinghoff briefed are calling for action. Rep. Henry Waxman (D., Calif.) mentioned the incident at a FERC oversight hearing in December, saying he was concerned that no one in government can order utilities to improve grid protections or to take charge in an emergency.

As for Mr. Wellinghoff, he said he has made something of a hobby of visiting big substations to look over defenses and see whether he is questioned by security details or local police. He said he typically finds easy access to fence lines that are often close to important equipment.

"What keeps me awake at night is a physical attack that could take down the grid," he said. "This is a huge problem."

The Power Grid: Our Achilles' HeelChain-link fencing is all that protects the U.S. from a major disaster.ByL. Gordon CrovitzFeb. 9, 2014 5:53 p.m. ET

Tens of thousands of cyber attacks on the power grid are troubling, though so far they have rarely caused damage. More alarming is news of an old-fashioned armed attack on a physical location that proved the vulnerability of the grid.

Last April, a nighttime attack destroyed a power substation in San Jose, Calif., the center of Silicon Valley. The attackers had a good understanding of the facility and how to destroy it. They broke into an underground vault off Highway 101 and cut fiber-optic cables. Then they fired on the substation for almost 20 minutes, apparently using AK-47s, and wrecked 17 of 23 transformers. News of the incident was suppressed, with Pacific Gas & Electric Co. PCG +0.29% blaming vandalism. The damage took a month to repair.

We now have a better understanding of what happened thanks to a page-one article last week in this newspaper. Jon Wellinghoff, who was head of the Federal Energy Regulatory Commission when the incident occurred, says this could be a trial run for attacks to bring down large parts of the electrical grid.

In an interview, Mr. Wellinghoff was careful to say he doesn't know if a terrorist group was responsible. But he called it a "purposeful attack, extremely well planned and executed by professionals who had expert training." He visited the scene with Pentagon experts who train Navy SEALs how to destroy enemy infrastructure. They pointed to the precision of the attack and evidence of its careful preparation. Mr. Wellinghoff said this was the only time Pentagon experts have concluded that damage to the grid in the U.S. has been caused by professionals.

The power substation in San Jose, Calif., that came under attack last April. Reuters

"Coordinated attacks on just a few substations could have a devastating impact," Mr. Wellinghoff warned. Destroying the right targets could knock out power for most of North America. Government agencies keep classified which combination of substations would create the most damage if attacked.

The FBI, which downplayed the likelihood of terrorism, still has no suspects. The bureau recently told the Los Angeles Times: "Until we understand the motives, we won't be 100% sure it's not terrorism."

Former CIA director Jim Woolsey told the Commonwealth Club in San Francisco in October that three or four men operating in a "disciplined military fashion" were responsible for the attack. "This wasn't hooliganism," he said. "This was a systematic attempt to take down the electric grid."

Mr. Wellinghoff, who came to Washington as an advocate for renewable energy, says physical security became a focus for him even before the San Jose attack. "I talked to anyone who would listen in the administration to say that physical security is key to the grid," he recalled. He left office late last year frustrated that few officials seemed to care.

"Terrorism and the Electric Power Delivery System," a National Academy of Sciences report written in 2007 and declassified in 2012, detailed the risks of a physical attack on facilities. "If it were carried out in a carefully planned way, by people who knew what they were doing, it could deny large regions of the country access to bulk system power for weeks or even months," the report said. "Terrorist attacks on multiple-line transmission corridors could cause cascading blackouts."

After Hurricane Sandy in 2012 we saw how much damage can be caused even by short-term, isolated outages. Areas of the Northeast lost access to the Internet, commerce came to a halt, and hospitals soon ran out of power from generators.

The power grid is especially vulnerable because many substations are in rural areas, protected only by chain-link fences. Mr. Wellinghoff urged power companies to take basic steps like building metal or concrete walls.

There are also new tools on the Internet that can be deployed to protect its source of power. Wireless digital sensors could alert security services to intruders. Mr. Wellinghoff says a Silicon Valley firm contacted him to offer sensors that can send alerts as soon as gunshots are fired. Sensors could automatically shut systems down to minimize damage from attacks.

Surveillance drones could be deployed 24/7 around especially sensitive facilities. The need for cheap, reliable drones is another reason the Federal Aviation Administration should legalize commercial uses of drones, which would accelerate their development.

Much of the discussion about surveillance in recent months has focused on the hypothetical risks to privacy from telephone metadata collected by the National Security Agency. Back in the physical world, no government agency is accountable for safeguarding the power grid. Power companies fear legal liability if they change their security systems, even to shore up defenses.

The security of the electrical grid is too important to be left to chain-link fencing. By deploying more Internet security technologies, the power grid can be empowered to help defend itself.

Second Item: Syria Has Non-nuke EMP BombsOp-Ed: Syria Has Non-nuke EMP BombsPublished: Friday, October 11, 2013 8:11 AMSometimes you need to spend billions to unearth supreme strategic military secrets. Other times, self-intoxicated leaders let the cat out of the bag.

Mark LangfanThe writer, who writes on security issues, has created an original educational 3d Topographic Map System of Israel to facilitate clear understanding of the dangers facing Israel and its water supply. It has been studied by US lawmakers and can be seen at www.marklangfan.com.

On 3 March 1917, German Foreign Minister Arthur Zimmerman blurted out, "I cannot deny it. It is true." Zimmerman had just admitted that as a German gambit to keep America "busy," Germany had secretly offered Mexico funding to attack America, and regain Texas and virtually America's entire Southwest.

In no small part due to Zimmerman's admission, America entered the war against Germany the next month, April 1917.

On 26 September 2013, Syrian president Bashar Assad said that Syria possesses "more advanced weaponry, which can serve as a deterrent, and blindside Israel within seconds." Loose lips sink ships.

Sometimes you need to spend billions to unearth supreme strategic military secrets. Other times, self-intoxicated leaders like Zimmerman articulate a casus belli of Germany against America in World War I.

Today, Assad gives a game-changing military secret to Israel more valuable than rubies. By letting slip that Syria could "blindside" Israel, Assad admitted Syria likely possesses non-nuclear, conventional Electro-Magnetic Pulse (EMP) weapons. The ramifications of a conventional EMP attack on Israel apply not only from Syria, but also, most critically, from any Palestinian Arab "demilitarized" state.

As explained in previous articles, a nuclear EMP bomb is merely a regular nuclear weapon which is ignited 50 kilometers above the earth where they don't kill people but kill electronics, instead of 1-5 kilometers above the ground where it would kill many people. Non-nuclear EMP bombs are microwave-emitting weapons which electronically fry a much smaller electronic kill radius than a nuclear EMP bomb would.

But, a conventional EMP bomb has the same end result in that it totally destroys any electronics in a given target zone. The reason I hadn't raised this issue previously is because it is the likeliest critical element to any possible strike on Iran. But since Assad let the cat out of the bag, it must now be discussed openly.

No one can describe non-nuclear conventional EMPs better than its maker the Boeing Company..."A recent weapons flight test in the Utah desert may change future warfare after the missile successfully defeated electronic targets with little to no collateral damage.

Boeing and the U.S. Air Force Research Laboratory (AFRL) Directed Energy Directorate, Kirtland Air Force Base, N.M., successfully tested the Counter-electronics High-powered Microwave Advanced Missile Project (CHAMP) during a flight over the Utah Test and Training Range.

CHAMP, which renders electronic targets useless, is a non-kinetic alternative to traditional explosive weapons that use the energy of motion to defeat a target.

During the test, the CHAMP missile navigated a pre-programmed flight plan and emitted bursts of high-powered energy, effectively knocking out the target's data and electronic subsystems. CHAMP allows for selective high-frequency radio wave strikes against numerous targets during a single mission.

‘This technology marks a new era in modern-day warfare,’ said Keith Coleman, CHAMP program manager for Boeing Phantom Works. ‘In the near future, this technology may be used to render an enemy’s electronic and data systems useless even before the first troops or aircraft arrive.’”

It is entirely feasible, if not highly probable, that Syria possesses such a conventional EMP weapon. How Syria could deliver the weapon to Israel's electronic soft-underbelly is another question. But leaving the question of delivery aside for the moment, the real question is: how does this effect Israel's strategic equation?

Apart from other aspects, the most worrisome effect would be if Israel attempted to strike Iran's nuclear facilities. Such a sophisticated weapon in Syria is likely under the control of Iran's al Quds forces in Syria. And, it would likely be used as a second-strike against Israel if Israel attacked. A non-nuclear counter strike against Israel when Israel is in the midst of attacking Iran could bring Israel a grievous military catastrophe.

Therefore, it now becomes a military necessity that Assad is liquidated before any Israeli attack on Iran proceeds. By Israel's supplying the rebels with critical real-time intelligence of the locations of Assad's weapons' depots, and an unlimited amount of untraceable small caliber bullets (preferably 7.62x39mm Kalash rounds), Assad will crumble and crumble fast.

But the most dramatic threat posed by a non-nuclear conventional EMP bomb is not from Syria, it's from the "demilitarized" Palestinian Arab State that Tzipi Livni is concocting for Israel's future. The reason being, with absolute Israeli military control of all goods coming into that area and absolute military control over the Samarian mountains facing Tel Aviv, it will be impossible to keep out the electronic parts necessary for the Palestinian Arabs to make such a conventional EMP bomb.

They wouldn't need a "delivery system" because the EMP bomb would have already been "delivered" to striking range of Israel's electronic soft-underbelly, Tel Aviv. The Palestinian Arabs could then conventionally first-strike decapitate Israel's entire military force structure and electronic system as a prelude to a second wave attack by Iranian or Arab missiles or armies. The Kirya (Israel's "Pentagon" in Tel Aviv) would be electronically toasted from the word "go." All of Israel's anti-missile, air defense and mobilization systems would be paralyzed from the very first second of a Muslim war of decimation. It would be lights-out, game-over for Israel before the war against Israel even began.

A Palestinian Arab state no longer threatens Israel with chemical Katyushas, it now threatens Israel with annihilation by EMP bombs. Unless Israel short-circuits its suicidal "peace" process, a Palestinian CHAMP EMP bomb could instantly turn Israel's electronic defense into a clump of burning wiring.----------------------------------Hat tip to GM

As the U.S. becomes more reliant on renewable energy like solar and wind, our electricity bills are going to go up. Way up....But the real problem lies in switching those old systems off and getting the renewable systems—wind power, solar farms—online. Many states have a mandate to convert a certain percentage of their energy production to renewable energy by a certain date, but they still haven't figured out exactly how that plan will work. Plus all these renewable systems need to have access to a backup system, adding still more costs.

RELATED

Take a Tour of California's Insane Solar Thermal Energy PlantSometime in the next few months, the Ivanpah Solar Electric Generating System will flip the switch on the largest solar plant of its kind in the… Read…California, which is seen as a leader for renewable energy, has the most aggressive mandate: 33 percent of its power must be renewable by 2020. But that means the cost of electricity could rise 47 percent over the next 16 years.--------------------------------------------------------------------

In 16 years, Californians will WISH electricity only went up 46% under their mandates!

Interesting piece on the grid by a qualified author. Doesn't answer all questions or solve all challenges but makes good sense for as far as it goes. Nuclearis not mentioned?

"Demand for reliability is rising faster than demand for kilowatt-hours themselves."

"the entire planet’s annual production of lithium batteries for all purposes can store about five minutes worth of U.S. electric demand."

"the Energy Information Administration (EIA) forecasts a 12 percent rise over the next decade – that will require the United States to add capacity equal to Germany’s entire current grid."

"PV costs [solar] will decline by [only] another 30 percent by 2030 — important but hardly revolutionary.

"EIA sees natural gas and coal, in almost equal shares (coal still dominates in EIA’s forecast), providing about 70 percent of electric supply a decade out. These two fuels are now in a race to the bottom in terms of price, to the benefit of consumers and ensuring a permanent, low-cost electricity future (absent meddling from policymakers) that will confer an enormous economic advantage on U.S. industries."

We hear increasingly that technology is making today’s electric utility model ‘obsolete’ and will put its companies into a ‘death spiral.’ Is it possible that so much has changed so quickly?

It was only a little more than ten years ago that a National Academy of Engineering report ranked the invention of the electric grid at the top of a list of the 20 greatest inventions of the 20th century. Not just one of the great engineering achievements, but first amongst them. The Academy ranked the Internet 13th.

Now we hear increasingly that technology is making today’s electric utility model “obsolete” and will put its companies into a “death spiral.” Is it possible that so much has changed so quickly?

Post-utility advocates point to three technologies as disrupters: photovoltaics (PV), batteries, and smart or micro grids. The U.S. Department of Energy (DOE), along with a conga line of venture firms in Silicon Valley, invested tens of billions of dollars in these three domains over the past half-dozen years. Volumes of analyses and claims can be summarized in three paragraphs:

Solar arrays on the roofs of homes and buildings, it is argued, will obviate central power generation, especially much-reviled coal plants, and will do so rapidly, as PV costs decline and approach “grid parity.” The Department of Energy released a report chronicling the progress, titled Solar Revolution, that inspired palpitations from New York Times columnist Paul Krugman, who wrote that “it’s no longer remotely true that we need to keep burning coal to satisfy electricity demand.”

It is true that the nation’s electric grid is morphing, but just not quite the way green energy proponents imagine.Lithium battery technology, incredibly improved courtesy of the mobile Internet, we’re told will now migrate into basements of homes and buildings to store PV electricity for nights and cloudy days, obviating the grid as backup. The global proliferation of lithium-powered hybrid-electric cars is just a first step. And when Tesla recently announced plans to build a “gigafactory” that would alone produce more than all of the world’s existing lithium battery factories combined, the green-tech media erupted with excitement, claiming such economies-of-scale promise revolution, not just for electric cars, but also the grid.

Finally, third in the triad, a smart grid, in particular in the form of “micro grids,” connects everything. With far more granular and real-time information about how much, when, and where electricity is used, advocates assert that social and economic behavior will change to radically reduce energy use and further undermine utility revenues.

These three technology forces in combination, the post-utility analysts claim, will “transform the way the utility industry meets energy demand.” It is, we frequently hear, analogous to and as inevitable as the destruction of the Ma Bell landline phone model when cell phones emerged. (Apparently none who offer this analogy notice that AT&T is doing just fine, and is still a huge if differently regulated business.)

The central problem with this post-utility construct is that the physics of information and electricity are profoundly different, and render the Bell analogy meaningless. More on that shortly. First though, it is true that the nation’s electric grid is morphing, but just not quite the way green energy proponents imagine.

The need for a harder grid

Modern society is in much more urgent need of a harder grid, not so much a greener grid. Demand for reliability is rising faster than demand for kilowatt-hours themselves. Two words epitomize this new reality – Metcalfe and Sandy.

In the aftermath of Hurricane Sandy’s widespread and persistent outages, federal and state policymakers called for more spending on grid resilience and recovery. And more recently, policymakers and utilities are still reacting to the fallout from learning about a terrorist-like gunfire attack on California’s Metcalf substation last year, an incident that had been kept a secret until this past spring. That attack prompted a flurry of “what if” scenarios about potential blackouts from future, similar attacks on any of the nation’s tens of thousands of substations.

Electricity powers everything people think is modern about our economy, from older but indispensable things like lights, motors, refrigerators, and air conditioners, to new technologies like the Internet, electric cars, 3D printing, and gene sequencing.On average though, more mundane events lead to the vast majority of increasingly intolerable blackouts: car accidents, squirrels chewing through cables, and old equipment failing. The average incidence of grid outages has been rising at about 8 percent to 10 percent annually since 1990. And the duration of outages has also been rising by about 14 percent per year. (Eaton Corporation provides revealing state-by-state data and trends in their Blackout Tracker.) And then there are the rising concerns over cyber attacks on the grid – arguably one of the most critical areas, demanding increased spending and attention.

All this comes at a time of greater demand for “always on” power to keep our digital and information-centric economy humming. Electricity powers everything people think is modern about our economy, from conventional but indispensable things like lights, motors, refrigerators, and air conditioners, to new technologies like the Internet, electric cars, 3D printing, and gene sequencing.

The share of the U.S. GDP associated with information is three times bigger than the share associated with the transportation sector that moves people and stuff. The former is entirely dependent on electricity and is growing far faster than the latter, which uses oil. (For more on the Cloud’s surprising electricity appetite, see my earlier report.)

It should thus be unsurprising to learn that studies find the cost of outages, measured per kilowatt-hour, is ten to ten thousand times more than the cost of the power itself.

Even as the importance of reliability grows, the consumption of kilowatt-hours also keeps growing, despite billions invested trying to stifle that growth. U.S. electric demand today is 10 percent higher than 2001, perhaps a seemingly modest amount, but for a grid the scale of America’s this increase equals Italy’s entire annual use. For the future, the Energy Information Administration (EIA) forecasts a 12 percent rise over the next decade – that will require the United States to add capacity equal to Germany’s entire current grid.

Thus the future will not be dominated by trying to bolt more renewables onto the grid for their own sake, but in using them to meet growing demand and to add resiliency and reliability.

Technological limits

Smart grids. The key to a more resilient, flexible, and useful grid is to operate it like the Internet, which is nodal, interactive, and highly controllable. This is where “smart” meters and microgrids come in, and where solar energy and batteries play a role.

The future will not be dominated by trying to bolt more renewables onto the grid for their own sake, but in using them to meet growing demand and to add resiliency and reliability.An Internet-like grid will know how much power is needed, when and where, and even what "flavor" of electrons some customers prefer — say, greener or cheaper. It would help moderate variations in peak demand by using software to negotiate in real-time with local and remote power sources, as well as by purchasing “avoided” power (temporarily cycling off air conditioners and refrigerators, but not computers and TVs). It would also reduce outage frequency through predictive analytics that anticipate maintenance before failures. And when failures occur, it would reduce outage duration by more rapidly locating, identifying, and optimally dispatching.

But thus far, spending on the smart grid has been dominated by smart meters that allow more granular and frequent readings and the transmission of that data to the utility, eliminating the old-fashioned meter reader. But just adding a communications feature to the meters is not deeply game-changing; it is the equivalent of installing a speedometer and gas gauge without a steering wheel and brakes. The game-changer is in controlling power.

Internet-like real-time control of power is mainly found at low power levels inside homes and buildings, not on the grid, and is unimaginatively labeled “building automation.” This is a small part of the smart-grid architecture wherein, to continue the information analogy, it is equivalent to the era of stand-alone mainframe computing before the Internet. But control of megawatt-hours, not megabytes, on big grids is a daunting technology problem.

The difference between the two power levels, controlling traffic on the Internet versus grid-power traffic, is what dictates physical material, and safety challenges. That difference is comparable to going from controlling a toy drone to a Boeing 777. Technologies are emerging that make grid-level dynamic switching and control possible, but they’ll take some time yet to get deployed. In the future you’ll hear a lot more about new classes of power transistors and semiconductors, like gallium nitride and silicon carbide, that can manage weapons-grade flows of electrons.

It’s still early days for such technology, and deployment in smart microgrids has barely begun. The country’s most successful and arguably only operational microgrid to date is on the campus of the University of California at San Diego. That 40 MW microgrid seamlessly exits the local public grid when regional demand (or prices) peak, and keeps the campus and its supercomputer lit with on-site power that includes fuel cells, solar arrays, batteries, and natural gas turbines. Notably it’s natural gas that supplies 75 percent of the on-site power.

Microgrids are a start but not the end game. To continue the information analogies, microgrids no more replace central power plants than WiFi networks replace Google’s central computing.

Photovoltaics. It is with the collapsing cost of PV cells that post-utility advocates assert we are close to the tipping point for grid and central power plant disruption.

No regulatory fiat, as exhibited notably by California policymakers implementing the nation’s only mandate for that state’s utilities to install grid-scale storage, can change the reality of simple arithmetic.The capital cost of PVs has improved by a remarkable 200 percent in the past decade. But that rate of decline is slowing as the underlying technologies mature and physics limits are approached. (This happens to everything: aircraft engines improved more than 200 percent in their early years too, and now get better at single-digit percentages at best). Going forward, Germany’s Fraunhofer Institute recently estimated that PV costs will decline by another 30 percent by 2030 — important but hardly revolutionary. And today, an unsubsidized PV array on homes and buildings, Fraunhhofer notes, produces far more expensive electricity than a central power plant.

And, it is argued, the central plant depends on a costly grid to get power to consumers. But solar needs the grid too. In order to ensure the 24-7 electric supply society demands, a PV array today uses the grid as “back-up.” But that raises questions about how to share the cost of the grid’s power plants and infrastructure, an issue regulators are struggling with in many states.

The alternative is to convert episodic on-site solar generation into “always on” power using batteries, or on-site back-up generators. The latter solution, distributing millions of small car-sized engine-driven power plants to every home or office to back up solar arrays, is not economically viable, much less sensible. It is battery technology that post-utility solar advocates hold out as the Holy Grail. Just store the electricity for when the sun’s not shining.

Batteries. Assume for the sake of argument that big batteries are cheap. Even then, a solar-only or solar-dominated system remains economically untenable. Supplying electricity all day, every day with a battery-solar combination requires, on average, buying two to three extra solar panels for every one installed in order to generate and store extra power when the sun is shining, thereby doubling or tripling system costs.

And assume again that batteries are cheap, the world would have difficulty producing enough of them to be impactful at grid levels. California policymakers apparently think otherwise, having implemented the nation’s only mandate for that state’s utilities to install grid-scale storage. Consider the reality of simple arithmetic:

All the world’s lithium battery factories collectively produce about 30 GWhr (30 billion watt-hours) of storage capacity annually. The United States alone consumes about 4,000,000 GWhr of electricity a year. Thus the entire planet’s annual production of lithium batteries for all purposes can store about five minutes worth of U.S. electric demand.

Politicians face increasing peril if their policies cause something as important as electricity to become increasingly expensive and less reliable.As for Tesla’s putative gigafactory, if it gets built, its entire annual output adds another five minutes of U.S. grid-scale storage. And at that, Tesla batteries cost at least 500 percent more than today’s solution for providing electricity when outages or peaks happen. Reliability of supply comes from building extra power plants to have on standby, and storing gas in caverns or coal in piles adjacent to those power plants.

This reality is precisely why society-levels of reliable, affordable electricity supply is such a great engineering challenge, and why the National Academy honored that achievement. For other energy commodities (and in general, most commodities), it is technically easy and inexpensive to store several months — not minutes — of demand at any given time in order to ensure price stability and physical reliability.

Still, better battery technology will emerge in due course. But it will come from some university research lab using big data to unravel chemical mysteries, not from building bigger buildings using yesterday’s chemistry. And you can bet that the company that invents the new way to store electricity will focus on selling into the huge high-value market for powering mobile devices. That’s where battery fortunes will be made, because consumers pay $20 per kilowatt-hour to keep iPads and iPhones lit, compared to $0.20 a kWh to keep the grid lit.

And, as better, cheaper battery technology does emerge, it will be as valuable, arguably more valuable, for conventional power plants, for reasons of simple economics. One would store the cheapest electrons when they are in surplus — i.e., coal-fired electricity in surplus at night delivered on uncongested lines — to resell later when prices and grid congestion are highest, around midday. Cheap grid-scale batteries will reinforce and arbitrage a complementary role for coal and solar. Somewhat ironic perhaps.

Speaking of coal, it’s impossible to talk about the grid and not make note of the carbon and global warming issues. To tilt the field away from hydrocarbon fuels, policymakers and regulators have taken actions that increase their costs, and also subsidize non-hydrocarbon energy. But the laws of physics of energy, and laws of economic reality, cannot be ignored. Politicians face increasing peril if their policies cause something as important as electricity to become increasingly expensive and less reliable.

Today’s technological breakthroughs: Smart drilling and big data

The two technologies that are reshaping the electric grid, allowing both more resilience and reliability, are not the ones pundits and policymakers expected: smart drilling, and big data.

Smart drilling has unleashed an entirely unexpected bounty of shale gas, not only making it easier to meet rising demand, but also to ensure reliability with plenty of spare capacity, all at low cost.

The entire planet’s annual production of lithium batteries for all purposes can store about 5 minutes worth of U.S. electric demand.Just as technology lead to a 200 percent drop in the capital needed to produce a unit of PV electricity over the past decade, so too has technology driven a similar 200 percent (and even greater) decline in capital needed to produce a unit of shale gas – but the latter has happened in the past four years, and continues.

EIA sees natural gas and coal, in almost equal shares (coal still dominates in EIA’s forecast), providing about 70 percent of electric supply a decade out. These two fuels are now in a race to the bottom in terms of price, to the benefit of consumers and ensuring a permanent, low-cost electricity future (absent meddling from policymakers) that will confer an enormous economic advantage on U.S. industries.

The other disruptor, big data analytics, is made possible by the combination of proliferating low-cost sensors, ubiquitous wireless connectivity, and the Promethean power of computing. While big data will eventually impact every sector of the economy, one of the immediate benefits that real-time analytic intelligence brings is to wring greater value out of existing supply chains and infrastructures.

For some indication of the power of big data, consider the unheralded success of the PJM ISO — the system operator for the long-distance transmission system that lies between Chicago, New York City, and Washington, D.C., and its new sensor, control, and big data analytics system. After bringing it on line a little more than a year ago, it not only resulted in greater reliability and hundreds of millions of dollars in operational savings, but it also increased the system’s power-carrying capacity two-fold without adding new power lines.

Tesla batteries cost at least 500 percent more than today’s solution for providing electricity when outages or peaks happen.While it is technically more difficult to implement that architecture on the local grids — the urban roads of the grid, versus the interstates of long-distance transmission — that is what will come next. Big data will also create greater markets for solar arrays and batteries as a feature of a central-power-plant grid. While the future may lead to different kinds of companies owning some or all the pieces of the grid, it will still be a grid and for consumers it will still feel as much like a “utility.”

Still, some utility CEOs, notably NRG’s outspoken chief David Crane, believe in a post-grid world. Earlier this year, Crane said: “Think how shockingly stupid it is to build a 21st-century electric system based on 120 million wooden poles... the system from the 1930s isn’t going to work in the long term.” This is not much of an improvement over the flawed Ma Bell analogy. Today we still build furniture and houses out of wood, a system predating the Romans; we just use the same materials much more efficiently.

When the National Academy of Engineering gets around to a 21st-century retrospective, odds are that ranked as top achievements will be whatever we end up calling big data, and whatever we end up calling the technologies that unlocked the shale. And the grid, pioneered in the 20th-century, will still be around, just a lot better.

Mark P. Mills is a senior fellow at the Manhattan Institute and CEO of the Digital Power Group.

Along with Huber, Mills wrote the Huber-Mills Powercosm newsletter starting back in 2001-- I even attended an investor conference for it back in 2001. I regard him as highly qualified and thoughtful in this area.

ENERGY & SUSTAINABILITYLockheed Claims Breakthrough on Fusion EnergyLockheed Martin Corp said on Wednesday it had made a technological breakthrough in developing a power source based on nuclear fusion, and the first reactors, small enough to fit on the back of a truck, could be ready in a decade.