The largest solar power plant of its kind is about to turn on in California's Mojave Desert.

The Ivanpah Solar Electric Generating System will power about 140,000 homes and will be a boon to the state's renewable energy goals, but it was no slam dunk. Now, California is trying to bring conservationists and energy companies together to create a smoother path for future projects.

To get the best view of the Ivanpah solar project, you have to go up to the top of a 400-foot concrete tower. Below, close to 200,000 mirrors shimmer across a dry, dusty valley. "It's very exciting," says Dave Beaudoin, the construction manager for the $2 billion project located about an hour southwest of Las Vegas. Each mirror is about the size of a garage door, and it's mounted on a pole so it can be pointed at the tower. "We can keep the sun's energy — the rays of the sun — targeted back to the solar tower," Beaudoin says.

All of those mirrors generate about a thousand degrees of heat. It isn't the solar technology most of us think of: dark panels on rooftops. These mirrors heat a giant boiler on top of the tower, where water turns into steam. Beaudoin says that steam powers a turbine that generates electricity.

BrightSource Energy just collected $15 million of a modest $35 million tranche of venture funding. (It's modest relative to the hundreds of millions the company has raised over the last decade.) ...

Woolard's resignation came on the heels of a BrightSource press release that described the company "evolving from being a U.S. project developer to becoming a global technology provider that also offers development support as well as engineering and operational services." That's tough language to decode, but forgoing project development to become a "technology provider" is reminiscent of a startup's transition to a licensing model -- not always a good sign for a startup. ...

Several months ago, we reported that BrightSource's giant Ivanpah solar thermal project in the Mojave Desert was 92 percent complete. The 377-megawatt project consists of three 459-foot-tall towers encircled by arrays of garage-door-sized heliostats. A total of 173,500 computer-controlled heliostats will eventually reflect the sun onto the receiving towers, heating water to create steam that will drive turbines that produce electricity. Future projects from BrightSource will include thermal energy storage as per Solar Reserve's projects. Another CSP vendor, GlassPoint, directs its steam toward enhanced oil recovery, rather than electrical power. Two BrightSource Energy projects have recently been shelved due to permitting issues. BSE terminated power purchase agreements for the proposed Hidden Hills and Rio Mesa CSP solar power tower projects....

Former CEO Woolard said, "The problem we're trying to solve is [... how to] decarbonize the power supply and maintain system reliability at the lowest total cost to customers." Woolard noted that each picture-window-sized heliostat mirror, installed at the rate of one per minute at the $2.2 billion Ivanpah project, is capable of providing power to approximately one home, without the "hidden integration costs to the consumer" that come with wind and solar.

Since BrightSource was founded, its competition -- natural gas and solar photovoltaics -- has gone through disruptive price drops. CSP has not yet had the opportunity to scale like those two technologies.

Researchers at MIT in the US have stumbled upon a new method to trap light that could lead to a wide variety of applications, not least of all a vast improvement in the efficiency of concentrated solar power generation.

The breakthrough involves what is being described as a kind of “perfect” mirror, which works in a way that deviates from known scientific laws, pitting light waves against light waves, and setting up two waves that have the same wavelength, but exactly opposite phases — where one wave has a peak, the other has a trough — so that the waves cancel each other out. Meanwhile, light of other wavelengths (or colors) can pass through freely.

Australia’s biggest solar energy plants have been given the funding go-ahead, clearing the way for the installation of 2 million photovoltaic panels at two sites in the NSW outback.Power company AGL on Wednesday committed to proceed with the $450 million investment in the plants which will supply 50,000 homes with electricity and potentially pave the way for more such ventures.

Nyngan, north-west of Dubbo, will host the larger of the two plants, with a 102-megawatt capacity, while a 53-megawatt plant will be built near Broken Hill. Both should be supplying power to the eastern Australian grid by the end of 2015.

By the end of the year, Ontario will become the first jurisdiction in North America to shut down almost its entire coal fleet. Yesterday, the province announced that its last two large coal units will close before 2014, making more than 99 percent of the province's electricity generated from non-coal sources. It is a major shift for Ontario, which fired 25 percent of its grid from coal a decade ago.

A new report from the European Wind Energy Association (EWEA) states the power produced from turbines in deep waters in the North Sea alone could meet the EU’s electricity consumption - four times over.

"Offshore wind in Europe could be providing 145 million households with renewable electricity and employing 318,000 people by 2030, while providing energy security, technology exports, and no greenhouse gases," says the EWEA.

'Deep Water' in relation to wind turbines is considered to be depths of 50 metres or more. While the technology is still reasonably new, floating wind turbine designs are competitive in terms of the levelised cost of energy (LCOE) with bottom-fixed foundations in depths exceeding 50 metres.

Of all Europe’s grid connected offshore wind turbines currently in service, only two are not reliant on fixed foundations.

Acknowledging that the sector must overcome significant technical, economic and political challenges in order to properly tap deep water wind resources; the EWEA says if the challenges are overcome, the first deep offshore wind farms could be up and running by 2017.

Even at the shallow end of the pool, offshore wind power in the EU is a very big business. At the end of last year, 1,662 turbines totalling 5 GW capacity were operating in 55 wind farms in 10 European countries; producing 18 TWh of electricity - enough to power almost five million households.

This post is an update and continued expansion to my previous posts about tight/shale oil in Bakken/Three Forks in North Dakota (ND):

* Is Shale Oil Production from Bakken Headed for a Run with “The Red Queen”?
* Is the Typical NDIC Bakken Tight Oil Well a Sales Pitch?

This post documents:

* At present oil prices Bakken tight oil has the overall prospects of being profitable.
* Between 70-75% of the studied wells (well cost @9Million and oil price @$90/bbl) were found to have a prognosis for being at or above breakeven (being profitable).
* If (or rather when) average well productivity declines further, this will add a new meaning to the term tight oil.
* Developments in average well productivity.

Two dead horses, five dead hens, and their owner with a couple hundred bee stings — that’s the end result of a recent bee attack in Texas. The bees in question initiated the attack on Kristen Beauregard and her animals shortly after she had finished exercising the two horses on Wednesday afternoon — according to Beauregard, the attack was completely unprovoked, her and her animals were nowhere near the hive at the time.

Hot on the heels of ARENA’s decision against supporting Alinta’s development of solar thermal power generation in Port Augusta, the federal government has launched an set of online tools, funded by the Australian Renewable Energy Agency, aimed at helping researchers, developers and financiers assess the commercial prospects of Concentrating Solar Thermal power projects in Australia.

Originally developed by the US Department of Energy’s National Renewable Energy Laboratory (NREL), the concentrating solar thermal System Advisor Model (SAM) has been adapted for Australian conditions by the Australian Solar Thermal Energy Association (AUSTELA) through a $73,500 ARENA investment.

According to AUSTELA, the adapted SAM model is “general purpose and can predict hourly, monthly and annual output of CSP, Concentrating PV, flat plate PV and a range of other renewable energy systems” – but there has been an extensive body of work around its application to CSP systems in particular.

“Concentrating solar thermal systems have the potential to play a significant role in future electricity networks as they can store energy, which means clean energy can be dispatched to homes and businesses at anytime of the day or night,” said federal resources and energy minister Gary Gray.

“These new tools – which will optimise an industry-leading United States model for Australian conditions – will make it easier for developers and financiers to assess the commercial viability of concentrating solar thermal projects.”

Bee brains lacked much processing power. Just enough hardware in there to run a high-level bee-dance language where the bees could clue each other in about tasty matter resources. Adrienne had mocked this system up on a whiteboard with boxes and arrows. Julio had coded it with open-source modules.

Then they’d created these 3Dprinted plastic “bee puppets.” Their fake plastic Maker bees were, like, awesomely effective at bee dancing. Their robot bees, set dancing by Arduino, were basically Trojan Bees. They had gotten root in the hive. They had powned the hive colony superorganism. Those bees would do whatever the hackers wanted.

“Their bee-swarm pitch is out of this world!” I told Crawferd. “I can’t believe I haven’t seen this idea before!”

The Maker kids ramped up to their triumphant climax. Being new to California, they’d noticed all the window-box marijuana plants. They’d hacked their bees to go out to forage for dope pollen.

They showed the camera their existence proof: a double fistful of honey-drenched Silicon Valley hashish.

Then little Adrienne and Julio modestly asked the public for twenty grand to go 3Dprint some beehives, so they could issue some royal-jelly marijuana prescriptions. A business-model screwup that was total facepalm. Of course their Kickstarter had exploded. Just gone ballistic. It had blown past twelve million USD in capital and was heading north at high speed.

“You have created a monster,” I told Crawferd. “I can see why you’re so upset now. This is not even funny. Where are those crazy kids? They’re gonna need to lawyer up.”

In another blow to recovery efforts in the prefecture hit hardest by the 3/11 tsunami and nuclear meltdown, Japan’s National Institute of Advanced Industrial Science and Technology have estimated the cost of decontamination and cleanup to be more than $50 billion. The central Japanese government has earmarked only $11 billion in aid to the struggling region.

“The group has estimated that the decontamination in the no-entry zones will be at around $20 billion, while the other surrounding areas will cost another $31 billion,” said the Japan Daily Press. “The estimates are based on the unit costs that the government provided as well as information collected from the affected municipalities. It also includes the expenses that will be incurred in the removal, transportation and storage of radioactive waste like the contaminated soil and water.”

It's almost a cliche that the missing piece of a renewable energy future is low-cost energy storage.

"Until we find a technology that is low-cost, highly scalable, and long-lasting, ubiquitous grid storage won't be possible. The all-liquid battery's elegant materials design and simple assembly process makes it the best chemical option we've seen for storing the grid at massive scale."

That's Khosla Ventures partner Andrew Chung's comment on Liquid Metal Battery Corporation (LMBC). He's now on the board of the firm; he funded founder Don Sadoway's research at MIT before the firm won the top ARPA-E award back in 2009. LMBC just announced that it raised an additional $15 million in funding in its Round B from Khosla Ventures, Bill Gates and energy company Total

We spoke with Phil Giudice, the CEO of LMBC, as well. He said, "Our Liquid Metal Battery technology is tremendously exciting because it has the potential to dramatically change the electric power system everywhere," in a release. The CEO told GTM that the company had passed the R&D stage and was moving into commercializing the technology for large-scale grid applications.

The International Energy Agency (IEA) released its latest Medium-Term Gas Market Report in St. Petersburg, Russia last month. Although the IEA sees the growth of gas in the power sector slowing, they also cite its emergence as "a significant transportation fuel." What really caught my eye was their projection that gas over the next five years would have "a bigger impact on oil demand than biofuels and electric cars combined," in light of the US shale gas revolution and tougher pollution rules in China.

That's quite an assertion, considering oil's longstanding dominance in transportation energy. As I noted in March, Italy, Pakistan and several other countries already have well-established demand for compressed natural gas (CNG) for passenger cars. Despite these hot spots only 3% of gas is currently used in transportation, globally, based on analysis from Citigroup. The IEA is forecasting that transportation growth will consume 10% of the projected global gas production increase of roughly 20 trillion cubic feet (TCF) per year by 2018. That's 2 TCF per year of additional natural gas demand in the transport sector, equivalent to 1 million barrels per day of diesel fuel.

I'd be more skeptical about that figure if I hadn't seen a presentation from Dr. Michael Gallagher of Westport Innovations at the Energy Information Administration's annual energy conference in Washington, DC last Monday. Westport specializes in natural gas engine technology for heavy-duty trucks and played a major role in implementing the LNG vision of the ports of Los Angeles and Long Beach, CA a few years ago.

Dr. Gallagher made a strong case for gas in heavy-duty trucking, starting with the low cost of US natural gas compared to oil and its products. Initial growth rates in several segments look encouraging, including transit buses and new trash trucks, for which natural gas now has around half the market. Growth in China has apparently been even faster, with LNG vehicles increasing at over 100% per year (from a small base) and natural gas refueling stations growing at 33% per year since 2003.

With regard to the disastrous Macondo oil well rupture that ended in 11 deaths and triggered the largest US offshore oil spill of all time (and uncountable ongoing ecological impacts), Halliburton has 'graciously' decided to plead guilty to destroying evidence. 'Guilty?' we hear you ask? When has any large US corporation not just settled in order to not be forced to admit guilt? Well, as Reuters reports, "their willingness to plead to this may also indicate that they'd like to settle up with the federal government on the civil penalties." The maximum statutory fine for this apparent midemeanour? $200,000! Or 0.0007% of expected revenues for 2013. Well, that'll teach 'em for sure - they won't be destroying evidence again, eh?

A major new technology has been developed by The University of Nottingham, which enables all of the world's crops to take nitrogen from the air rather than expensive and environmentally damaging fertilisers.

Nitrogen fixation, the process by which nitrogen is converted to ammonia, is vital for plants to survive and grow. However, only a very small number of plants, most notably legumes (such as peas, beans and lentils) have the ability to fix nitrogen from the atmosphere with the help of nitrogen fixing bacteria. The vast majority of plants have to obtain nitrogen from the soil, and for most crops currently being grown across the world, this also means a reliance on synthetic nitrogen fertiliser.

Professor Edward Cocking, Director of The University of Nottingham's Centre for Crop Nitrogen Fixation, has developed a unique method of putting nitrogen-fixing bacteria into the cells of plant roots. His major breakthrough came when he found a specific strain of nitrogen-fixing bacteria in sugar-cane which he discovered could intracellularly colonise all major crop plants. This ground-breaking development potentially provides every cell in the plant with the ability to fix atmospheric nitrogen. The implications for agriculture are enormous as this new technology can provide much of the plant's nitrogen needs.

Solar and wind energy could replace all fossil fuels in Australia by 2040 if their recent rate of deployment is maintained and slightly increased over the next 27 years – delivering the country with a 100% renewable electricity grid “by default” as early as 2040.

The stunning conclusions come from research from Andrew Blakers, the director of the Australian National University’s Centre for Sustainable Energy Systems. It notes that nearly all new electricity generation capacity in recent years has been wind and solar photovoltaics (PV), and demand has also ben falling since 2008.

Blakers says that if this situation continues then Australia will achieve renewable electricity system by 2040, as existing fossil fuel power stations retire at the end of their service lives and are replaced with renewables.

And the cost will be no greater than having fossil fuels because, as Bloomberg New Energy Finance notes, wind is already cheaper than new coal or gas-fired generation and solar soon will be. These are the critical points – because renewables are often painted as expensive when compared to fully-depreciated, 40 years fossil fuel plants. But not compared with the new capacity required to replace ageing fossil fuel fleet.

Blakers says his scenario works even using the more conservative technology cost forecasts prepared by the Bureau of Resource and Energy Economics. These forecasts are being updated, but they came to similar conclusions as BNEF on technology cost trends, just not quite as quickly.

The 100% by 2040 scenario is probably not that much different in scope to current trends. Australia was sitting at around 10 per cent renewables in 2010, and will probably end up with at least 25 per cent by 2020, given current trends on rooftop solar and the fixed 41,000GWh target for large scale renewables.

Wind power is now cost competitive with new coal-fired generation in India, according to a report from HSBC [pdf]. Falling costs are just one reason for the increased interest in wind. For the first time, India has identified water as a scare natural resource in its most recent five-year plan. Nearly 90 percent of India’s industrial water demand comes from thermal power plants, according to the HSBC report.

The appeal of some renewables, such as wind and solar photovoltaic, is that they use far less water than coal, nuclear, or natural gas power plants. Across the globe, water stress is growing within the energy industry and power plants have to partially shut down when there isn’t enough water for cooling. In India, water shortages just before monsoon season in 2012 forced hydro generation and thermal power plants to partially close.

India needs all the power it can get. Last July, a sweeping power outage left about 700 million people without power. Outages are a daily occurrence in India, although usually not at that scale, because of a dearth of generation coupled with an outdated grid. The Central Electricity Authority estimates India has a peak deficit of 12 gigawatts.

The latest five-year plan calls for a doubling of renewable energy from the previous plan, including 15 gigawatts of wind, 10 gigawatts of solar, 2 gigawatts of small hydro and nearly 3 gigawatts of biomass. Individual Indian states have also instituted solar and wind installation targets.

Hydropower accounts for more electricity production than solar PV, wind, and geothermal combined. In 2012, hydropower accounted for 16% of the world’s electricity production. However, hydropower gets far less press because it is a mature technology with a much lower annual growth rate than most renewables. While solar PV increased capacity by an average of 60% per year over the past 5 years, new hydropower capacity increased at a much more modest annual rate of 3.3%. ...

Despite hydropower’s current dominant position among renewables, growth in consumption of hydroelectricity will likely continue to be modest, because many of the best sites for hydroelectric dams have already been developed. The exception to this is in the Asia Pacific region, where hydroelectric consumption more than doubled over the past decade. The region currently accounts for 35% of global hydroelectric consumption, and that percentage is likely to increase as countries continue to develop hydroelectric power plants. ...

In 2012, at least 78 countries used geothermal directly for energy. Over two-thirds of the geothermal energy for direct use was through geothermal heat pumps. 24 countries operated geothermal plants for electricity production. Total geothermal electricity capacity was 11.7 GW at the end of 2012. Capacity was led by the U.S. with 3.4 GW of capacity, followed by the Philippines at 1.9 GW, Indonesia at 1.3 GW, Mexico at 1.0 GW, and Italy at 0.9 GW. On a per capita basis, Iceland leads the world with 0.7 GW of capacity, which accounted for 30% of the country’s electricity in 2012.

Platts' "The Barrel" blog has an interesting post looking at the history of natural gas pipeline proposals in Asutralia, including the last plan to pipe gas from the Northern Territory to the eastern states (which face an impending shortfall now most of the coal seam gas being extracted is destined to be sent offshore in the form of LNG), which the gas potentially coming from both offshore fields and shale gas projects in the dead heart - Australia revisits transnational gas pipeline.

Australia is no stranger to the idea of transnational or even international pipelines when it comes to solving the vexed issue of getting enough gas to its eastern seaboard, home to its biggest cities.

Australia currently has two separate gas pipeline networks in the west and east of the country which supply markets of around 1 Bcf/day and 1.6 Bcf/d respectively. A much smaller, also separate, network in central Australia services the Northern Territory capital of Darwin. ...

The latest proposal for a transnational interconnection between Australia’s pipeline networks was initially aired in recent months by former Chief Minister of the Northern Territory Terry Mills, as part of his efforts to secure the future of Rio Tinto’s alumina refinery at Gove. In February, just before being ousted in a party room coup, Mills secured a deal under which Gove would be supplied with gas from Eni’s Blacktip offshore field, heralding a project which would include the construction of a A$500 million pipeline to the plant. ...

That call has now been taken up by Australia’s largest pipeline operator APA Group, manager of 14,120 km of pipeline infrastructure. One of APA’s assets is the 1,600 km Amadeus Basin to Darwin gas pipeline, which was the world’s third-longest when it was completed in 1986 at a cost of just A$380 million. ...

A raft of international oil and gas industry heavyweights have taken a foothold in northern and central Australia’s nascent shale sector over the past few years. Companies including Chevron, ConocoPhillips, Statoil, Total and BG Group have secured farm-in agreements and pledged investments of more than $1.55 billion in Australian shale, according to the US Energy Information Administration. The EIA has estimated that Australia has 437 Tcf of technically recoverable shale gas reserves, ranking the country sixth highest in the world.

Kuwait recently started the bidding process for the 70 MW Shagaya Multi Technology Renewable Energy Power Park, which will include a 50 MW CSP plant with 10 hours thermal storage in addition to 10MW PV and 10MW wind. ...

There is much more on Kuwait’s renewable energy agenda, however, given that the state-owned Shagaya project is the first of a three-phased master plan proposed by KISR. The second phase will expand the plant’s capacity by 930 MW to bring it up to 1,000 MW, and the third by another 1,000 MW to ultimately reach 2,000 MW by 2030. By then, the complex will generate more than 5,000,000 MWh of power every year, fulfilling the demands of nearly 100,000 households.
A 100-square-kilometre (38.6 square-mile) site in Shagaya – a desert area 100km (62 miles) west of Kuwait City, near the borders with Saudi Arabia and Iraq – has been designated for the complex. And while the first phase will be financed by the government, the second and third phases are expected to be offered to investors on a Build-Operate-Transfer basis for 25 years.
- See more at: http://social.csptoday.com/emerging-markets/csp-makes-grand-entry-kuwait#sthash.2JgVY40V.dpuf

A week or so ago, there was a mini-flurry of blog posts announcing that peak oil was dead. Thanks to shale oil, tar sands, heavy oil, deepwater oil, and all the other kinds of oil that the peakists didn't know about, the world was now practically drowning in the stuff.

The whole thing was very strange for several reasons. First, the peak oil community not only knows about all those kinds of nonconventional oil, its forecasts have always included them in minute detail. The question isn't whether they exist, it's when production declines in existing mature fields will outpace the modest amounts of new oil we're getting from nonconventional sources and new drilling technologies. Second, the world isn't drowning in oil. There's no dispute that shale oil has ramped up over the past few years, but it's added only a couple of million barrels a day to worldwide production and it's likely to start declining pretty quickly (within five or ten years or so). It's really not that big a deal on a global scale. Third, peak oil has never been only about the exact date that production of oil hits its highest point. It's been about how long production will plateau; how steep the subsequent decline will be; how expensive it will be to extract nonconventional oil; and how much oil prices will spike up and down as demand bumps up permanently against supply limits.

Hell, a few years ago even the International Energy Agency‌—which historically had refused to acknowledge production limits even theoretically—finally admitted that peak oil was a reality. When you lose the IEA to the dark side, you really ought to just admit defeat.

Oil spills at a major oil sands operation in Alberta have been ongoing for at least six weeks and have cast doubts on the safety of underground extraction methods, according to documents obtained by the Star and a government scientist who has been on site.

Canadian Natural Resources Ltd. has been unable to stop an underground oil blowout that has killed numerous animals and contaminated a lake, forest, and muskeg at its operations in Cold Lake, Alta.

The documents indicate that, since cleanup started in May, some 26,000 barrels of bitumen mixed with surface water have been removed, including more than 4,500 barrels of bitumen.

Rapid thawing of the Arctic could trigger a catastrophic "economic timebomb" which would cost trillions of dollars and undermine the global financial system, say a group of economists and polar scientists.

Governments and industry have expected the widespread warming of the Arctic region in the past 20 years to be an economic boon, allowing the exploitation of new gas and oilfields and enabling shipping to travel faster between Europe and Asia. But the release of a single giant "pulse" of methane from thawing Arctic permafrost beneath the East Siberian sea "could come with a $60tn [£39tn] global price tag", according to the researchers who have for the first time quantified the effects on the global economy.

Even the slow emission of a much smaller proportion of the vast quantities of methane locked up in the Arctic permafrost and offshore waters could trigger catastrophic climate change and "steep" economic losses, they say.

The Arctic sea ice, which largely melts and reforms each year, is declining at an unprecedented rate. In 2012, it collapsed to under 3.5m sqkm by mid September, just 40% of its usual extent in the 1970s. Because the ice is also losing its thickness, some scientists expect the Arctic ocean to be largely free of summer ice by 2020.

The growing fear is that as the ice retreats, the warming of the sea water will allow offshore permafrost to release ever greater quantities of methane. A giant reservoir of the greenhouse gas, in the form of gas hydrates on the East Siberian Arctic Shelf (ESAS), could be emitted, either slowly over 50 years or catastrophically fast over a shorter time frame, say the researchers.

Concentrating the sun's ray onto solar photovoltaic (PV) modules requires walking the fine line between optimizing power output and not literally melting your very expensive super-high-efficiency solar cells. A team led by IBM Research seems to have found a way to push back the line. They have created a High Concentration PhotoVoltaic Thermal (HCPVT) system that is capable of concentrating the power of 2,000 suns onto hundreds of triple junction photovoltaic chips measuring a single square centimeter each (they even claim to be able to keep temperatures safe up to 5,000x). The trick is that each solar PV cell is cooled using technology developed for supercomputers; microchannels inspired by blood vessels but only a few tens of micrometers in width pipe liquid coolant in and extract heat "10 times more effective than with passive air cooling."

The beauty is that this heat is not just thrown away. This system gets useful work out of it. So while the PV modules are 30%+ efficient at converting the sun's light into electricity, another 50% of the sun's energy is captured as heat and can then be used to do things like thermal water desalination and adsorption cooling. This means that the system is capable of converting around 80% of the collected solar energy into useable energy (though the electricity is of course more useful than the thermal energy).

Geodynamics, the developer of Australia’s first deep hot rocks energy project, aims to secure customers for its Cooper Basin site within six to 12 months, allowing it to proceed to a larger commercial plant, said chief executive Geoff Ward.

The company has spent more than $400 million since listing in 2002, including buying equipment and drilling. Its 1-megawatt Habanero pilot plant was commissioned on April 30 and has been operating in excess of expectations since, said chief executive Geoff Ward. "We're delighted by how stable and reliable" the operation has been, he said.

The geothermal plant at Innamincka in north-eastern SA taps salty water heated at 210 degrees more than 4.2 kilometres below the surface, extracting the heat to generate electricity. The cooled brine is then pumped back down a separate well where it is reheated by the hot rocks, creating an energy loop. Only two other sites now operate so-called enhanced geothermal systems, at Soulz in France and Landau in Germany.

Geodynamics will only proceed with a 5-10 megawatt commercial plant if it can secure customers. Potential clients include Santos, which operates its own gas and oil hub at Moomba, about 70 kilometres away. Beach Energy and Chevron, meanwhile, are exploring for unconventional shale gas within 5-15 kilometres of Geodynamics's wells.

GeoDynamics' competitor Petratherm seems to be losing confidence in their proposed alternative project, with the company looking to explore for shale oil in Tasmania.

After years of trying to commercialise geothermal energy in the South Australian outback, ASX listed Petratherm announced plans to diversify into shale oil and gas in Tasmania. ''This decision by Petratherm to extend into unconventional shale oil and gas exploration leverages our core areas of expertise that include basin geology and deep drilling,'' the company said.

Petratherm managing director Terry Kallis sought to reassure environmentally conscious investors that the company had not abandoned its geothermal project, which is planned to be about 75 per cent smaller than previous plans. But funding is hard to come by, and a $13 million government grant can be used only if Petratherm can raise millions of its own, a task that will be even more difficult under a lower carbon price.

From stadiums in Brazil to a bank headquarters in Britain, architects led by Norman Foster are integrating solar cells into the skin of buildings, helping the market for the technology triple within two years. Sun-powered systems will top the stadia hosting 2014 FIFA World Cup football in Brazil. In Manchester, northern England, the Co-operative Group Ltd. office has cells from Solar Century Holdings Ltd. clad into its vertical surfaces.

The projects mark an effort by designers to adopt building-integrated photovoltaics, or BIPV, where the power-generating features are planned from the start instead of tacked on as an afterthought. Foster and his customers are seeking to produce eye-catching works while meeting a European Union directive that new buildings should produce next to zero emissions after 2020.

“Building integrated solar in office buildings and factories which generate energy consistently during daylight hours, whilst not requiring additional expensive land space or unsightly installations, is seen as the most obvious energy solution,” said Gavin Rezos, principal of Viaticus Capital Ltd., an Australian corporate advisory company that’s one of the private equity funds putting money into the technology.

The market for solar laid onto buildings and into building materials is expected to grow to $7.5 billion by 2015 from about $2.1 billion, according to Accenture Plc, citing research from NanoMarkets. Sales of solar glass are expected to reach as much as $4.2 billion by 2015, with walls integrating solar cells at $830 million. About $1.5 billion is expected to be generated from solar tiles and shingles.

The technology provides a respite for solar manufacturers, opening the way for them to charge a premium for products. Traditional solar panel prices have fallen 90 percent since 2008 due to oversupply, cutting margins and pushing more than 30 companies including Q-Cells SE and a unit of Suntech Power Holdings Co. into bankruptcy.

A new tidal power project in WA's Kimberley region (something which has been on the cards for many years) has received approval from the WA environment minister (taking the lead in the race to develop the first tidal power plant in Australia's north).

The proposed plant will have a capacity of 40 MW - the main obstacle to it actually being built seems to be a dependency on the negotiation of a contract for the construction of power lines to major towns in the West Kimberley. TEA says the project (the design and costing was completed in 2003) is awaiting “a suitable offtake contract” before it can go ahead. Woodside's proposed Browse LNG project James Price Point was originally considered to be the likely customer, however with that project being shelved the company is hoping one of the local diamond or zinc mines will fill that role.

The cost is being speculated to be in the $250-$300/MWh range – not be much of a discount from diesel. RenewEconomy suggests the project would require the support of funding from ARENA or the CEFC. There are no details on which tidal technology would be favoured.

The FT has an article by John Dizard explaining why he thinks the shale oil boom will prove far more disappointing than many are claiming - ‘Saudi America’ remains a Washington fantasy (it seems to be readable without a subscription - though you may have to go via Google News). He quotes Bernstein's Bob Brackett as predicting oil production in the Bakken (North Dakota) will peak in 2019 at 1.2m barrels per day.

The 392MW Ivanpah solar tower power station is the biggest concentrated solar thermal project in the world. It is also the most visually arresting. It features three huge towers, each 150m tall, surrounded by huge fields of mirrors that will focus the sun’s energy on a receiver located at the top of the tower. Water is boiled to create steam that then drives the turbines. It’s solar generation at a massive scale, made more impressive by its surroundings. Even though it spreads over so many hectares, its size pales against the grandeur of the stunning Mojave landscape.

Ivanpah is not the only solar power station of large-scale being built in this art of the world. To the north, across the state border in Nevada, a 110MW solar tower with storage facility is being built by SolarReserve.

To the west, in the heart of California’s “high desert”, First Solar is nearing completion of a 250MW AVSR solar PV project near Lancaster, while down the road SunPower has begun construction of a 579MW solar PV plant of their own. A little further north, the tables are turned as SunPower puts the finishing touches to its 250MW CVSR project, while First Solar is about to trump it with the 550MW Topaz solar PV project, which is half way through construction.

But even as these massive projects are nearing completion, the question is being asked: Does the future of solar really lie in more of these large scale projects? Even the owners of these huge projects are not so sure.

NRG, the largest owner of generation assets in the US, and part owner of the Ivanpah project, says it is uncertain about the future of such large scale projects, because they are hugely capital-intensive. “These projects are massive, and even though the technology is proven it is very difficult to continually build these in the US, because there are limits to where these projects can be placed,” says Todd Michaels, the head of distributed generation for NRG. “Utilities are fully procured out in the south-west where this technology most appropriate. You are seeing a move to push new solar projects into distribution network. That’s where our CEO David Crane is saying these projects are heading – into distributed energy in general and solar in particular.”

CEO SunPower Tom Werner is building two of these massive projects, but even he says he is not sure where the future lies, which is why he is having his company hedge his bets. “We large scale utility, large and small distributed generation , and rooftop,” Werner told RenewEconomy is a recent interview. “We purposely straddle all three because we don’t know the answer to your question, to be honest. “Here’s how we look at it. The beauty of solar is that it is easy to site, where there is sun. It’s quick to install, scaleable, and you can make it big or small. Those are huge advantages. “So you can though utility problem on its head – you ask yourself, where do I have transmission, where do I have load, and then you can put put solar in it.

Last Monday's BBC News at Ten broadcast a report by science editor David Shukman arguing that concerns "about oil supplies running dry are receding." Shukman interviewed a range of industry experts talking up the idea that a "peak" in oil production has been "moved to the backburner" - but he obfuscated compelling evidence in his own report contradicting this view.

"There's still plenty of oil - we just haven't got all of it out of the ground yet. There's not a real danger of there being no fossil fuel," one oil company executive told the BBC. "There's enough oil in this country for another 100 years with our present technology and there's more around the world to be found yet."

Following a chorus of industry hype on the wonders of shale gas and fracking, Shukman finally referred in passing to a new scientific paper published by Eos, Transactions - the newsletter of the American Geophysical Union - saying that the paper "supports the assertion that a peak in oil production is 'a myth' but argues that the rising cost of extraction could itself provide a limit, and may act as a brake on economic growth." He then closed his report with the following quote from a leading industry figure: "The era of cheap oil is over, but we're a long way from peak oil - costs will go up but the technology will respond."

The thrust of the message was that peak oil is a myth because we're not running out of oil. Even if costs go up, this will automatically spur the technological innovation that will make continued extraction of expensive oil viable.

But Shukman's characterisation of the new Eos paper is a combination of falsehood and half-truth. Far from describing peak oil as a myth, the paper's conclusions are far more nuanced, and point to an overwhelming body of evidence contradicting the industry hype that the rest of his report parrots uncritically.

"Peak oil is not about oil reserves or resources, neither of which translates directly into production rate", the Eos paper points out. "Peak oil is not about running out of oil but about its peak in production...

"So is the idea of peak oil a myth? If readers are expecting an abrupt decrease in oil production, then it is. But if they understand that the manifestation of peak oil is a struggle between supply and demand that is resolved through global oil markets, they will understand that the data shows that peak oil can originate from economic as well as geological factors."

Indeed, peak oil does not suggest we are 'running out of oil', but that a peak in conventional oil production will create an increasing reliance on more expensive, unconventional forms of oil and gas which have a far lower energy output. According to the Eos paper, we seem to be arriving at that point:

"Global production of crude oil and condensates... has essentially remained on a plateau of about 75 million barrels per day (mb/d) since 2005 in spite of a large increase in the price of oil. Even more important, the global net oil exports from oil-exporting countries (oil production minus internal consumption) have peaked and are in decline."

The Eos paper goes on to point out that while "total oil production has plateaued, production of oil from older existing fields has been in decline, dropping roughly 5% annually, corresponding to a loss of 3-4 mb/d." Although production from unconventional oil and gas has balanced this decline, they are "difficult and expensive" with "very low energy return on investment (EROI)." In simpler terms, "it takes energy to get energy, and more is required to produce energy from unconventional sources."

You may recall, late last summer the Arctic sea ice extent dropped to its lowest level on record, 49 percent below the 1979-2000 average. It’s not clear if 2013 levels will match 2012′s astonishing record low, but – with temperatures over the Arctic Ocean 1-3 degrees above average – the 2013 melt season has picked up in earnest during July.

“During the first two weeks of July, ice extent declined at a rate of 132,000 square kilometers (51,000 square miles) per day. This was 61% faster than the average rate of decline over the period 1981 to 2010 of 82,000 square kilometers (32,000 square miles) per day,” the National Snow and Ice Data Center writes on its website.

Abu Dhabi, the most oil-rich of the United Arab Emirates, is now home to the world's single-largest concentrated solar power plant. The 100-megawatt Shams 1 plant cost an estimated $750 million and is expected to provide electricity to 20,000 homes, according to the BBC.

Why, you might ask? Bloomberg says the less oil Abu Dhabi uses for local consumption, the more it can export.

Sultan Ahmed al Jaber, head of Abu Dhabi Future Energy Co., speaking at a news conference for the plant's opening over the weekend, said it is part of a "strategic plan to diversify energy sources in Abu Dhabi." ...

Shams 1 uses 768 adjustable parabolic "trough mirrors" to focus sunlight onto a water boiler that produces steam, activates turbines and finally generates electricity, reports the website Clean Technica. The middle step in the process, it says, is to use natural gas to superheat the water.

The plant, located about 75 miles southwest of Abu Dhabi, is similar in design to Solar Energy Generating Systems (SEGS) located in California's Mojave Desert. Although Shams 1 claims to be the single-largest plant, the nine SEGS plants taken together generate more than three times as much energy and serve more than 10 times as many households at peak output.

"Mentions of “peak oil” in news publications peaked between July 2007 and July 2008, according to Nexis" and "Web search interest in “peak oil” peaked in August 2005 and spiked again in May 2008."

I spoke on this a little before, but I continue to be amazed by how many very smart people just don't understand how silly they look when they use THIS as evidence of Peak Oil's demise.

If "internet popularity" was the true measure of importance, we'd have given the keys of the world to Justin Bieber a long time ago. And if you looked at how many people were talking and writing about terrorism in August of 2001, those guys would probably conclude that topic peaked forever in the early '90s.

I'm shocked I have to keep repeating this, but "We judge Peak Oil by oil production rates, if you're looking at anything else, you're doing it wrong."

"Oil prices also peaked around then, hitting $145 per barrel in July 2008."

It's like Zeitlin forgets to include the second half of that sentence. His own chart shows that oil hit $145, retreated, and then resumed its march back up. RIGHT NOW as people are falling over themselves proclaiming the death of Peak Oil, oil is back in the triple digits and at 15-month highs. Over the past decade-plus, the price of oil has more than tripled.

Peak Oil can never really die, because oil is a finite resource and any finite resource peaks in production. But you can kill it in the court of public opinion, and for that to happen you'd need two things:

1) You need daily production rates to continue to skyrocket, leaving far behind any peaks of the past. But you also need something more difficult.

2) You need what AEI's James Pethokoukis called the "wonder-working power of technological innovation" to actually reduce oil prices, much like Moore's Law for computers has made memory cheaper year after year. If you're in a production boom, but you then have to turn around and tell the people of the economy that they'll have to keep paying a larger share of their income for gasoline... is that really a net win?

In oil production, you access the easy and cheap oil first, then move on to the more difficult and more expensive oil later when prices allow. That's why conventional crude oil production has already peaked, and the only thing keeping total oil production from declining are gains from much more expensive unconventional sources.

Anyone seriously telling people that Peak Oil is dead, really needs to have a strong answer when regular people ask why prices are still so high.

The Central Intelligence Agency is funding a scientific study that will investigate whether humans could use geoengineering to alter Earth's environment and stop climate change. The National Academy of Sciences (NAS) will run the 21-month project, which is the first NAS geoengineering study financially supported by an intelligence agency. With the spooks' money, scientists will study how humans might influence weather patterns, assess the potential dangers of messing with the climate, and investigate possible national security implications of geoengineering attempts.

The total cost of the project is $630,000, which NAS is splitting with the CIA, the National Oceanic and Atmospheric Administration, and NASA. The NAS website says that "the US intelligence community" is funding the project, and William Kearney, a spokesman for NAS, told Mother Jones that phrase refers to the CIA. Edward Price, a spokesman for the CIA, refused to confirm the agency's role in the study, but said, "It's natural that on a subject like climate change the Agency would work with scientists to better understand the phenomenon and its implications on national security." The CIA reportedly closed its research center on climate change and national security last year, after GOP members of Congress argued that the CIA shouldn't be looking at climate change.

The goal of the CIA-backed NAS study is to conduct a "technical evaluation of a limited number of proposed geoengineering techniques," according to the NAS website. Scientists will attempt to determine which geoengineering techniques are feasible and try to evaluate the impacts and risks of each (including "national security concerns"). One proposed geoengineering method the study will look at is solar radiation management—a fancy term for pumping particles into the stratosphere to reflect incoming sunlight away from the planet. In theory, solar radiation management could lead to a global cooling trend that might reverse, or at least slow down, global warming. The study will also investigate proposals for removing carbon dioxide from the atmosphere.

The National Academies has held two previous workshops on geoengineering, but neither was funded by the intelligence community, says Edward Dunlea, the study director for the latest project. The CIA would not say why it had decided to fund the project at this time, but the US government's apparent interest in altering the climate isn't new. The first big use of weather modification as a military tactic came during the Vietnam War, when the Air Force engaged in a cloud seeding program to try to create rainfall and turn the Ho Chi Minh Trail into muck, and thereby gain tactical advantage. Between 1962 and 1983, other would-be weather engineers tried to change the behavior of hurricanes using silver iodide. That effort, dubbed Project Stormfury, was spearheaded by the Navy and the Commerce Department. China's "Weather Modification Office" also controversially seeded clouds in advance of the 2008 Beijing Olympics, hoping to ensure rain would fall in the Beijing suburbs instead of over the Olympic stadiums.

Although previous efforts to manipulate weather and climate have often been met with mockery, many geoengineering proposals "are fundamentally doable, relatively cheap, and do appear to be able to reduce climate risk significantly, but with risks," explains David Keith, a Harvard researcher and top geoengineering proponent.

But if geoengineering is cheap and "fundamentally doable," as Keith claims, that suggests foreign countries, or even wealthy individuals, could mess with the climate to advance their own ends. "This whole issue of lone actors: Do we need to be concerned about China acting unilaterally? Is that just idle chatter, or is that something the US government should prepare for?" asks Ken Caldeira, a geoengineering researcher at the Carnegie Institution's Department of Global Ecology and a member of the current National Academy of Sciences panel.

At least one individual has already tried modifying the climate. Russ George, the former head of Planktos, a company that works to develop technology to deal with global warming, seeded the Pacific Ocean off western Canada with iron to generate a plankton bloom that, in turn, was supposed to suck up carbon dioxide from the air. George's effort was widely condemned, but at present there's little to stop other individuals or countries from trying it or something similar. That's part of what has the US intelligence community interested.

Technophilic environmentalists, including myself, tout the 3D printing revolution as a boon that could eliminate waste in manufacturing. But is that really true? Even if it is true, does it matter compared to the extra energy used? And what about toxins — does it release more, or less?
No one has done this comparison before in a comprehensive, quantitative way, so some colleagues and I in the UC Berkeley mechanical engineering department set out to find the answers. The results were tricky and surprising.

First, let's bust a myth: 3D printing does not mean zero waste. There are many kinds of 3D printers, making things in very different ways; we measured two kinds. An "FDM" machine (such as a RepRap or Makerbot, sort of a hot glue gun with XYZ controls), actually can have a negligible percent waste, if your model doesn't need any support material to shore it up while printing. (That's a big "if.") But we found that an inkjet 3D printer (which lays down polymeric ink and UV-cures it layer by layer) wastes 40 to 45 percent of its ink, not even counting support material, and it can't be recycled. Other researchers studying other kinds of 3D printers have found significant waste in some of them as well.

To see whether 3D printing will be a sustainability win, we compared it to machining by a computer-controlled mill (starting with a block of stuff and cutting away everything you don't want). We only looked at machining things out of plastic, because that's what these FDM and inkjet 3D printers do. Let's be clear: most plastic consumer products are not machined; they're injection-molded. But 3D printing is not going to replace injection-molding for mass-manufactured products (plastic parts made in the millions). It is replacing machining for smaller runs (1 unit, 10 units, maybe 1,000 units).

We compared them by doing a life-cycle assessment (LCA) of the two 3D printers and the CNC mill, including the materials and manufacturing of the machines themselves, transportation, energy use, material in the final parts, material wasted, and the end-of-life disposal of the machines. ...

The 3D printers' impacts mostly came from electricity use, which is simply a function of time, so anything that reduces the time spent running also reduces eco-impacts. The mill's impacts were mostly from material use and waste, but energy use was significant too. The resources and manufacturing to make the machines themselves was a small portion of impacts when they run at high utilization, as shown above; but if you only make one part per week, those embodied impacts can be significant for the FDM and the mill.

The final verdict, then, is that 3D printing can be greener, if it's the right kind (FDM); but again, the biggest environmental win comes from sharing the fewest tools so each has the most utilization. If you want to know more, the full study (with far more detail in methodology and results, including breakdowns of impacts by source for all 22 scenarios studied) has been submitted to the Journal of Rapid Prototyping. Be patient, though; peer-reviewed academic publications take a year or more to get published.

Two of the key metrics that will be watched closely in the global solar industry reporting season that has just commenced are the price of panels sold, and the cost of manufacture. The difference is what the industry calls the margin. ...

REC Solar, one of the leading European solar companies produced these two graph in its results last week.

The first is the price of solar modules, which shows a rebound. The second is the crucial one for the future of the industry, and its ability to undercut fossil fuels over the long term, because it shows that the cost of manufacture of a solar module will fall around 20 per cent over the year – despite the 60-80 per cent falls achieved over the previous three to four years. The same story is expected to be repeated among many other manufacturers.

The first floating wind turbine in U.S. history went upright and onto the water in Brewer, Maine, on Friday, on its way to being put in place and connected to the grid off the coast near the town of Castine.

The 65-foot-tall VolturnUS 1:8 prototype is a small-scale model of the giant 6-megawatt turbines the University of Maine’s Advanced Structures and Composites Center and its partners in the DeepCWind Consortium hope to have in the water someday. Deep-water wind supporters in Maine believe that by 2030 they can grab 5 gigawatts of power with arrays of large turbines up to 50 miles off the coast, away from conflicts and where the winds blow strong and consistent. ...

Floating turbines are seen as a next logical step in offshore wind development. Standard offshore turbines are nearly always installed in waters less than 30 meters deep, but deeper water accessible only with floating turbines could offer even better wind as well as fewer stakeholder and aesthetic conflicts. A couple of demo floating projects have launched in Europe, one by Energias de Portugal and Principle Power off Portugal and the world’s first, in 2009, Statoil’s Hywind.

Beyond this smaller-scale pilot, DeepCWind is planning to build two of the 6-megawatt full-sized turbines in the 2015-17 period. To aid in that effort, this past December the U.S. Department of Energy gave the university a $4 million grant (at the same time it backed the Oregon floating project), and other such projects are brewing around the world.

Ultimately, the Maine group aims to have some 80 turbines floating in 4-mile by 8-mile zone 20 miles from the coast. The team figures those turbines will be able to produce electricity at 10 cents per kilowatt-hour without subsidies, meeting a 2020 goal of the DOE.

The Harvard Business Review has an article on Shell's scenario planning practice over the years - Living in the Futures - going back to it's genesis in 1965 and chronicling the evolution of the practice over the years.

The authors of the article are a former Shell scenario planner and a former Shell executive who recently completed a history of scenario planning at the company after interviewing almost every surviving veteran of the operation, along with current and former top company executives. They identified the following principles that have been followed by scenario planners over the years:

Make It Plausible, Not Probable

Strike a Balance Between Relevant and Challenging

Tell Stories That Are Memorable Yet Disposable - You are trying to manipulate people into being open-minded.

With the world’s population headed toward 9 billion at mid-century and millions of people climbing out of poverty, global energy demand could increase by as much as 80% by 2050. That’s according to Shell’s latest scenarios, which look at trends in the economy, politics and energy in considering developments over the next half a century.

What might lie ahead 50 years from now… or even in 2100? We consider two possible scenarios of the future, taking a number of pressing global trends and issues and using them as “lenses” through which to view the world.

The scenarios provide a detailed analysis of current trends and their likely trajectory into the future. They dive into the implications for the pace of global economic development, the types of energy we use to power our lives and the growth in greenhouse gas emissions.

The scenarios also highlight areas of public policy likely to have the greatest influence on the development of cleaner fuels, improvements in energy efficiency and on moderating greenhouse gas emissions.

Mountains

The first scenario, labelled “mountains”, sees a strong role for government and the introduction of firm and far-reaching policy measures. These help to develop more compact cities and transform the global transport network. New policies unlock plentiful natural gas resources – making it the largest global energy source by the 2030s – and accelerate carbon capture and storage technology, supporting a cleaner energy system.

Oceans

The second scenario, which we call “oceans”, describes a more prosperous and volatile world. Energy demand surges, due to strong economic growth. Power is more widely distributed and governments take longer to agree major decisions. Market forces rather than policies shape the energy system: oil and coal remain part of the energy mix but renewable energy also grows. By the 2070s solar becomes the world’s largest energy source.

The Energy Collective has an interesting look at the relative merits of CNG and LNG for fueling heavy road transport vehicles (which seems to be part of the general effort on the part of Shell and others to try to boost the use of natural gas for transport) - A New Debate Emerges: LNG or CNG for the Long Haul

Amidst the constant discussion of plentiful domestic natural gas and its use as a transportation fuel, an unusual technological and philosophical debate has emerged. Those familiar with the industry know that until recently, fleet managers considering the conversion from gasoline or diesel to natural gas had basically two options: compressed natural gas (CNG) was the choice for any return-to-base, short mileage vehicles, and liquefied natural gas (LNG) was the option for long-haul on-highway Class 8 trucks, also known as tractor trailers or semis. The reasoning behind this was relatively straightforward, and more or less a product of a few issues inherent to gaseous rather than liquid fuel (energy density, tank storage capacity, re-fueling time). However, due to a variety of innovations, a paradigm shift may be under way.

In comparing alternative fuels to gasoline or diesel, a major consideration is the relative energy density and associated cost, weight and size of on-board fuel storage. For natural gas, when compressed, its energy density is only about a quarter that of diesel, and when liquefied just 60% of the energy density of diesel. Therefore, either option requires greater fuel storage capacity to achieve a comparable range, which means more and/or larger tanks.

Compared to CNG, LNG contains 2.4 times more energy per diesel gallon equivalent (DGE). Moreover, since LNG, like diesel and gasoline, is a liquid, one could achieve comparable refueling speed, whereas the level of compression required to “fast-fill” with CNG is very high (~3,600 psi). As a result, for the long-haul trucking sector, the energy density and associated cost(s), weight and on-board storage capacity of LNG have long been viewed as the more attractive, viable option.

Relatively recent advances in tank storage capacity and “fast fill” refueling technology have allowed room for debate as to whether LNG really is the only natural gas option for the long-haul trucking industry. To best highlight the philosophical nature of this emerging debate, it may be best to look at two of the leading natural gas refueling infrastructure providers, Clean Energy Fuels (CLNE) and Trillium CNG (TEG subsidiary), each of which has taken an opposing view on this topic.

Clean Energy was the first mover in the industry and is now by far the largest provider of natural gas refueling infrastructure in the US. They are betting big on the fact that CNG is the choice for local urban fleets (refuse vehicles, delivery trucks, etc.) but that LNG is the option for long-haul tractor-trailers. Alternatively, Trillium CNG, along with their partners at AMP Americas, a Chicago-based investment firm, strongly believe that CNG should be the choice for all heavy-duty fleets, regardless of distance traveled or route. Without commenting on which approach is better, the following will help to explain each company’s thought process.

CNG and LNG are both proven forms of natural gas storage, with distinct advantages over diesel and gasoline when used as a transportation fuel. To produce CNG, natural gas is taken directly out of the United States' expansive network of natural gas pipelines, whereas LNG must be cryogenically liquefied to -260 degrees F (to become a liquid) and often must travel via ground transportation (tanker truck) to stations across the US. With pipeline access, LNG can alternately be produced from the gas grid through MMLS (movable modular liquefaction system) units. On-site, CNG is compressed immediately and enters a truck in a process that is almost identical to traditional fueling practices, from the driver’s perspective. On the other hand, LNG requires drivers to wear a mask and gloves to protect themselves against cryogenic burns.

For the Class 8 truck sector, Trillium/AMP have made the decision to build CNG stations because, in their words, “it is a cheap, simple and safe way to transport and store natural gas.” They have also found that the additional simplicity of CNG over LNG makes it an easier product to maintain, as well as a less expensive product to produce. For example, according to their general pricing model and marketing materials, on average, “end users of CNG gain a $.48 advantage over LNG, for a product that works equally well, has less associated hazards and a greater built-in infrastructure across the US.”

Alternatively, Clean Energy has invested heavily in LNG infrastructure, including two liquefaction facilities, to supply their network of 150 existing refueling stations and more in the works that they refer to as America’s Natural Gas Highway. While the production and transport of LNG require greater technical expertise and significantly more capital than for CNG, LNG cost savings are realized on refueling infrastructure/operation. This is primarily a result of the high electricity demand/cost required to achieve the compression necessary for a “fast-fill” CNG station. There is also greater flexibility in where a station can be located (no need for natural gas pipeline access) and in future expansion of existing stations.

Most people in Britain want to reduce reliance on fossil fuels, but due more to fears of shortages and rising prices than to fears about climate change, according to a poll developed by researchers at Cardiff University and funded by the UK Energy Research Centre.

Nearly 2,500 people were surveyed across England, Scotland and Wales in August 2012. The results, published on Tuesday in a report on "Transforming the UK energy system: public values, attitudes and acceptability," provide a trove of information about public opinion on climate and energy policy.

By a large majority, respondents were either very concerned (24 percent) or fairly concerned (50 percent) about climate change and thought it was partly (48 percent) or mainly (28 percent) caused by human activity. Only a minority thought fears about climate change have been exaggerated (30 percent), though more expressed uncertainty about what the effects will really be (59 percent).

Nearly everyone agreed with the statement that Britain needs "to radically change how we produce and use energy by 2050". ...

By overwhelming majorities, those polled were fairly or very concerned gas and electricity would become unaffordable (83 percent); Britain will become too dependent on energy from other countries (83 percent); the country will have no alternatives if fossil fuels are no longer available (83 percent); and petrol will become unaffordable (78 percent).

Nearly four out of five respondents agreed the country should reduce its reliance on fossil fuels (79 percent). When asked for their reasons, respondents cited concerns about fossil fuels running out, being unsustainable or non-renewable (48 percent), costly (7 percent) and implied dependence on other countries (5 percent), compared with worries they are harmful to the environment and polluting (19 percent) or contribute to climate change (17 percent).

As scorching weather envelops the Northeast and the Midwest, electric utilities are scrambling to keep the power on while air-conditioners strain utilities’ capacity. By Tuesday afternoon in New York City and Westchester County, for instance, Consolidated Edison had logged nearly 7,700 interruptions since the heat arrived on Sunday, and it had dispatched crews to restore almost all of the power.

Such disruptions have plagued utilities for years: how do they keep extra electricity on hand and ready to go, avoiding the need to cut the voltage in stressed neighborhoods and lowering the risk of blackouts?

Now, several utilities, including Con Edison, National Grid and the large European utilities Enel and GDF SUEZ, have signed up to fine-tune and test what they hope could lead to an answer — a battery half the size of a refrigerator from Eos Energy Storage, the company said Tuesday. If the testing goes well, the batteries hold the promise of providing storage that until now has been unaffordable on a large scale. “Energy storage is no longer an idea and a theory — it’s actually a practical reality,” said Steve Hellman, Eos’s president. “You’re seeing a lot of commercial activity in the energy storage sector.”

Part of the appeal is economic: utilities could buy power from centralized plants during off-peak hours, when it is cheaper, and use it to feed the grid at peak hours when it is typically more expensive. That could also relieve congestion on some transmission lines, reducing strain and the need to spend money upgrading or repairing them. In addition, batteries could help integrate more renewable sources like solar and wind into the power grid, smoothing out their intermittent production.

“Energy storage in general has been kind of a holy grail for utilities — a lot of the generation and demand is instantaneous,” said Joseph Carbonara, project manager in research and development at Con Edison, who is managing the Eos program. “The utilities have always been looking to buffer that.”

Utilities and institutions across the country, many with grants from federal or state energy departments, are testing energy storage technologies. Con Edison and the City University of New York are using a different zinc-based battery from Urban Electric Power to help reduce the school’s peak energy use as part of a New York State Energy Research and Development Authority program. In California, Pacific Gas and Electric is studying sodium-sulfur batteries that can store more than six hours of energy. And Duke Energy is working with lead acid batteries from Xtreme Power that are linked to a wind farm in Texas.

At the same time, there are a host of start-ups racing to develop different technologies for a wide range of applications, and already there are some large-scale batteries tied to the grid. But the technology has generally proved too expensive for widespread adoption.

Eos says it has gotten around that problem. Its battery relies on zinc, a relatively plentiful and cheap element. The company projects that its cost will be $160 a kilowatt-hour, and that it would provide electricity cheaper than a new gas power plant built to help fulfill periods of high demand, Eos executives said. Other battery technologies can range from $400 to about $1,000 a kilowatt-hour.

A new $135 million research facility aims to solve a puzzle: how can countries prepare for an energy system that relies heavily on renewable energy? It can also test ways to improve reliability under stress, for example when demand soars in the summer as the air-conditioning load taxes the grid.

Because wind and solar energy supply power intermittently, they create challenges for grid operators. Other new energy technologies are coming online, too, including electric vehicles, energy storage, efficient buildings that cut power use during peak times, and small-scale natural-gas generators and fuel cells. Integrating these technologies on a large scale presents challenges to grid operators.

The National Renewable Energy Laboratory (NREL) in Golden, Colorado, created the Energy Systems Integration Facility (ESIF) to understand how to best operate the pieces of a more diverse energy system. Drawing on a supercomputer and power equipment that can create a megawatt-scale mini-grid within the facility, product engineers and utilities can simulate the impact of new technologies without causing problems to functioning grids.

Regions with a high percentage of wind and solar now rely on daily forecasts and stand-by fossil-fuel power plants to maintain reliable service. But once renewable energy is more than 20 percent of capacity, grid planners need more sophisticated tools, says Benjamin Kroposki, director of energy systems integration at NREL. “We saw this big shift. If we are successful in reaching cost targets for individual technologies, then what? You need to start doing systems integration,” he says.

An NREL analysis published last year found that, with a more flexible system, the U.S. could get 80 percent of its electricity from existing renewable energy technologies by 2050 (see “The U.S. Could Run on 80 Percent Renewable Electricity by 2050”). Germany and Denmark already have about 20 percent renewable electricity and Germany plans to achieve around 80 percent renewable energy, in both electric power and transportation, by 2050.