Geo-engineering is the study and implementation of technical ways to change (and arguably improve) things like weather patterns, river paths, soils, climates and sea currents on Earth. Recently, geo-engineering has received special attention for efforts to combat global warming.

Friday, December 13, 2013

Ocean tunnels are proposed by Patrick McNulty as a way to combat global warming. Many of these tunnels, lined up across the Gulf Stream and the Kuroshio Current, could supply large quantities of clean energy to the North American East Coast and to East Asia.

Such tunnels can supply energy continuously, i.e. 24 hours a day, all year, making them suitable to supply base load energy as currently generated by coal-fired power plants and nuclear power plants.

Ocean tunnels thus hold the potential to supply huge amounts of clean energy and facilitate a rapid move to a sustainable economy, as part of the comprehensive and effective action needed to combat climate change. This is pictured in the image below under part 1.

Ocean Tunnels can be combined with Ocean thermal energy conversion (OTEC) methods that use the temperature difference between cooler deeper parts of the ocean and warmer surface waters to run a heat engine to produce energy. Once such a system is in place, it has access to both deeper parts of the ocean and to surface waters, while generating a lot of energy. Such a system can also be used to pull up sunken nutrients from the depth of the ocean and put them out at surface level to fertilize the waters there, while the colder water that is the output of OTEC will float down, taking along newly-grown plankton to the ocean depths before it can revert to CO2, as described in the earlier post Using the Oceans to Remove CO2 from the Atmosphere.

Tunnels could regulate temperatures in the Arctic in a number of ways. The clean electricity they generate can replace ways polluting energy that warms up the Arctic. The clean energy tunnels generate can also be used in projects that help reduce temperatures in the Arctic. Furthermore, the turbines in tunnels can reduce the flow of ocean currents somewhat, thus reducing the flow of warm water into the Arctic.

Additionally, tunnels also hold the potential to divert warm water elsewhere and to move colder water into places that could otherwise get too warm, i.e. part 2. (Heat management) of the above action plan, more specifically management of water temperature.

Tunnels could be shaped to guide the flow of water into a specific direction, which could divert some of the water that is currently going into North Atlantic Current towards the Arctic Ocean down a southwards course along the Canary Current along the coast of West Africa.

Thus, tunnels could both produce energy to pump water elsewhere, or to pump water onto the sea ice and glaciers, to thicken the ice, or to pump sea water up into the air to spray it around and create clouds. The energy could be used in projects to help reduce temperatures in the Arctic. Additionally, tunnels could also be shaped in ways to guide water, which works even when no energy is generated. Tunnels is a concept with many applications and testing and further studies will show which applications are attractive.

A comprehensive action plan will need to consider a wide range of action. A warming Arctic results in changes to the Jet Stream, in turn making that more extreme weather can be expected, as illustrated by the video below, by Paul Beckwith.

In July 2013, water off the coast of North America reached 'Record Warmest' temperatures and proceeded to travel to the Arctic Ocean, where it is still warming up the seabed, resulting in huge emissions of methane from the Arctic Ocean's seafloor.

NOAA: part of the Atlantic Ocean off the coast of North America reached record warmest temperatures in July 2013

Diversion of ocean currents could reduce warming of the waters in the Arctic. As the image below shows, warm water is carried by the Gulf Stream all the way into the Arctic Ocean.

Warming up of the waters in the Arctic is threatening to cause release of huge quantities of methane that is held in sediments under the seabed, as discussed in the post Quantifying Arctic Methane.

Wednesday, December 4, 2013

Methane can be released from hydrates during an earthquake or by rising ocean temperatures, and this can contribute significantly to global warming. Stimulating microbes to consume the methane in the water could prevent methane from entering the atmosphere and, as a new study has found, trace metals may hold the key. The following is from a Georgia Institute of Technology news release.

A pair of cooperating microbes on the ocean floor “eats” this methane in a unique way, and a new study provides insights into their surprising nutritional requirements. Learning how these methane-munching organisms make a living in these extreme environments could provide clues about how the deep-sea environment might change in a warming world.

Scientists already understood some details about the basic biochemistry of how these two organisms consume methane, but the details of the process have remained mysterious. The new study revealed that a rare trace metal – tungsten, also used as filaments in light bulbs — could be important in the breakdown of methane.

Glass works in a chamber where she can control the oxygen
levels to mimic the deep sea environment. Credit: Rob Felt.

“This is the first evidence for a microbial tungsten enzyme in low temperature ecosystems,” said Jennifer Glass, an assistant professor in the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.

The study was recently published online in the journal Environmental Microbiology. The research was sponsored by the Department of Energy, NASA Astrobiology Institute and the National Science Foundation. Glass conducted the research while working as a NASA Astrobiology post-doctoral fellow at the California Institute of Technology, in the laboratory of professor Victoria Orphan.

The methane-eating organisms, which live in symbiosis, consume methane and excrete carbon dioxide.

“Essentially, they are eating it,” Glass said. “They are using some of the methane as a carbon source and most of it as an energy source.”

Phylogenetically speaking, one microbial partner belongs to the Bacteria, and the other is in the Archaea, representing two distinct domains of life. The archaea is named ANME, or anaerobic methanotrophic archaea, and the other is a sulfate-utilizing deltaproteobacteria. Together, the organisms form “beautiful bundles,” Glass said.

For a close-up view of the action on the sea floor, the research team used the underwater submersible robot Jason. The robot is an unmanned, remotely operated vehicle (ROV) and can stay underwater for days at a time. The research expedition in which Glass participated was Jason’s longest continuous underwater trip to date, at four consecutive days underwater.

The carbon dioxide excreted by the microbes reacts with minerals in the water to form calcium carbonate. As the researchers saw through Jason’s cameras, calcium carbonate has formed an exotic landscape on the ocean floor over hundreds of years.

“There are giant mountains on the seafloor of calcium carbonate,” Glass said. “They are gorgeous. It looks like a mountain landscape down there.”

While on the seafloor, Jason’s robotic arm collected samples of sediment. Back in the lab, researchers sequenced the genes and proteins in these samples. The collection of genes constitutes the meta-genome of the sediment, or the genes present in a particular environment, and likewise the proteins constitute a metaproteome. The research team discovered evidence that an enzyme used by microbes to “eat” methane may need tungsten to operate.

The enzyme (formylmethanofuran dehydrogenase) is the last in the pathway of converting methane to carbon dioxide, an essential step for methane oxidation.

Microorganisms in low temperature environments typically use molybdenum, which has similar chemical properties to tungsten but is usually much more available (tungsten is directly below molybdenum on the periodic table). Why these archaea appear to use tungsten is unknown. One guess is that tungsten may be in a form that is easier for the organisms to use in methane seeps, but that question will have to be answered in future experiments.

Tuesday, May 7, 2013

The apparatus consists of a vertical cylindrical wind-rotor, the interior of which is used as an ideal drop size cloud making machine.

The inside surface of the cylindrical wind rotor has metal coated polyester film laminated to it with the metal coated surface facing inwards.

The Metal coated Polyester film has been coated with a light sensitive emulsion, photo-exposed in a lattice of dots, and etched to produce an array of nano-cones on the surface of the metal, surounded by a hexagonal lattice of valleys.

Spaced by insulators, a few millimeters from the nano-cone surface, is a concentric cylinder of metal mesh. This will probably be silver wire mesh of around 1mm grid spacing and 0.1mm wire gauge.

At the centre of the cylinder is a non-rotating, star buttressed spar.

The star buttressed spar has microbubble aerated water plumbed through it, to a regularly spaced grid of de-Lavel nozzles, of around 1mm diameter, aiming tangentially at the inside of the rotor, from the outer tips of the star buttress.

The water supply aeration is around 50%, with the bubble size controlled to around 0.1mm. This should produce an atomised spray of water droplets around 0.1mm diameter, from the de-Lavel nozzles.

The 0.1mm water droplets transfer their energy to rotor rotation, and air vortex motion, in the cylinder of air close to the inner surface of the rotor cylinder.

The 0.1mm water droplets pass through the metal mesh, and land on the nanocone surface, producing a thin film of water.

A high frequency, high voltage, alternating electric potential is supplied between the nano-cone metal film, and the metal mesh.

When the voltage peaks the electric field will cause each Nano-cone to jet a charged micro-droplet of water. The apparatus will be tuned so that these droplets will be around half the ideal size for our perfect clouds.

The opposite charge on the metal mesh, will accelerate each charged droplet. The Voltage frequency will be such that the droplet reaches the mesh at the time that the polarity has fully reversed. This will ensure that the droplet passes through the mesh, and is carried by its momentum to the non-rotating airmass at the centre of the rotor.

As droplets of alternating polarity are being fired into the rotor-core, each droplet will quickly be attracted to an oppositely charged droplet, combining to form a neutral droplet of the Ideal Size.

At the bottom of the rotor cylinder the de Lavels are pointed a little upward to induce a helical input of air-large droplet mixture, entraining and sucking in air from the open bottom of the rotor.

The axiswise upward angling of the lower de Lavels reduces the further up the rotor you go, reaching pure tangential before the top. This will create an inwards airflow towards the rotor axis.

At an average velocity from nanocone to grid of each droplet of 30m/s = 30 000 mm per second... the droplet will travel 3mm in 1/10000 of a second- the time taken for the 5khz field to reverse polarity. So with these numbers, 3mm gap between the Nanocone surface and the metal mesh seems appropriate.

Tuning will have to allow for evaporative losses from the droplets, however as all the droplets will have the same size and velocity, this should be an easy task.

It may not be necessary at all to use electrostatics. Larger helical angled de Lavels at the bottom of the rotor creating a vortex seperation system where too large droplets impact the inner surface of the rotor, and small enough ones exit at the top may work adequately. A fatter at the bottom, tapered rotor would work well in this case, as it would help expel out the bottom, the waste flow from the too large droplets centrifically.

Star buttresses may not be neccesary on the central spar, particularly with the non-electric version.

Filtering requirements are low, particles smaller than 0.1mm should cause no problems for the electro version, smaller than 1mm no probs at all for the pure vortex model.

Direct air capture of carbon dioxide is a method that takes carbon dioxide out of ambient air, as opposed to carbon dioxide that is captured from the point of emissions, say, from the smokestack of a coal-fired power plant.

Lackner and his team are developing a device they call an air extractor, modeled after what is most abundant in nature: the leaf of a tree. There is about 0.5 liter of carbon dioxide in a cubic meter of atmosphere. When the extractor is dry, it loads itself with carbon dioxide from the air; when it's wet it releases carbon dioxide it has captured.

“We can do this at a cost of about $30 a ton of carbon dioxide”, says Lackner, “we have designed a box that can extract about a ton of carbon dioxide a day; it fits into a shipping container”. “If we had 100 million of them", Lackner adds, “we could extract more carbon dioxide out of the air then is currently put in.”

The carbon can be stored in the form of mineral carbonate rock or it can be injected deep in the ground. Alternatively, the carbon dioxide can be used, e.g. by turning it into a fuel. Airplanes will likely need to be powered by fuel for a long time, so captured carbon dioxide could be used to more sustainably produce synthetic jetfuel.

In his lab at Columbia's Engineering School, Lackner has built a small greenhouse, demonstrating that air extractors loaded with captured carbon dioxide can be placed inside a greenhouse; the humid atmosphere inside the greenhouse will make that the carbon dioxide is released. Adding carbon dioxide to the air inside greenhouses is beneficial for plant growth; the plants will take the carbon dioxide out of the air and use it to grow.

Tuesday, March 5, 2013

As a compliment to cloud brightening systems, these for use in calm blue sky conditions, or windy blue sky conditions, over Ocean, sea and glacial ice, and land permafrost.

Also may be very important this year for as high tech cloud brightening/making doesn't look like it will be easy to get out in large unit numbers, while there is existing firepump systems that are available in numbers we need now.

Also are essentially no different from snowmaking gear used on ski fields, except for making snow, lower velocity is fine, and no CCN's are required. Just air below 0C, and freshwater.

- High pressure / high volume fire-fighting/water cannon pump gear can be used as is, or modified for higher pressure and kW capacities to increase output volumes at similar nozzle velocities.

An aerated system looks best at this point because:

By using de Laval nozzles ( convergent-divergent, supersonic and tight stream output ) the aerated water can be accelerated by expansion to high velocity or Supersonic speed as it leaves the divergent exit section of the nozzle.

Nozzle friction is reduced because air sticks to the surface and creates a gaseous boundary layer.

For Aeration, copper or soft stainless tubes CNC laser perforated, swaged to flare to hexagonal ends, stacked for a honeycomb aeration section (just like a ww2 spitfire radiator except they had the water on the outside of the tubes and no holes) fed with compressed air, in the water feed before the pumps can entrain microbubbles in the water.

Alternatively supersonic streams can be achieved with unaerated water with convergent nozzles, but more pressure is required.

The high kinetic energy of the water stream will cause excellent dispersion, and evaporation, via transonic shockwaves as the stream slows, shedding its outer layer as it goes, eventually disintegrating completely either below the altitude where enough kinetic energy, has converted to gravitational potential energy for the stream to go transonic if the stream is below a critical diameter, or not far above that altitude if its above that diameter.

If its a high velocity Subsonic jet it will still shatter the droplets and evaporate lots of, if not all of them by air turbulence and high differential speed energy conduction/friction evaporation.

We need to look at freshwater versions as well. This because saltwater rain will be fine over open oceans but it landing on ice and land permafrost will make them melt faster. And saltwater rain on land living ecologies is not at all good either. There's going to be a big use for them to protect the land permafrosts with cloud cover too. Freshwater versions will benefit from using water with diatoms growing in it, as these act as cloud droplet condensation nuclei, just like salt crystals.

Seeding tundra lakes with diatoms will also eat CO2, oxygenate the water enhancing aerobic digestion of dissolved methane and other organic carbon. Removing the diatoms with the water for cloud cannons will also remove excess nutrients from the waters, provide aeration for skyborne digestion of DOC to CO2, and will clean up lakes to make them better for winter snow-making watersources.

We're going to need to straffe the sky with these things for best cloudmaking effect, so we need to get ready to mount them on naval gun turrets with computer controlled tracking systems and look into parking tanks and APC's with suitable turrets on container ship decks.

Using these tanks and APC's, maybe fixed installations when the wind is blowing, with cloud-cannons on the arctic tundras can help protect the permafrosts.

Calculations and conclusions, for peer review:

These are based on a sonic speed case. Faster will give more range but less volume and slower more volume but less range, for a given pump system.

- existing pump designs would need to be upgraded for higher power/pressure to produce this much cloud, if supersonic velocities are required, but this is a very small engineering challenge. Ships trawler size and up, and tanks have more than enough kWs for the job. Rapid small amplitude vertical oscillation of the jet release angle should lay down the average 100m thick cloud bank aimed for.

- this looks good for mobile straffing with existing fire pumps, provided aerated water and deLavel nozzles are used to produce supersonic velocities. The range required for 4 sqkm per hr coverage at only 80m is no problem for the small volume, aerated supersonic water flows possible from existing fire pumps.

Latent heat of evaporation and Ek sonic considerations:

latent heat of evaporation water = 2260 kJ per L

Ek sonic water = 54.45 kJ per L

If the very small water droplets produced by transonic shockwaves shattering any water breaking from the decaying jet should partially or fully evaporate (this will depend on stream velocity) they will be doing this by absorbing a lot of heat from the air they are landing in. This will cool and supersaturate the air with water vapour, and result in rapid droplet condensation in both saltwater and freshwater versions proposed.

I am advised that we can expect around 60% humidity levels in arctic conditions. As the evaporative cooling effect will cool the air that the stream droplets land in, and vast quantities of very small cloud nucleation salt crystals will be formed, we can expect a lot more cloud to be formed than the above examples suggest.

Aeration should result in more and smaller salt crystals, and droplets. In part due to microbubbles enhancing droplet fragmentation. Also due to supersaturation of the water with air, enhanced by evaporation. This causing many disturbances per drop as new bubbles precipitate, and initiate many salt crystals per droplet to precipitate. Turbulence will also initiate precipitation of air and salt crystals in the supersaturated droplet.

How much extra cloud will depend on how much atmosperic turbulence and mixing is generated by the straffing pattern, and on local temperature and humidity conditions.

Less mixing will also result in larger cloud droplets.

Too much mixing will run the risk of forming little cloud at all, as the humidity levels may be too low to form any droplets at all around the salt crystals.

It's quite likely that 500-600kph will be sufficient velocity. This would produce about 15 litres per second from standard firepump gear. A good estimate seems to be that this would initially produce around 100sqkm of 100m thick cloud bank per day. However from what I am hearing there is likely to be a repeating cycle of droplet evaporation - re nucleation of new droplets - back to droplet evaporation, due to the added water vapour and downwind cooling effects. So total cloud produced may be more than this.

We should start testing on these ASAP. Others doing testing too, would be a good thing.

There's little point getting too distracted with talk on how to reduce human CO2 emissions until we have succeeded in reversing the Arctic sea-ice crash.

However, as geoengineering for this will be an ongoing annual commitment until CO2 is back in the region of 280ppm, we do need a plan to pump carbon out of the atmosphere and the sea (where 60% of the 500 Gton total human contribution is residing.)

It's been learned that the primary ocean production has fallen by nearly half in the last 100 years. The reduction in windblown dust from irrigation and cultivation of arid areas and the prolonging of the growing season of grasses in arid areas by CO2 increases is most likely the biggest cause of this. This has resulted in the amount of natural wind-borne iron-carrying dust falling dramatically, 30% over the past 30 years alone.

Tropical rainforests have globally 8 million square km with biomass productivity of 2000g Carbon per square meter for a total of 16 Gtons of Carbon per year. Doubling this area would only get near an extra 16 Gtons of annual carbon pulldown after 1 to 2 decades and with studies showing drought stress already turning Amazon and stheast Asian rainforests now net CO2 producers rather than removers no gains might occur at all.

Temperate forests have globally 19 million square km with biomass productivity of 1,250g Carbon per square meter for a total of 24 Gtons of Carbon per year. Doubling this area would only get near an extra 24 Gtons of annual carbon pulldown after 1 to 2 decades, and then would need a further 20 years to remove the 500 Gton existing carbon debt, and thats assuming that 100 percent of carbon taken in by these trees can be kept away from consumers and decomposers.

The Oceans have globally 350 million square km with average biomass productivity of 140 gC/m²/yr for a total of 48.5 Gtons of Carbon per year. This is heavily weighted towards coastal areas at present. The open Oceans are 311 million sqkm with average biomass productivity of 125 gC/m²/yr and a total of 39 Gtons of Carbon per year, however, as can be clearly seen on the map below, some 80% of the ocean are so isolated from land sourced nutrient inputs that their productivity is about 1/100 of the most productive oceanic zones.

Oceanic desolate zone at 80% of 311 million sqkm is 249 million sqkm. 50 GtonsC/249million = 201 tonsC per sqkm per year = 200g C per sqmeter per year average. With prime coastal Aquatic enviroment like estuarys and coral reefs producing 10x that at 2000+ gC/sqm it would seem very achievable to increase the deep ocean productivity this much.

Doubling the productivity of the oceans could pump down the global 500 Gton Carbon burden in as little as 10 years and is possible, affordable, already very well studied.

In the currently near sterile central oceans the absence of an existing foodchain would ensure most of this Phytoplankton Carbon will die and sink a couple of hundred meters into the tidal mixed layer.

This can be a problem....

The amount of organic carbon needed to completely remove all oxygen from the WHOLE ocean as it is decomposed by bacteria is thought to be 1000 Gton C. Just letting the phytoplankton sink into the tidal mixed zone, which is low in oxygen already, would be a very bad idea. Back to this later.

As can be seen on the front page graphic of: http://www.aslo.org/meetings/Phytoplankton_Production_Symposium_Report.pdf
The benefits of iron fertilization alone are only achievable in the Nutrient Rich Iron Depleted zones of the southern ocean to 35degr sth, the equatorial oceans to 20degr sth and 10degr nth, and the nth pacific from 40 degr nth. These areas can easily be stimulated urgently.

At the low figure of 1 million tonC/1ton Fe we would annualy need 50GtC/1MtC= 50000 tons of iron dust -bugger all.

Antarctic krill have a total fresh biomass of up to 500 million tons. This will increase several times over when we iron fert the southern ocean.

The rest of the desolate zones need nitrogen and phosphorus. Rather than using mined phosphates and CO2 producing urea for nitrogen there are these alternatives:

Natural volcanic ash. There are concerns about heavy metal contamination from this but as long as we stick to siliceous ash from recycled seafloor volcanism we should be pretty OK.

Wave pumped chimneys. Tested already, these pump nutrient rich deep benthic water via wave power. We would however need millions of these due to scale limitations imposed by ocean wavelengths.

Chimneys driven by submarine volcanism. An idea I was looking at 10 yrs ago (dibs on the carbon credits, giggles, could make me a trillionaire) this could quickly fill the oceanic gyres of the desolate zones with all the deep benthic and volcano enriched nutrients needed.

Good old fashioned blood and bone. Puree krill from the southern ocean and fert the low nutrient desolate zones.

Simultaneous with fertilising the desolate zones we'll need to seed them with the best diatoms and suitable higher temp krill species such as north pacific, common in the sea of Japan. It would be possible to multiply world krill population 100x, to the region of 50 Gtons, making them the biggest living carbon store on the planet

Krill are looking very good for Ocean Fertilization for a number of reasons:

a) Getting phytoplankton produced carbon to seafloor or depth.

- Approximately every 13 to 20 days, krill shed their chitinous exoskeleton which is rich in stable CaCO3.
- Krill are very untidy feeders, and often spit out aggregates of phytoplankton (spit balls) containing thousands of cells sticking together.
- They produce fecal strings that still contain significant amounts of carbon and the carbonate/silica glass shells of the diatoms.

These are all heavy and sink very fast into the deep benthic zone and ocean floor. Oxygen levels are higher down there, and the deep benthic zone is much larger in volume than the rest of the worlds oceans. Besides which, unlike Phytoplankton alone, the spitballs and fecal strings stand a much beter chance of not being decomposed and using up oyxgen. The exoskeletons won't be decomposed at all.

Quote wikipedia: "If the phytoplankton is consumed by other [than krill] components of the pelagic ecosystem, most of the carbon remains in the upper strata. There is speculation that this process is one of the largest biofeedback mechanisms of the planet, maybe the most sizable of all, driven by a gigantic biomass"

b) They can be dried and pressed for krill oil. Krill oil can be used directly for biodiesel or as food suppliments.

c) Dried krill (pressed or not),( and any other biomass) can be pyrolysised for gas, pyrolysis oil for existing power plants, and these can have their flue CO2 fed into algae ponds for negative carbon energy.

- The pyrolysis oil can be used directly for large diesels like ships and heavy machinery.

- The water soluble portion of pyrolysis oil can be used for timber construction adhesive for plywoods, chipboards, and laminated beams etc

- Pyrolysis also produces biochar, which is terrific fertiliser. Producing soils called Terra Preta that are able to sequester fresh carbon from humus, water and nutrients better than any other soils on the planet, and holding fertility for thousands of years. This is the best and safest way to bury carbon.

d) Krill are delicious nutritious food for humans to replace massively methane emitting beef/sheep/goats and reforest this pastoral land with food-forests and indigenous ecologies.

e) Krill are the best food for a large number of fish and whale species. Putting carbon into living marine biomass is a safe store, and replacing the carbon that we have lost by depleting those stocks.

f) Krill are very efficient phytoplankton harvesters, sometimes reaching densities of 10,000–30,000 individual animals per cubic metre. They quickly swarm to any plankton bloom in the area.

Using them to harvest phytoplankton, and then using simple krill nets on the worlds fishing fleet, is much easier than getting phytoplankton out of the ocean ourselves, as that requires energy intensive centrifuge separation of large quantities of water.

g) Krill Females lay 6,000–10,000 eggs at one time, and they reach maturity after 2-3 years.

- Obviously they can quickly build biomass to any level we can provide food for. Particularly if we are putting them in fresh habitat where small fish that normally consume lots of tiny immature krill are absent.

If we increased the total biomass of krill to 50 Gton fresh biomass as suggested above, that would be about 10 Gton C, then we could remove this amount of Carbon from the ocean every 2 years, this alone has the potential to remove 100 Gton C from the ocean/atmosphere in twenty years.

As krill are such messy feeders, inefficient digesters and shed carbonate rich exoskeletons every 2-3 weeks, they probably would sink to the ocean floor to relatively safely aggregate into sediments, stable carbonate and undecomposed organic carbon around 100 times as much as that. So burying 500Gton C of CO2 in one year would be possible.

Obviously we only need to increase Krill populations by 10x to get the result we need in about 10 years total including the breed up time.

We'd be best to harvest as much as possible to refertilise and replace the carbon in our soils. Remember that about 600 Gton C of carbon from our soils has gone into the oceans already in the last 2000 years.

Suppose we had to quickly put the CO2 genie back in the bottle. After a half-century of “thinking small” about climate action, we would be forced to think big—big enough to quickly pull back from the danger zone for tipping points and other abrupt climate shifts.

By addressing the prospects for an emergency drawdown of excess CO2 now, we can also judge how close we have already come to painting ourselves into a corner where all escape routes are closed off.7

Getting serious about emissions reduction will be the first course of action to come to mind in a climate crisis, as little else has been discussed. But it has become a largely ineffective course of action11 with poor prospects, as the following argument shows.

In half of the climate models14, global average overheating is more than 2°C by 2048. But in the US, we get there by 2028. It is a similar story for other large countries.

Because most of the growth in emissions now comes from the developing countries burning their own fossil fuels to modernize with electricity and personal vehicles, emissions growth is likely out of control, though capable of being countered by removals elsewhere.

But suppose the world somehow succeeds. In the slow growth IPCC scenario, similar to what global emissions reduction might buy us, 2°C arrives by 2079 globally–but in the US, it arrives by 2037.

So drastic emissions reduction worldwide would only buy the US nine extra years.

However useful it would have been in the 20th century, emissions reduction has now become a failed strategy, though still useful as a booster for a more effective intervention.

We must now resort to a form of geoengineer­ing that will not cause more trouble than it cures, one that addresses ocean acidification as well as overheating and its knock-on effects.

Putting current and past CO2 emissions back into secure storage5 would reduce the global overheating, relieve deluge and drought, reverse ocean acidification, reverse the thermal expansion portion of sea level rise, and reduce the chance of more4 abrupt climate shifts.

Existing ideas for removing the excess CO2 from the air appear inadequate: too little, too late. They do not meet the test of being sufficiently big, quick, and secure. There is, however, an idealized approach to ocean fertilization5 that appears to pass this triple test.

It mimics natural up- and down-welling processes using push-pull ocean pumps powered by the wind. One pump pulls sunken nutrients back up to fertilize the ocean surface—but then another pump immediately pushes the new plankton production down to the slow-moving depths before it can revert to CO2.

How Big? How Fast?

The atmospheric CO2 is currently above 390 parts per million and the excess CO2 growth has been exponential. Excess CO2 is that above 280 ppm in the air, the pre-industrial (1750) value and also the old maximum concentration for the last several million years of ice age fluctuations between 200 and 280 ppm.

Is a 350 ppm reduction target12, allowing a 70 ppm anthropogenic excess, low enough? We hit 350 ppm in 1988, well after the sudden circulation shift18 in 1976, the decade-long failure of Greenland Sea flushing24 that began in 1978, and the sustained doubling (compared to the 1950-1981 average) of world drought acreage6 that suddenly began in 1982.

Clearly, 350 ppm is not low enough to avoid sudden climate jumps4, so for simplicity I have used 280 ppm as my target: essentially, cleaning up all excess CO2.

But how quickly must we do it? That depends not on 2°C overheating estimates but on an evaluation of the danger zone2 we are already in.

The Danger Zone

Global average temperature has not been observed to suddenly jump, even in the European heat waves of 2003 and 2010. However, other global aspects of climate have shifted suddenly and maintained the change for many years.

The traditional concern, failure of the northern-most loop of the Atlantic meridional overturning circulation (AMOC), has been sidelined by model results20-22 that show no sudden shutdowns (though they do show a 30% weakening by 2100).

While the standard cautions about negative results apply, there is a more important reason to discount this negative result: there have already been decade-long partial shutdowns not seen in the models.

Not only did the largest sinking site shut down in 1978 for a decade24, but so did the second-largest site23,28 in 1997. Were both the Greenland Sea and the Labrador Sea flushing to fail together2, we could be in for a major rearrange­ment of winds and moisture delivery as the surface of the Atlantic Ocean cooled above 55°N. From these sudden failures and the aforementioned leaps in drought, one must conclude that big trouble could arrive in the course of only 1-2 years, with no warning.

So the climate is already unstable. (“Stabilizing” emissions4 is not to be confused with climate stability; it still leaves us overheated and in the danger zone for climate jumps. Nor does “stabilized” imply safe.)

While quicker would be better, I will take twenty years as the target for completing the excess CO2 cleanup in order to estimate the drawdown rate needed.

The Size of the Cleanup

It is not enough to target the excess CO2 currently in the air, even though that is indeed the cause of ocean acidification, overheat­ing, and knock-on effects. We must also deal with the CO2 that will be released from the ocean surface as air concentration falls and the bicarbonate buffers reverse, slowing the drawdown.

Thus, I take as the goal to counter the anthropogenic emissions4,5 since 1750, currently totaling 350 gigatonnes of carbon. (GtC =1015g of Carbon=PgC.)

During a twenty year project period, another 250 GtC are likely be emitted, judging from the 3% annual growth in the use of fossil fuels5 despite some efforts at emissions reduction. Thus we need to take back 600 GtC within 20 yr at an average rate of 30 GtC/yr in order to clean up (for the lesser goal of countering continuing emissions, it would take 10 to 15 GtC/yr).

Chemically scrubbing the CO2 from the air is expensive and requires new electrical power from clean sources, not likely to arrive quickly enough. On this time scale, we cannot merely scale up what suffices on submarines.

Thus we must find ways of capturing 30 GtC/yr with traditional carbon-cycle8 biology, where CO2 is captured by photosynthesis and the carbon incorporated into an organic carbon molecule such as sugar. Then, to take this captured carbon out of circulation, it must be buried to keep decomposition methane and CO2 from reaching the atmosphere.

Sequestering CO2

One proposal26 is to bundle up crop residue (half of the annual harvest is inedible leaves, skins, cornstalks, etc.) and sink the weighted bales to the ocean floor. They will decompose there but it will take a thousand years before this CO2 can be carried back up to the ocean surface and vent into the air.

Such a project, even when done on a global scale, will yield only a few percent of 30 GtC/yr. Burying raw sewage3 is no better.

If crop residue represents half of the yearly agricultural biomass, this also tells you that additional land-based photo­synthesis, competing for space and water with human uses, cannot do the job in time.5 It would need to be far more efficient than traditional plant growth. At best, augmented crops on land would be an order of magnitude short of what we need for either countering or cleanup.

Big, Quick, and Secure

Because of the threat from abrupt climate leaps, the cleanup must be big, quick, and secure.

Doubling all forests might satisfy the first two requirements but it would be quite insecure—currently even rain forests4 are burning and rotting, releasing additional CO2.

Strike One. We are already past the point where enhanced land-based photosynthesis can implement an emergency drawdown. They cannot even counter current emissions.

Basically, we must look to the oceans for the new photosynthesis and for the long-term storage of the CO2 thus captured.

Fertilization per se

Algal blooms are increases in biological productivity when the ocean surface is provided with fertilizer containing missing nutrients15 such as nitrogen, iron, and phosphorus.

A sustained bloom of algae can be fertilized by pumping up seawater5,16,19 from the depths, a more continuous version of what winter winds9 bring up.

Currently about 11 GtC/yr settles out of the wind-mixed surface layer into the slowly-moving depths13 as plankton die. To settle out another 30 GtC/yr, we would need about four times the current ocean primary productivity. Clearly, boosting ocean productivity worldwide is not, by itself, the quick way to put the CO2 genie back in the bottle.

Strike Two. Our 41% CO2 excess is already too large to draw down in 20 yr via primary productivity increases in the ocean per se.

However, our escape route is not yet closed off. There is at least one plausible prospect for an emergency draw down for 600 GtC in 20 yr. It seeks to mimic the natural ocean processes of upwelling and downwelling.

2. Push-pull ocean pipes

Upwelling and Downwelling

Upwelling from the depths is typically caused by winds which push aside surface waters, especially those strong westerly winds in the high southern latitudes that continuously circle Antarctica without bumping into land.

In addition to the heavier biomass (the larger fecal pellets and shells) that can settle into the depths before becoming CO2, there is downwelling, an express route to the depths using bulk flow. Surface waters are flushed via whirlpools into the depths of the Greenland Sea and the Labrador Sea23. This downwelling carries along the surface’s living biomass (from bacteria to fish) as well as the dissolved organic carbon (from feces and smaller cell debris).

Note that, in the surface ocean, there is a hundred times more dissolved organic carbon (DOC) than the organic carbon inside living organisms1. Bacterial respiration produces CO2 from this DOC that reaches the air within 40 days.

To augment normal downwelling, one could pump surface DOC and plankton into the ocean depths before they become CO2. Half of the decomposition CO2 produced in the depths rejoins the atmosphere when the deep water is first upwelled a millennium later. Thanks to ocean mixing in the depths and multiple upwelling sites at different path lengths, it will come back up spread out in time after that initial delay.

There is an even larger spread because the other half (called refractory DOC17) is somehow protected from becoming CO2 for a while, even when cycled through the surface layers multiple times.17 Average radiocarbon dates for DOC in the depths are about 4,000 years, not 40 days.

Thus, if we somehow sink 600 GtC into the ocean depths over 20 years, the return of 600 GtC of decomposition CO2 to the air is spread out over, say, 6,000 years. That is an average of 0.1 GtC each year, about 1% of current emissions. Such a slow return of excess CO2 can be countered by slow reforestation or similar measures.

From this analysis, we still have a plausible way out of the climate crisis, even on an emergency basis.

What follows is an idealized example of how we might implement it, using less than one percent of the ocean surface for the next twenty years to do the equivalent of plowing under a cover crop.5

Fig. 1. A plankton plantation design using windmill pumps (ref 5), including a fishing lane free of anchor cables. Shading shows the plume of nutrients from a single pump and the plume of organic matter dispersed in the depths. One advantage of windmills is that compressed air can be generated to be pumped into the depths, addressing anoxia problems. Spacing of windmills, however, is subject to the usual limitations of vortices downwind.

Plowing Under a Cover Crop

In addition to the up-pump of the fertilization-only example, add another wind-driven pump nearby that flushes the surface water back down into even deeper depths before its new biomass becomes CO2 again.

If we fertilize via pumping up and sink nearby via bulk flow (a push-pull pump), we are essentially burying a carbon-fixing crop, much as farmers plow under a nitrogen-fixing cover crop of legumes to fertilize the soil.

Algaculture yields25 allow a preliminary estimate to be made of the size of our undertaking. Suppose that a midrange 50 g (as dry weight) of algae can be grown each day under a square meter of sunlit surface, and that half is carbon. Thus it takes about 1 x 10-4 m2 to grow 1 gC each year. To produce our 30 x 1015gC/yr drawdown rate would require 30 x 1011m2 (0.8% of the ocean surface, about the size of the Caribbean).

But because we pump the surface waters down, not dried algae, we would also be sinking the entire organic carbon soup of the wind-mixed surface layer: the carbon in living cells plus the hundred-fold larger amounts in the surface DOC. Thus the plankton plantations might require only 30 x 109 m2 (closer to the size of Lake Michigan).

Apropos location5, pumping down to 150 m near the edge of the continental shelf would deposit the organic carbon where it could be carried over the cliff and into the slower-moving deep ocean.

The ocean pipe spacing, and the volume pumped down, will depend on the outflow needed to optimize the organic carbon production (the chemostat calculation). Only field trials are likely to provide a better estimate for the needed size of sink-on-the-spot plankton plantations, pump numbers, and project costs. The obvious test beds are the North Sea and Gulf of Mexico where thousands of existing drilling platforms could be used to support appended pipes and pumps for field trials5. Without waiting for floating pumps, we could quickly test for impacts as well as efficient plantation layouts.

I have used windmills here for several reasons: they are familiar mechanisms and they enable a push-pull plantation layout to be readily illustrated. But there are a number of ways to achieve wind-wave-powered pumps, both up and down, such as atmocean.com’s buoyed pipes and Salter’s elevated ring23a to capture wave tops and create a hydrostatic pressure head for sinking less dense warm water into the more dense cool waters of the depths. Each implementation will have considerations peculiar to it; what follows are some of the more general advantages and disadvantages in the context.

Fig 2 A,B: A less expensive pump can be constructed that uses wave power and allows closer packing (ref 3). They would be more effective in the Antarctic Circumpolar Current because of the wave heights. Calvin (2012b), after P. Kithel’s design (atmocean.com).

Fig 3. Salter Sink23auses a meter-high lip on a large floating ring to capture wavetops. This builds up enough hydrostatic pressure to push down warm surface water, kept enclosed by a skirt. It can also (not shown) achieve some upwelling. Warm water exiting in the depths will rise outside the tube, entraining higher-density nutrient-rich cold water. The mix can rise above the thermocline into the surface layer, fertilizing plankton growth. Detail from figure in Intellectual Ventures white paper13a.

Pro and Con

Here we have an idealized candidate for removing 600 Gt of excess carbon from the air: the sink-on-the-spot plankton plantation that moves decomposition into the thousand-year depths. Push-pull pumping for fertilization and sequestration is relatively low-tech and merely augments natural up- and down­welling processes.

This idealized candidate has some unique advantages compared to current climate strategies: It is big, quick, and secure. It is impervious to drought and holdout governments. It does not compete for land, fresh water, fuel, or electricity. By bringing up cold water from the depths and sinking warm surface water into the thousand-year depths, it cools the ocean surface regionally. And there is a “cognitive carrot,” an immediate payoff every year (fish catch5, cooling hurricane paths9a) while growing the climate fix (the 600 GtC emergency draw down).

The idealized example intentionally uses technologies that are too old or simple to be patentable. The industries most likely to benefit would be fishing and the offshore services presently associated with oil and gas platforms.

It is against such advantages that we must judge the potential downsides5. Concerns voiced thus far include:

Could we get international agreement fast enough? Continental shelves in the most productive latitudes belong to relatively wealthy countries. Their independent initiatives could quickly establish many plankton plantations just inside the shelf without new treaties.

Won’t it pollute? Perhaps not as proposed here, using local algae and nutrients in a vertical loop, but the usual considerations would apply should we want to introduce exotic or modified algal species to achieve even higher rates of sinking potential CO2. Toxic blooms are possible during productivity transitions. With floating enclosures rather than plumes, this would change.

Won’t anoxic “dead zones” form? Shallow continental shelf sites should be avoided because hypoxia will occur from the decomposition of the downwelled carbon soup in a restricted volume. Fish kills occur when anoxia develops more quickly than fish can find their way out of the increasingly hypoxic zone. However, a maintained hypoxic zone will mostly repel fish from entering.

We don’t know what will happen. The novelty here is minimal, even less than for iron fertilization. Fertilizing and sinking surface waters merely mimics, albeit in new locations or new seasons, those frequently studied natural processes seen on a large scale in winter mixing and in ocean up- and downwelling. There is also prehistorical precedent. The 80 ppm drawdown of atmospheric CO2 in the last four ice ages is thought to have occurred via enhanced surface productivity, triggered by a major reduction in the Antarctic offshore downwelling27 that re-sinks nutrient-rich waters brought to the surface in high latitudes by the circumpolar winds.

Won’t this just move the ocean acidification problem into the depths? Since the depths are 98% of ocean volume, there is a fifty-fold dilution of the acidity. Were countering out-of-control emissions to continue for a century, depth acidification might be more of a problem.

Pumping up will just bring up water with higher CO2 than in the surface waters. A depth difference10 of 40 μmol/kg means that upwelling a cubic meter of seawater brings up an unwanted 0.48 g of inorganic carbon. The resulting fertilization will take that CO2 (and more) out of the surface ocean. Also, pumping down the same volume sinks 1 g of potential CO2 as DOC, even without fertilization.

Aren't you going to run out of phosphate, what currently limits the global ocean productivity to a fraction of its capacity? Up-pump pipes could be sited to bring up bottom waters from the southern oceans that are currently rich in phosphate.

A Second Manhattan Project

Though these objections do not seem insurmountable good reasons usually arise for not implementing most such projects as initially proposed.

This idealized push-pull ocean pumps proposal is meant to give a concrete example, easy to remember, that defines the response ballpark by being big, quick, secure, powered by clean sources, and inexpensive enough so that a country can implement it on its own continental shelf without endless international conferences. Other drawdown schemes—say, floating enclosures or wave-driven circulating cells — need to pass those same tests.

To do the planning job right is going to take a Second Manhattan Project of various experts to design cleanup candidates and evaluate their side effects. Lend them Los Alamos and let the Pentagon buy them what they need with wartime priorities. To field test their plantation designs, let them instrument the many abandoned oil platforms in the North Sea and the Gulf of Mexico. Then quickly deploy the best designs, using the abilities of the offshore services industry.

Aim to accomplish all this in the four year time frame of the original Manhattan Project. Ten years after that, the cleanup job should be half done, and without all of the economic pain of a quick (and ineffective) shutdown of fossil fuel use. At the beginning of World War II, Franklin D. Roosevelt used the metaphor of a “four alarm fire up the street” that had to be extinguished immediately, whatever the cost. Our need for fast action on climate deterioration requires devoting the resources necessary to radically shorten the developmental cycle for all carbon burial projects. We dare not wait until we are weakened before undertaking emergency climate repairs. Our ability to avoid a human population crash will be compromised if economies become fragile or if international cooperation is lost via conflicts. A serious jolt—say, a major rearrangement of the winds—could cause catastrophic crop failures and food riots within several years, creating global waves of climate refugees with the attendant famine, pestilence, war, and genocide.

Acquiescing in a slower approach to climate is, in effect, playing Russian roulette with the climate gun. The climate crisis needs wartime priorities now.