Azimuth News (Part 1)

Over the next 100 years, many scientists predict, 20 percent to 30 percent of species could be lost if the temperature rises 3.6 degrees to 5.4 degrees Fahrenheit. If the most extreme warming predictions are realized, the loss could be over 50 percent, according to the United Nations climate change panel.

But when the going gets tough, the tough get going! The idea of the Azimuth Project is create a place where scientists and engineers can meet and work together to help save the planet from global warming and other environmental threats. The first step was to develop a procedure for collecting reliable information and explaining it clearly. That means: not just a wiki, but a wiki with good procedures and a discussion forum to help us criticize and correct the articles.

Thanks to the technical wizardry of Andrew Stacey, and a lot of sage advice and help from Eric Forgy, the wiki and forum officially opened their doors about four months ago.

That seems like ages ago. For months a small band of us worked hard to get things started. With the beginning of the new year, we seem to be entering a phase transition: we’re getting a lot of new members. So, it’s time to give you an update!

Azimuth Project Pages

By far the easiest thing is to go to any Azimuth Project page, think of some information or reference that it’s missing, and add it! Go to the home page, click on a category, find an interesting article in that category and give it a try. Or, if you want to start a new page, do that. We desperately need more help from people in the life sciences, to build up our collection of pages on biodiversity.

Plans of Action

So far it goes like this. First we write summaries of these plans. Then I blog about them. Then Frederik De Roo is distilling your criticisms and comments and adding them to the Azimuth Project. The idea is to build up a thorough comparison of many different plans.

You don’t need to be an expert on any particular discipline to help here! You just need to be able to read plans of action and write crisp precise summaries, as above. We also need help finding the most important plans of action.

In addition to plans of action, we’re also summarizing various ‘reports’. The idea is that a report presents facts, while a plan of action advocates a course of action. See:

In practice the borderline between plans of action and reports is a bit fuzzy, but that’s okay.

Plan C

Analyzing plans of action is just the first step in a more ambitious project: we’d like to start formulating our own plans. Our nickname for this project is Plan C.

Why Plan C? Many other plans, like Lester Brown’s Plan B, are too optimistic. They assume that most people will change their behavior in dramatic ways before problems become very serious. We want a plan that works with actual humans.

In other words: while optimism is a crucial part of any successful endeavor, we also need plans that assume plausibly suboptimal behavior on the part of the human race. It would be best if we did everything right in the first place. It would be second best to catch problems before they get very bad — that’s the idea of Plan B. But realistically, we’ll be lucky if we do the third best thing: muddle through when things get bad.

Azimuth Code Project

Some people on the Amazon Project, most notably Tim van Beek, are writing software that illustrates ideas from climate physics and quantitative ecology. Full-fledged climate models are big and tough to develop; it’s a lot easier to start with simple models, which are good for educational purposes. I’m starting to use these in This Week’s Finds.

If you have a background in programming, we need your help! We have people writing programs in R and Sage… but Tim is writing code in Java for a systematic effort he calls the Azimuth Code Project. The idea is that over time, the results will become a repository of open-source modelling software. As a side effect, he’ll try to show that clean, simple, open-source, well-managed and up-to-date code handling is possible at a low cost — and he’ll explain how it can be done.

• Software for investigating the Hopf bifurcation and its stochastic version: see week308 of This Week’s Finds.

• Software for studying predator-prey models, including stochastic versions: see the page on quantitative ecology. Ultimately it would be nice to have some software to simulate quite general stochastic Petri nets.

• Software for studying stochastic resonance: see the page on stochastic resonance. We need a lot more on this, leading up to software that takes publicly available data on Milankovitch cycles — cyclic changes in the Earth’s orbit — and uses it to make predictions of the glacial cycles. It’s not clear how good these predictions will be — the graphs I’ve seen so far don’t look terribly convincing — but the Milankovitch cycle theory of the ice ages is pretty popular, so it’ll be fun to see.

• An open source version of FESA, the Future Energy Scenario Assessment. FESA, put out by Orion Innovations, is proprietary software that models energy systems scenarios, including meteorological data, economic analysis and technology performance.

• An automated species-identification system. See the article Time to automate identification in the journal Nature. The authors say that taxonomists should work with specialists in pattern recognition, machine learning and artificial intelligence to increase accuracy and reduce drudgery.

David Tweed, who is writing a lot of our pages on the economics of energy, has suggested some others:

All these more challenging projects will only take off if we find some energetic people and get access to good data.

This Week’s Finds

I’m interviewing people for This Week’s Finds: especially scientists who have switched from physics to environmental issues, and people with big ideas about how to save the planet. The goal here is to attract people, especially students, into working on these subjects.

If you’re a scientist or engineer doing interesting things on the topics we’re interested in at the Azimuth Project, and you’d like me to interview you, let me know! Of course, your ego should be tough enough to handle it if I say no.

Alternatively: if you know somebody like this, and you’re good at interviewing people, this is another place you might help. You could either send them to me, or interview them yourself! I’m already trying to subcontract out one interview to a mathematician friend.

Blog articles

While I’ve been writing most of the articles on this blog so far, I don’t want it to stay that way. If you want to write articles, let me know! I might or might not agree… but if you read this blog, you know what I like, so you can guess ahead of time whether I’ll like your article or not.

More

Related

This entry was posted on Monday, January 24th, 2011 at 6:21 am and is filed under azimuth. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

Post navigation

21 Responses to Azimuth News (Part 1)

We would like a program that simulates the delayed action oscillator, which is an interesting simple model for the El Niño / Southern Oscillation.

I’ve been contacted by email by someone (I’ll leave it to the author to publish his true name) with code and pictures that supposedly solve this coding challenge, but did not have a closer look at it yet. My apologies, the main reason is that a severe flu has been pinning me to my bed the last 8 days and still won’t say good bye.

To anyone interested in this kind of project: Feel free to apply for membership for the Azimuth Forum. If you don’t like to do this for one reason or another, feel free to contact me via email. I promise a swift and witty response to the extent that I’m able, once I’m on my feet again.

I got email from Marco Stenico too — I don’t see any need to hide his identity! He sent me nice-looking figures of the delayed action oscillator, made using a Java program he wrote, but he says that the program may need a bit of debugging, which he’ll try to do by the end of this month. He’s become a member the Azimuth Forum, so I expect we’ll be hearing from him there.

In the promotion department: I took the liberty of suggesting to the ScienceSeeker.org people that they add the Azimuth blog to their list of science blogs, which may send some traffic in this direction.

(I also started coding SDE stuff in Python, starting with the Hopf bifurcation example, but what I’ve done so far is more in the genre of “small program to generate a couple figures” than “part of a whole open-source project”.)

While Tim is trying to spearhead a “whole open-source project”, we also have people like Graham Jones and David Pollard and… probably others I’m forgetting… who are writing individual programs in various languages. It’s pretty easy to put these programs onto the Azimuth Project, and I can’t think of any great reason not to.

Well, I’m pretty creative, so after writing that, I can now think of all sorts of bad things that might happen. But don’t you have to get a certain number of programmers hanging around talking to each other and doing slightly random things before they spontaneously self-organize and start working open-source miracles?

I’m no expert on this by any means, but it seems that getting a bunch of programmers interacting is the first step.

…what I’ve done so far is more in the genre of “small program to generate a couple figures” than “part of a whole open-source project”.

The Azimuth code project is first and foremost a SVN repository for everyone who wrote some software for Azimuth, so if you think that it would be beneficial to have version control, you are of course free to check in your code into the Azimuth code project, we’ll adapt the directory structure to our needs.

John said:

But don’t you have to get a certain number of programmers hanging around talking to each other and doing slightly random things before they spontaneously self-organize and start working open-source miracles?

It’s more about having a bunch of programmers who have the same problem that can be solved with the software developed in an open source project. In my case the problem is that I would like to understand climate models that are comprised of millions of lines of code. Working on this kind of complex software is very different from writing 50 script lines for MATLAB. This is what my own personal project in the Azimuth code project is about, but that should not be confused with the Azimuth code project as a version control and publication mechanism for everybody working for Azimuth.

Okay, so my description of the Azimuth Code Project above may have made it sound more scary than it really is.

I’ve been worrying about this for a while. I’m very ignorant about software development, and programmers and what programmers like. So I don’t have many opinions about how things should be done. But I think some confusion may be dispelled by having different names for different things.

There’s the Azimuth Code Product, which is version control and publication mechanism for everybody working for Azimuth.

There’s your own project, which has much more specific goals, which you have outlined.

And then there may be a third thing: a very inviting broad structure that is open to programmers with many different visions, including those who don’t even care about version control?

It could easily be premature to worry about all these nuances… but sometimes things grow faster if you can easily see what they are, and they seem welcoming.

You’ll see what I mean if you look at how Blake Stacey was a bit hesitant to contribute some software, at first, because he wasn’t doing ‘industrial-strength’ programming. And then I said ‘don’t worry, go ahead’ — and he did.

In my case the problem is that I would like to understand climate models that are comprised of millions of lines of code. Working on this kind of complex software is very different from writing 50 script lines for MATLAB.

Very much so! I’ve had to do plenty of the latter, but I have no practical experience with the former. For pedagogical purposes, e.g., understanding simple models of the ENSO, those shorter programs are what you want to have around.

Despite the disappointment of last year, when a venture capitalist wrote up a business plan with many hours of my help, then gave up on finding the paltry few million it required, I expect private money — in this field, the only money that can be public-spirited — to step up.

I occasionally remarked on the uncertainties of the plan, but he presented it to his contacts — who I assume existed — as cut-and-dried development. That’s much like cut-and-dried water, and I guess they knew that.

We couldn’t get millions for development that everyone knew would really include a lot of research. But maybe some of *your* contacts are cut from better stuff. They are able and willing to provide up to tens of thousands of dollars, or equivalent lab time, for research that is unflinchingly labelled as such.

These experiments will yield basic knowledge that is interesting, not yet in humanity’s possession, not very hard to acquire, and perhaps useful:

1. Preparation of nitrate alloy

Dehydrate 100 kg or so of a mixture of sodium nitrate, magnesium nitrate hexahydrate, and calcium nitrate tetrahydrate in these initial mass proportions:

NaNO3 31.3
Mg(NO3)2·6H2O 66.6
Ca(NO3)2·4H2O 2.1

This mass breakdown is given by 1932 British patent 382368, whichsays the dehydrated mixture freezes near 411 K (138°C). To dewater the liquid one thoroughly, one should stir it while heating it a
few tens of kelvins higher than that, 500 K (227°C) maybe, and doing this under vacuum will speed the process.

The liquid remains stable up to 750 K (477°C), so overheating is easily enough avoided. Its calculated density at 430 K (157°C) is 2.01 g/mL.

If, rather than being used immediately, it will be allowed to freeze in the mixing vessel, that vessel should be sealed to preserve the dryness. Alternatively, its contents may be frozen by being poured onto a cold surface such as a thick metal plate, then broken up and promptly put in airtight containers.

Experience with similar mixtures of boron and unalloyed potassium nitrate, and boron and unalloyed sodium nitrate, is that such mixtures — boron and highly stable nitrates of light, highly electropositive metals — burn smoothly and do not detonate, despite their high energy.

However, no such experience is in hand with the present nitrate alloy, and while this experience is being acquired, the mixing should be done in a way that takes into account the conceivability of an explosion: metal vessels, batches of — at first — no more than a few milligrams, explosion shields. If no explosions occur with the first few hundred batches, one can ramp the batch size up towards 1 gram and perhaps beyond.

If what was mixed was room-temperature powders, heat the mixture until the nitrate alloy melts. (Another way to mix the components is to add boron powder to the liquid nitrate alloy, then pour the resulting suspension back and forth between two small long-handled metal vessels, spoons perhaps.)

Based on experience with boron/potassium nitrate mixtures, mixing the suspension at a temperature on the order of 430 K (157°C), or heating premixed powder to that temperature to turn it into the suspension, will not ignite it as long as roughness that could cause small regions to be much hotter is avoided. Boron/potassium nitrate, sometimes known as BKNO3, is seen in differential thermal analysis experiments to begin reacting only once the temperature rises into the neighbourhood of 813 K (540°C).

3. Ignition

Ignite small amounts of the combustible mixture. It may be allowed to freeze first. Freezing it in a horizontal half-tube mold, so as to make a long piece with a semicircular cross section, will allow the burn front propagation rate, aka flame speed, to be measured.

In the established use of mixtures of boron and potassium nitrate (BKNO3), this speed has been observed to be 12.5 mm/s when the boron mass fraction is 20 percent, 14.5 mm/s when it is 25 percent.

The latter composition includes slightly more boron than the nitrate can consume. Exact stoichiometry is at 22.19 mass percent boron, and at 20 percent, there is not enough boron to react with all the nitrogen. Some nitrogen will therefore come free as gas.

The present combustible mixture is at 26.80 mass percent boron because for the present nitrate alloy this is the exact stoichiometry, the one that should leave over neither boron nor nitrogen. Because its preparation includes melting of the nitrate alloy, it is expected to have good contact between the nitrate alloy and the boron particles. So the flame speed should be significantly higher than with BKNO3.

Retain the combustion products. All are expected to be solid, and therefore easy to retain as powders:

For best results, the suspension’s self-combustion should be staged in a nitrogen atmosphere, so that no oxidation of hot expelled boron nitride particles by the atmosphere can occur, and if hot boron particles are expelled, although they escape the mixture’s own nitrogen, this other nitrogen gets them, and they still end up as boron nitride.

4. Air oxygen treatment of nitrate-immersed ash

Set up a potful of the liquid nitrate alloy with an air diffuser, not sure that’s the right word, submerged at the bottom. Preferably this will be done in a pressure chamber at a pressure on the order of 10 bar (147 psi). When air, preferably compressed, is fed through the diffuser, it will sparge the liquid nitrate alloy: many streams of small air bubbles will rise through the liquid.

With the sparging not yet turned on, mix the ash powder that was saved from the ignition experiment into the liquid nitrate alloy, or ignite frozen pieces of the combustible mixture in the space above the liquid surface, so as to create clouds of fresh ash that falls directly in. (This may give smaller suspended particles in the liquid by not giving them much time to agglomerate before they go in, but it requires the space above the liquid surface initially to contain pure nitrogen.)

For a range of temperatures in the liquid nitrate alloy from near its freezing point up to near its thermal stability limit, sparge the ash suspension, and observe how fast exothermic processes of the following type occur:

BN + borates + air oxygen —> B2O3 + nitrates

Two examples:

BN + (1/3) Na2B4O7 + (1/3) NaBO2 + 2 O2 —>
(4/3) B2O3 + NaNO3

BN + ¼ Mg2B2O5 + 2 O2 —> ¾ B2O3 + ½ Mg(NO3)2

These have deltas ‘H’ of -470.7 kJ/mol and -457.3 kJ/mol*, respectively.

In all cases the solids on the left side are denser than the liquid nitrate alloy, and so, once mixed down into it, will tend to stay down, making it appear cloudy or milky unless they dissolve. Boron nitride won’t dissolve, but some of the borates may.

On the right, the products include nitrates — the same ones that the liquid nitrate alloy is made of — and boron oxide, B2O3. Boron oxide is most readily produced in a non-crystalline form that is less dense than the liquid, and probably has little solubility in it.

If this low solubility is fact, it will be possible to gauge how much reaction has occurred by observing how much boron oxide floats to the surface. First the sparging must be turned off, so that buoyant particles cease being stirred down and ones that sink cease being stirred up. Once the now calm liquid has stratified, the boron oxide will be in a turbid floating layer of small particles with liquid nitrate between them.

The processes’ exotherms will keep the liquid nitrate alloy hot. The rising air bubbles that provide the oxygen will carry off some of the heat when they emerge from the surface.

If the oxidation occurs very quickly, so that a bubble loses most of its oxygen while still low down in the liquid, increasing the air injection rate will raise the temperature. It is more likely, however, that the boron nitride reacts rather slowly with the bubbles’ oxygen, so that have almost the same oxygen content when they break the surface as when they emerged from the diffuser.

In that low-reactivity case, increasing the air feed rate will not speed up the reaction, but will take away more heat, and so the temperature will diminish. Adjusting the feed rate will be a good way to keep the temperature constant.

An alternative process that also consumes boron nitride and oxygen, but yields less energy, may also occur, especially when the liquid nitrate alloy is very hot, near or above its 750-K thermal stability ceiling:

4 BN + 3 O2 —> 2 B2O3 + 2 N2

Because this reaction frees nitrogen in the presence of oxygen, it tends to have nitrogen oxides as byproducts. This will be evident if the gas above the liquid, gas that was formerly in the bubbles traversing it, is tinted with the red-brown of nitrogen dioxide, a toxic gas that is also produced by lightning. So the gas emerging from the liquid surface should not be breathed, but if the lab is out of town, a fume hood may discharge the gas outdoors. In a built-up area whose air is already polluted with oxides of nitrogen from cars, it may be necessary first to clean the gas, perhaps by bubbling it through aqueous alkali.

For a given temperature in the liquid nitrate alloy, how quickly is suspended boron nitride consumed, and how efficiently is its nitrogen retained in the liquid, as new nitrate, rather than departing with the bubbles when they reach the surface? Answering these two questions is the objective of this experiment.

* based on an estimated enthalpy of formation for Mg2B2O5 of -2508.4 kJ/mol.

5. Catalyis of boron oxide crystallization

If the air oxygen treatment of nitrate-immersed ash greatly reduced the concentration of suspended boron nitride, and also greatly reduced the amount of suspended or dissolved borate, it yielded a pot containing fairly pure nitrate liquid plus a layer of floating B2O3. This can be used now, with the pot removed from the pressure vessel if it was in one.

If the air oxygen treatment did not yield such an arrangement, make it: the pure liquid nitrate alloy in the pot, a B2O3 layer on the surface.

The boron oxide that is in excess of what the liquid can dissolve floats because it is in its glassy form, density ~1.8 g/mL. But there is also a crystalline form, and it is stabler. No piece of glassy boron oxide has been seen to crumble to crystalline boron oxide powder, nor even lose transparency through the formation of small internal crystals, but in theory both changes must eventually, on timescales many times the age of the universe, happen.

At ~2.4 g/mL, the crystals are denser than the liquid nitrate alloy (~2 g/mL).

An interesting phenomenon therefore may conceivably occur, and the purpose of this section is to observe how fast it does, if it does: the liquid nitrate alloy may dissolve the less stable vitreous form of boron oxide that is lying atop it, allow it to diffuse downward as a solute, and give it up to stabler, crystalline boron oxide pieces lying at the bottom.

At its beginning, this process can be helped along by a submerged seed crystal. Heating the system from above, so that the liquid nitrate alloy is cooler at the bottom, where it touches the crystals, than at the top, where it touches the floating glass particles, will help the crystallization throughout the time in which the boron oxide is both floating and sunken.

Thanks to whoever took out my hard line ends. I was posting from a library terminal where I am kicked off after one hour, and my hour was almost up.

I described the experiments I want done as “pedestrian”. Another way to express this is “on the ground”, seeking ground truth. The talk involving Curtis Faith would seem to be about seeking ground truth, seeking reality, I think it said. Do they count?

Experience with similar mixtures of boron and unalloyed potassium nitrate, and boron and unalloyed sodium nitrate, is that such mixtures — boron and highly stable nitrates of light, highly electropositive metals — burn smoothly and do not detonate, despite their high energy.

“Over the next 100 years, many scientists predict, 20 percent to 30 percent of species could be lost if the temperature rises 3.6 degrees to 5.4 degrees Fahrenheit. If the most extreme warming predictions are realized, the loss could be over 50 percent, according to the United Nations climate change panel.”

What a completely unscientific BS.

First you can find many scientists who support almost any belief you want, it’s simply a consequence of the number of scientists and the fact they are humans.

Second predicting that something *could* happen in the future is hardly a worthwhile prediction unless you can provide a well argued probability of it happening.

Finally the notion that 50% of species will die due to 3 degrees C average temperature shift when they easily handle daily shifts of 5+ times that magnitude and can simply move following the changing climate patterns to preserve their optimum is just plain absurd.

Let’s hear which species are in such peril and why a 3 deg C increase in average global temperature will kill them…

Your objections are precisely the reason why the Azimuth project exists, over there we try to find information and investigate what would back up or disprove claims like the one you cite.
As you can see for yourself, mainstream media cannot and will not provide the kind of scientific background information that is necessary to make informed decisions.

Speaking as someone who tries to put probabilities on predictions, I disagree that we need to be able to put exact probabilities on predictions for them to be worthwhile! Even an order of magnitude probability bound could be useful. (Is it less than 10%? Less than 1%?) We do need to have some understanding of what the range of plausible outcomes may be, if not exact probabilities. However, I disagree that an estimate without probabilities is “unscientific”.

Your first point itself seems “unscientific” to me: you’re going to dismiss a finding because “you can find many scientists who support almost any belief you want”? Come on. Give a real reason to disagree with their findings.

That said, I am skeptical myself of how such mass extinction numbers are obtained. It seems very difficult to extrapolate ecological dynamics compared to, say, global average temperature. But I’m not going to call such predictions “unscientific”, certainly not without first understanding how they are arrived at.

And arguments of the form “global warming is small compared to the diurnal temperature cycle therefore it has negligible ecological impacts”, or “species can simply move”, seem extraordinarily naive to me. I know virtually nothing about ecology, but it’s not hard to think of ways those arguments could fail. For example:

1. Direct climate change. Some species really do live near the upper end of their temperature range. An average warming of a few C could wipe them out.

2. Extremes. It’s not just higher average temperatures, but extremes that can cause wide dieoffs. More heat waves, droughts, flash floods, etc. This is one of the great stressors on corals, for example.

3. Competition. Warming doesn’t have to push species past their maximum physiologically compatible temperature in order to drive them to extinction. It just has to make them less efficient competitors other species, either in their existing location or in a niche they’ll try to migrate into.

4. Barriers. There are many geographical barriers to migration. The NYT article mentions an obvious barrier for mountainous species: eventually, they can’t move farther up the mountain. Many non-mountainous species can’t cross mountain ranges. Or there might be a desert in the way. Or maybe the soil is too poor/different for that species to grow. There can be geographic bottlenecks: imagine tons of species leaving South America and all trying to cram through the Isthmus of Panama. Humans can create barriers to migration, by building cities, roads, farms, houses, etc.

5. Climatic inhomogeneity. Climatic variables don’t all change in the same way. You seem to envision every species moving in lockstep X kilometers poleward, and resettling in perfect latitudinal translation. But suppose that the temperature envelope where a species needs to live moves poleward by X kilometers, but the precipitation envelope moves Y kilometers, and they don’t overlap.

6. Migration inhomogeneity. From this it follows that ecosystem migrations may not be uniform. Suppose a species needs to move X kilometers, but another species it depends on needs to move Y kilometers, and they no longer overlap geographically. Or they both need to move the same amount, but they migrate at different rates or in different directions, and don’t overlap.

7. Key species and interdependency. Knock out a few species that a lot of other species depend on, and other species may follow, even if they’re not harmed directly by climate change.

8. Invasion. The bark beetle infestation in the American West is an example. Warming allows them to survive the winter and breed, allowing them to expand into vast new stretches of territory and attack trees.

9. Existing stressors. The extinction rate is already elevated by other human activity (land use change, overexploitation, pollution, the aforementioned geographic migration barriers, etc.) That could make species more sensitive to extinction (by climate change or other factors) than they otherwise would be.

10. Biodiversity hotspots. Supposedly 20% of plant species live on 0.5% of the Earth’s land surface; 1.4% of the land surface contain 44% of plant species and 35% of terrestrial vertebrates. (See the next comment for citations; I presume this applies to “known” species.) Climate change in such a region could have a disproportionate effect on extinctions.

Maybe there are also other reasons to think that climate change might have little impact on ecosystems, but the reasons you gave don’t appear particularly well grounded in any form of ecological science. Indeed, they seem like totally unsupported assumptions, rather than scientific conclusions.

In general, I might expect perturbations to lead to extinctions, on the grounds that most species are probably living near local fitness optima, and so perturbations would tend to decrease local fitness until the system reorganizes itself. But that’s a fuzzy, non-quantitative argument, and the quantitative details will depend on how much, how fast, where, and what’s living there.

Nathan, It’s hard to be inclusive with a list like this, but there are a couple more I can think of.

As the oceans rise and the coastlines and coastal human occupation is inundated, many species will be able to move and won’t be bothered at all. Some whole ecologies would probably be disrupted, though, like estuary ecologies that no longer have ancient marshland to occupy, for example. One example would be loosing all of Florida, the remains of ancient coral reefs itself, would be all but completely lost, along with all it’s dozens of unique ecologies.

The other one I was thinking of was the emergence of “crustacean free zones” (my name for them). Acidification is being projected to get to levels in some open ocean areas such that shell formation would be greatly or completely inhibited in, I believe about 40 years. By that time the well known oxygen free “dead zones” might greatly expand too, as a byproduct of rising population, food production and temperature…

Both cases are examples of species extinction from habitat loss, and human activity is consuming ecological habitat at a furious pace it seems. So that might be proxy measure for species loss, just the simple quantitative loss of habitat as a % of the total.

The previous comment was long, so let me put this in a new comment: where do these species extinction predictions come from? Some of them are summarized in the IPCC AR4 WG2 report, Section 4.4.11. I don’t have time to read through this, so I don’t really know how they get these estimates.

The Thomas et al. paper is simpler (and highly cited), so I’ll start there. Apparently their conclusion is based on an empirically determined scaling relation between the area of a “climate envelope” and the number of species in it. Specifically, they assume that the number of species scales as the fourth root of area, but explore sensitivity to other power laws. (They also explore sensitivity to different dispersal behaviors, climate changes, etc.) The species-area law is assumed to apply globally with the same exponent, and they ignore geographic barriers.

This is a pretty simplistic relationship, and I don’t know how far to trust it, especially in large extrapolations. Also, I’m puzzled: if you want to get a 30% decrease in species from this law, I guess you’d need a 4x decrease in climate envelope area on average? (Ok, it’s nonlinear, but whatever.) Will the average size of all climate envelopes shrink by that much? Surely some will shrink and some will grow; what’s the net effect? I don’t quite understand what a “climate envelope” is in the first place. In the paper I can’t find raw numbers for how the areas are predicted change; I can only find the implied extinctions. Not sure where to find the database they used.

Apparently this paper has been criticized a lot (see the comments on it, for example), so I’m not sure if I picked the best or most representative example to start with! It is cited a lot, though.

The Malcolm et al. paper also uses species-area relationships, but appears to do more detailed spatial modeling. At this point I’ve spent enough time, but maybe someone else would like to pick up and analyze these papers, or find better ones.

It’s worth noting that there have been various examples of species being introduced to previously “isolated” areas. Under non-climate change conditions, it appears that the effect on native ecosystem species can be much larger than one would expect, eg, simply the introduction of rabbits into Australia is claimed to be the biggest cause of species loss, although no source is cited.

So one question is: does species moving in response to climate change (but without a hard barrier previously having been in place) have a similar effect to these observable past cases?

I certainly wouldn’t say a 30 percent speices loss is something that’s “plain absurd”, although it certainly needs more detailed support and calculation to justify it.

Finally the notion that 50% of species will die due to 3 degrees C average temperature shift when they easily handle daily shifts of 5+ times that magnitude and can simply move following the changing climate patterns to preserve their optimum is just plain absurd.

I’ve once heard this nonsense talking point from a Bavarian zoologist bureaucrat on TV. He didn’t know about bark beetle explosion apparently, and how birds have changed their migration, or about insects invading from Italy, etc., due to many warm winters before. These warmer winters (due to global warming) seem now superseded by very cold winters (due to progressing global warming). How unscientific nature is…

And it’s not just average temperature that can have a profound effect over the years. There’s also the water cycle. If you’ve watched the news these days you have seen what weirdness a measly 0.7°C warming can effect. This is not without repercussions on soil, plants, animals, and finally humans.

How To Write Math Here:

You need the word 'latex' right after the first dollar sign, and it needs a space after it. Double dollar signs don't work, and other limitations apply, some described here. You can't preview comments here, but I'm happy to fix errors.