They’re also inviting mathematicians to organize workshops on this theme at the Banff International Research Station for Mathematical Innovation and Discovery, or BIRS. This is a famous and beautiful research center in the Canadian Rockies.

The deadline is coming up on September 30th, and I want to apply. If you’d like to join me, please drop me a note, either here on this blog or by email!

I’m open to all sorts of ideas, and I’d love help from biologists or climate scientists. If you don’t give me a better idea, I’ll probably do an application on network theory. It might look a bit like this:

Diagrammatic languages for describing complex networks made of interacting parts are used throughout ecology, biology, climate science, engineering, and many other fields. Examples include Systems Biology Graphical Notation, Petri nets in computer science, stochastic Petri nets and chemical reaction networks in chemistry and biochemistry, bond graphs in electrical, chemical and mechanical engineering, Bayesian networks in probabilistic reasoning, box models in climate science, and Harold Odum’s Energy Systems Language for systems ecology. Often these diagrammatic languages are invented by practitioners in a given field without reference to previous work in other fields. Recently mathematicians have set up the theoretical infrastructure needed to formalize, rigorously relate, and some cases unify these various languages. Doing this will help interdisciplinary work of the sort that is becoming important in theoretical ecology, climate science and ‘the mathematics of planet Earth’. The goal of this workshop is to bring together experts on various diagrammatic languages and mathematicians who study the general theory of diagrammatic reasoning.

If you’d be interested in coming to a workshop on this subject, let me know. Banff provides accommodation, full board, and research facilities—but not, I believe, travel funds! So, “interested in coming” means “interested enough to pay for your own flight”.

Banff does “full workshops” with 42 people for 5 days, and “half workshops” with 20 people for 5 days. Part of why I’m asking you to express your interest is to gauge which seems more appropriate.

With a growing global population competing for the same global resources, an increased frequency and intensity of dramatic climatic events, and evidence pointing to more long-term patterns of general climate change, the pressure to comprehend nature and its trends is greater than ever. Leaders in politics, sociology and economics have begun to seriously take note of issues which before were confined to the natural sciences alone, and mathematical modeling is at the heart of much of the research undertaken. The year 2013 has thus been earmarked by mathematical sciences institutes around the world as a time for a special emphasis on the study of the “Mathematics of Planet Earth” (MPE 13). This theme is to be interpreted as broadly as possible, in the aim of creating new partnerships with related disciplines and casting new light on the many ways in which the mathematical sciences can help to comprehend and tackle some of the world’s most pressing problems.

The Banff International Research Station (BIRS) is a full partner in this important initiative, as the goals of MPE 13 are completely in line with the station’s commitment to pursuing excellence in a broad range of mathematical sciences and applications. BIRS has already planned to host five workshops in 2012 which deal with the themes of MPE 13:

BIRS also invites interested applicants to use the opportunities of its 2013 program and submit proposals in line of the MPE 2013 theme, in conjunction with BIRS’ regular format for programming. Proposals should be made using the BIRS online submission process.

34 Responses to Mathematics of Planet Earth at Banff

Besides you and Blake, two people who’ve expressed interest are Dan Ghica and Jason Morton. Jason Morton is doing some cool work on dagger-compact categories and Bayesian networks—exactly the sort of thing I’m interested in these days. Dan Ghica works on things like using category theory and diagrammatic methods for hardware synthesis with programmable gate arrays, as I explained on the n-Category Café. You can see a recent talk by him here.

There is no proof for that whatsoever. Or it may depend on what they mean by “dramatic climatic event”, of course.

If it is an ordinary climatic event presented as “drama” (a situation or succession of events having the dramatic progression or emotional effect characteristic of a play), it may well be true, but it has nothing to do with the distribution of events on any objective “severity scale” and everything with how these events are dramatized (presented or viewed in a dramatic or melodramatic way).

Here’s what I’ve been able to find so far. If anyone knows papers in referee journals about the changing frequency of floods, droughts, heat waves, fires etcetera over time, I would like to add them to the Azimuth Library!

Floods in central Europe, wildfires in Russia, widespread flooding in Pakistan. The number and scale of weather-related natural catastrophe losses in the first nine months of 2010 was exceptionally high…. Munich Re emphasises the probability of a link between the increasing number of weather extremes and climate change.

Globally, 2010 has been the warmest year since records began over 130 years ago, the ten warmest during that period all falling within the last 12 years. The warmer atmosphere and higher sea temperatures are having significant effects. Prof. Peter Höppe, Head of Munich Re’s Geo Risks Research/Corporate Climate Centre: “It’s as if the weather machine had changed up a gear. Unless binding carbon reduction targets stay on the agenda, future generations will bear the consequences.”

Munich Re recorded a total of 725 weather-related natural hazard events with significant losses from January to September 2010, the second-highest figure recorded for the first nine months of the year since 1980. Some 21,000 people lost their lives, 1,760 in Pakistan alone, up to one-fifth of which was flooded for several weeks. Overall losses due to weather-related natural catastrophes from January to September came to more than US$ 65bn and insured losses to US$ 18bn. Despite producing 13 named storms, the hurricane season has been relatively benign to date, the hurricanes having pursued favourable courses.

Munich Re’s natural catastrophe database, the most comprehensive of its kind in the world, shows a marked increase in the number of weather-related events. For instance, globally there has been a more than threefold increase in loss-related floods since 1980 and more than double the number of windstorm natural catastrophes, with particularly heavy losses as a result of Atlantic hurricanes.

The rise in natural catastrophe losses is primarily due to socio-economic factors. In many countries, populations are rising, and more and more people moving into exposed areas. At the same time, greater prosperity is leading to higher property values. Nevertheless, it would seem that the only plausible explanation for the rise in weather-related catastrophes is climate change. The view that weather extremes are more frequent and intense due to global warming coincides with the current state of scientific knowledge as set out in the Fourth IPCC Assessment Report.

There are at present insufficient data on many weather risks and regions to permit statistically backed assertions regarding the link with climate change. However, there is evidence that, as a result of warming, events associated with severe windstorms, such as thunderstorms, hail and cloudbursts, have become more frequent in parts of the USA, southwest Germany and other regions. The number of very severe tropical cyclones is also increasing. One direct result of warming is an increase in heatwaves such as that experienced in Russia this summer. There are also indications of a higher incidence of atmospheric conditions causing air mass formation on the north side of the Alps and low-lying mountain ranges, a phenomenon which can result in floods. Heavy rain and flash floods are affecting not only people living close to rivers but also those who live well away from traditionally flood-prone areas. Although climate change can no longer be halted, even with the help of very ambitious schemes, it can still be curbed.

The number of weather-related disasters reported each year in the world’s poorest countries has more than trebled since the 1980s and the increase cannot be explained by better reporting or an increase in population, a study by Oxfam has found.

An analysis of the natural disasters reported to international relief agencies since 1980 has revealed that while the number of disasters relating to geophysical events – such as earthquakes and volcano eruptions – remained fairly constant, disasters caused by flooding and storms significantly increased. Oxfam looked at disasters in more than 140 countries and found a clear increase over time, rising from 133 disasters a year in 1980 to more than 350 a year in recent years. Steve Jennings, the report’s author, believes the increase could be the result of climate change.

“It is abundantly clear that weather-related disasters have been increasing in some of the world’s poorest countries and this increase cannot be explained fully by better ways of counting them,” Dr Jennings said. “Whichever way you look at the figures, there is a significant rise in the number of weather-related disasters. They have been increasing and are set to get worse as climate change further intensifies natural hazards.” In the past 30 years, the human population has risen in many of the disaster-prone countries of the developing world, which means that more people are exposed to natural disasters. Since 1975, disasters of all kind have killed more than 2.2 million people, with two-thirds of these deaths caused by weather-related events, according to the UN’s International Strategy for Disaster Reduction.

The Oxfam study attempted to eliminate any statistical bias caused by the better reporting of disasters or by increases in population in such areas as denuded hillsides that are prone to mudslides.

“There is an upwards trend in the number of reported disasters,” the report concludes. “This is chiefly driven by a steep rise in reported floods in all regions and, to a lesser extent, storms in Africa and the Americas. An increase in the number of people exposed to disasters (approximated by population growth) partly explains the trend, but not fully. It is unlikely that reporting bias fully explains the trend either.”

Globally, there is no discernible increase in the frequency or intensity of hurricanes and tropical cyclones, but there is evidence of more extreme storms, Dr Jennings said. Droughts, meanwhile, have barely increased.

“What we seem to be getting is a big change in the number of people exposed to weather-related disasters, some change in the reporting of them, and the two combined are the major explanation of what we are seeing,” Dr Jennings said. “I suspect that the climate change component is smaller.”

He suspects it’s smaller, but he feels it’s unlikely that it’s zero!

On a smaller scale, in a recent post I mentioned that 2011 holds the record for weather disasters costing over $1 billion in the United States, with 2008 coming in second place. Of course this doesn’t clearly separate the changing weather from the changing economic factors.

I can easily provide data proving that heat records are being broken at a rate that exceeds the rate at which cold records are being broken.

Hurricane Irene this year pushed the U.S. yearly record for billion-dollar natural disasters to 10, smashing the 2008 record of nine. In the “Current Extreme Weather and Climate Change” report, released today by the Climate Communication scientific group, leading climate scientists outlined how increasing global atmospheric temperatures and other climate change effects — triggered by industrial emissions of greenhouse gases including carbon dioxide and methane — are loading the dice for the sort of extreme weather seen this year.

“Greenhouse gases are the steroids of weather,” says climate projection expert Jerry Meehl of the National Center for Atmospheric Research, at a briefing held by the report’s expert reviewers. “Small increases in temperature set the stage for record breaking extreme temperature events.” Overall, says the report, higher temperatures tied to global warming, about a one-degree global average temperature rise in the last century, have widely contributed to recent runs of horrible weather:

• In 1950, U.S. record breaking hot weather days were as likely as cold ones. By 2000, they were twice as likely, and in 2011 they are three times more likely, so far. By the end of the century they will be 50 times more likely, Meehl says.

• With global warming’s higher temperatures packing about 4% more water into the atmosphere, total average U.S. snow and rainfall has increased by about 7% in the past century, says the study. The amount of rain falling in the heaviest 1% of cloudbursts has increased 20%, leading to more flooding.

• Early snow melt, and more rain rather than snow, has led to water cycle changes in the western U.S. in river flow, winter air temperature, and snow pack from 1950 to 1999. The effects are up to 60% attributable to human influence.

Rather than totally triggering any extreme event, global warming just makes it worse, says meteorologist Jeff Masters of Weather Underground, a report reviewer. “A warmer atmosphere has more energy,” he says, contributing to heat waves, tornadoes and other extremes. Even heavy blizzards come from an atmosphere packed with extra moisture by global warming he adds. “Years like 2011 may be the new normal.”

The report notes scientific disagreement exists over the role of global warming in some severe weather events, such as hurricanes, or the frequency of El Nino weather patterns.

“There’s really no such thing as natural weather anymore,” says climate scientist Donald Wuebbles of the University of Illinois, who was not involved with the report, but said he largely agreed with its conclusions. “Anything that takes place today in the weather system has been affected by the changes we’ve made to the climate system. That’s just the background situation and it’s good for people to know that,” Wuebbles says. Although scientists cannot immediately tie what percentage of an extreme weather event relies on global warming to make it more severe, he says. “It’s always a factor in today’s world.”

It is indispensable for acquiring some historic perspective on what counts as “dramatic”.

You may also want to study recent history of global tropical cyclone activity and accumulated cyclone energy (ACE). Dr. Ryan N. Maue’s site is a good starting point. You could also link to some of his graphics in this thread, they are telling.

I second that — am most keen in hearing what biologists, ecologists, etcetera have to say.

I should have thought of this before, but I am going to send a link to my friend Phillip Staniczenko, a physicist working on mathematical ecology and related ideas. He might be able to pass this around and drum up interest, particularly of people that have a non-zero probability of saying something anyone else has general interest in.

Great John! Even if travel funds wouldn’t become available, I have a yearly travel allowance I’d be happy to use for this so long as my institution wouldn’t put up any blocks. If you wouldn’t mind, I could give the information to my adviser who I am sure would be happy to circulate it more widely than I could myself among network theory-oriented biologists.

There are plenty of things mathematicians could do to promote climate science, providing proper tools to understand non-equilibrium quasi steady state thermodynamic processes in non-linear systems of many degrees of freedom being topmost priority.

As I see it, the climate system is obviously a heat engine, which is only radiatively coupled to its (cosmic) environment. Temperature of a tiny fraction of the celestial sphere is 5778 K, while the rest is 2.7 K – those are the relevant heat reservoirs we have, with infinite heat capacity each for all practical purposes. Now, the first thing to understand with a heat engine is its entropy processes. Paltridge had a good start in this direction, but this line of investigation is all but abandoned. It is pretty unfortunate, because the climate engine is mind-bogglingly complicated, therefore an analytic approach (pursued by computational climate models) is a priori hopeless.

But we have other means to approach such a system, like SOC (Self Organized Criticality), SAD (Sandpile Avalanche Dynamics), fractal geometry, MEPP (Maximum Entropy Production Principle), etc. The common trait is that they do not focus on minute details of the system, but treat it as a whole.

It is far from being clear how to proceed in this direction, nevertheless this is the only track that has a chance to bear fruit. With a non-linear system having an astronomical number of degrees of freedom any other approach is doomed to early failure.

For example it is crystal clear that most of the entropy production happens in the climate system when short wave radiation gets absorbed (either in the atmosphere, at the surface or at some depth in the ocean) and is converted to heat. It is also clear that the amount of flux absorbed depends on the planetary albedo, which is not constant. But it is not clear how MEPP or some related extremum principle could control the overall value of albedo (and also the effective emissivity of the system as seen from outside).

Anyway, the first thing to do is to actually *understand* what’s going on, by actively looking for a logical level on which the system is *understandable*.

BTW, I am pretty sure no one actually understands a computational climate model with its million lines of code, especially if it is not structured properly, not published, not documented and the code base itself is bogus in the first place.

ps. Fractal geometry is mentioned, because distribution of water vapor in the atmosphere is clearly fractal-like (even computer graphics people know that), its fractal dimension generally decreasing poleward. Calculating bulk quantities like reflectivity, absorptivity, emissivity, transmittance, etc. over fractals is not trivial and involves some mathematical insight. Simple gridding, if it is done on a single scale, can be quite misleading.

Unfortunately I’m not enough of an expert on these ideas (yet?) to feel confident that BIRS would approve an application by me for a workshop on this topic. I have a long history of studying diagrammatic techniques in physics, and I’ve been writing a series of expository articles about more practical applications of these ideas for a while now, so that seems like a better bet.

But if someone else organizes a workshop on nonequilibrium thermodynamics, or something like that, I’d love to go!

This is not entirely true. Tidal effects have a major role in the climate system, they provide a good portion of the (mechanical) energy needed for deep turbulent mixing of oceans (the other player is internal waves generated by wind stress, mostly over the southern ocean).

Without this (ill-quantified) deep turbulent mixing, which happens at some continental margins and mid-ocean ridges over rugged bottom features, the MOC (Meridional Overturning Circulation) would eventually stop, or at least slow down to a crawl supported only by geothermal heating at the bottom.

For MOC is not a heat engine, it does not convert thermal differences to work (mechanical energy), but it’s the other way around. If different parts of a fluid are heated and cooled at the same gravitational potential (depth/height), that never induces any macroscopic flow whatsoever.

Or, as cooling always happens close to the surface (even if one takes surface waves into account) while some heating goes down to several hundred meters (water is pretty transparent to light, especially in the UV and blue segments of the spectrum), it would provide for a very shallow overturning perhaps, unlike what is observed.

Therefore tidal coupling (and breaking) can’t be dismissed. True, it is only ~150 GW on average, but it is a pure mechanical energy input, which rearranges heat, salts, nutrients, dissolved oxygen & carbon dioxide in the oceans before dissipating (into minuscule amount of heat).

I fired off the proposal to BIRS today. Another person emailed me to express interest in the workshop: Michael Dietze of the Department of Plant Biology at the University of Illinois at Urbana-Champaign. His homepage says:

The environmental sciences have progressively found themselves thrust from their humble roots in natural history into the role of detecting, quantifying, and predicting the interactions between humankind and our natural environment. We face a future where there is clear and growing demand for quantitative ecological forecasts with accurate assessments of uncertainty at the local, national, and global level. One of the primary goals of my work is to produce ecological forecasts by combining innovative ecological models with cutting-edge statistical and computational techniques and integrating diverse sources of data across many spatial and temporal scales. Forecasting is not merely an exercise in modern information technology, but requires tackling a number of basic research questions. At the forefront of these is the need to go beyond studying individual sites in isolation in order to understand the generalities across ecological systems. Basic science questions are what ultimately drive my research: how do species coexistence; what are the relative contributions of biotic interactions, abiotic factors, and disturbance in structuring ecosystems; and to what extent are ecosystem dynamics predictable versus determined by individual history and chance events? I am interested in understanding the universal constraints on vegetation dynamics through the integration of cross-site studies and focused field campaigns with cutting-edge models and modern statistical techniques. Overall my research is focused on the interacting roles of environmental heterogeneity, disturbance, and climate change in structuring forest dynamics.

Recent projects have focused on forest dynamics in the eastern and central U.S. at the stand, landscape, and regional scales. Past projects have also involved work in Costa Rica, Australia, and the Pacific Northwest. In addition, I am starting work on biofuels that will look at the suitability and sustainability of different biofuel crops, their vulnerability to climate variability, their impacts on carbon storage and the water cycle, and the potential land use/land cover changes of biofuel expansion.

Today I got an email from Georghe Craciun expressing interest in the BIRS workshop.

Craciun helped prove the exciting theorem that Brendan and I discussed in Part 9 of the Network Theory posts. He works in the Department of Mathematics and the Department of Biomolecular Chemistry at the University of Wisconsin. He studies biochemical networks and biological interaction networks. He’s shown that some graphs associated to biological interaction networks give information about the qualitative properties of the associated dynamical systems: multistability, oscillations, persistence, and global stability. This is the exactly the sort of thing I’d like to hear about at the workshop, if it’s approved: using diagrams to help understand complex systems.

My proposal for a workshop at Banff in 2013 was turned down, but they invited me to reapply for 2014, and I just did.

I’ll definitely be holding a special session Fall 2013 Meeting of the AMS Western Section here at U.C. Riverside on Saturday and Sunday, November 2 and 3, 2013. But more about that later!

Here’s the new proposal. Wish me luck:

Title of proposal: Diagrammatic Techniques for Networked Systems.

Type of meeting: 5 Day Workshop (20 participants).

Overview of the subject area of the workshop: This workshop is about diagrammatic languages for describing complex networks made of interacting parts, particularly stochastic Petri nets, chemical reaction networks, and Bayesian networks. While these languages have arisen independently, mathematicians are now able to study them systematically in a unified way, thanks in large part to recent developments in mathematical physics. Stochastic Petri nets and chemical reaction networks are in fact equivalent formalisms, the first from computer science and the second from chemistry, for describing how collections of entities of various kinds randomly interact and turn into other entities. Bayesian networks are important in probabilistic reasoning, especially machine learning. All these formalisms involve graph theory interacting with probability theory in related ways.

Statement of the objectives of the workshop: Diagrammatic languages for describing complex networks made of interacting parts are important practical tools in many branches of applied mathematics, biology and chemistry. However, practitioners in different fields often wind up ‘reinventing the wheel’. Recently mathematicians have made great progress in formally understanding these languages – but so far, their work has been applied mainly to rather abstruse branches of physics, such as topological quantum field theory. The workshop will connect mathematicians to chemists and biologists so the mathematicians can better understand the practical problems and the scientists can learn about the new tools that are available for working with stochastic Petri nets, chemical reaction networks, and Bayesian networks. For example, techniques from quantum field theory have been exploited to give new proofs of the Anderson-Craciun-Kurtz theorem and deficiency zero theorem in chemical reaction network theory, but the limits of these techniques are not yet clear, and only a few researchers in the theory are familiar with these techniques.

Workshop press release: Scientists have developed many ways to use diagrams to study complex systems made of interacting parts. The most familiar might be an electrical circuit diagram, but many other kinds of diagrams are used in biology, chemistry, and computer science. Unfortunately, workers in one field don’t always know what workers in another field have done. So, it’s up to mathematicians to talk to all these scientists and start thinking about diagrams for complex systems in a unified way. We’re starting to do this, and the results are interesting: for example, ideas from particle physics can be used to study chemical reactions and population biology!

How To Write Math Here:

You need the word 'latex' right after the first dollar sign, and it needs a space after it. Double dollar signs don't work, and other limitations apply, some described here. You can't preview comments here, but I'm happy to fix errors.