I mentioned before that I’m teaching a new course this term, a first-year undergraduate seminar course on climate computing. Course starts next week, so I’m busy putting together some material. It’s intended to be pretty open ended, as it’s a small group seminar, and we can jump into topics that the students are interested in. So I put together a core set of topics I want to cover, and then a long list of other possible topics. Here’s what I have so far:

Core topics:

Part 1: Background & History

Week 1: Climate Science BC (Before Computers), in which we cover Fourier, Tyndall, Arrhenius, Callendar, and the discovery of global warming. We’ll introduce the key concepts for understanding how the physical climate system works.

Week 2: Taming Chaos in which we talk about Bjerknes and Lorenz, and look at the basic equations for modeling the atmosphere, and a gentle introduction to Chaos theory.

Week 3: The Giant Brain in which we talk about von Neumann, Charney, ENIAC and the first general circulation models.

Part 2: Basics of Climate Modeling

Week 4: Inside the simulation (also known as “Computational Fluid Dynamics on a Rotating Sphere for beginners”), where we look at the major elements of a climate model, including grids, dynamics, radiation, parameterizations, etc.

Week 5: What happens at the Boundaries? in which we look at the physical boundaries (land and ocean, the boundary layer), spatial boundaries (subgrid processes), temporal boundaries (start states, long runs, etc), and climate state boundaries (forcings, emissions scenarios, etc), and talk about the difference between Initial Value and Boundary Value problems.

Week 6: Towards Earth System Models in which we look at the growing complexity of modern GCMs, and the trade-offs between more resolution, more earth system processes, and more complexity.

Part 3: Choosing and Using Models

Week 8: On the Catwalk in which we look at the vast range of different types of models that are available, and what they are used for.

Week 9: Experimentation in which we look at what’s involved in running a model, and the kinds of experiments you might do with one. If we’re lucky, we’ll get to try running some experiments for real.

Week 10: How good are the models? in which we look at how to test and validate models, what it means to “do science” with a model, what model inter-comparison projects tell us, and some of weaknesses of current earth system models.

Part 4: What we can know, and what we can do

Week 11: Knowledge and Uncertainty, in which we talk about what we know and what we don’t know about climate change, sources of uncertainty, and whether we can predict the future. We’ll also explore how climate models interact with other types of knowledge about global climate change.

Week 12: Decisions, Decisions, Decisions in which we look at what policymakers need, and what they get. We’ll talk about the IPCC process, and maybe a bit about some of the policy options. We’ll talk about the need for better predictions of climate extremes, and regional impacts. And we’ll look at the difference between GCMs and IAMs.

Week 13: Enough talk, time for action! in which we face up to the question that given we now have to learn how to manage the earth’s climate systems, what should we be doing about climate change, and what other tools do we need in order to be successful?

Additional Topics:

We can include any of these based on interest and enthusiasm (but probably not all of them!). Some of these stray away from the “computing” them of the course, so we might need to agree on some criteria for which ones to include. In no particular order:

@Jono: I’m thinking 3-4 assignments spread across the term:
(1) research one of the extra topics and write a blog post on it (I’ll post the best as guest posts here)
(2) in teams of 2-3, put together a 10-minute powerpoint presentation on one of the topics from the course
(3) design an experiment, justifying the design itself, and the choice of model to run it on
(4) Final assignment will be an essay of some kind.
But I welcome comments & suggestions.

(3) sounds very appropriate for the course and can trigger some interesting discussions outside of class! I like your idea for (1); it might be a bit much, but since the course will have a lot of discussion and less in the way of prepared material, perhaps two students each week could be assigned the task of writing a blog post about the class discussions: 2 students/week x 12 weeks = 24 students. That would be useful for students and curious onlookers such as myself!

Off the top of my head, I have two essay suggestions, but neither of these seem particularly suitable for a final assignment.

In keeping with the workload of the assignments you’re considering, it might be fun to have students pick apart a newspaper article on climate change and assess it on its scientific merits and accuracy. The more mainstream the newspaper the better, just to show students that you can’t believe everything you read (if they don’t already know that). It may help to restrict the choice of article for the purposes of marking.

Though maybe shorter than you’d like, students could come up with a metaphor for something discussed in class. For example, we have the bathtub metaphor for GHG production/sinks or the computer-as-a-brain metaphor. These metaphors have their own problems, but given the diverse background of the students of the seminar course (or at least the lesser degree of indoctrination), I’d be willing to bet that at least a few of the students will come up with a very original and apt metaphor! This could result in a contribution to the broader community: a better way to explain esoteric ideas to the general public!

Will you use existing climate models only or will the students program come toy models themselves?
What background in math and physics do the students have? Do they know e.g. what the Navier-Stokes equations are and how to program a finite element approximation to a partial differential equation?

@Tim: they’re first year undergrads from anywhere across the whole of the Faculty of Arts and Sciences. I suspect many of them won’t even have mastered calculus. So, I’m aiming to give them an appreciation of the numerics, without expecting a detailed understanding. I’ve noticed books on climate change do one of two things – they launch into the equations, expecting only computational physicists to follow them, or they avoid all equations all together. I’m going to try something different – I want to try and give the students the right intuitions for how the maths works, even when they don’t have any mathematically sophistication.
And, no I won’t be getting them to program anything.

Steve, would you please conduct a quick survey asking members of your class what their intended major(s) and minor(s) are and posting the results here? I’m also curious as to why students chose this seminar course (wanting to understand the science better, wanting to be able to hold informed discussions with people in the area, etc.), but collecting/compiling that data might take too much time. Hopefully this doesn’t need to go through an IRB…

A very short discussion of that seems to belong just before you really get into models, for context. I have all too often encountered people who are absolutely sure that models are everything, but no good, for reasons that vary by discipline.

2) Will you somewhere early introduce a clear idea of model hierarchies as successive approximations, sometimes requiring more compute power (that sort of meshes with the supercomputing topic.) Some history for folks might be useful (and the brand-new 25,000-sq-ft exhibition opens next week at the Computer History Museum. Also, I assume you know Supercomputing and the Transformation of Science, which may be old (1993), but is well-done and generally still relevant. The chart on p.33 might be relevant.

I suggest this is that students (naturally) quite often have little sense of computing progress. (I took a gang of first-term Stanford freshmen in the accelerated CS program around the Museum. They asked about the keypunches. I
said we used them to punch cards. They said why?
I said: that’s how we wrote code…. result: disbelief.
much muttering: NO WAY, nobody would do that, that would be crazy… late whispers “they actually used this junk?”)

An iPhone4 has (I think, I haven’t run it myself) about 2X the Linpack performance of a Cray-1, circa 1976, and much more memory.

John: Thanks for the links. I agree on all points. The “other sources of evidence” thing needs to be front and centre. I like the slide Jim Hansen uses in most of his talks these days, that explicitly puts models in third place as a source of evidence, after paleoclimate and current observations. Part of my mission will be to explain the role of models: i.e. not to prove climate change exists and is serious, but to improve our understanding of the processes involved.
The model hierarchies issue is central to my plan for week 8 (“on the catwalk”)….

Oh, and my favourite demonstration of the power of a cellphone: PHONIAC

Sounds like a fascinating course to teach, I like that you are getting them to express themselves on the topic with their own blogs. Is this course part of a larger program at UofT? I’ll be paying attention, especially to the parts near the end of your syllabus on uncertainty and planning.

Are you going to be giving them any simplified modeling tools to play with to see the difficulties scientists face in making predictions about such a complex system. There probably aren’t any such tools, hopefully you’ll share any ideas you get doing this course about what kind of models would be useful climate science in this way.

Oops, one more. I’m remiinded of a conversation with NCAR folks long, who pithily said they were limited by:

1) Having the data

2) Knowing the science

3) And having the {memory, CPU power, I/O} to do the computing.

This is kind of a Liebig’s Law for supercomputing: you’re limited by whichever item is in least supply. Of course, you never have data from the future, and often you can’t get what you’d like from the past either.

Item 3) implies that it is a good idea to use the {simplest, smallest, fastest} model that’s good enough. I remember a great demo by astrophysicist Paul Woodward where item 3 really mattered. He was doing big computational fluid dynamics problems, and showed:

Grid elements of size X^3 were simply too large to show turbulent flows.
But grid elements of (.5X)^3 did. That’s 8X more memory.

However, at some point, shrinking the grid elements further doesn’t really help.