2015-10-26

David Roberts had an interesting article in Vox recently which discusses the very severe limitations of the sorts of models economists typically use to study climate change (Integrated Assessment Models or IAMs).

(The title of the article is awful, but I’ve learned not to blame headlines on writers. This stupid hangover from the days of paper is inexplicable for a medium like Vox, but let’s give David the benefit of the doubt on this and presume somebody else slapped the title on.)

I briefly discuss his main point, with which I agree, here and here.

My concern in this article is somewhat tangential. David begins with a simplification of the climate quandary which is widely accepted. I appreciate his clear statement of this model, as it enables me to call it into question, and gives me an opportunity to communicate some ideas that I’d like to share.

My point is peripheral to David’s argument (again, I mostly agree with him) but I think there’s a relevant piece of science communication that may be of some value.

ROBERTS’ UNCERTAINTY LOOP

Here’s the opening, which hopefully Vox won’t climb all over my rear for quoting in full:

Now that climate hawks are emerging a bit from their defensive crouch, however, more attention is turning to the many uncertainties that haunt climate. Consider these layers:

To begin with: How will human economic activity this century translate into greenhouse gas emissions? How much will we emit? To answer that, we need to know how much population will grow, how much the global economy will grow, what per capita emissions will look like in 2050, 2080, etc.

Which leads to: How will a rise in greenhouse gases translate into a rise in global average temperature? How sensitive is climate to greenhouse gases? (In the biz, “climate sensitivity” refers to the rise in temperature that would result from a doubling in global greenhouses gases from pre-industrial levels.)

Which leads to: How will a rise in global average temperature translate into climate impacts (rising sea levels, etc.)? How do systems like ocean and air currents respond to temperature? What kinds of responses will be seen in different subclimates and latitudes?

Which leads to: How will the impacts of climate change translate into impacts on human lives and economies? In other words, how much will climate impacts hurt us? How much GDP growth will they thwart (or reverse)? Will future people be richer and better able to adapt, or poorer because of climate change itself?

The really funny thing? The answer to 4 depends on the answer to 3, which depends on the answer to 2, which depends on the answer to 1, which depends on … the answer to 4.

It’s a loop. An uncertainty loop!

And he provides a little cartoon of the loop, which looks like this:



THE DECIDE/PREDICT PROBLEM

My first problem with this meta-model is that it is disempowering.

That disempowerment is key to what I and many others perceive as a small but important decline in David’s perspicacity as he becomes a more political being in the heady atmosphere of vox.com. Somewhere between closing the loop between point 4 and back to point 1 there’s a missing policy sector. How will we decide to respond to this situation? How should we decide?

Now, to be sure the whole article is about the interface between academia and politics. It’s not as if David is being a total libertarian fatalist. But there’s a fatalism built into the construction of the loop.

Policy is a control problem, not a prediction problem.

The lack of an explicit policy input into the model is a side effect of the effort of economists and “political scientists” to look like scientists. Just physics envy.

It’s my opinion that political analysis and economics should not be construed as pure descriptive sciences. They are applied sciences, more akin to engineering and medicine than to physics.

There’s nothing so toxic to human discourse as this talk of “iron laws”. One does not predict what book to read next. One decides.

We do not predict what policy to implement. We decide. The point of economics ought to be to give us an array of policies among which to decide, not just to describe the system on which those policies act. The point of political science is to improve our collective skill at identifying goals and picking policies, not just to guess how we will most likely fail.

The history of the past several decades has been substantially impacted by the failures of these disciplines to rise to the occasion. It’s painful to watch David’s horizons shrinking as he becomes a political “realist”.

(Tell Justin Trudeau to be more realistic! Change is possible.)

ALL PHYSICS IS LOCAL

My second problem is that David’s formulation overvalues the temperature sensitivity.

This isn’t unique to David by any means. It’s endemic to the conversation. But it’s also pernicious. I’d like to loosen the grip of this number on our discourse, not because it is uncertain but because it is rather well-constrained and the real difficulties lie elsewhere.

I wrote about this a few months back. Here is the crux of it again, slightly edited.

Our best understanding of the amount of fossil fuel there is available and how CO2 works leads us to an expectation that burning all that fuel will cause a severe disruption of all natural systems (even disregarding the many other insults we are visiting on Nature) and many human systems as well. This understanding is not primarily based on observations, nor on complicated computer models, but on our theoretical understanding of physics and on paleontological evidence.

This understanding traditionally is boiled down to a number, “the sensitivity”. Now, in general, “a” sensitivity is the ratio of an output to an input in a system. An electrical engineer would refer to the “gain” or “amplification factor”. In our case “sensitivity” is a measure of this question: if we put in so much change in CO2, how much global temperature change will we get out?

Before climate change became obvious, when it was merely a prediction, the sensitivity was a good thing to focus on. (There still are good reasons for scientists to think about it.) The presumption was that, all else equal, the hotter scenarios were responding more sensitively than the less hot ones.

The scale separation between climate models and actual impacts is too big for most applications. It was hard to know what so much warming implied for climate change in real scenarios. The sensitivity was a good preliminary parameter for how much trouble we would be in for a given emission scenario.

But this got us into backwards thinking and a backwards way of speaking. We started to speak as if global warming causes climate change, as if the number of degrees of warming were diagnostic in some sense of what would happen to us. Being humans we got wrapped up in the symbol and forgot the reality.

Greenhouse gases cause radiative transfer processes to change. These changes cause energy to accumulate in the system as it seeks a new equilibrium. The climate changes, in turn, to respond to the redistributed energy. And one of the many many consequences is, probably, an increase in global mean surface temperature (GMST). But there may be other consequences we care about!

The real sensitivity we care about is damage per unit of carbon emitted. That damage is caused directly by climate change, not by GMST.

Changing the radiative properties of the atmosphere changes the dynamics at each point in space. That changes the weather. The accumulated weather changes amount to a global climate change. The mean temperature measures the amount of change, but both the causes and effects of greenhouse gas response happen locally. (That’s why weather and climate models, which specify only local properties, allow a global pattern to successfully emerge.

All physics is local.

Consider most of the bizarre events of the past few years – the Australian megadrought, the Russian fires, the Pakistan floods, the Texas heatwave, Sandy, etc. Each of these and many others was associated with a phenomenon called “blocking” wherein the jet stream develops huge, sluggish meanders, delivering “the wrong air at the wrong time” to some large area. There is considerable evidence that this phenomenon has become more prevalent in recent years. It is especially associated with an increase in local heat events.

Notice that one way the system can avoid increasing its average temperature is by making the temperature change more unevenly distributed – an extremely hot place far to the pole can radiate so effectively as to more than balance out a comparably cool place near the pole. This is in fact one sort of climate change that fills the bill of equilibrating the energetics without changing the mean temperature much.

It’s pretty clear that even if global warming has in some sense “gone away” or is “on hiatus”, the world is no longer producing the reliable weather that it has over the period of human history. Most likely, this is only the beginning.

It’s a bit crazy to still be quibbling about sensitivity in policy circles. We should pay more attention to the stuff that is starting to hit the fan.

OVERSENSITIVITY

The sense in which “sensitivity” is meant above is the usual one in the climate debates – the equilibrium temperature change in response to a forcing, generally expressed as degrees C per CO2 doubling.

(It’s really per CO2-equivalent doubling, which in turn is a slight but useful fuzzing of the more solid concept of degrees C per watt per square meter top of atmosphere forcing, the logarithmic response of watts forcing to CO2 being itself an approximation. Apologies for the pedantic parenthetic.)

This sensitivity is a very useful organizing concept in paleoclimatology and in idealized models. But I think it is overvalued as a point of contention in the policy debates.

A key point in the uncertainty question is that it is quite well constrained (between 2.5 and 3 C per CO2e doubling is very likely) and very unlikely to be in error by a factor of more than 2-fold. Even if one is convinced that there is a 2-fold overstatement that somehow has persisted as the central estimate for over 35 years, that’s not enough to delay policy substantially, because the picture is risk-dominated. Because we are so overdue on a rational response to greenhouse gas accumulation, realistic uncertainty about climate sensitivity is not policy-relevant.

Temperature sensitivity to greenhouse gases is really not a key part of the uncertainty loop. All the arguing about it is just Benghazi; the foot-draggers have a whole array of red herrings worked up and they’ll be damned if they’re going to stop flinging them around now. But it just doesn’t matter.

All physics is local. David’s model that the causality is “greenhouse gases -> global temperature -> impacts” is mostly wrong.

One key impact is indeed closely related to global temperature: the thermal component of sea level rise. But even there the coupling is not tight – we see strong indications that the deep ocean is warming faster that we expected given the observed surface warming.

At the other end of the impact spectrum, ocean acidification is utterly disconnected from temperature. Indeed, you could argue that it’s inversely connected to temperature. A sensible way of looking at it is, across the believable set of earth models, the better a job the ocean does of absorbing CO2, the less the surface temperature rise and the greater the ocean acidification.

Ice melt depends not on global temperature alone, but on polar amplification. Conceivably there’s a world where greenhouse gases do little to change the global mean temperature, but all the increase is concentrated in polar regions, causing ice sheets to decay. Similarly, severe weather climatology depends on regional responses – the jet stream’s habits or the vertical structure may respond to regional or even local greenhouse-gas-driven perturbations.

And Roger Pielke Sr., whatever his other flaws, is correct to point out that other anthropogenic factors could be first order important in regional climate. Have you seen pictures of Indonesia lately?

THE BIG PICTURE

So this gives me an opportunity to reintroduce my map of the big picture of the greenhouse gas problem. It is more complicated than David’s, but it is similar in intent. I think the complication is justified by its improved accuracy, and I think it’s easy enough to understand.

It’s somewhat color coded: pink = geophysics, green = biology, purple = engineering, blue = social, yellow = control.



Roughly speaking, there is only one simple number (one octagon) in this model – concentration. Everything else is complicated – whole ranges of professions are represented in each bubble. People spend lifetimes working hard in each of the bubbles; the couplings between them are fraught and the systems dynamics of the whole system is ill-understood and a bit overwhelming.

WHY YOU SHOULD LOOK AT THAT PICTURE SOME MORE

I believe this model has two huge advantages over David’s simple representation which I’ll revisit in the reverse order that I mentioned them above.

There Are Multiple Sensitivities

In the more complicated diagram, the overvalued GMST “sensitivity” number is removed, effectively replaced by a plethora of direct response functions over the important impact phenomena. It removes David’s step 2 altogether.

In the past, The Sensitivity was a good shorthand for the problem (at least, neglecting ocean acidification, which everybody did for a long time). Now that we are getting a better picture of the individual impact sectors, and now that impacts are beginning to emerge, we need to abandon that simplification – the sensitivities are directly from the greenhouse concentration trajectory (and similarly to other anthropogenic forcings) to the impact sectors. Running it through global temperature was a simplification for an era when this was all theoretical. Now that we see impacts on the ground, global temperature is an overvalued intermediary.

One consequence of seeing the problem as having multiple sensitivities is to take our eye off the (probably illusory) fat tail of the GMST sensitivity, and back to the remaining fat tails of the various and diverse impact phenomena. Will things turn out well on all these fronts? In Piet Hein’s words, “Let us only hope, but not only only hope.”

The fever is not the disease. Global mean temperature is a crude representation of the extent to which we are in trouble. The basis for the 2 C target is not that there is a clear threshhold of disaster. The basis for the 2 C target is that we missed the boat for a rational response to this problem decades ago, and 2 C is a way of saying “transition away from net emissions as quickly as we can manage”.

Human Responsibility is Explicit

A more crucial improvement comes back to David’s motivating question of uncertainties. Not one of David’s four uncertainty sectors captures the key uncertainty!

The most important uncertainty in the climate trajectory is what we decide, and when.

This is a point that people who look at climate through the lens of scenarios are constantly at pains to make, yet somehow it seems to get lost. I think it’s a consequence of the strange fatalism of right wing economics that increasingly permeates political discourse.

The point of calling our time the anthropocene is that Nature no longer controls the fate of the planet. We do. We are suddenly in the driver’s seat and we had better start driving.

The yellow bubble in the diagram is emphasized, because it remains the most important feature of the problem.

The dominant uncertainty in the final position of an automobile at the end of the day is what destination the driver aimed for.

Show more