This a super-wonkish supplement to my latest Bloomberg article—which you should go read and which discusses the importance of tail risk in the economics of climate change. Climate change is, in short, a tail risk problem. The real danger, or more specifically, the center of mass of the risk-weighted costs, lies in the low-probability, high-impact outcomes.

I explained this qualitatively in my Bloomberg article, but I wanted to loop back and think about this quantitatively and more rigorously for those who, you know, like this sort of thing.

So here's how you should think about climate change. (At least, this is how I think about climate change.) First, there's a probability distribution of temperature anomalies. Second, there's a cost function which expresses the welfare loss of climate change in terms of temperature anomaly. The multiplicative product of these two functions is the risk-weighted cost function, which is what any cogent analysis addresses.

First, the probability distribution. We don't know the shape of the distribution, or at least we are highly uncertain. The probability distribution will determine the relative likelihoods of a 1.8°C increase in mean global surface temperature (the baseline forecast of the IPCC's AR4 report) as compared to something catastrophic, say, a 7.2°C increase.

For our math exercise here, I've selected five probability distributions: the normal Gaussian distribution (i.e. the "bell curve"), the Student's t distribution, the lognormal distribution, the Pareto distribution, and the Weibull distribution. I should note here that I have configured these distributions, where possible, to reflect the IPCC's statistics on the mean forecast and standard deviation of the forecast.

A quick discussion of rationale. I'm picking the normal distribution because, frankly, I am a high degree of confidence that this is

*not*what the probability distribution of temperature anomalies looks like. That, by extension, will allow us to see what the distribution of risk-weighted costs does

*not*look like. If that seems counterintuitive, think about it as clarification by contradiction.

The Student's t-distribution is a little closer. (If you've taken a high-school lab science course, you've probably used this one before.) It looks a lot like the normal distribution, except it has much fatter tails—that is to say, it sees extreme outcomes as substantially more likely than the normal distribution. Since I think the normal distribution significantly underweights climate-change tail risk, I suggest to you that the t-distribution is the better of the two.

Now, the lognormal distribution. This has a substantially a fatter tail than the normal distribution, and the way to think about why temperature anomaly might be characterized by this distribution is if the underlying mechanism comes as the product of several inputs. (The normal distribution supposes it is the sum of several inputs.)

Two more: the Pareto and Weibull distributions. I'm using the former because William Nordhaus invokes it as the prototypical fat-tailed distribution in his article I cited for Bloomberg. The Weibull distribution is introduced here also because it appears used for these sorts of questions: pricing insurance for extreme weather. (I didn't know of this distribution beforehand.)

So Part 1 is: Decide the possible probability distributions of temperature anomaly. Part 2: Decide the possible cost functions of temperature anomaly. That is, regardless of probability, what is the shape of the function C(T)?

I'll throw out a few likely modeling possibilities: linear, quadratic, quadratic-polynomial, and exponential.

The first is, again, what I

*don't*think it is: a linear cost function—i.e., where costs rise according to a constant slope over temperature anomaly. But we'll look at it anyway to demonstrate the argument.

The second is a perfectly reasonable modeling choice. It suggests that the

*rate*of cost increase is linear, and thus costs increase quadratically over temperature anomaly.

The third comes from Nordhaus, who uses αT^2/(1+αT^2) as the shape of his cost function in his DICE climate model, with a low value of α, which creates a concave-up function within the relevant domain of temperature anomaly.

The fourth is exponential. There is a case for this one, but I should note generally that it is the most agressive choice of these cost functions; that is, it sees high temperature anomaly outcomes as more damaging than any of our other options.

Let's review. We have five possible probability distributions of temperature anomaly and four possible cost functions of temperature anomaly. To find the set of risk-weighted cost functions, we multiply each probability distribution by each cost function. That means we have 20 possible risk-weighted cost functions.

The best way to compare them is to index the total risk-weighted cost of climate change within a range of temperature anomalies—here I assume a domain of 0°C through 10°C—to 1. We can look at the graphs of the risk-weighted cost functions.

A graph of 20 functions will be hard to visualize no matter what I do, but here:

*(Note: "Nordhaus" is my shorthand for the second-to-last, quadratic-polynomial, probability distribution.)*

Calculating the center of mass of the risk-weighted cost function is a helpfully reductive way of seeing the importance of tail risk. The further out the center of mass is for the risk-weighted cost function, the more important the extreme outcomes are to our assessment of the cost of climate change.

By combining a probability distribution on the row and a cost function on the column, we can determine the center of mass of the risk-weighted cost.

What we see is that the plausible fat-tailed probability distributions (Student's t, lognormal, and Pareto), with a plausible cost function (in my opinion, quadratic), gives us some very unpleasant centers of mass for the risk-weighted cost.

Assume that, by rough average, the center of mass of the risk-weighted cost occurs at 7°C. (Remember that we have nothing more than one significant figure right now, given this back-of-the-envelope analysis.) That tells us that 1.8°C, what the IPCC sees as the most likely outcome, is really

*not*what we should be worrying about. What we need to be worrying about, and better yet doing something about, is reducing the probability or cost of these tail outcomes.

It's important to think of the probability and cost questions as distinct. We could reduce the probability of tail outcomes with conventional measures like a carbon tax, a cap-and-trade system, CAFE standards, emissions regulation, etc. Trying to reduce the cost of a tail outcome leads us to some rather extreme options, like relocating the citizens of the Maldives or building sea gates for New York City. It is likely that we will end up doing some of both—probability reduction and cost mitigation—of course.

That's the background story on the Bloomberg post. I fundamentally see climate change as a global problem of tail risk. Anyone who doesn't, as this post implies, is assuming an implausible temperature-anomaly probability distribution or cost function.

Scott Sumner argues that the Fed ought to make an NGDP futures market, so it has the information to do forward-looking monetary policy. In theory, I suppose, the Fed could also make climate-change prediction markets. I wonder if that would have any useful effect. The problem is that while it would be easy for them to make a atmospheric CO2 prediction market or a global temperature prediction market, it's harder to define in advance the kind of actual end-outcomes we would like to know about.

ReplyDelete