Archive

Archive for the ‘fat tails’ Category

Fat Tails Part Deux: cost-benefit analysis and climate change; Weitzman replies to Nordhaus

February 13th, 2009 No comments

[Note:  Although the giant snakes I mentioned in my preceding post may have fat tails, I didn’t want my description of the discussion between Harvard`s Martin Weitzman and Yale`s William Nordhaus of the limits of cost-benefit analysis to be overlooked, so I have largely copied it below.  I’ve added an introduction, as well as a few links.]

“Fat tails” seem to be the rage these days, as Bill Safire noted last week in the NYT.  But what are “fat tails”?  Notes Safire,

To comprehend what fat tail is in
today’s media wringer, think of a bell curve, the line on a
statistician’s chart that reflects “normal distribution.” It is tall
and wide in the middle — where most people and things being studied
almost always tend to be — and drops and flattens out at the bottom,
where fewer are, making a shape on a graph resembling a bell. The
extremities at the bottom left and right are called the tails; when
they balloon instead of nearly vanishing as expected, the tails have
been designated “heavy” and, more recently, the more pejorative “fat.”
To a credit-agency statistician now living in a world of chagrin, the
alliterative definition of a fat tail is “an abnormal agglomeration of angst.”

In
an eye-popping Times Magazine article last month titled “Risk
Mismanagement
,” Joe Nocera, a business columnist for The Times, focused
on the passionate, prescient warnings of the former options trader
Nassim Nicholas Taleb, author of “The Black Swan” and “Fooled by
Randomness,”
who popularized the phrase now in vogue in its
financial-­statistics sense. Nocera wrote: “What will cause you to lose
billions instead of millions? Something rare, something you’ve never
considered a possibility. Taleb calls these events ‘fat tails
or ‘black swans,’
and he is convinced that they take place far more
frequently than most human beings are willing to contemplate.”

If I make quibble with Safire’s description; “fat” refers not to the probabilty distribution ballooning on either tail, but refers to the case that the tail probability does not decline quickly to zero (viz., probability approaches zero more slowly than exponentially).

*   *   *

The size of the giant snakes and the much higher temperatures (and GHG levels) at their time (60 million years ago) and shortly after during the PETM (a perod 56 million years ago temperatures shot up by 5° Celsius / 9° F in less than 10,000 years) tell us no simply that climate is sensitive
(on geological scales, sometimes rather short-term) to atmospheric
levels of carbon and methane, but  remind us that there is a “fat tail” of uncertain climate change risks
posed by mankind`s ramped up efforts to release as much as possible of
the CO2 that has been stored up in the form of fossil fuels, methane
and limestone over millions years.  

I have mentioned the issue of “fat tails” previously,
in connection with attempts at applying cost – benefit analysis (CBA)
to determine whether to tax CO2 emissions.  While economists like
Yale`s William Nordhaus who have applied CBA to climate policy have been saying for decades that taxing carbon makes sense on a net basis, our own Bob Murphy has criticized Nordhaus`s approach on rather narrow (and decidedly non-Austrian) grounds.

But Nordhaus has also been strongly criticized by economists such as Harvard`s Martin Weitzman,
who basically argue that Nordhaus has UNDERSOLD the case for carbon
pricing or that the results of such CBA imply a greater certainty of
knowledge (and complacency) than is deserved.  Weitzman points out
basic difficulties inherent in applying CBA to policies addressing
climate change, particularly where there seems to be a grave
possibility that we do not understand how drastically the climate might
respond to our influences.  Weitzman`s comments (scheduled to appear in
the February issue of The Review of Economics and Statistics) were the focus of the lead essay by Jim Manzi in Cato Unbound`s August 2008 issue, which I reviewed.

Nordhaus has since responded to Weitzman in a comment that became available in January; this time with Bob Murphy stepped in as a defender of CBA!  I note that Ron Bailey, science correspondent at Reason online, has just published a piece examining Weitzman’s paper last year and Nordhaus’s recent comments.

Weitzman has now replied to Nordhaus, and has kindly permitted me to
quote from a draft of his reply (which he has out for review).  It seems that Weitzman
provides a compelling statement of some the limits of CBA, as applied
to climate change.  (NB:  Weitzman`s draft response is a .pdf file that I cannot upload, though I have uploaded a version converted to .txt format.  I am happy to forward the .pdf to any interested readers.)

Weitzman`s criticisms of the limits of CBA ought to resonate with Austrian concerns about complexity, limits of knowledge and the difficulty of prediction — even as Weitzman (and Nordhaus and, indeed, Bob Murphy) completely fail to consider the fundamental problems of conflicting preferences in the absence of property rights and of the likelihood that rent-seeking with corrupt governmental policy responses.

 

The rest of the post sets those of Weitzman`s key points that I consider most salient to a discussion among laymen:

“there is enormous structural uncertainty about the economics of extreme climate change,
which, if not unique, is pretty rare. I will argue on intuitive grounds
that the way in which this deep structural uncertainty is
conceptualized and formalized should influence substantially the
outcomes of any reasonable CBA (or IAM) of climate change. Further, I
will argue that the seeming fact that this deep structural
uncertainty does not influence substantially outcomes from the
“standard” CBA hints at an implausible treatment of uncertainty.”

“The
pre-industrial-revolution level of atmospheric CO2 (about two centuries
ago) was

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


about280 parts per million (ppm). The ice-core data show that
carbon dioxide was within a range roughly between

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


180 and

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


280 ppm
during the last 800,000 years. Currently, CO2 is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


385 ppm, and
climbing steeply. Methane was never higher than

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


750 parts per billion
(ppb) in 800,000 years, but now this extremely potent GHG, which is
thirty times more powerful than CO2, is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1,780 ppb. The sum total of
all carbon-dioxide-equivalent (CO2-e) GHGs is currently at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


435 ppm.
Even more alarming in the 800,000-year record is the rate of change of
GHGs, with increases in CO2 being below (and typically well below)

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40
ppm within any past sub-period of ten thousand years, while now CO2 has
risen by

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40 ppm in just the last quarter century.

Thus, anthropogenic
activity has elevated atmospheric CO2 and CH4 to levels extraordinarily
far outside their natural range – and at a stupendously rapid rate. The
scale and speed of recent GHG increases makes predictions of future
climate change highly uncertain.  There is no analogue for anything
like this happening in the past geological record. Therefore, we do not
really know with much confidence what will happen next.”

“To keep atmospheric CO2 levels at twice pre-industrial-revolution levels would require not just stable but sharply declining emissions within a few decades from now. Forecasting
ahead a century or two, the levels of atmospheric GHGs that may
ultimately be attained (unless drastic measures are undertaken) have
likely not existed for tens of millions of years and the rate of change
will likely be unique on a time scale of hundreds of millions of years.

Remarkably,
the “standard”CBA of climate change takes essentially no account of the
extraordinary magnitude of the scale and speed of these unprecedented
changes in GHGs – and the extraordinary uncertainties they create for
any believable economic analysis of climate change.
Perhaps even
more astonishing is the fact that the “policy ramp” of gradually
tightening emissions, which emerges from the “standard” CBA, attains
stabilization at levels of CO2-e GHGs that approach

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


700 ppm. The
“standard” CBA [of Nordhaus] thus recommends imposing an impulse or
shock to the Earth’s system by geologically-instantaneously jolting
atmospheric stocks of GHGs up to

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


21/2 times their highest past level
over the last 800,000 years – without even mentioning what an
unprecedented planetary experiment such an “optimal” policy would
entail.”

“So-called
“climate sensitivity” (hereafter denoted S1) is a key macro-indicator
of the eventual temperature response to GHG changes. Climate
sensitivity is defi…ned as the global average surface warming following
a doubling of carbon dioxide concentrations. … the median upper 5%
probability level over all 22 climate-sensitivity studies cited in
IPCC-AR4 (2007) is 6.4° C – and this stylized fact alone is telling.
Glancing at Table 9.3 and Box 10.2 of IPCC-AR4, it is apparent that the
upper tails of these 22 PDFs tend to be sufficiently long and heavy
with probability that one is allowed from a simplistically-aggregated
PDF of these 22 studies the rough approximation P[S1>10° C]

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1%. The
actual empirical reason why these upper tails are long and heavy with
probability dovetails nicely with the theory of my paper: inductive
knowledge is always useful, of course, but simultaneously it is limited
in what it can tell us about extreme events outside the range of
experience – in which case one is forced back onto depending more than
one might wish upon the prior PDF, which of necessity is largely
subjective and relatively diffuse. As a recent Science commentary put
it: “Once the world has warmed by 4° C, conditions will be so
different from anything we can observe today (and still more different
from the last ice age) that it is inherently hard to say where the
warming will stop.”

“Exhibit C” concerns possibly disastrous releases over the long run of bad-feedback components
of the carbon cycle that are currently omitted from most general
circulation models. The chief worry here is a significant supplementary
component that conceptually should be added on to climate sensitivity
S1. This omitted component concerns the potentially powerful
self-amplification potential of greenhouse warming due to heat-induced
releases of sequestered carbon. … Over the long run, a CH4
outgassing-amplifier process could potentially precipitate a
cataclysmic strong-positive-feedback warming
. This real physical
basis for a highly unsure but truly catastrophic scenario is my Exhibit
C in the case that conventional CBAs and IAMs do not adequately cover
the deep structural uncertainties associated with possible
climate-change disasters.  Other examples of an actual real physical
basis for a catastrophic outcome could be cited, but this one will do
here.  The real physical possibility of endogenous heat-triggered
releases at high temperatures of the enormous amounts of
naturally-sequestered GHGs is a good example of indirect carbon-cycle
feedback effects that I think should be included in the abstract
interpretation of a concept of “climate sensitivity” that is relevant
here. What matters for the economics of climate change is the
reduced-form relationship between atmospheric stocks of
anthropogenically-injected CO2-e GHGs and temperature change. … When
fed into an economic analysis, the great open-ended uncertainty about
eventual mean planetary temperature change cascades into
yet-much-greater yet-much-more-open-ended uncertainty about eventual
changes in welfare.”

“Exhibit
D” concerns what I view as an unusually cavalier treatment of damages or
disutilities from extreme temperature changes. The “standard” CBA
treats high-temperature damages by a rather passive extrapolation of
whatever specification is assumed (typically arbitrarily) to be the
low-temperature “damages function.”  … Seemingly minor changes in
the specification of high-temperature damages can dramatically alter
the gradualist policy ramp outcomes recommended by the “standard” CBA.

Such fragility of policy to postulated forms of disutility functions
are my Exhibit D in making the case that the “standard” CBA does not
adequately cope with deep structural uncertainty – here structural
uncertainty about the specification of damages.”

“An
experiment without precedent is being performed on planet Earth by
subjecting the world to the shock of a geologically-instantaneous
injection of massive amounts of GHGs. Yet the “standard” CBA seems
almost oblivious to the extraordinarily uncertain consequences of
catastrophic climate change.”

“Almost
nothing in our world has a probability of exactly zero or exactly one.
What is worrisome is not the fact that extreme tails are long per se
(reflecting
the fact that a meaningful upper bound on disutility does not exist),
but that they are fat (with probability density). The critical
question is how fast does the probability of a catastrophe decline
relative to the welfare impact of the catastrophe. Other things being
equal, a thin-tailed PDF is of less concern because the probability of
the bad event declines exponentially (or faster). A fat-tailed
distribution, where the probability declines polynomially, can be much
more worrisome.
… To put a sharp point on this seemingly abstract issue, the
thin-tailed PDFs that Nordhaus requires implicitly to support his
gradualist “policy ramp” conclusions have some theoretical tendency to
morph into being fat tailed when he admits that he is fuzzy about the
functional forms or structural parameters of his assumed thin-tailed
PDFs
– at least for high temperatures. … When one combines fat
tails in the PDF of the logarithm of welfare-equivalent consumption
with a utility function that is sensitive to high damages from extreme
temperatures, it will tend to make the willingness to pay (WTP) to
avoid extreme climate changes very large.”

“Presumably
the PDF in the bad fat tail is thinned, or even truncated, perhaps from
considerations akin to what lies behind the value of a statistical life
(VSL). (After all, we would not pay an infinite amount to eliminate
altogether the fat tail of climate-change catastrophes.) Alas, in
whatever way the bad fat tail is thinned or truncated, a CBA based upon
it remains highly sensitive to the details of the thinning or
truncation mechanism, because the disutility of extreme climate change
has “essentially” unlimited liability.
In this sense climate change
is unique (or at least very rare) because the conclusions from a CBA
for such an unlimited-liability situation have some built-in tendency
to be non-robust to assumed tail fatness.”

“Reasonable
attempts to constrict the fatness of the “bad” tail can still leave us
with uncomfortably big numbers, whose exact value depends non-robustly
upon artificial constraints, functional forms, or parameters that we
really do not understand. The only legitimate way to avoid this
potential problem is when there exists strong a priori knowledge that
restrains the extent of total damages.
If a particular type of
idiosyncratic uncertainty affects only one small part of an
individual’s or a society’s overall portfolio of assets, exposure is
naturally limited to that specific component and bad-tail fatness is
not such a paramount concern. However, some very few but very
important real-world situations have potentially unlimited exposure due
to structural uncertainty about their potentially open-ended
catastrophic reach. Climate change potentially affects the whole
worldwide portfolio of utility by threatening to drive all of planetary
welfare to disastrously low levels in the most extreme scenarios.”

“Conclusions
from CBA [are] more fuzzy than we might prefer, because they are
dependent on essentially arbitrary decisions about how the fat tails
are expressed and about how the damages from high temperatures are
specified.
I would make a strong distinction between thin-tailed
CBA, where there is no reason in principle that outcomes should not be
robust, and fat-tailed CBA, where even in principle outcomes are
highly sensitive to functional forms and parameter values. For ordinary
run-of-the-mill limited exposure or thin-tailed situations, there is at
least the underlying theoretical reassurance that finite-cutoff-based
CBA might (at least in principle) be an arbitrarily-close approximation
to something that is accurate and objective. In fat-tailed unlimited
exposure situations, by contrast, there is no such theoretical
assurance underpinning the arbitrary cutoffs or attenuations – and
therefore CBA outcomes have a theoretical tendency to be sensitive to
fragile assumptions about the likelihood of extreme impacts and how
much disutility they cause.”

“My
target is not CBA in general, but the particular false precision
conveyed by the misplaced concreteness of the “standard” CBA of climate
change. By all means plug in tail probabilities, plug in disutilities
of high impacts, plug in rates of pure time preference, and so forth,
and then see what emerges empirically. Only please do not be surprised
when outcomes from fat-tailed CBA are fragile to specifications
concerning catastrophic extremes.  The extraordinary magnitude of the
deep structural uncertainties involved in climate-change CBA, and the
implied limitations that prevent CBA from reaching robust conclusions,
are highly frustrating for most economists, and in my view may even
push some into a state of denial. After all, economists make a living
from plugging rough numbers into simple models and reaching specific
conclusions (more or less) on the basis of these numbers. What are we
supposed to tell policy makers and politicians if our conclusions are
ambiguous and fragile?”

“It is
threatening for economists to have to admit that the structural
uncertainties and unlimited liabilities of climate change run so deep
that gung-ho “can do” economics may be up against limits on the ability of quantitative analysis to give robust advice in such a grey area. But if this is the way things are with the economics of climate change, then this is the way things are – and non-robustness to subjective assumptions is an inconvenient truth to be lived with rather than a fact to be denied or evaded
just because it looks less scientif…cally objective in CBA. In my
opinion, we economists need to admit to the policy makers, the
politicians, and the public that CBA of climate change is unusual
in being especially fuzzy because it depends especially sensitively on
what is subjectively assumed about the high-temperature damages
function, along with subjective judgements about the fatness of the
extreme tails and/or where they have effectively been cut off
.
Policy makers and the public will just have to deal with the idea that
CBA of climate change is less crisp (maybe I should say even less
crisp) than CBAs of more conventional situations.”

“The
moral of the dismal theorem is that under extreme uncertainty,
seemingly casual decisions about functional forms, parameter values,
and tail thickness may be dominant. We economists should not pursue
a narrow, superficially precise, analysis by blowing away the
low-probability high-impact catastrophic scenarios as if this is a
necessary price we must pay for the worthy goal of giving crisp advice.
An artificial infatuation with precision is likely to make our analysis
go seriously askew and to undermine the credibility of what we say by
effectively marginalizing the very possibilities that make climate
change grave in the first place.

“The
issue of how to deal with the deep structural uncertainties in climate
change would be completely different and immensely simpler if systemic
inertias (like the time required for the system to naturally remove
extra atmospheric CO2) were short (as is the case for SO2;
particulates, and many other airborne pollutants). Then an important
part of an optimal strategy would presumably be along the lines of
“wait and see.” With strong reversibility, an optimal
climate-change policy should logically involve (among other elements)
waiting to see how far out on the bad fat tail the planet will end up,
followed by midcourse corrections if we seem to be headed for a
disaster. This is the ultimate backstop rebuttal of DT given by some
critics of fat-tailed reasoning, including Nordhaus. Alas, the problem
of climate change is characterized everywhere by immensely long
inertias – in atmospheric CO2 removal times, in the capacity of the
oceans to absorb heat (as well as CO2), and in many other relevant
physical and biological processes. Therefore, it is an open question
whether or not we could learn enough in sufficient time to make
politically feasible midcourse corrections. When the critics are
gambling on this midcourse-correction learning mechanism to undercut
the message of DT, they are relying more on an article of faith than on
any kind of evidence-based scientific argument.

“I
think the actual scientific facts behind the alleged feasibility of
“wait and see”policies are, if anything, additional evidence for the
importance of fat-tailed irreversible uncertainty about ultimate
climate change.

“The
relevance of “wait and see”policies is an important unresolved issue,
which in principle could decide the debate between me and Nordhaus, but
my own take right now would be that the built-in pipeline inertias
are so great that if and when we detect that we are heading for
unacceptable climate change, it will likely prove too late to do
anything much about it for centuries to come thereafter
(except,
possibly, for lowering temperatures by geoengineering the atmosphere to
reflect back incoming solar radiation). In any event, I see this whole
“wait and see” issue as yet another component of fat-tailed uncertainty
– rather than being a reliable backstop strategy for dealing with
excessive CO2 in the atmosphere.

Nordhaus
states that there are so many low-probability catastrophic-impact
scenarios around that ‘if we accept the Dismal Theorem, we would
probably dissolve in a sea of anxiety at the prospect of the infinity
of infinitely bad outcomes.’ This is rhetorical excess and, more to the
point here, it is fallacious. Most of the examples Nordhaus gives have
such miniscule thin-tailed probabilities that they can be written off.”

Nordhaus
summarizes his critique with the idea there are indeed deep
uncertainties about virtually every aspect of the natural and social
sciences of climate change – but these uncertainties can only be
resolved by continued careful analysis of data and theories. I heartily
endorse his constructive attitude about the necessity of further
research targeted toward a goal of resolving as much of the uncertainty
as it is humanly possible to resolve.
I would just add that we
should also recognize the reality that, for now and perhaps for some
time to come, the sheer magnitude of the deep structural uncertainties,
and the way we express them in our models, will likely dominate
plausible applications of CBA to the economics of climate change
.”

(emphasis added)