Archive

Archive for February, 2009

The Curse of Limited Liability; WSJ.com: Executives/traders of big financial corporations generate risky business, while smaller partnerships are much more risk averse

February 26th, 2009 No comments

The February 25 Wall Street Journal carries an insightful piece of commentary by James K. Glassman (president of the World Growth Institute and a former undersecretary of state) and William T. Nolan (president of Devonshire Holdings and former associate at Brown Brothers Harriman & Co. in the early 1970s) .

The Glassman and Nolan piece, entitled Bankers Need More Skin in the Game; Partnerships may be a more trustworthy business model than corporations,” echoes in the context of Wall Street financial institutions the theme of inappropriate managerial risk-taking that I have previously blogged on a number of times regarding the consequences of  the “limited liability” corporate form.  Glassman and Nolan point to the sterling performance of Brown Brothers Harriman & Co., the oldest and largest partnership bank in the U.S., founded in 1818.

The Glassman and Nolan editorial is worth reading in whole, for purposes of discussion I excerpt portions here (bolding is mine):

“Of all the causes of the financial meltdown of the past few years, the easiest to understand is that an irresponsible attitude toward risk led to terrible mistakes in judgment. But where did this casual approach to risk originate?

A major culprit, we believe, is a change in the way Wall Street financial institutions are organized. During the late 1970s and ’80s, much of the responsibility for risk was transferred away from the people who made the financial decisions. As a result, leverage rose from 20-1 to 40-1 or higher, creating shaky towers of debt, which, as we know, eventually collapsed. …

“The trick is to find a way to encourage sensible risk-taking, while dampening the impulse to take chances that can throw an economy into recession and force taxpayers to bail out a banking system.

Can government accomplish this feat through rule-making and regulatory oversight? It is unlikely. As the Nobel Prize-winning economist Friedrich von Hayek correctly emphasized, no one — not even a politician or a bureaucrat — can gain the broad and deep knowledge necessary to make wise enough rules. Moreover, in a $14 trillion economy, you can’t hire enough overseers to pore over everyone’s books.

There is, however, a better solution: expose players in the financial game to greater personal loss if their risk-taking fails. When you worry that a mistake will cause you to lose your second home, your stocks and bonds and your club memberships, then you’re less likely to take the kinds of risks that expose the rest of society to your failures.

“A simple mechanism exists to achieve this purpose: the private partnership. Partners face liability that extends to their personal assets. They aren’t protected by the corporate shield that limits losses to what the corporation itself owns (as well as the value of the stocks and bonds the corporation has issued). Unfortunately, the partnership is a legal form of business organization that was largely abandoned by banks over the past quarter-century. Our advice is to bring it back. …

“Even John Gutfreund — the man who kicked off the dramatic change in investment-banking culture and structure when he took Salomon Brothers, a longtime partnership, public in 1981 — confirms our thesis. Michael Lewis wrote in the December issue of Condé Nast Portfolio that Mr. Gutfreund now believes “that the main effect of turning a partnership into a corporation was to transfer financial risk to the shareholders. ‘When things go wrong, it’s their problem,'” said Mr. Gutfreund.

“But when the personal wealth of executives is put at risk, as it is in a partnership, their behavior changes. Risk aversion increases. Few partnerships would leverage themselves to the hilt to load up on risky subprime loans.

“How do we know this? Luckily, for this financial experiment, there is a control case: Brown Brothers Harriman & Co. ….

“Some would say that BBH is sui generis. Would its structure work more broadly for financial institutions? It already is. As large brokers merged into huge corporations with greater concentration in real-estate finance, corporate finance migrated to private equity firms and hedge funds, which are generally structured as partnerships. While many of these new engines of finance have suffered in the recent meltdown, they generally didn’t engage in such extreme risk-taking and thus haven’t become wards of the state.

“We know from Alfred Chandler, the great business historian, that “strategy determines structure.” Similarly, structure determines behavior — in this case, a healthier attitude toward risk. It is unlikely that a partnership will grow to the size of a Bank of America or Citigroup, but, while size can boost efficiency, it also poses systemic risk. As partnerships — and corporations with partnership attributes — replace behemoths, the current crisis will spawn structures for future success.  …

We do not believe that government should require banks to be partnerships. Rather, investors — and governments — should recognize the extra safety inherent in doing business with partnerships.

I have previously argued that one of the key state interventions that has fuelled the rent-seeking and risk socialization that we see today is the grants of blanket limited liability to shareholders, along with the grant of legal personhood (with unlimited purposes and life and Constitutional rights) to corporations:

Limited liability has enabled corporate managers to act without close shareholder oversight and management; this I believe has played a key role in the vast misalignment of incentives that Michael Lewis and David Einhorn describe at the NYT, and in the risk mismanagement that Joe Nocera of the NYT describes at length in the NYT Magazine.  Those taking large bonuses (whether in the financial industry or large corporations) were essentially playing with OPM – Other People’s Money – and capturing the upside of short-term gains while leaving shareholders and taxpayers holding the bag for loses.

I hope that you and others here will look more deeply at the role of the state in the problem of misaligned incentives that continue to corrupt American capitalism.

It is not clear what Glassman and Nolan intend with their reference to “corporations with partnership attributes”, but I would note that corporations that make use of an unlimited liability structure (as American Express once did) share the main “partnership attribute” – that the owners of the firm may be, if the assets of the firm are insufficient, personally liable to creditors for all debts of the firm (other than those whose creditors agree in advance to limit recourse), particularly for torts to involuntary third parties.  The availability of the unlimited liability corporate form in various jurisdiction should be further investigated.

I agree with Glassman and Nolan that governments should recognize the better risk management that partnerships are likely to conduct, but not merely in the financial sector but in other industrial, commercial and professional fields as well.   Such recognition could take the form of eased regulations, for example.  I favor aggressive pursuit of this “carrot” approach to encouraging better risk management and less shifting of risks to shareholders, government and citizens generally.  However, this fails to consider what should be done about existing public companies and other limited liability corporations.  I would urge more aggressive veil-piercing, both judicially and by statute.

In any case, it is gratifying to see this topic getting some of the attention that it deserves.

Categories: limited liability, partnerships Tags:

NY’s oil spill fund: limited liability means owners of polluting firms can walk away, leaving citizens and states holding the bag for risks & clean-up costs

February 24th, 2009 No comments

There’s an interesting article in the Feb. 22 Times Union on the ineffectiveness of the New York oil spill fund:

Oil polluters pass on spill costs to public

The New York Environmental Protection and Spill Compensation Fund pays to clean up oil spills if polluters won’t handle it themselves. While the state is supposed to get that money back, it is owed millions by companies that won’t settle up. In more than 1,100 cases — some dating back to the early 1980s — the state has recouped just 17 cents on every dollar it spent.

As I’ve noted previously on several occasions, the limited liability that states grant to owners of corporations means that owners of polluting firms can walk away, leaving citizens and states holding the bag for risks and clean-up costs; this is true not only for the New York emergency oil spill clean-up fund, but for ordinary pollution damages where individuals are seeking compensation.  This problem is manifest in, and has been compounded in, New York, where the gas tax-funded clean-up fund system is clearly not working; not only has the fund been bailed out by general taxes, but the gas tax being used to fund it has been increased eight-fold since 1978, and the fund argues that it lacks sufficient enforcement tools.  At least part of the problem may be that the fund administrators find it easier simply to clean up and increase taxes than to try to pursue polluters.

As New York ponders reforms, the New York legislature ought to consider explicitly “piercing the corporate veil” by providing that the owners and executives of polluting firms – including shareholders of public companies – have direct personal liability for clean-up costs.

That may do wonders in incentivizing them to make sure that the firm that they own and/or manage (or an insurer on its/their behalf) promptly reimburses the fund for clean-up costs.  One suspects it might even cut down on the number of oil spills!

[Update] Cato’s Jerry Taylor: Nuclear power is "solar power for conservatives" and needs "a policy of tough love"

February 24th, 2009 No comments

After decades of loathing nuclear power as the ugly, monstrous child of a big government Dr. Frankenstein, climate-change-fearing enviros like George Monbiot are finally coming around to the relative benefits of nuclear power.  This is a welcome change – as it is clear that coal has generated and continues to generate much greater environmental impacts (not only in extraction, but in acid rain, particulates, heavy metals, released radiation and fly waste) – but that doesn’t mean that libertarians or conservatives ought to support throwing any more taxpayer dollars at nuclear power.

Rather, as Jerry Taylor (a senior fellow at the Cato Institute and well-regarded energy/environment expert) argues (as noted in the excerpts below), we should push nuclear power off the federal dole, deregulate power markets, and – IF we decide that climate change risks merits a constraint on greenhouse gas emissions  – we should do that through pricing mechanisms rather than by having the federal government further involved in the business of trying to guess what technological approaches will be successful via massive subsidies for nuclear or other “clean” technologies.

I pretty much agree with Taylor in principle (including on tackling power market regulatory issues), but believe that he both (1) dodges the massive pollution costs imposed by and rent-seeking conducted by coal firms and coal-fired utilities and (2) understates the economic case for carbon pricing, as I have noted elsewhere.

For those interested, I’ve collected below a few pieces of analysis and debate from Jerry Taylor on nuclear power:

– at Cato.org on June 21, 2003:

The federal government has always maintained a unique public-private partnership with the nuclear industry, wherein the costs of nuclear power are shared by the public but the profits are enjoyed privately. In an attempt to resuscitate this dying industry, the current Senate energy bill proposes unprecedented federal support for nuclear power. …

But nuclear power was ultimately rejected by investors because it simply does not make economic sense. In truth, nuclear power has never made economic sense and exists purely as a creature of government.

In fact, a recent report by Scully Capital Services, an investment banking and financial services firm, commissioned by the Department of Energy (DOE), highlighted three federal subsidies and regulations — termed “show stoppers” — without which the industry would grind to a halt. These “show stoppers” include the Price Anderson Act, which limits the liability of the nuclear industry in case of a serious nuclear accident — leaving taxpayers on the hook for potentially hundreds of billions in compensation costs; federal disposal of nuclear waste in a permanent repository, which will save the industry billions at taxpayer expense; and licensing regulations, wherein the report recommends that the Nuclear Regulatory Commission further grease the skids of its quasi-judicial licensing process to preclude successful interventions from opponents.

But even these long-standing subsidies are not enough to convince investors, who for decades have treated nuclear power as the pariah of the energy industry. Nuclear generated electricity remains about twice as expensive as coal- or gas-fired electricity. Although the marginal costs of nuclear are lower, the capital costs are much higher. In light of this resounding cold shoulder from Wall Street, the federal government is opening the treasury wider than ever before.

– at NRO on Jan. 26, 2006:

Nuclear power is solar power for conservatives — an energy source with every merit in the world save for the most important — economic merit. Investors — not environmentalists — are the parties that have turned against nuclear and there’s no reason for government to second guess the businessmen …

– at a Rountable on Nuclear Power and Energy Independence at Reason Foundation on October 21, 2008: [worth reviewing in whole]

Nuclear energy is to the Right what solar energy is to the Left: Religious devotion in practice, a wonderful technology in theory, but an economic white elephant in fact

But nuclear power plant construction costs are so high that it would take a very, very long time for nuclear facilities to pay for themselves if they only operated during high demand periods. Hence, nuclear power plants are only profitable in base-load markets. Gas-fired power plants, on the other hand, can be profitable in either market because not only are their upfront costs low but it is much easier to turn them off or on unlike nuclear.

Nuclear’s high up-front costs don’t just mean delayed profits, it also makes nuclear a more risky investment, especially since 20 states have scrapped policies that used to allow investors to charge rates that would guarantee their money back. This means that investors in new nuclear power plants are making a multi-billion dollar bet on disciplined construction schedules, accurate cost estimates, and the future economic health of the region. Bet wrong on any of the above and the company may well go bankrupt. Bet wrong on a gas-fired power plant, on the other hand, and corporate life will go on because there is less to lose given that the construction costs associated with gas-fired power plants are a small fraction of those associated with nuclear plants. …

Investors are also wary of nuclear plants because of the construction delays and cost over-runs that have historically plagued the industry. …  Nor have these construction delays had anything to do with regulatory obstruction or organized public opposition.

If nuclear power plants are so uneconomical, how then to explain the blizzard of permit applications for the construction and operation of new nuclear power plants that the Nuclear Regulatory Commission has received? Easy: These applications cost little and oblige utilities to do nothing. Industry analysts maintain that federal approvals will not translate into actual plants without a federal promise to private equity markets that, in case of default by power plants, the taxpayer will make good on the full sum of all bad nuclear loans.

Nuclear supporters often counter that construction costs would be a lot lower if regulators didn’t impose insanely demanding safety standards, byzantine and time-consuming permitting processes, or endless public hearings, any one of which could result in the plant being stopped in its tracks. Investors would also be more likely to invest, we’re told, if there were a high-level waste repository in place or more political support for nuclear power.

I would love to tell that story. I do, after all, work at the Cato Institute, and blaming government for economic problems is what keeps me in business. But what stops me is the fact that those complaints are not echoed by the nuclear power industry itself.

On the contrary, the industry in the early 1990s asked for – and got – exactly the sort of safety regulations, permit review process, and public comment regime now in place. Both public and political support for nuclear power is running so high than even a majority of Democrats in Congress are happy to not just tolerate nuclear power, but lavish even more subsidies upon it. And while Yucca Mountain may not be open now or ever, everyone seems reasonably content with the current on-site waste storage regime.

Indeed, if government were the reason why investors were saying “no” to their loan applications, I would expect that industry officials would be the first to say so. But they do not.

There’s another good reason why the industry is not protesting government intervention these days — the industry would not exist without it. Take away the 1.8¢ per kWh production tax credit available to the first 6,000 megawatts of new nuclear generation built prior to 2021, for instance, and Metcalf calculates that the levelized cost of new nuclear power plants jumps by 30 percent. Replace accelerated depreciation tax rules with regular depreciation rules and costs jump another 9 percent. Even zero taxation on nuclear power would increase costs by 6 percent because right now nuclear power enjoys a negative effective tax rate. Indeed, this jump by itself would make nuclear much more expensive than conventional coal, “clean” coal, and natural gas. Finally, repealing the $18 billion in federal loan guarantees recently promised the industry and eliminating regulations that relieve nuclear plant owners of the responsibility to pay third-parties to accept the risks associated with waste disposal would dampen market interest in nuclear power even further.

But the final nail in the coffin for the industry would be if the federal cap on the liability that nuclear power plant owners face in case of accidents (the Price-Anderson Act) were to be lifted.

Given all of this, how do France, India, China and Russia build cost-effective nuclear power plants? They don’t. Government officials in those countries, not private investors, decide what is built. …

Conservatives project nuclear power as the solution to greenhouse gas emissions. But they should resist that argument. If we slapped a carbon tax on the economy to “internalize” the costs associated with greenhouse gas emissions – the ideal way to address emissions if we find such policies necessary – then the “right” carbon tax would likely be about $2 per ton of emissions according to a survey of the academic literature by climate economist Richard Tol [As noted in the update further below, Taylor has subsequently moved from this low figure after reviewing Tol’s more recent work]. That’s not enough to make nuclear energy competitive against coal or natural gas according to calculations performed by the Electric Power Research Institute. In any case, if nuclear offers a cost-effective way to reduce greenhouse gas emissions, it should have to prove it by competing against alternatives in some future carbon-constrained market.  …

Those who favor nuclear power should adopt a policy of tough love. Getting this industry off the government dole would finally force it to innovate or die – at least in the United States. Welfare, after all, breeds sloth in both individual and corporate recipients. The Left’s distrust of nuclear power is not a sufficient rationale for the Right’s embrace of the same.

follow-up discussion on Reason Foundation’s “Out of Control” blog, involving Jerry Taylor and co-essayists William Tucker and Shikha Dalmia (also moderator) and various blog commenters. 

    One interesting point made in the follow-up discussion was that while our regulatory scheme is much tougher on nuclear power over risks that so far have been speculative, Taylor ignores the much heavier health damages (on the order of 25,000 deaths per year) generated by coal.  Taylor’s response:  perhaps so, but coal’s extra environmental cost should be directly addressed by being tougher on coal, not by subsidizing nuclear power.

 

[Update:  On the carbon pricing issue, subsequent to the October 2008 Rountable referred to above, I pointed out to Jerry that his reference to Tol was dated (based on a 2005 study rather than on Tol’s more recent 2008 study.  Jerry reviewed and summarized Tol’s most recent study at Cato in December 2008;  in this, (1) Taylor notes Tol’s conclusions that (a) the social cost of carbon emissions
is positive, that (b) there is so much uncertainty regarding costs that “a
considerable risk premium is warranted,”
and that, (c) consequently,
“greenhouse gas emission reduction today is justified,” and (2) Taylor concludes that “Given our skepticism about the underlying logic of discount rates of 1%
or less, any number between $3 per ton and $24 per ton seems
defensible.” 
However, Taylor remains conerned that “the political and economic transaction costs associated with imposing a carbon tax … likely exceed the benefits,” and argues that “there may be less expensive ways of reducing harm.”]

George Monbiot: taking the heat from other enviros for supporting for nuclear power

February 23rd, 2009 No comments

UK enviro-journalist-commentator George Monbiot has an interesting post in the Feb. 20 Guardian that explains his why he now believes that “A kneejerk rejection of  nuclear power is not an option“, and why he’s willing to take heat from others by raising the topic.

Reasonable (even if partly so) enviros?  What’s the world coming to?

Henry Payne/NRO and the Deal Not Taken: He’s shocked, shocked that Dems won’t end CAFE mileage standards

February 19th, 2009 2 comments

Henry Payne (cartoonist at the Detroit News and commentator at NRO) has a interesting post up on Feb. 18 at NRO’s enviro-bashing “Planet Gore” website: “Obama’s Washington Is the Enemy of Auto-Industry Reform“.  In it, Payne does a remarkable job of side-stepping the long history of the auto mess (poor governance, intransigent labor, counterproductive Washington meddling and competition from foreign automakers) and focussing on the blame that the Obama administration and “Washington Democrats” are likely to earn from further counterproductive policy.  In particular, he seems exercised that Dems are unlikely to eliminate the inefficient and costly CAFE standards.

Well, this seemed a little more myopic than I could stand, so I sent Mr. Payne the following note:

Henry, what did you expect to happen?  You can blame Dems for the mistakes that they will make, but Republicans are no better at governing, and it’s the car cos and the unions that are responsible for their current predicaments and unwillingness to budge.

“there will be no elimination of costly CAFE laws. It is shocking, in fact, that Washington Democrats are unwilling to even consider this fundamental, multi-billion-dollar reform. “

As for this, you are probably right – not the least because the Bush administration failed to act on climate change so enviro won the Supreme Court case that Jon Adler says essentially forces the EPA to do more of the same – but is anyone actually making a proposal that would include eliminating CAFE? 

But in this vein, back in the Bush heyday when Republicans had both houses of Congress, I’m sure Dems/enviros would have loved to trade away CAFE for rebated carbon taxes, or for improving power competition/smart grid a la Paul Joskow/Lynne Kiesling.  They might have even given up corporate income taxes entirely for such alternate revenues.  It is shocking, in fact, that Washington Republicans were unwilling to even consider this fundamental, multi-billion-dollar reform, that would have eliminated CAFE and avoided C&T pork and subsidies of the type that Obama and guys like Pickens wants.

But instead of even-handedness and looking for win-win deals, you can keep bashing Dems.  Good luck with that now, after Bush strong-armed Greenspan into creating this bubble, did a bunch of other nonsense and thus empowered Dems to finish off the job wrecking the economy – in order to “save” it.

While thinking creatively might not be easy, it’s a start at actually succeeding.

Best,

Tom

[Update re: Truthiness] Property rights? Why George Will WON’T be consistent on climate change when bashing climate "Malthusians"

February 18th, 2009 No comments

[Update below]

George Will has gifted us with a thoroughly confused op-ed in the Sunday WaPo

Will predictibly trots out the 1980 bet that Paul Ehrlich lost to Julian Simon over the prices of minerals and commodities – but fails to note that the reason that Simon won that bet was that people own land and that markets functioned to both to change demand and to elicit further supply.  None of this logic holds true for unowned, open-access resources – like the global atmosphere and the climate it modulates – as there simply are no property rights or markets in the air.  Until there are, people with legitimate preferences as to climate and man’s affect on it have little effective ways of expressing those preferences.

I made a few further comments at the “Denialism” Science Blog, which has a post up that points out that Will has greatly overstated the case for scientific concern over cooling in the 70s.

I think Will simply has a difficult time changing his mind,
particularly given his conservative leanings – he doesn’t want more
government programs – and the fact that climate change doesn’t happen
at the same speed as weather.

I sympathize with his complaint about alarmism – after all, that’s
how Bush got us into Iraq, and it’s how Obama justified the stimulus
package – but there is, after all, cause for concern about climate
change and it’s very difficult to see as a problem that markets
themselves can be expected to address – as the atmosphere is shared
globally and no one has any property rights in it.

However, Will has had better moments, such as last June when he
argued FOR a carbon tax
– at least as a better option to cap and trade
and tech subsidies:

This should not be ignored by either the skeptics or the AGWers.

However, this latest editorial is indeed disappointing, because it
turns its back on suggesting or considering any of the “no regrets” or
pro-free market policies that ought to be explored as possible win-win common ground
betweem “alarmists” and “skeptics”, as I keep noting:

http://mises.org/Community/blogs/tokyotom/archive/2009/01/10/neocons-conservatives-libertarians-and-exxon-join-jim-hansen-in-calling-for-rebated-carbon-taxes-in-lieu-of-massive-cap-trade-rent-seeking-and-industrial-planning.aspx

http://mises.org/Community/blogs/tokyotom/archive/2009/02/08/paul-jostrow-what-electric-power-regulatory-reforms-are-need-a-federal-power-act-of-2009.aspx

 

[Update:  On top of the points above, respected science writer Carl Zimmer at Discover Mag has noted in a post titled “George Will: Liberated From the Burden of Fact-Checking” that George Will also has his data wrong on the extent of sea ice, so much so that the data center that Will referred to posted their own correction of Will.

Zimmer further notes that Will is using a metric (the sum of
northern and southern hemisphere sea ice extent) that actually masks
the degree of climate change and melting.  Climate scientists have specifically noted:

In the
context of climate change, GLOBAL sea ice area may not be the most
relevant indicator. Almost all global climate models project a decrease
in the Northern Hemisphere sea ice area over the next several decades
under increasing greenhouse gas scenarios. But, the same model
responses of the Southern Hemisphere sea ice are less certain. In fact,
there have been some recent studies suggesting the amount of sea ice in
the Southern Hemisphere may initially increase as a response to
atmospheric warming through increased evaporation and subsequent
snowfall
onto the sea ice. …

we urge
interested parties to consider the many variables and resources
available when considering observed and model-projected climate change.
For example, the ice that is presently in the Arctic Ocean is
younger and thinner than the ice of the 1980s and 1990s. So Arctic ice
volume is now below its long-term average by an even greater amount
than is ice extent or area
.  (emphasis added)

Is George Will becoming a master of truthiness?]

Empowering power consumers: Google beta tests software to give consumers real-time info

February 17th, 2009 No comments

“If you cannot measure it;

You cannot improve it.”

— Lord Kelvin

Consistent with its mission to “organize the world’s information and make it universally accessible and useful,” Google, whose climate change-related efforts I’ve blogged about previously, is trying to help consumers to measure and track their real-time electric usage, thereby allowing them to make better choices as to when and how they use electricity.

Google is now beta testing new “PowerMeter” software – a secure iGoogle Gadget that it plans to give away free (though no doubt there will be a buck or two for Google in advertising and data services later) – that will provide near real-time power usage information to consumers who have advanced “Smart Meters”.  This information will make it easy for consumers to figure out when and how they are using electricity, to manage such use by device and to better match such use to the pricing programs of their utilities.  So far, Google testers have found that the software allows them to relatively easily cut use (by an average of 15%), and to save on their electricity bills by an even greater percentage.

The availability of such software will motivate consumers everywhere to push their utilizing to establish Smart Meter programs, for access to the information generated by such meters, and for an array of services and pricing programs.  There should be a boom smart meters, as the Obama Administration’s proposed stimulus package targets supporting their installation in over 40 million U.S. homes
over the next three years.

While Smart Meter / Smart Grid programs have been growing, there is still considerable market fragmentation and rights of consumers have not been clearly spelled out. According to Google, while some state regulators have ordered utilities to deploy smart
meters, their focus has been on their use by utilities and grid
managers, and not on consumer rights to the information they generate.  As a result, Google is engaged in policy advocacy as well; says Google:

“deploying smart meters alone isn’t enough. This needs to be coupled
with a strategy to provide customers with easy access to energy
information. That’s why we believe that open protocols and standards
should serve as the cornerstone of smart grid projects, to spur
innovation, drive competition, and bring more information to consumers
as the smart grid evolves. We believe that detailed data on your
personal energy use belongs to you, and should be available in an open
standard, non-proprietary format. You should control who gets to see
your data, and you should be free to choose from a wide range of
services to help you understand it and benefit from it. For more
details on our policy suggestions, check out the comments we filed yesterday with the California Public Utility Commission.”

While it’s not clear yet how significant a role Google will end up playing in this market, Google is to be commended, as both its PowerMeter software and its advocay efforts will help pave the way to greater consumer choice and freer markets.

What we need in addition is for the Obama Administration and Congress to give a kick in the pants to electric power market reform and deregulation along the lines of proposals that I have noted elsewhere.  Consumers need not only better information, but greater competition in who is providing them electricity and in the sources that are used to generate it.

Christian Science Monitor summary here:

New York Times

Wired

The Google Blog

Google’s PowerMeter website

[View:http://www.youtube.com/watch?v=6Dx38hzRWDQ:550:0]

Categories: Google, obama, power, smart grid Tags:

Fat Tails Part Deux: cost-benefit analysis and climate change; Weitzman replies to Nordhaus

February 13th, 2009 No comments

[Note:  Although the giant snakes I mentioned in my preceding post may have fat tails, I didn’t want my description of the discussion between Harvard`s Martin Weitzman and Yale`s William Nordhaus of the limits of cost-benefit analysis to be overlooked, so I have largely copied it below.  I’ve added an introduction, as well as a few links.]

“Fat tails” seem to be the rage these days, as Bill Safire noted last week in the NYT.  But what are “fat tails”?  Notes Safire,

To comprehend what fat tail is in
today’s media wringer, think of a bell curve, the line on a
statistician’s chart that reflects “normal distribution.” It is tall
and wide in the middle — where most people and things being studied
almost always tend to be — and drops and flattens out at the bottom,
where fewer are, making a shape on a graph resembling a bell. The
extremities at the bottom left and right are called the tails; when
they balloon instead of nearly vanishing as expected, the tails have
been designated “heavy” and, more recently, the more pejorative “fat.”
To a credit-agency statistician now living in a world of chagrin, the
alliterative definition of a fat tail is “an abnormal agglomeration of angst.”

In
an eye-popping Times Magazine article last month titled “Risk
Mismanagement
,” Joe Nocera, a business columnist for The Times, focused
on the passionate, prescient warnings of the former options trader
Nassim Nicholas Taleb, author of “The Black Swan” and “Fooled by
Randomness,”
who popularized the phrase now in vogue in its
financial-­statistics sense. Nocera wrote: “What will cause you to lose
billions instead of millions? Something rare, something you’ve never
considered a possibility. Taleb calls these events ‘fat tails
or ‘black swans,’
and he is convinced that they take place far more
frequently than most human beings are willing to contemplate.”

If I make quibble with Safire’s description; “fat” refers not to the probabilty distribution ballooning on either tail, but refers to the case that the tail probability does not decline quickly to zero (viz., probability approaches zero more slowly than exponentially).

*   *   *

The size of the giant snakes and the much higher temperatures (and GHG levels) at their time (60 million years ago) and shortly after during the PETM (a perod 56 million years ago temperatures shot up by 5° Celsius / 9° F in less than 10,000 years) tell us no simply that climate is sensitive
(on geological scales, sometimes rather short-term) to atmospheric
levels of carbon and methane, but  remind us that there is a “fat tail” of uncertain climate change risks
posed by mankind`s ramped up efforts to release as much as possible of
the CO2 that has been stored up in the form of fossil fuels, methane
and limestone over millions years.  

I have mentioned the issue of “fat tails” previously,
in connection with attempts at applying cost – benefit analysis (CBA)
to determine whether to tax CO2 emissions.  While economists like
Yale`s William Nordhaus who have applied CBA to climate policy have been saying for decades that taxing carbon makes sense on a net basis, our own Bob Murphy has criticized Nordhaus`s approach on rather narrow (and decidedly non-Austrian) grounds.

But Nordhaus has also been strongly criticized by economists such as Harvard`s Martin Weitzman,
who basically argue that Nordhaus has UNDERSOLD the case for carbon
pricing or that the results of such CBA imply a greater certainty of
knowledge (and complacency) than is deserved.  Weitzman points out
basic difficulties inherent in applying CBA to policies addressing
climate change, particularly where there seems to be a grave
possibility that we do not understand how drastically the climate might
respond to our influences.  Weitzman`s comments (scheduled to appear in
the February issue of The Review of Economics and Statistics) were the focus of the lead essay by Jim Manzi in Cato Unbound`s August 2008 issue, which I reviewed.

Nordhaus has since responded to Weitzman in a comment that became available in January; this time with Bob Murphy stepped in as a defender of CBA!  I note that Ron Bailey, science correspondent at Reason online, has just published a piece examining Weitzman’s paper last year and Nordhaus’s recent comments.

Weitzman has now replied to Nordhaus, and has kindly permitted me to
quote from a draft of his reply (which he has out for review).  It seems that Weitzman
provides a compelling statement of some the limits of CBA, as applied
to climate change.  (NB:  Weitzman`s draft response is a .pdf file that I cannot upload, though I have uploaded a version converted to .txt format.  I am happy to forward the .pdf to any interested readers.)

Weitzman`s criticisms of the limits of CBA ought to resonate with Austrian concerns about complexity, limits of knowledge and the difficulty of prediction — even as Weitzman (and Nordhaus and, indeed, Bob Murphy) completely fail to consider the fundamental problems of conflicting preferences in the absence of property rights and of the likelihood that rent-seeking with corrupt governmental policy responses.

 

The rest of the post sets those of Weitzman`s key points that I consider most salient to a discussion among laymen:

“there is enormous structural uncertainty about the economics of extreme climate change,
which, if not unique, is pretty rare. I will argue on intuitive grounds
that the way in which this deep structural uncertainty is
conceptualized and formalized should influence substantially the
outcomes of any reasonable CBA (or IAM) of climate change. Further, I
will argue that the seeming fact that this deep structural
uncertainty does not influence substantially outcomes from the
“standard” CBA hints at an implausible treatment of uncertainty.”

“The
pre-industrial-revolution level of atmospheric CO2 (about two centuries
ago) was

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


about280 parts per million (ppm). The ice-core data show that
carbon dioxide was within a range roughly between

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


180 and

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


280 ppm
during the last 800,000 years. Currently, CO2 is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


385 ppm, and
climbing steeply. Methane was never higher than

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


750 parts per billion
(ppb) in 800,000 years, but now this extremely potent GHG, which is
thirty times more powerful than CO2, is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1,780 ppb. The sum total of
all carbon-dioxide-equivalent (CO2-e) GHGs is currently at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


435 ppm.
Even more alarming in the 800,000-year record is the rate of change of
GHGs, with increases in CO2 being below (and typically well below)

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40
ppm within any past sub-period of ten thousand years, while now CO2 has
risen by

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40 ppm in just the last quarter century.

Thus, anthropogenic
activity has elevated atmospheric CO2 and CH4 to levels extraordinarily
far outside their natural range – and at a stupendously rapid rate. The
scale and speed of recent GHG increases makes predictions of future
climate change highly uncertain.  There is no analogue for anything
like this happening in the past geological record. Therefore, we do not
really know with much confidence what will happen next.”

“To keep atmospheric CO2 levels at twice pre-industrial-revolution levels would require not just stable but sharply declining emissions within a few decades from now. Forecasting
ahead a century or two, the levels of atmospheric GHGs that may
ultimately be attained (unless drastic measures are undertaken) have
likely not existed for tens of millions of years and the rate of change
will likely be unique on a time scale of hundreds of millions of years.

Remarkably,
the “standard”CBA of climate change takes essentially no account of the
extraordinary magnitude of the scale and speed of these unprecedented
changes in GHGs – and the extraordinary uncertainties they create for
any believable economic analysis of climate change.
Perhaps even
more astonishing is the fact that the “policy ramp” of gradually
tightening emissions, which emerges from the “standard” CBA, attains
stabilization at levels of CO2-e GHGs that approach

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


700 ppm. The
“standard” CBA [of Nordhaus] thus recommends imposing an impulse or
shock to the Earth’s system by geologically-instantaneously jolting
atmospheric stocks of GHGs up to

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


21/2 times their highest past level
over the last 800,000 years – without even mentioning what an
unprecedented planetary experiment such an “optimal” policy would
entail.”

“So-called
“climate sensitivity” (hereafter denoted S1) is a key macro-indicator
of the eventual temperature response to GHG changes. Climate
sensitivity is defi…ned as the global average surface warming following
a doubling of carbon dioxide concentrations. … the median upper 5%
probability level over all 22 climate-sensitivity studies cited in
IPCC-AR4 (2007) is 6.4° C – and this stylized fact alone is telling.
Glancing at Table 9.3 and Box 10.2 of IPCC-AR4, it is apparent that the
upper tails of these 22 PDFs tend to be sufficiently long and heavy
with probability that one is allowed from a simplistically-aggregated
PDF of these 22 studies the rough approximation P[S1>10° C]

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1%. The
actual empirical reason why these upper tails are long and heavy with
probability dovetails nicely with the theory of my paper: inductive
knowledge is always useful, of course, but simultaneously it is limited
in what it can tell us about extreme events outside the range of
experience – in which case one is forced back onto depending more than
one might wish upon the prior PDF, which of necessity is largely
subjective and relatively diffuse. As a recent Science commentary put
it: “Once the world has warmed by 4° C, conditions will be so
different from anything we can observe today (and still more different
from the last ice age) that it is inherently hard to say where the
warming will stop.”

“Exhibit C” concerns possibly disastrous releases over the long run of bad-feedback components
of the carbon cycle that are currently omitted from most general
circulation models. The chief worry here is a significant supplementary
component that conceptually should be added on to climate sensitivity
S1. This omitted component concerns the potentially powerful
self-amplification potential of greenhouse warming due to heat-induced
releases of sequestered carbon. … Over the long run, a CH4
outgassing-amplifier process could potentially precipitate a
cataclysmic strong-positive-feedback warming
. This real physical
basis for a highly unsure but truly catastrophic scenario is my Exhibit
C in the case that conventional CBAs and IAMs do not adequately cover
the deep structural uncertainties associated with possible
climate-change disasters.  Other examples of an actual real physical
basis for a catastrophic outcome could be cited, but this one will do
here.  The real physical possibility of endogenous heat-triggered
releases at high temperatures of the enormous amounts of
naturally-sequestered GHGs is a good example of indirect carbon-cycle
feedback effects that I think should be included in the abstract
interpretation of a concept of “climate sensitivity” that is relevant
here. What matters for the economics of climate change is the
reduced-form relationship between atmospheric stocks of
anthropogenically-injected CO2-e GHGs and temperature change. … When
fed into an economic analysis, the great open-ended uncertainty about
eventual mean planetary temperature change cascades into
yet-much-greater yet-much-more-open-ended uncertainty about eventual
changes in welfare.”

“Exhibit
D” concerns what I view as an unusually cavalier treatment of damages or
disutilities from extreme temperature changes. The “standard” CBA
treats high-temperature damages by a rather passive extrapolation of
whatever specification is assumed (typically arbitrarily) to be the
low-temperature “damages function.”  … Seemingly minor changes in
the specification of high-temperature damages can dramatically alter
the gradualist policy ramp outcomes recommended by the “standard” CBA.

Such fragility of policy to postulated forms of disutility functions
are my Exhibit D in making the case that the “standard” CBA does not
adequately cope with deep structural uncertainty – here structural
uncertainty about the specification of damages.”

“An
experiment without precedent is being performed on planet Earth by
subjecting the world to the shock of a geologically-instantaneous
injection of massive amounts of GHGs. Yet the “standard” CBA seems
almost oblivious to the extraordinarily uncertain consequences of
catastrophic climate change.”

“Almost
nothing in our world has a probability of exactly zero or exactly one.
What is worrisome is not the fact that extreme tails are long per se
(reflecting
the fact that a meaningful upper bound on disutility does not exist),
but that they are fat (with probability density). The critical
question is how fast does the probability of a catastrophe decline
relative to the welfare impact of the catastrophe. Other things being
equal, a thin-tailed PDF is of less concern because the probability of
the bad event declines exponentially (or faster). A fat-tailed
distribution, where the probability declines polynomially, can be much
more worrisome.
… To put a sharp point on this seemingly abstract issue, the
thin-tailed PDFs that Nordhaus requires implicitly to support his
gradualist “policy ramp” conclusions have some theoretical tendency to
morph into being fat tailed when he admits that he is fuzzy about the
functional forms or structural parameters of his assumed thin-tailed
PDFs
– at least for high temperatures. … When one combines fat
tails in the PDF of the logarithm of welfare-equivalent consumption
with a utility function that is sensitive to high damages from extreme
temperatures, it will tend to make the willingness to pay (WTP) to
avoid extreme climate changes very large.”

“Presumably
the PDF in the bad fat tail is thinned, or even truncated, perhaps from
considerations akin to what lies behind the value of a statistical life
(VSL). (After all, we would not pay an infinite amount to eliminate
altogether the fat tail of climate-change catastrophes.) Alas, in
whatever way the bad fat tail is thinned or truncated, a CBA based upon
it remains highly sensitive to the details of the thinning or
truncation mechanism, because the disutility of extreme climate change
has “essentially” unlimited liability.
In this sense climate change
is unique (or at least very rare) because the conclusions from a CBA
for such an unlimited-liability situation have some built-in tendency
to be non-robust to assumed tail fatness.”

“Reasonable
attempts to constrict the fatness of the “bad” tail can still leave us
with uncomfortably big numbers, whose exact value depends non-robustly
upon artificial constraints, functional forms, or parameters that we
really do not understand. The only legitimate way to avoid this
potential problem is when there exists strong a priori knowledge that
restrains the extent of total damages.
If a particular type of
idiosyncratic uncertainty affects only one small part of an
individual’s or a society’s overall portfolio of assets, exposure is
naturally limited to that specific component and bad-tail fatness is
not such a paramount concern. However, some very few but very
important real-world situations have potentially unlimited exposure due
to structural uncertainty about their potentially open-ended
catastrophic reach. Climate change potentially affects the whole
worldwide portfolio of utility by threatening to drive all of planetary
welfare to disastrously low levels in the most extreme scenarios.”

“Conclusions
from CBA [are] more fuzzy than we might prefer, because they are
dependent on essentially arbitrary decisions about how the fat tails
are expressed and about how the damages from high temperatures are
specified.
I would make a strong distinction between thin-tailed
CBA, where there is no reason in principle that outcomes should not be
robust, and fat-tailed CBA, where even in principle outcomes are
highly sensitive to functional forms and parameter values. For ordinary
run-of-the-mill limited exposure or thin-tailed situations, there is at
least the underlying theoretical reassurance that finite-cutoff-based
CBA might (at least in principle) be an arbitrarily-close approximation
to something that is accurate and objective. In fat-tailed unlimited
exposure situations, by contrast, there is no such theoretical
assurance underpinning the arbitrary cutoffs or attenuations – and
therefore CBA outcomes have a theoretical tendency to be sensitive to
fragile assumptions about the likelihood of extreme impacts and how
much disutility they cause.”

“My
target is not CBA in general, but the particular false precision
conveyed by the misplaced concreteness of the “standard” CBA of climate
change. By all means plug in tail probabilities, plug in disutilities
of high impacts, plug in rates of pure time preference, and so forth,
and then see what emerges empirically. Only please do not be surprised
when outcomes from fat-tailed CBA are fragile to specifications
concerning catastrophic extremes.  The extraordinary magnitude of the
deep structural uncertainties involved in climate-change CBA, and the
implied limitations that prevent CBA from reaching robust conclusions,
are highly frustrating for most economists, and in my view may even
push some into a state of denial. After all, economists make a living
from plugging rough numbers into simple models and reaching specific
conclusions (more or less) on the basis of these numbers. What are we
supposed to tell policy makers and politicians if our conclusions are
ambiguous and fragile?”

“It is
threatening for economists to have to admit that the structural
uncertainties and unlimited liabilities of climate change run so deep
that gung-ho “can do” economics may be up against limits on the ability of quantitative analysis to give robust advice in such a grey area. But if this is the way things are with the economics of climate change, then this is the way things are – and non-robustness to subjective assumptions is an inconvenient truth to be lived with rather than a fact to be denied or evaded
just because it looks less scientif…cally objective in CBA. In my
opinion, we economists need to admit to the policy makers, the
politicians, and the public that CBA of climate change is unusual
in being especially fuzzy because it depends especially sensitively on
what is subjectively assumed about the high-temperature damages
function, along with subjective judgements about the fatness of the
extreme tails and/or where they have effectively been cut off
.
Policy makers and the public will just have to deal with the idea that
CBA of climate change is less crisp (maybe I should say even less
crisp) than CBAs of more conventional situations.”

“The
moral of the dismal theorem is that under extreme uncertainty,
seemingly casual decisions about functional forms, parameter values,
and tail thickness may be dominant. We economists should not pursue
a narrow, superficially precise, analysis by blowing away the
low-probability high-impact catastrophic scenarios as if this is a
necessary price we must pay for the worthy goal of giving crisp advice.
An artificial infatuation with precision is likely to make our analysis
go seriously askew and to undermine the credibility of what we say by
effectively marginalizing the very possibilities that make climate
change grave in the first place.

“The
issue of how to deal with the deep structural uncertainties in climate
change would be completely different and immensely simpler if systemic
inertias (like the time required for the system to naturally remove
extra atmospheric CO2) were short (as is the case for SO2;
particulates, and many other airborne pollutants). Then an important
part of an optimal strategy would presumably be along the lines of
“wait and see.” With strong reversibility, an optimal
climate-change policy should logically involve (among other elements)
waiting to see how far out on the bad fat tail the planet will end up,
followed by midcourse corrections if we seem to be headed for a
disaster. This is the ultimate backstop rebuttal of DT given by some
critics of fat-tailed reasoning, including Nordhaus. Alas, the problem
of climate change is characterized everywhere by immensely long
inertias – in atmospheric CO2 removal times, in the capacity of the
oceans to absorb heat (as well as CO2), and in many other relevant
physical and biological processes. Therefore, it is an open question
whether or not we could learn enough in sufficient time to make
politically feasible midcourse corrections. When the critics are
gambling on this midcourse-correction learning mechanism to undercut
the message of DT, they are relying more on an article of faith than on
any kind of evidence-based scientific argument.

“I
think the actual scientific facts behind the alleged feasibility of
“wait and see”policies are, if anything, additional evidence for the
importance of fat-tailed irreversible uncertainty about ultimate
climate change.

“The
relevance of “wait and see”policies is an important unresolved issue,
which in principle could decide the debate between me and Nordhaus, but
my own take right now would be that the built-in pipeline inertias
are so great that if and when we detect that we are heading for
unacceptable climate change, it will likely prove too late to do
anything much about it for centuries to come thereafter
(except,
possibly, for lowering temperatures by geoengineering the atmosphere to
reflect back incoming solar radiation). In any event, I see this whole
“wait and see” issue as yet another component of fat-tailed uncertainty
– rather than being a reliable backstop strategy for dealing with
excessive CO2 in the atmosphere.

Nordhaus
states that there are so many low-probability catastrophic-impact
scenarios around that ‘if we accept the Dismal Theorem, we would
probably dissolve in a sea of anxiety at the prospect of the infinity
of infinitely bad outcomes.’ This is rhetorical excess and, more to the
point here, it is fallacious. Most of the examples Nordhaus gives have
such miniscule thin-tailed probabilities that they can be written off.”

Nordhaus
summarizes his critique with the idea there are indeed deep
uncertainties about virtually every aspect of the natural and social
sciences of climate change – but these uncertainties can only be
resolved by continued careful analysis of data and theories. I heartily
endorse his constructive attitude about the necessity of further
research targeted toward a goal of resolving as much of the uncertainty
as it is humanly possible to resolve.
I would just add that we
should also recognize the reality that, for now and perhaps for some
time to come, the sheer magnitude of the deep structural uncertainties,
and the way we express them in our models, will likely dominate
plausible applications of CBA to the economics of climate change
.”

(emphasis added)

Let`s recreate the Paleocene! Giant snakes, "fat tails", cost-benefit analysis and climate change; Weitzman replies to Nordhaus

February 11th, 2009 1 comment

Giant snakes?  What could a few colossal bones found in Colombia have to do with us now?

1.  A recent paper in Nature about the discovery of several specimens of a giant snake (“Titanoboa”) that lived in Latin America 60 million years ago captured attention last week, including among climate change bloggers (yes, “skeptics” too).  Why?  Not only because the snakes were enormous (more than 40 feet and over a ton) – making anacondas look like garter snakes – but because their size appears to tell us something about the climate about during the Paleocene.  Based on existing knowledge of the size, metabolism and temperature tolerances of  snakes, scientists believe that the size of the snake appears to indicate that not only was the world overall quite warm during the Paleocene (with palms growing at the poles), but that average temperatures in the tropics would have been from 3° to 5° Celsius (5° to 9° F) warmer than they are today in order for such large snakes to  survive.

The period in which these snakes lived was followed a few million years later by the Paleocene – Eocene Thermal Maximum (PETM) in 56 million BC, when a pulse of CO2 and methane drove already warm temperatures sharply higher (by 5° Celsius / 9° F) in less than 10,000 years. During the PETM, CO2 levels rose to about 2000 ppm, or roughly 6 times  where they are now. The PETM resulted in a massive extinction of species.

The size of the snakes and the temperatures at their time and shortly after during the PETM also tell us that climate is sensitive (on geological scales, sometimes rather short-term) to atmospheric levels of carbon and methane – and remind us that there is a “fat tail” of uncertain climate change risks posed by mankind`s ramped up efforts to release as much as possible of the CO2 that has been stored up in the form of fossil fuels, methane and limestone over millions years.  

2.  I have mentioned the issue of “fat tails” previously, in connection with attempts at applying cost – benefit analysis (CBA) to determine whether to tax CO2 emissions.  While economists like Yale`s William Nordhaus who have applied CBA to climate policy have been saying for decades that taxing carbon makes sense on a net basis, our own Bob Murphy has criticized Nordhaus`s approach on rather narrow (and decidedly non-Austrian) grounds.

But Nordhaus has also been strongly criticized by economists such as Harvard`s Martin Weitzman, who basically argue that Nordhaus has UNDERSOLD the case for carbon pricing or that the results of such CBA imply a greater certainty of knowledge (and complacency) than is deserved.  Weitzman points out basic difficulties inherent in applying CBA to policies addressing climate change, particularly where there seems to be a grave possibility that we do not understand how drastically the climate might respond to our influences.  Weitzman`s comments (scheduled to appear in the February issue of The Review of Economics and Statistics) were the focus of the lead essay by Jim Manzi in Cato Unbound`s August 2008 issue, which I reviewed.

Nordhaus has since responded to Weitzman, and this time with Bob Murphy stepped in as a defender of CBA.  Weitzman has now replied to Nordhaus, and has kindly permitted me to quote from the current draft of such reply.  It seems that Weitzman provides a compelling statement of some the limits of CBA, as applied to climate change. It seems to me that any Austrian ought to be sympathetic to Weitzman`s criticisms of the limits of CBA.

(NB:  Weitzman`s draft response is a .pdf file that I cannot upload, though I have uploaded a version convert to .txt format.  I am happy to forward the .pdf to any interested readers.)

The rest of the post sets out the most salient (for a layman) of Weitzman`s key points:

“there is enormous structural uncertainty about the economics of extreme climate change,
which, if not unique, is pretty rare. I will argue on intuitive grounds
that the way in which this deep structural uncertainty is
conceptualized and formalized should influence substantially the
outcomes of any reasonable CBA (or IAM) of climate change. Further, I
will argue that the seeming fact that this deep structural
uncertainty does not influence substantially outcomes from the
“standard” CBA hints at an implausible treatment of uncertainty.”

“The
pre-industrial-revolution level of atmospheric CO2 (about two centuries
ago) was

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


about280 parts per million (ppm). The ice-core data show that
carbon dioxide was within a range roughly between

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


180 and

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


280 ppm
during the last 800,000 years. Currently, CO2 is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


385 ppm, and
climbing steeply. Methane was never higher than

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


750 parts per billion
(ppb) in 800,000 years, but now this extremely potent GHG, which is
thirty times more powerful than CO2, is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1,780 ppb. The sum total of
all carbon-dioxide-equivalent (CO2-e) GHGs is currently at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


435 ppm.
Even more alarming in the 800,000-year record is the rate of change of
GHGs, with increases in CO2 being below (and typically well below)

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40
ppm within any past sub-period of ten thousand years, while now CO2 has
risen by

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40 ppm in just the last quarter century.

Thus, anthropogenic
activity has elevated atmospheric CO2 and CH4 to levels extraordinarily
far outside their natural range – and at a stupendously rapid rate. The
scale and speed of recent GHG increases makes predictions of future
climate change highly uncertain.  There is no analogue for anything
like this happening in the past geological record. Therefore, we do not
really know with much confidence what will happen next.”

“To keep atmospheric CO2 levels at twice pre-industrial-revolution levels would require not just stable but sharply declining emissions within a few decades from now. Forecasting
ahead a century or two, the levels of atmospheric GHGs that may
ultimately be attained (unless drastic measures are undertaken) have
likely not existed for tens of millions of years and the rate of change
will likely be unique on a time scale of hundreds of millions of years.

Remarkably,
the “standard”CBA of climate change takes essentially no account of the
extraordinary magnitude of the scale and speed of these unprecedented
changes in GHGs – and the extraordinary uncertainties they create for
any believable economic analysis of climate change.
Perhaps even
more astonishing is the fact that the “policy ramp” of gradually
tightening emissions, which emerges from the “standard” CBA, attains
stabilization at levels of CO2-e GHGs that approach

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


700 ppm. The
“standard” CBA [of Nordhaus] thus recommends imposing an impulse or
shock to the Earth’s system by geologically-instantaneously jolting
atmospheric stocks of GHGs up to

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


21/2 times their highest past level
over the last 800,000 years – without even mentioning what an
unprecedented planetary experiment such an “optimal” policy would
entail.”

“So-called
“climate sensitivity” (hereafter denoted S1) is a key macro-indicator
of the eventual temperature response to GHG changes. Climate
sensitivity is defi…ned as the global average surface warming following
a doubling of carbon dioxide concentrations. … the median upper 5%
probability level over all 22 climate-sensitivity studies cited in
IPCC-AR4 (2007) is 6.4° C – and this stylized fact alone is telling.
Glancing at Table 9.3 and Box 10.2 of IPCC-AR4, it is apparent that the
upper tails of these 22 PDFs tend to be sufficiently long and heavy
with probability that one is allowed from a simplistically-aggregated
PDF of these 22 studies the rough approximation P[S1>10° C]

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1%. The
actual empirical reason why these upper tails are long and heavy with
probability dovetails nicely with the theory of my paper: inductive
knowledge is always useful, of course, but simultaneously it is limited
in what it can tell us about extreme events outside the range of
experience – in which case one is forced back onto depending more than
one might wish upon the prior PDF, which of necessity is largely
subjective and relatively diffuse. As a recent Science commentary put
it: “Once the world has warmed by 4° C, conditions will be so
different from anything we can observe today (and still more different
from the last ice age) that it is inherently hard to say where the
warming will stop.”

“Exhibit C” concerns possibly disastrous releases over the long run of bad-feedback components
of the carbon cycle that are currently omitted from most general
circulation models. The chief worry here is a significant supplementary
component that conceptually should be added on to climate sensitivity
S1. This omitted component concerns the potentially powerful
self-amplification potential of greenhouse warming due to heat-induced
releases of sequestered carbon. … Over the long run, a CH4
outgassing-amplifier process could potentially precipitate a
cataclysmic strong-positive-feedback warming
. This real physical
basis for a highly unsure but truly catastrophic scenario is my Exhibit
C in the case that conventional CBAs and IAMs do not adequately cover
the deep structural uncertainties associated with possible
climate-change disasters.  Other examples of an actual real physical
basis for a catastrophic outcome could be cited, but this one will do
here.  The real physical possibility of endogenous heat-triggered
releases at high temperatures of the enormous amounts of
naturally-sequestered GHGs is a good example of indirect carbon-cycle
feedback effects that I think should be included in the abstract
interpretation of a concept of “climate sensitivity” that is relevant
here. What matters for the economics of climate change is the
reduced-form relationship between atmospheric stocks of
anthropogenically-injected CO2-e GHGs and temperature change. … When
fed into an economic analysis, the great open-ended uncertainty about
eventual mean planetary temperature change cascades into
yet-much-greater yet-much-more-open-ended uncertainty about eventual
changes in welfare.”

“Exhibit
D” concerns what I view as an unusually cavalier treatment of damages or
disutilities from extreme temperature changes. The “standard” CBA
treats high-temperature damages by a rather passive extrapolation of
whatever specification is assumed (typically arbitrarily) to be the
low-temperature “damages function.”  … Seemingly minor changes in
the specification of high-temperature damages can dramatically alter
the gradualist policy ramp outcomes recommended by the “standard” CBA.

Such fragility of policy to postulated forms of disutility functions
are my Exhibit D in making the case that the “standard” CBA does not
adequately cope with deep structural uncertainty – here structural
uncertainty about the specification of damages.”

“An
experiment without precedent is being performed on planet Earth by
subjecting the world to the shock of a geologically-instantaneous
injection of massive amounts of GHGs. Yet the “standard” CBA seems
almost oblivious to the extraordinarily uncertain consequences of
catastrophic climate change.”

“Almost
nothing in our world has a probability of exactly zero or exactly one.
What is worrisome is not the fact that extreme tails are long per se
(reflecting
the fact that a meaningful upper bound on disutility does not exist),
but that they are fat (with probability density). The critical
question is how fast does the probability of a catastrophe decline
relative to the welfare impact of the catastrophe. Other things being
equal, a thin-tailed PDF is of less concern because the probability of
the bad event declines exponentially (or faster). A fat-tailed
distribution, where the probability declines polynomially, can be much
more worrisome.
… To put a sharp point on this seemingly abstract issue, the
thin-tailed PDFs that Nordhaus requires implicitly to support his
gradualist “policy ramp” conclusions have some theoretical tendency to
morph into being fat tailed when he admits that he is fuzzy about the
functional forms or structural parameters of his assumed thin-tailed
PDFs
– at least for high temperatures. … When one combines fat
tails in the PDF of the logarithm of welfare-equivalent consumption
with a utility function that is sensitive to high damages from extreme
temperatures, it will tend to make the willingness to pay (WTP) to
avoid extreme climate changes very large.”

“Presumably
the PDF in the bad fat tail is thinned, or even truncated, perhaps from
considerations akin to what lies behind the value of a statistical life
(VSL). (After all, we would not pay an infinite amount to eliminate
altogether the fat tail of climate-change catastrophes.) Alas, in
whatever way the bad fat tail is thinned or truncated, a CBA based upon
it remains highly sensitive to the details of the thinning or
truncation mechanism, because the disutility of extreme climate change
has “essentially” unlimited liability.
In this sense climate change
is unique (or at least very rare) because the conclusions from a CBA
for such an unlimited-liability situation have some built-in tendency
to be non-robust to assumed tail fatness.”

“Reasonable
attempts to constrict the fatness of the “bad” tail can still leave us
with uncomfortably big numbers, whose exact value depends non-robustly
upon artificial constraints, functional forms, or parameters that we
really do not understand. The only legitimate way to avoid this
potential problem is when there exists strong a priori knowledge that
restrains the extent of total damages.
If a particular type of
idiosyncratic uncertainty affects only one small part of an
individual’s or a society’s overall portfolio of assets, exposure is
naturally limited to that specific component and bad-tail fatness is
not such a paramount concern. However, some very few but very
important real-world situations have potentially unlimited exposure due
to structural uncertainty about their potentially open-ended
catastrophic reach. Climate change potentially affects the whole
worldwide portfolio of utility by threatening to drive all of planetary
welfare to disastrously low levels in the most extreme scenarios.”

“Conclusions
from CBA [are] more fuzzy than we might prefer, because they are
dependent on essentially arbitrary decisions about how the fat tails
are expressed and about how the damages from high temperatures are
specified.
I would make a strong distinction between thin-tailed
CBA, where there is no reason in principle that outcomes should not be
robust, and fat-tailed CBA, where even in principle outcomes are
highly sensitive to functional forms and parameter values. For ordinary
run-of-the-mill limited exposure or thin-tailed situations, there is at
least the underlying theoretical reassurance that finite-cutoff-based
CBA might (at least in principle) be an arbitrarily-close approximation
to something that is accurate and objective. In fat-tailed unlimited
exposure situations, by contrast, there is no such theoretical
assurance underpinning the arbitrary cutoffs or attenuations – and
therefore CBA outcomes have a theoretical tendency to be sensitive to
fragile assumptions about the likelihood of extreme impacts and how
much disutility they cause.”

“My
target is not CBA in general, but the particular false precision
conveyed by the misplaced concreteness of the “standard” CBA of climate
change. By all means plug in tail probabilities, plug in disutilities
of high impacts, plug in rates of pure time preference, and so forth,
and then see what emerges empirically. Only please do not be surprised
when outcomes from fat-tailed CBA are fragile to specifications
concerning catastrophic extremes.  The extraordinary magnitude of the
deep structural uncertainties involved in climate-change CBA, and the
implied limitations that prevent CBA from reaching robust conclusions,
are highly frustrating for most economists, and in my view may even
push some into a state of denial. After all, economists make a living
from plugging rough numbers into simple models and reaching specific
conclusions (more or less) on the basis of these numbers. What are we
supposed to tell policy makers and politicians if our conclusions are
ambiguous and fragile?”

“It is
threatening for economists to have to admit that the structural
uncertainties and unlimited liabilities of climate change run so deep
that gung-ho “can do” economics may be up against limits on the ability of quantitative analysis to give robust advice in such a grey area. But if this is the way things are with the economics of climate change, then this is the way things are – and non-robustness to subjective assumptions is an inconvenient truth to be lived with rather than a fact to be denied or evaded
just because it looks less scientif…cally objective in CBA. In my
opinion, we economists need to admit to the policy makers, the
politicians, and the public that CBA of climate change is unusual
in being especially fuzzy because it depends especially sensitively on
what is subjectively assumed about the high-temperature damages
function, along with subjective judgements about the fatness of the
extreme tails and/or where they have effectively been cut off
.
Policy makers and the public will just have to deal with the idea that
CBA of climate change is less crisp (maybe I should say even less
crisp) than CBAs of more conventional situations.”

“The
moral of the dismal theorem is that under extreme uncertainty,
seemingly casual decisions about functional forms, parameter values,
and tail thickness may be dominant. We economists should not pursue
a narrow, superficially precise, analysis by blowing away the
low-probability high-impact catastrophic scenarios as if this is a
necessary price we must pay for the worthy goal of giving crisp advice.
An artificial infatuation with precision is likely to make our analysis
go seriously askew and to undermine the credibility of what we say by
effectively marginalizing the very possibilities that make climate
change grave in the first place.

“The
issue of how to deal with the deep structural uncertainties in climate
change would be completely different and immensely simpler if systemic
inertias (like the time required for the system to naturally remove
extra atmospheric CO2) were short (as is the case for SO2;
particulates, and many other airborne pollutants). Then an important
part of an optimal strategy would presumably be along the lines of
“wait and see.” With strong reversibility, an optimal
climate-change policy should logically involve (among other elements)
waiting to see how far out on the bad fat tail the planet will end up,
followed by midcourse corrections if we seem to be headed for a
disaster. This is the ultimate backstop rebuttal of DT given by some
critics of fat-tailed reasoning, including Nordhaus. Alas, the problem
of climate change is characterized everywhere by immensely long
inertias – in atmospheric CO2 removal times, in the capacity of the
oceans to absorb heat (as well as CO2), and in many other relevant
physical and biological processes. Therefore, it is an open question
whether or not we could learn enough in sufficient time to make
politically feasible midcourse corrections. When the critics are
gambling on this midcourse-correction learning mechanism to undercut
the message of DT, they are relying more on an article of faith than on
any kind of evidence-based scientific argument.

“I
think the actual scientific facts behind the alleged feasibility of
“wait and see”policies are, if anything, additional evidence for the
importance of fat-tailed irreversible uncertainty about ultimate
climate change.

“The
relevance of “wait and see”policies is an important unresolved issue,
which in principle could decide the debate between me and Nordhaus, but
my own take right now would be that the built-in pipeline inertias
are so great that if and when we detect that we are heading for
unacceptable climate change, it will likely prove too late to do
anything much about it for centuries to come thereafter
(except,
possibly, for lowering temperatures by geoengineering the atmosphere to
reflect back incoming solar radiation). In any event, I see this whole
“wait and see” issue as yet another component of fat-tailed uncertainty
– rather than being a reliable backstop strategy for dealing with
excessive CO2 in the atmosphere.

Nordhaus
states that there are so many low-probability catastrophic-impact
scenarios around that ‘if we accept the Dismal Theorem, we would
probably dissolve in a sea of anxiety at the prospect of the infinity
of infinitely bad outcomes.’ This is rhetorical excess and, more to the
point here, it is fallacious. Most of the examples Nordhaus gives have
such miniscule thin-tailed probabilities that they can be written off.”

Nordhaus
summarizes his critique with the idea there are indeed deep
uncertainties about virtually every aspect of the natural and social
sciences of climate change – but these uncertainties can only be
resolved by continued careful analysis of data and theories. I heartily
endorse his constructive attitude about the necessity of further
research targeted toward a goal of resolving as much of the uncertainty
as it is humanly possible to resolve.
I would just add that we
should also recognize the reality that, for now and perhaps for some
time to come, the sheer magnitude of the deep structural uncertainties,
and the way we express them in our models, will likely dominate
plausible applications of CBA to the economics of climate change
.”

(emphasis added)

Public spending gave Japan its "Lost Decade" and largest public debt in the developed world; Geithner wants to do it bigger

February 9th, 2009 No comments

What did Japan get from sustained and massive public works spending by the LDP after a real estate bubble burst in the late 1980s?  According to a recent article in the IHT, one thing is clear:  taxpayers ended up being saddled with the largest public debt in the developed world, totaling 180 percent of its $5.5 trillion economy.

While there are disputes over how to view the results, the Japanese appear to have learned a lesson, while US officials like Treasury Secretary Timothy Geithner, who spent time as a financial attaché in Japan after the collapse, appear to determined to repeat it on a larger scale

Excerpts from the article:

Economists tend to divide into two camps on the question of Japan’s infrastructure spending: those, many of them Americans like Geithner, who think it did not go far enough; and those, many of them Japanese, who think it was a colossal waste.

Among ordinary Japanese, the spending is widely disparaged for having turned the nation into a public-works-based welfare state and making regional economies dependent on Tokyo for jobs. Much of the blame has fallen on the Liberal Democratic Party, which has long used government spending to grease rural vote-buying machines that help keep the party in power. …

Beyond that, proponents of Keynesian-style stimulus spending in the United States say that the Japanese approach failed to accomplish more not because of waste but because it was never undertaken wholeheartedly. They argue that instead of making one big push to pump up the economy with economic shock therapy, Japan spread its spending out over several years, diluting the effects.

After years of heavy spending in the first half of the 1990s, economists say, Japanese leaders grew concerned about growing budget deficits and cut back too soon, snuffing out the recovery in its infancy, much as Roosevelt did to the U.S. economy in 1936. Growth that, by 1996, had reached 3 percent was suffocated by premature spending cuts and tax increases, they say. While spending remained high in the late 1990s, Japan never gave the economy another full-fledged push, these economists say.

They also say that the size of Japan’s apparently successful stimulus in the early 1990s suggests that the United States will need to spend far more than the current $820 billion to get results. Between 1991 and 1995, Japan spent some $2.1 trillion on public works, in an economy roughly half as large as that of the United States, according to the Cabinet Office. “Stimulus worked in Japan when it was tried,” said David Weinstein, a professor of Japanese economics at Columbia University. “Japan’s lesson is that, if anything, the current U.S. stimulus will not be enough.”

Categories: Uncategorized Tags: