Search Results

Keyword: ‘utility’

Enviro-Trek IV: In which your intrepid reporter boldly discusses "tragedy of the commons" and "property" with corrupted climate scientists and AGW co-religionists!

May 18th, 2009 No comments

 

Further to my prior posts, here are my more recent comments over at the remarkable RealClimate thread started by climate scientist Gavin Schmidt, to specifically discuss the “tragedy of the commons” paradigm in the context of domestic and international wrngling over climate policy:

 

544:  TokyoTom Says: 

530: “our temporary endowment of hydrocarbons … [is] currently almost a monoculture and it has developed a set of entrenched players who feel very threatened when confronted with the possibility that consumers may have a choice about where to plug in their toasters.”

Doug, you`ve correctly identified that SOMEONE feels threatened about where people plug in their toasters, but it ain`t the fossil fuel industry. but the so-called “public utilties”, which are NOT owned by fossil fuel producers, and have persuaded states to give them local monopolies and to wall them off from competition, in exchange for regulation of how rates are set.

Consumers get screwed all around, since they can`t purchase power from whom they want, by type of generating source, by time of day (peak v. off-peak), largely can`t easily monitor their own use, have limited ability to put back power to the utility, and because the utilities have no incentive to invest in long-range transmission (which would allow greater competition among generators) unless the local regulator is willing to allow cost recovery.

As the whole pent up demand for green energy is caused by the state/local grants of monopoly, perhaps environmentalists, rather than pushing for more government involvement, might consider asking for and end to public utility monopolies:

http://mises.org/daily/2264

 

545:  TokyoTom Says: 

#438: “But Rene isn’t talking about incorporating private ownership as part of a management strategy, but rather selling off the resources and getting rid of any collective from-above management strategy altogether, from forbidding government managers from setting goals (for instance, sustainability) at all.

When these schemes work it is typically due to some sort of collective mechanism above and beyond the whim of the individual owner of a fishery or other stock.”

dhogaza, you persist in finding an enemy in every friend. Nowhere has Rene (or I) advocated ANY form of privatization scheme, much less insisted on one that eliminates all government oversight (which of course, for as long as governments exist, is impossible anyway). In any case, in all of the cases where open-access-type resources are centrally managed, we can only expect gradual steps away from that, as politicians like to maintain their positions as gatekeepers for favors and we rarely see bureaucrats volunteer to lighten their own oversight purview.

“We have exceptions where individual owners put long-term sustainabiliity and non-economic values as a priority (I mentioned Gilchrist lumber here in Oregon as an example). But these are notable precisely because they’re *exceptions*.”

I understand your concern about the timeframes in which humans act, but there is an irreducible difficulty in fashioning institutions with longer-term views, as they are all populated by people. Even resources in the hands of governments are subject to human whim, such as Cheney`s allocation of scarce water in Oregon in ways that favored Republican farmers over salmon, Native Americans and fishermen, and Bush`s widescale gas leasing in the Front Range, against the opposition of ranchers and hunters.

Further, you and others keep forgetting that many private owners lead the way in environmental protection; many state parks have their roots in privately preserved land that, in order to avoid the tax man, were subsequently handed over to the state. The Nature Conservancy (which represents its individual members) protects valuable parcels not by seeking government regulation, but by buying them (or conservation easements) outright.

Another problem you point to is that of conflicts between community interests and the interests of individual owner and interloping buyers (individuals or firms). It seems to me that the greatest problem relates not to the ownership of property, but to the willingness of giant corporations to listen to the communities in which they operate. Some do a better job than others, but I do think that the problems with corporations also has its roots in gifts by governments to relatively wealthy investors:http://mises.org/Community/blogs/tokyotom/search.aspx?q=limited. Many large firms are run in order to put money first in the pockets of executives, with employees and investors next, under circumstances that encourage risk-taking rather than truly conservative behavior (as can be seen from the financial crisis).

 

547:  TokyoTom Says: 

#408: “The “climate commons” are the biggest ones of all. They cannot be contained, users cannot be easily left out. Even market-based solutions demand an international enforceable regulation to forbid, tax or at least know who´s emmitting how much, and who has to pay to whom for what.”

Alexandre, thanks for your comments; I largely agree.

The fact that the atmosphere is a global commons means no government can act effectively alone; that`s why Gavin`s metaphor of the multi-party international negotiations as a tragedy of the commons is apt. It`s also why fear of government “fiat” is rather misdirected, as in essence all major emmitting governments (and their chief constitutencies) have to reach a COMMON agreement. The situation is much like ranchers reaching terms of use on a range, and fishermen agreeing how to manage a fishery:

http://mises.org/Community/blogs/tokyotom/archive/2008/07/14/are-pigovian-taxes-coasean-if-they-are-not-fixed-by-one-government-but-rather-the-product-of-negotiations-among-many.aspx

 

550:  TokyoTom Says: 

#484: “Tosh, to put it bluntly. The ratio of greenwash to real change is vast. Moreover, only retail businesses are subject to any significant consumer pressure even to undertake greenwashing. It has been legislation and in some cases international agreements that have mitigated damage from food adulteration, lead in fuel and paint, acid rain, and ozone-destroying chemicals.”

Nick, “tosh”? Now I`m really offended! ;)

I never argued that consumer pressure was by itself adequate in all cases. Presumably you agree that consumer pressure has proven to be useful, even as you downplay it. The fact of greenwashing is itself an indication that consumer opinion matters, even as people remain susceptibly to deception – which is why there remain entrepreneurial opportunities for certification organizations. consumer reporting, etc.

I would love to see some consumer boycotts of unsustainbly caught bluefin, in order to lead the way for regulatory/treaty changes that I certainly agree are needed, and the role of moral suasion and struggle for the moral high ground is not to be denied on the climate change issue (which is why Gore in some ways is a self-hamstrung figure – the man wouldn`t know a hairshirt if it hit him in the face).

 

608:  TokyoTom Says: 

#419: Missed this:

“Slavery was brought up because of the idiotic contention posted that owning something means you take good care of it. And, BTW, some Libertarian philosophers have touted “voluntary slavery” as a solution to unemployment. You see, you have a property right in yourself, so you also have the right to sell it.”

Barton, I don`t speak for Rene, but I think the chief point is the largely uncontroversial contention that people are more likely to take better care of things that they own, relative to the possessions of others or things that nobody owns. Feel free to quibble about the failures of property rights, but are we completely disagreeing on the big picture and what drives the “tragedy of the commons”?

As for slavery, surely you can recognize that what those libertarians are discussing are still voluntary transactions between consenting person, not the theft and enslavement of others by violence and force. They are just not the same.

As to the former, do you have any idea about the ways that many of our forefathers funded their expensive passage to the young colonies/US? Ever hear of “indentured servitude”?

 

Bureacrash and Al Gore: Is CEI on a problem-solving, "libertarian" mission, or just another cynical supporter of statist beneficiaries of status quo?

May 15th, 2009 No comments

CEI funds, staffs and supports the relatively young, growing and interesting Bureaucrash” grassroots libertarian social action site, by which CEI tries to tap into some of the discontent with government that has bloomed over the Bush presidency.  I`ve opened an account there and cross-posted some of my blog posts.

I received the attached by email from Bureaucrash:

A message to all members of Bureaucrash Social

Crashers,

Be sure to visit http://cei.org/1984
to see how politicians and bureaucrats are trying to turn 2009 into
1984 by taking control of the entire economy via energy policies.  CEI
has produced a video showing that Al Gore and those like him are really
just Big Brother in green clothes.

You can write your Congressman today and make your voice heard by clicking on the “Write Your Congressman” link on the page.

In liberty,

Cord Blomquist

Visit Bureaucrash Social at: http://social.bureaucrash.com

I responded directly at the Bureaucrassh post (which may be accessible only to those who have registered); I copy my response below

Cord, I think CEI has been playing a counter-productive role for quite some time, and is still doing so.

Rather than taking the lead in (1) finding property-rights based or related approaches to blindingly obvious commons problems
– caused either by open-access without property or by government
regulation and favoritism/kleptocracy – in important local, regional
and global resources (tropical deforestation (theft of native title),
the atmosphere (AGW and pollution in US, China & India) and oceans
(crashing fisheries, dead zones & rising pH), or (2) calling for
deregulation to enhance competition, consumer choice and efficiency in
the “public utility” sector, CEI has played a denialist, delayist and ad hom game, all in ways quite contrary to libertarian principles
– principles that seek ways to resolve problems by enhancing the
ability of people to express their preferences through market
transactions.

This is puzzling, since there are many topics on which libertarians can
productively engage with enviros, who are starting to see the merit of
property rights approaches to fisheries, and whose frustration on power
issues is readily understandable, as Lew Rockwell has noted: The Real Cause of Blackouts

Given CEI`s unproductive and in-your-face approach to environmentalists, despite all of obvious problems and unexplored ground, one is tempted to wonder whether CEI`s purpose is not to achieve any positive action, but
– by talking about how environmentalists are always wrong and by
talking a great game about market principles while steadily ignoring
the way that government has long been used to benefit particular
special interests –simply to defend the status quo, which obviously
is NOT based on pure free markets, but massive governmental
intervention that benefits particular firms and investors.

More on Boone Pickens and power regulation in Texas: in which I test whether Rob Bradley/Master Resource is still blocking my posts

April 25th, 2009 No comments

Here`s Bradley`s post, A Texas-Sized Energy Problem: Republicans, Democrats, and ‘Baptists & Bootleggers’ Running Wild in the Lone Star State (Obama sends his thanks).  I left a short note wondering how Bradley could have made it through a generally observant post without referring to all of the sweet deals that Boone Pickens managed to buy from the thoroughly Republican Texas legislature; it`s also a puzzle why he didn`t call for public utility deregulation.

Rob banned me from his blog 6 weeks back; I`m checking to see if he`s reconsidered.

Here`s the comment I left:

TokyoTom { 04.25.09 at 1:43 pm }

Rob, why not go the extra step and identify Boone Pickens as the chief bootlegger, and the shameful way that the Republican legislature let him buy rights of eminent domain?

http://mises.org/Community/blogs/tokyotom/search.aspx?q=pickens

Steve Milloy has also written astutely on this.

Your comment is awaiting moderation.

Categories: Pickens, Rob Bradley Tags:

In which I applaud another balanced, productive post by Dr. Reisman, and draw attention to a post by Lew Rockwell on the need for more power competition

April 23rd, 2009 11 comments

[Snark Factor:  Ridiculously High]

In honor of Earth Day, yesterday Dr. George Reisman, Professor Emeritus of Economics at Pepperdine University and author of Capitalism: A Treatise on Economics, put up a fun little post that mocks the full-employment arguments made by President Obama on behalf of environmentalists and investors in the wind and solar power industries.

On the comment thread, I couldn`t resist expressing my appreciation, while introducing newer readers to the deeper challenge to which Dr. Reisman invites his readers:

I too have enjoyed another delightful article from Dr. Reisman; bravo!

But Dr. Reisman`s style does seem to present problems of
interpretation for some readers, whom do not seem to understand that
while Dr. Reisman appears to simply be bashing environmentalists or
environmentalism generally (by focussing on the most absurd arguments
that some of them offer), he is in fact challenging his readers to do
precisely what he has studiously avoided.

That is, far from simply pulling the wings off of flies as he might
seem to some, Dr. Reisman is actually suggesting that serious students
of economics and libertarian approaches to society should diligently:

  • – seek to engage others productively and with sympathy, in a manner
    carefully designed to improve the functioning of markets and ancillary
    institutions that enhance plan formation across society;

    – note that there are many important, valuable open-access/unowed
    resources and government-owned resources – in which property rights and
    pricing mechanisms are working poorly at best;

    – acknowledge that while proposed “solutions” offered by
    environmentalists may be misguided, enviros have legitimate preferences
    as to how such resources should be protected, managed and distributed;
    and

    – recognize that the concerns of enviros frequently arise in
    response to government interventions have clearly benefitted powerful
    insiders, including wealthy investors and large enterprises, while
    shifting costs and risks more broadly.

    As a result, Dr. Reisman`s tongue-in-cheek posts are in fact searing
    indictments
    of the status quo and tbe fat cats who are using government
    to stifle open competition, consumer choice and innovation, while
    frequently generating large external costs. Unlike some who spoil the
    fun by engaging in the pedestrian task of spelling out the problems
    with the status quo that enviros are right to be dissatisfied with, Dr.
    Reisman treats his readers as adults by bracingly challenging them to
    use their thinking caps and to clear their own heads.

    For those for whom this task is too difficult, perhaps this piece by Lew Rockwell might be a good start:

    “Just who is in charge of getting electricity to residents? A
    public utility, which, in the absurd American lexicon, means
    “state-run” and “state-managed,” perhaps with a veneer of private
    trappings. If you look at the electrical grid on a map, it is organized
    by region. If you look at the jurisdiction of management, it is
    organized by political boundaries.

    “In other ways, the provision of power is organized precisely as
    a central planner of the old school might plan something: not according
    to economics but according to some textbook idea of how to be
    “organized.” It is “organized” the same way the Soviets organized grain
    production or the New Deal organized bridge building.

    “All of centralization and cartelization began nearly a century
    ago, as Robert Bradley points out in Energy: The Master Resource, when
    industry leaders obtained what was known as a regulatory covenant. They
    received franchise protection from market competition in exchange for
    which they agreed to price controls based on a cost-plus formula — a
    formula that survives to this day.

    “Then the economists got involved ex post and declared that
    electrical power is a “public good,” under the belief that private
    enterprise is not up to the job of providing the essentials of life.

    “What industry leaders received from this pact with the devil was
    a certain level of cartel-like protection, the same type that the
    English crown granted tea or the US government grants first-class
    postal mail. It is a government privilege that subjects them to
    regulation and immunizes companies from business failure. It’s great
    for a handful of producers, but not so great for everyone else.

    “There are many costs. Customers are not in charge. They are
    courted only for political reasons but they are not the first concern
    of the production process. Entrepreneurial development is hindered. Our
    current system of electrical provision is stuck in time. Meanwhile,
    sectors that provide DSL and other forms of internet and
    telecommunication services are expanded and advancing day by day — not
    with perfect results but at least with the desire to serve consumers.

    “How New York and California consumers would adore a setting in
    which power companies were begging for their business and encouraging
    them to turn down their thermostats to the coldest point. Competition
    would lead to price reductions, innovation, and an ever greater variety
    of services — the same as we find in the computer industry.

    “What we are learning in our times is that no essential sector of
    life can be entrusted to the state. Energy is far too important to the
    very core of life to be administered by a bureaucracy that lacks the
    economic means to provide for the public. How it should be organized we
    can’t say in advance: it should be left to the markets. Whatever the
    result, you can bet the grid would not look like it does today, nor
    would its management be dependent on the whims of political
    jurisdiction.

    What we need today is full, radical, complete, uncompromised
    deregulation and privatization. We need competition. That doesn’t mean
    that we need two or more companies serving every market (though that
    was common up through the 1960s). What we need is the absence of legal
    barriers to enter the market.

    Thanks, again, Dr. Reisman, for challenging us, and not pandering to the dullest and laziest among us, the way Lew Rockwell does!

    Your admiring pupil (and fellow enviro-hater),

    TT

  • Published: April 23, 2009 5:32 AM

 

For those who think that Dr. Reisman is being serious in his one-sided attack on enviros while ignoring the problems of ongoing rent-seeking by entrenched statist corporations, I would be pleased to refer to other posts in which he is clearly posting tongue-in-cheek and intends no rancor or imbalance.  A good example would be his light-hearted post in March 2007, Global Warming: Environmentalism’s Threat of Hell on Earth, in which Dr. Reisman appeared to seriously argue that

there is a case for considering
the possible detonation, on uninhabited land north of 70° latitude,
say, of a limited number of hydrogen bombs. … This is certainly
something that should be seriously considered by everyone who is
concerned with global warming and who also desires to preserve modern
industrial civilization and retain and increase its amenities.
If
there really is any possibility of global warming so great as to cause
major disturbances, this kind of solution should be studied and
perfected. Atomic testing should be resumed for the purpose of empirically testing its feasibility.

While apparently serious, how could this possibly be a libertarian, nonstatist proposal?  The answer clearly MUST be – since Dr. Reisman is a lover of freedom and markets, and not of big government, goverment-run mega projects or statist corporate rent-seeking  – that Dr. Reisman was NOT being serious.  Instead, in his usual playful manner, he was simply inviting his readers to see through his words, and to productively engage those who are concerned with climate or other commons issues, on the basis of a cool consideration of libertarian and market principles.

Inquiring minds might like to note that I have remarked on Dr. Reisman`s  productive and insightful playfulness on a number of other occasions, on top of comments on his environment-related posts,  which have been fertilizing the LVMI pages since the 2005 Earth Day.

Q.E.D.



Jim Hansen on Freeman Dyson on climate change

March 29th, 2009 No comments

I received the following in an email from NASA climate scientist James Hansen (whom I`ve mentioned a number of times), in connection with today`s New York Times Magazine article (“The Civil Heretic”) on Freeman Dyson, which is now making its way through the “skeptosphere”.  My short and unfair take on Dyson?:  Short Freeman Dyson: Yep, it`s warming; I LIKE vast uncontrolled experiments with climate, and who needs fish in the ocean anyway!

Dyson is rather critical of Hansen, but it`s not at all clear that he understand`s Hansen`s position.  But why attack Hansen, when Exxon  and its CEO Rex Tillerson are now explicitly pushing carbon taxes?  If any firm ought to understand fossil fuels – and the problems with government actions – it`s Exxon.  Hansen is a vocal scientist, but he represents no particular special interests.

In any case, Hansen`s mail is fairly brief, so I thought I`d just post it as if for all you open minds out there:  

            Tomorrow’s NY Times Magazine article (The Civil Heretic) on Freeman Dyson includes an unfortunate quote from me that may appear to be disparaging and ad hominem (something about bigger fish to fry).  It was a quick response to a reporter* who had been doggedly pursuing me for an interview that I did not want to give.  I accept responsibility for the sloppy wording and I will apologize to Freeman, who deserves much respect.

            You might guess (correctly) that I was referring to the fact that contrarians are not the real problem – it is the vested interests who take advantage of the existence of contrarians.

            There is nothing wrong with having contrarian views, even from those who have little relevant expertise – indeed, good science continually questions assumptions and conclusions.  But the government needs to get its advice from the most authoritative sources, not from magazine articles.  In the United States the most authoritative source of information would be the National Academy of Sciences.

            The fact that the current administration in the United States has not asked for such advice, when combined with continued emanations about “cap and trade”, should be a source of great concern.  What I learned in visiting other countries is that most governments do not want to hear from their equivalent scientific bodies, probably because they fear the advice will be “stop building coal plants now!”  These governments are all guilty of greenwash, pretending that they are dealing with the climate problem via “goals” and “caps”, while they continue to build coal plants and even investigate unconventional fossil fuels and coal-to-liquids.

            I will send out something (“Worshiping the Temple of Doom”) on cap-and-trade soon.  It is incredible how governments resist the obvious (maybe not so incredible when lobbying budgets are examined, along with Washington’s revolving doors).  This is not rocket science.  If we want to move toward energy independence and solve the climate problem, we need to stop subsidizing fossil fuels with the public’s money and instead place a price on carbon emissions.

            My suggestion is Carbon Fee and 100% Dividend, with a meaningful starting price (on oil, gas and coal at the mine or port of entry) equivalent to $1/gallon gasoline ($115/ton CO2).  Based on 2007 fuel use, this would generate $670B/year – returned 100% to the public (monthly electronic deposit in bank accounts or debit cards), the dividend would be $3000 per adult legal resident, $9000/year per family with two or more children.  This is large enough to affect consumer product and life style choices, investments and innovations.  Of course all the other things (rules re vehicle, appliance and building efficiencies, smart electric grid, utility profit motives, etc.) are needed, but a rising carbon price is needed to make them work and move us most efficiently to the cleaner world beyond fossil fuels. 

Jim Hansen
 

*             The reporter left the impression that my conclusions are based mainly on climate models.  I always try to make clear that our conclusions are based on #1 Earth’s history, how it responded to forcings in the past, #2 observations of what is happening now, #3 models.  Here is the actual note that I sent to the reporter after hanging up on him:

 I looked up Freeman Dyson on Wikipedia, which describes his views on “global warming” as below.  If that is an accurate description of what he is saying now, it is actually quite reasonable (I had heard that he is just another contrarian).  However, this also indicates that he is under the mistaken impression that concern about global warming is based on climate models, which in reality play little role in our understanding — our understanding is based mainly on how the Earth responded to changes of boundary conditions in the past and on how it is responding to on-going changes.  

If this Wikipedia information is an accurate description of his position, then the only thing that I would like to say about him is that he should be careful not to offer public opinions about global warming unless he is willing to first take a serious look at the science.  His philosophy of science is spot-on, the open-mindedness, consistent with that of Feynman and the other greats, but if he is going to wander into something with major consequences for humanity and other life on the planet, then he should first do his homework — which he obviously has not done on global warming.  My concern is that the public may assume that he has — and, because of his other accomplishments, give his opinion more weight than it deserves.

      (emphasis added)      

Categories: carbon pricing, Exxon, Jim Hansen Tags:

Rot at the core: federally-owned TVA’s massive coal flyash spill – the TVA "protects" affected residents by hassling/arresting the volunteers who help them

March 10th, 2009 No comments

A few items of interest have come to my attention regarding the TVA’s massive spill last December 22 of wet coal fly-ash into a lovely river area near Kingston, TN (about 35 miles west of Knoxville, at the junction of the Emory and Clinch Rivers).  The collapse of a retaining wall released over five million cubic feet (more than a billion gallons) of wet coal ash
flooded nearly 400 acres of land adjacent to the power plant and into the nearby
Clinch and Emory rivers, filling large areas of the rivers, damaging homes and property, rupturing
a major gas line and damaging a
railway line.

– according to a report in the Tennessean, the TVA was long aware of the possibility of a release from the Kingston site, but elected not to proceed with any costly fix – the most expensive fix apparently in the ballpark of $25 million – because it didn’t want to set a precedent for spending similar sums at its other wet ash storage sites.  Penny wise, pound foolish – how often that happens when decision-makers don’t face personal responsibility for the downsides (yes, my “limited liaibility breeds moral hazards” meme)!

– in response to the accident, the EPA announced on Monday that it will: request electric utilities
nationwide to provide coal ash impoundment information (the EPA estimates there may be as many as 300 coal ash impoundments across the US
); conduct on-site assessments to determine structural
integrity and vulnerabilities; order cleanup and repairs where needed; and develop new regulations for future safety.  Said administrator Lisa Jackson: “Environmental disasters like the one last December in Kingston should never happen anywhere in this country.”  Not only are such regulations too little too late and probably unneccesarily costly, but one wonders why in this case she fails to note that as the TVA is wholly-owned by the US government, in this case the government did this to us itself.  The industry must be really grateful to TVA for leading the way to more regulations!

– The TVA is spending $1 million a day on the cleanup, and estimates final recovery may cost $525 million to $825 million.  This is just the cost for recovering the spilled ash, which could take two years or more, and does not cover long-term mediation costs, or litigation expenses, fines or any settlements
from the accident or the extra cost of upgrading coal ash ponds at
other TVA plants
, or costs being borne by local, state or other federal agencies.  So we could be easily talking physical damage of a billion dollars or more, and decades before local homeowners can start enjoying the rivers again.

– The TVA announced in February that TVA it lost $305 million in the fiscal quarter
ending Dec. 31 2008 due to the $525 million charge
the utility took for the
estimated cost of the ash spill.

– In response, TVA president and CEO Tom Kilgore, who earned $2.2 million in FY2008, saw his base and incentive compensation for FY 2009 cut by about half.  Said Kilgore, who had outraged ratepayers in October (on the heels of rate increases) by taking large compensation increase for FY2009 (in a package worth up to $3.275 million), “I’m at the point in
my career where it’s not all about money.”
 

– The fly ash poses health risks, both as the small particle dust can affect the lungs and since the ash contains elevated levels of heavy metals that were left behind from the combusted coal.  A Tennessee Department of Health survey indicates that a third of the people living near the toxic coal ash spill are experiencing respiratory problems, and about half
have increased stress and anxiety.  

According to TVA President Tom Kilgore, TVA and the state Department of Environment and Conservation have tested the water and believe there’s “no reason to believe that the water is not safe,” but “water quality tests conducted by environmental activists showed arsenic
levels as high as 48 times the primary drinking water standard in river
water nearest the spill
. Coal industry watchdog United Mountain Defense
and Washington, D.C.-based Environmental Integrity Project said January
levels of arsenic, lead, selenium, cadmium, beryllium, antimony and
copper violated water quality standards and exceeded primary drinking
water standards.”

State senator Tim Burchett (a Republican) characterized TVA officials as “arrogant clowns” on March 10 as he presented legislation on coal ash storage to a Senate committee.  “I want to assure my colleagues that any offense (to TVA) is intentional,” he said. “I have little faith in what TVA is telling us.”

More on water testing results and on health, safety and environment impacts is here.

– the TVA is naturally trying to buy out residents, both to cut future losses and to limit coverage of the affected area. Apparently these buyouts require the sellers to waive all future health claims against the TVA.

– On top of such purchases, though, TVA – through its own police department – is trying to make it difficult for residents to remain and to prevent full disclosure of health risks, by restricting access to public roads and to the homes of residents, requiring any who receive medical checkups from TVA doctors to waive health claims and by hassling volunteers who, at the invitation of residents, do ash, water and air testing, deliver bottled water, and assist some residents with the transportation needs.   In two recent incidents, the TVA police have gone onto private property to detain volunteers and force the removal of private air quality monitoring devices, and arrested, shackled and jailed on March 6 a driver who had used a public road – now restricted by the TVA – to drop off a two grandmothers (one elderly and vision-impaired) at their homes after a town meeting – and who had written permission from residents to visit at any time.

According to one group, volunteers “have relatives in the Swan Pond Community and have an
open invitation to visit residents or their property near the disaster
site at any time day or night.”   The volunteer who was arrested reports the following, entirely believable – conversation with a TVA officer when he was being booked:

So as I was escorted to the Roane County Jail for processing I was informed by the TVA officer that he was “protecting the residents” of the Swan Pond Community from “people like me.”  When I questioned him further about this he stated that he meant onlookers and sight seers and people taking video while disrupting vehicle traffic and impeding the cleanup of the disaster site.
 
Well if TVA has any video proof of me personally disrupting vehicle traffic or impeding the cleanup of the disaster site I would like to see it, please post it to YouTube; show the world exactly what I am doing, PLEASE.    When I stated,” why would the residents need to be protected from someone who is delivering water, taking people to the grocery store, hospital, doctor, not trespassing, monitoring air/ water/ coal ash, helping facilitate trainings and organize with the local community, and sit at the Harriman American Legion building for more than 20 hours helping with heavy metal exposure testing,” he could not answer.

So far, one lawsuit against the TVA has been filed in federal court in Knoxville on
behalf of 109 citizens.  The TVA harassment policy may be aimed in part at preventing residents from gathering independent evidence to support their claims.

The TVA is governed by a nine-member board of directors, all current members of which were appointed by nominated by former President Bush (on
the approval of senators from the region) and confirmed by the Senate. 
Over the objections of the current chairman and two others
(Republicans),former national GOP committee chairman and former TVA board member was reappointed in February as chairman.  Since the TVA board has two vacancies, will
have two members terms expire in May and another in 2010, President Obama will have the opportunity to take control of the board.

– Photographic and video images of the impact of the ash spill are here:

– by renowned photographer Carlan Tapp

– by local residents (first three minutes are home footage before the accident)

– More information by the enviro group doing testing and resident support work

– the TVA’s home page, etc.

 

Where is anyone calling for the privatization of the TVA?

Categories: Coal, damage, limited liability, moral hazard, TVA Tags:

Empowering power consumers: Google beta tests software to give consumers real-time info

February 17th, 2009 No comments

“If you cannot measure it;

You cannot improve it.”

— Lord Kelvin

Consistent with its mission to “organize the world’s information and make it universally accessible and useful,” Google, whose climate change-related efforts I’ve blogged about previously, is trying to help consumers to measure and track their real-time electric usage, thereby allowing them to make better choices as to when and how they use electricity.

Google is now beta testing new “PowerMeter” software – a secure iGoogle Gadget that it plans to give away free (though no doubt there will be a buck or two for Google in advertising and data services later) – that will provide near real-time power usage information to consumers who have advanced “Smart Meters”.  This information will make it easy for consumers to figure out when and how they are using electricity, to manage such use by device and to better match such use to the pricing programs of their utilities.  So far, Google testers have found that the software allows them to relatively easily cut use (by an average of 15%), and to save on their electricity bills by an even greater percentage.

The availability of such software will motivate consumers everywhere to push their utilizing to establish Smart Meter programs, for access to the information generated by such meters, and for an array of services and pricing programs.  There should be a boom smart meters, as the Obama Administration’s proposed stimulus package targets supporting their installation in over 40 million U.S. homes
over the next three years.

While Smart Meter / Smart Grid programs have been growing, there is still considerable market fragmentation and rights of consumers have not been clearly spelled out. According to Google, while some state regulators have ordered utilities to deploy smart
meters, their focus has been on their use by utilities and grid
managers, and not on consumer rights to the information they generate.  As a result, Google is engaged in policy advocacy as well; says Google:

“deploying smart meters alone isn’t enough. This needs to be coupled
with a strategy to provide customers with easy access to energy
information. That’s why we believe that open protocols and standards
should serve as the cornerstone of smart grid projects, to spur
innovation, drive competition, and bring more information to consumers
as the smart grid evolves. We believe that detailed data on your
personal energy use belongs to you, and should be available in an open
standard, non-proprietary format. You should control who gets to see
your data, and you should be free to choose from a wide range of
services to help you understand it and benefit from it. For more
details on our policy suggestions, check out the comments we filed yesterday with the California Public Utility Commission.”

While it’s not clear yet how significant a role Google will end up playing in this market, Google is to be commended, as both its PowerMeter software and its advocay efforts will help pave the way to greater consumer choice and freer markets.

What we need in addition is for the Obama Administration and Congress to give a kick in the pants to electric power market reform and deregulation along the lines of proposals that I have noted elsewhere.  Consumers need not only better information, but greater competition in who is providing them electricity and in the sources that are used to generate it.

Christian Science Monitor summary here:

New York Times

Wired

The Google Blog

Google’s PowerMeter website

[View:http://www.youtube.com/watch?v=6Dx38hzRWDQ:550:0]

Categories: Google, obama, power, smart grid Tags:

Fat Tails Part Deux: cost-benefit analysis and climate change; Weitzman replies to Nordhaus

February 13th, 2009 No comments

[Note:  Although the giant snakes I mentioned in my preceding post may have fat tails, I didn’t want my description of the discussion between Harvard`s Martin Weitzman and Yale`s William Nordhaus of the limits of cost-benefit analysis to be overlooked, so I have largely copied it below.  I’ve added an introduction, as well as a few links.]

“Fat tails” seem to be the rage these days, as Bill Safire noted last week in the NYT.  But what are “fat tails”?  Notes Safire,

To comprehend what fat tail is in
today’s media wringer, think of a bell curve, the line on a
statistician’s chart that reflects “normal distribution.” It is tall
and wide in the middle — where most people and things being studied
almost always tend to be — and drops and flattens out at the bottom,
where fewer are, making a shape on a graph resembling a bell. The
extremities at the bottom left and right are called the tails; when
they balloon instead of nearly vanishing as expected, the tails have
been designated “heavy” and, more recently, the more pejorative “fat.”
To a credit-agency statistician now living in a world of chagrin, the
alliterative definition of a fat tail is “an abnormal agglomeration of angst.”

In
an eye-popping Times Magazine article last month titled “Risk
Mismanagement
,” Joe Nocera, a business columnist for The Times, focused
on the passionate, prescient warnings of the former options trader
Nassim Nicholas Taleb, author of “The Black Swan” and “Fooled by
Randomness,”
who popularized the phrase now in vogue in its
financial-­statistics sense. Nocera wrote: “What will cause you to lose
billions instead of millions? Something rare, something you’ve never
considered a possibility. Taleb calls these events ‘fat tails
or ‘black swans,’
and he is convinced that they take place far more
frequently than most human beings are willing to contemplate.”

If I make quibble with Safire’s description; “fat” refers not to the probabilty distribution ballooning on either tail, but refers to the case that the tail probability does not decline quickly to zero (viz., probability approaches zero more slowly than exponentially).

*   *   *

The size of the giant snakes and the much higher temperatures (and GHG levels) at their time (60 million years ago) and shortly after during the PETM (a perod 56 million years ago temperatures shot up by 5° Celsius / 9° F in less than 10,000 years) tell us no simply that climate is sensitive
(on geological scales, sometimes rather short-term) to atmospheric
levels of carbon and methane, but  remind us that there is a “fat tail” of uncertain climate change risks
posed by mankind`s ramped up efforts to release as much as possible of
the CO2 that has been stored up in the form of fossil fuels, methane
and limestone over millions years.  

I have mentioned the issue of “fat tails” previously,
in connection with attempts at applying cost – benefit analysis (CBA)
to determine whether to tax CO2 emissions.  While economists like
Yale`s William Nordhaus who have applied CBA to climate policy have been saying for decades that taxing carbon makes sense on a net basis, our own Bob Murphy has criticized Nordhaus`s approach on rather narrow (and decidedly non-Austrian) grounds.

But Nordhaus has also been strongly criticized by economists such as Harvard`s Martin Weitzman,
who basically argue that Nordhaus has UNDERSOLD the case for carbon
pricing or that the results of such CBA imply a greater certainty of
knowledge (and complacency) than is deserved.  Weitzman points out
basic difficulties inherent in applying CBA to policies addressing
climate change, particularly where there seems to be a grave
possibility that we do not understand how drastically the climate might
respond to our influences.  Weitzman`s comments (scheduled to appear in
the February issue of The Review of Economics and Statistics) were the focus of the lead essay by Jim Manzi in Cato Unbound`s August 2008 issue, which I reviewed.

Nordhaus has since responded to Weitzman in a comment that became available in January; this time with Bob Murphy stepped in as a defender of CBA!  I note that Ron Bailey, science correspondent at Reason online, has just published a piece examining Weitzman’s paper last year and Nordhaus’s recent comments.

Weitzman has now replied to Nordhaus, and has kindly permitted me to
quote from a draft of his reply (which he has out for review).  It seems that Weitzman
provides a compelling statement of some the limits of CBA, as applied
to climate change.  (NB:  Weitzman`s draft response is a .pdf file that I cannot upload, though I have uploaded a version converted to .txt format.  I am happy to forward the .pdf to any interested readers.)

Weitzman`s criticisms of the limits of CBA ought to resonate with Austrian concerns about complexity, limits of knowledge and the difficulty of prediction — even as Weitzman (and Nordhaus and, indeed, Bob Murphy) completely fail to consider the fundamental problems of conflicting preferences in the absence of property rights and of the likelihood that rent-seeking with corrupt governmental policy responses.

 

The rest of the post sets those of Weitzman`s key points that I consider most salient to a discussion among laymen:

“there is enormous structural uncertainty about the economics of extreme climate change,
which, if not unique, is pretty rare. I will argue on intuitive grounds
that the way in which this deep structural uncertainty is
conceptualized and formalized should influence substantially the
outcomes of any reasonable CBA (or IAM) of climate change. Further, I
will argue that the seeming fact that this deep structural
uncertainty does not influence substantially outcomes from the
“standard” CBA hints at an implausible treatment of uncertainty.”

“The
pre-industrial-revolution level of atmospheric CO2 (about two centuries
ago) was

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


about280 parts per million (ppm). The ice-core data show that
carbon dioxide was within a range roughly between

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


180 and

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


280 ppm
during the last 800,000 years. Currently, CO2 is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


385 ppm, and
climbing steeply. Methane was never higher than

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


750 parts per billion
(ppb) in 800,000 years, but now this extremely potent GHG, which is
thirty times more powerful than CO2, is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1,780 ppb. The sum total of
all carbon-dioxide-equivalent (CO2-e) GHGs is currently at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


435 ppm.
Even more alarming in the 800,000-year record is the rate of change of
GHGs, with increases in CO2 being below (and typically well below)

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40
ppm within any past sub-period of ten thousand years, while now CO2 has
risen by

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40 ppm in just the last quarter century.

Thus, anthropogenic
activity has elevated atmospheric CO2 and CH4 to levels extraordinarily
far outside their natural range – and at a stupendously rapid rate. The
scale and speed of recent GHG increases makes predictions of future
climate change highly uncertain.  There is no analogue for anything
like this happening in the past geological record. Therefore, we do not
really know with much confidence what will happen next.”

“To keep atmospheric CO2 levels at twice pre-industrial-revolution levels would require not just stable but sharply declining emissions within a few decades from now. Forecasting
ahead a century or two, the levels of atmospheric GHGs that may
ultimately be attained (unless drastic measures are undertaken) have
likely not existed for tens of millions of years and the rate of change
will likely be unique on a time scale of hundreds of millions of years.

Remarkably,
the “standard”CBA of climate change takes essentially no account of the
extraordinary magnitude of the scale and speed of these unprecedented
changes in GHGs – and the extraordinary uncertainties they create for
any believable economic analysis of climate change.
Perhaps even
more astonishing is the fact that the “policy ramp” of gradually
tightening emissions, which emerges from the “standard” CBA, attains
stabilization at levels of CO2-e GHGs that approach

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


700 ppm. The
“standard” CBA [of Nordhaus] thus recommends imposing an impulse or
shock to the Earth’s system by geologically-instantaneously jolting
atmospheric stocks of GHGs up to

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


21/2 times their highest past level
over the last 800,000 years – without even mentioning what an
unprecedented planetary experiment such an “optimal” policy would
entail.”

“So-called
“climate sensitivity” (hereafter denoted S1) is a key macro-indicator
of the eventual temperature response to GHG changes. Climate
sensitivity is defi…ned as the global average surface warming following
a doubling of carbon dioxide concentrations. … the median upper 5%
probability level over all 22 climate-sensitivity studies cited in
IPCC-AR4 (2007) is 6.4° C – and this stylized fact alone is telling.
Glancing at Table 9.3 and Box 10.2 of IPCC-AR4, it is apparent that the
upper tails of these 22 PDFs tend to be sufficiently long and heavy
with probability that one is allowed from a simplistically-aggregated
PDF of these 22 studies the rough approximation P[S1>10° C]

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1%. The
actual empirical reason why these upper tails are long and heavy with
probability dovetails nicely with the theory of my paper: inductive
knowledge is always useful, of course, but simultaneously it is limited
in what it can tell us about extreme events outside the range of
experience – in which case one is forced back onto depending more than
one might wish upon the prior PDF, which of necessity is largely
subjective and relatively diffuse. As a recent Science commentary put
it: “Once the world has warmed by 4° C, conditions will be so
different from anything we can observe today (and still more different
from the last ice age) that it is inherently hard to say where the
warming will stop.”

“Exhibit C” concerns possibly disastrous releases over the long run of bad-feedback components
of the carbon cycle that are currently omitted from most general
circulation models. The chief worry here is a significant supplementary
component that conceptually should be added on to climate sensitivity
S1. This omitted component concerns the potentially powerful
self-amplification potential of greenhouse warming due to heat-induced
releases of sequestered carbon. … Over the long run, a CH4
outgassing-amplifier process could potentially precipitate a
cataclysmic strong-positive-feedback warming
. This real physical
basis for a highly unsure but truly catastrophic scenario is my Exhibit
C in the case that conventional CBAs and IAMs do not adequately cover
the deep structural uncertainties associated with possible
climate-change disasters.  Other examples of an actual real physical
basis for a catastrophic outcome could be cited, but this one will do
here.  The real physical possibility of endogenous heat-triggered
releases at high temperatures of the enormous amounts of
naturally-sequestered GHGs is a good example of indirect carbon-cycle
feedback effects that I think should be included in the abstract
interpretation of a concept of “climate sensitivity” that is relevant
here. What matters for the economics of climate change is the
reduced-form relationship between atmospheric stocks of
anthropogenically-injected CO2-e GHGs and temperature change. … When
fed into an economic analysis, the great open-ended uncertainty about
eventual mean planetary temperature change cascades into
yet-much-greater yet-much-more-open-ended uncertainty about eventual
changes in welfare.”

“Exhibit
D” concerns what I view as an unusually cavalier treatment of damages or
disutilities from extreme temperature changes. The “standard” CBA
treats high-temperature damages by a rather passive extrapolation of
whatever specification is assumed (typically arbitrarily) to be the
low-temperature “damages function.”  … Seemingly minor changes in
the specification of high-temperature damages can dramatically alter
the gradualist policy ramp outcomes recommended by the “standard” CBA.

Such fragility of policy to postulated forms of disutility functions
are my Exhibit D in making the case that the “standard” CBA does not
adequately cope with deep structural uncertainty – here structural
uncertainty about the specification of damages.”

“An
experiment without precedent is being performed on planet Earth by
subjecting the world to the shock of a geologically-instantaneous
injection of massive amounts of GHGs. Yet the “standard” CBA seems
almost oblivious to the extraordinarily uncertain consequences of
catastrophic climate change.”

“Almost
nothing in our world has a probability of exactly zero or exactly one.
What is worrisome is not the fact that extreme tails are long per se
(reflecting
the fact that a meaningful upper bound on disutility does not exist),
but that they are fat (with probability density). The critical
question is how fast does the probability of a catastrophe decline
relative to the welfare impact of the catastrophe. Other things being
equal, a thin-tailed PDF is of less concern because the probability of
the bad event declines exponentially (or faster). A fat-tailed
distribution, where the probability declines polynomially, can be much
more worrisome.
… To put a sharp point on this seemingly abstract issue, the
thin-tailed PDFs that Nordhaus requires implicitly to support his
gradualist “policy ramp” conclusions have some theoretical tendency to
morph into being fat tailed when he admits that he is fuzzy about the
functional forms or structural parameters of his assumed thin-tailed
PDFs
– at least for high temperatures. … When one combines fat
tails in the PDF of the logarithm of welfare-equivalent consumption
with a utility function that is sensitive to high damages from extreme
temperatures, it will tend to make the willingness to pay (WTP) to
avoid extreme climate changes very large.”

“Presumably
the PDF in the bad fat tail is thinned, or even truncated, perhaps from
considerations akin to what lies behind the value of a statistical life
(VSL). (After all, we would not pay an infinite amount to eliminate
altogether the fat tail of climate-change catastrophes.) Alas, in
whatever way the bad fat tail is thinned or truncated, a CBA based upon
it remains highly sensitive to the details of the thinning or
truncation mechanism, because the disutility of extreme climate change
has “essentially” unlimited liability.
In this sense climate change
is unique (or at least very rare) because the conclusions from a CBA
for such an unlimited-liability situation have some built-in tendency
to be non-robust to assumed tail fatness.”

“Reasonable
attempts to constrict the fatness of the “bad” tail can still leave us
with uncomfortably big numbers, whose exact value depends non-robustly
upon artificial constraints, functional forms, or parameters that we
really do not understand. The only legitimate way to avoid this
potential problem is when there exists strong a priori knowledge that
restrains the extent of total damages.
If a particular type of
idiosyncratic uncertainty affects only one small part of an
individual’s or a society’s overall portfolio of assets, exposure is
naturally limited to that specific component and bad-tail fatness is
not such a paramount concern. However, some very few but very
important real-world situations have potentially unlimited exposure due
to structural uncertainty about their potentially open-ended
catastrophic reach. Climate change potentially affects the whole
worldwide portfolio of utility by threatening to drive all of planetary
welfare to disastrously low levels in the most extreme scenarios.”

“Conclusions
from CBA [are] more fuzzy than we might prefer, because they are
dependent on essentially arbitrary decisions about how the fat tails
are expressed and about how the damages from high temperatures are
specified.
I would make a strong distinction between thin-tailed
CBA, where there is no reason in principle that outcomes should not be
robust, and fat-tailed CBA, where even in principle outcomes are
highly sensitive to functional forms and parameter values. For ordinary
run-of-the-mill limited exposure or thin-tailed situations, there is at
least the underlying theoretical reassurance that finite-cutoff-based
CBA might (at least in principle) be an arbitrarily-close approximation
to something that is accurate and objective. In fat-tailed unlimited
exposure situations, by contrast, there is no such theoretical
assurance underpinning the arbitrary cutoffs or attenuations – and
therefore CBA outcomes have a theoretical tendency to be sensitive to
fragile assumptions about the likelihood of extreme impacts and how
much disutility they cause.”

“My
target is not CBA in general, but the particular false precision
conveyed by the misplaced concreteness of the “standard” CBA of climate
change. By all means plug in tail probabilities, plug in disutilities
of high impacts, plug in rates of pure time preference, and so forth,
and then see what emerges empirically. Only please do not be surprised
when outcomes from fat-tailed CBA are fragile to specifications
concerning catastrophic extremes.  The extraordinary magnitude of the
deep structural uncertainties involved in climate-change CBA, and the
implied limitations that prevent CBA from reaching robust conclusions,
are highly frustrating for most economists, and in my view may even
push some into a state of denial. After all, economists make a living
from plugging rough numbers into simple models and reaching specific
conclusions (more or less) on the basis of these numbers. What are we
supposed to tell policy makers and politicians if our conclusions are
ambiguous and fragile?”

“It is
threatening for economists to have to admit that the structural
uncertainties and unlimited liabilities of climate change run so deep
that gung-ho “can do” economics may be up against limits on the ability of quantitative analysis to give robust advice in such a grey area. But if this is the way things are with the economics of climate change, then this is the way things are – and non-robustness to subjective assumptions is an inconvenient truth to be lived with rather than a fact to be denied or evaded
just because it looks less scientif…cally objective in CBA. In my
opinion, we economists need to admit to the policy makers, the
politicians, and the public that CBA of climate change is unusual
in being especially fuzzy because it depends especially sensitively on
what is subjectively assumed about the high-temperature damages
function, along with subjective judgements about the fatness of the
extreme tails and/or where they have effectively been cut off
.
Policy makers and the public will just have to deal with the idea that
CBA of climate change is less crisp (maybe I should say even less
crisp) than CBAs of more conventional situations.”

“The
moral of the dismal theorem is that under extreme uncertainty,
seemingly casual decisions about functional forms, parameter values,
and tail thickness may be dominant. We economists should not pursue
a narrow, superficially precise, analysis by blowing away the
low-probability high-impact catastrophic scenarios as if this is a
necessary price we must pay for the worthy goal of giving crisp advice.
An artificial infatuation with precision is likely to make our analysis
go seriously askew and to undermine the credibility of what we say by
effectively marginalizing the very possibilities that make climate
change grave in the first place.

“The
issue of how to deal with the deep structural uncertainties in climate
change would be completely different and immensely simpler if systemic
inertias (like the time required for the system to naturally remove
extra atmospheric CO2) were short (as is the case for SO2;
particulates, and many other airborne pollutants). Then an important
part of an optimal strategy would presumably be along the lines of
“wait and see.” With strong reversibility, an optimal
climate-change policy should logically involve (among other elements)
waiting to see how far out on the bad fat tail the planet will end up,
followed by midcourse corrections if we seem to be headed for a
disaster. This is the ultimate backstop rebuttal of DT given by some
critics of fat-tailed reasoning, including Nordhaus. Alas, the problem
of climate change is characterized everywhere by immensely long
inertias – in atmospheric CO2 removal times, in the capacity of the
oceans to absorb heat (as well as CO2), and in many other relevant
physical and biological processes. Therefore, it is an open question
whether or not we could learn enough in sufficient time to make
politically feasible midcourse corrections. When the critics are
gambling on this midcourse-correction learning mechanism to undercut
the message of DT, they are relying more on an article of faith than on
any kind of evidence-based scientific argument.

“I
think the actual scientific facts behind the alleged feasibility of
“wait and see”policies are, if anything, additional evidence for the
importance of fat-tailed irreversible uncertainty about ultimate
climate change.

“The
relevance of “wait and see”policies is an important unresolved issue,
which in principle could decide the debate between me and Nordhaus, but
my own take right now would be that the built-in pipeline inertias
are so great that if and when we detect that we are heading for
unacceptable climate change, it will likely prove too late to do
anything much about it for centuries to come thereafter
(except,
possibly, for lowering temperatures by geoengineering the atmosphere to
reflect back incoming solar radiation). In any event, I see this whole
“wait and see” issue as yet another component of fat-tailed uncertainty
– rather than being a reliable backstop strategy for dealing with
excessive CO2 in the atmosphere.

Nordhaus
states that there are so many low-probability catastrophic-impact
scenarios around that ‘if we accept the Dismal Theorem, we would
probably dissolve in a sea of anxiety at the prospect of the infinity
of infinitely bad outcomes.’ This is rhetorical excess and, more to the
point here, it is fallacious. Most of the examples Nordhaus gives have
such miniscule thin-tailed probabilities that they can be written off.”

Nordhaus
summarizes his critique with the idea there are indeed deep
uncertainties about virtually every aspect of the natural and social
sciences of climate change – but these uncertainties can only be
resolved by continued careful analysis of data and theories. I heartily
endorse his constructive attitude about the necessity of further
research targeted toward a goal of resolving as much of the uncertainty
as it is humanly possible to resolve.
I would just add that we
should also recognize the reality that, for now and perhaps for some
time to come, the sheer magnitude of the deep structural uncertainties,
and the way we express them in our models, will likely dominate
plausible applications of CBA to the economics of climate change
.”

(emphasis added)

Let`s recreate the Paleocene! Giant snakes, "fat tails", cost-benefit analysis and climate change; Weitzman replies to Nordhaus

February 11th, 2009 1 comment

Giant snakes?  What could a few colossal bones found in Colombia have to do with us now?

1.  A recent paper in Nature about the discovery of several specimens of a giant snake (“Titanoboa”) that lived in Latin America 60 million years ago captured attention last week, including among climate change bloggers (yes, “skeptics” too).  Why?  Not only because the snakes were enormous (more than 40 feet and over a ton) – making anacondas look like garter snakes – but because their size appears to tell us something about the climate about during the Paleocene.  Based on existing knowledge of the size, metabolism and temperature tolerances of  snakes, scientists believe that the size of the snake appears to indicate that not only was the world overall quite warm during the Paleocene (with palms growing at the poles), but that average temperatures in the tropics would have been from 3° to 5° Celsius (5° to 9° F) warmer than they are today in order for such large snakes to  survive.

The period in which these snakes lived was followed a few million years later by the Paleocene – Eocene Thermal Maximum (PETM) in 56 million BC, when a pulse of CO2 and methane drove already warm temperatures sharply higher (by 5° Celsius / 9° F) in less than 10,000 years. During the PETM, CO2 levels rose to about 2000 ppm, or roughly 6 times  where they are now. The PETM resulted in a massive extinction of species.

The size of the snakes and the temperatures at their time and shortly after during the PETM also tell us that climate is sensitive (on geological scales, sometimes rather short-term) to atmospheric levels of carbon and methane – and remind us that there is a “fat tail” of uncertain climate change risks posed by mankind`s ramped up efforts to release as much as possible of the CO2 that has been stored up in the form of fossil fuels, methane and limestone over millions years.  

2.  I have mentioned the issue of “fat tails” previously, in connection with attempts at applying cost – benefit analysis (CBA) to determine whether to tax CO2 emissions.  While economists like Yale`s William Nordhaus who have applied CBA to climate policy have been saying for decades that taxing carbon makes sense on a net basis, our own Bob Murphy has criticized Nordhaus`s approach on rather narrow (and decidedly non-Austrian) grounds.

But Nordhaus has also been strongly criticized by economists such as Harvard`s Martin Weitzman, who basically argue that Nordhaus has UNDERSOLD the case for carbon pricing or that the results of such CBA imply a greater certainty of knowledge (and complacency) than is deserved.  Weitzman points out basic difficulties inherent in applying CBA to policies addressing climate change, particularly where there seems to be a grave possibility that we do not understand how drastically the climate might respond to our influences.  Weitzman`s comments (scheduled to appear in the February issue of The Review of Economics and Statistics) were the focus of the lead essay by Jim Manzi in Cato Unbound`s August 2008 issue, which I reviewed.

Nordhaus has since responded to Weitzman, and this time with Bob Murphy stepped in as a defender of CBA.  Weitzman has now replied to Nordhaus, and has kindly permitted me to quote from the current draft of such reply.  It seems that Weitzman provides a compelling statement of some the limits of CBA, as applied to climate change. It seems to me that any Austrian ought to be sympathetic to Weitzman`s criticisms of the limits of CBA.

(NB:  Weitzman`s draft response is a .pdf file that I cannot upload, though I have uploaded a version convert to .txt format.  I am happy to forward the .pdf to any interested readers.)

The rest of the post sets out the most salient (for a layman) of Weitzman`s key points:

“there is enormous structural uncertainty about the economics of extreme climate change,
which, if not unique, is pretty rare. I will argue on intuitive grounds
that the way in which this deep structural uncertainty is
conceptualized and formalized should influence substantially the
outcomes of any reasonable CBA (or IAM) of climate change. Further, I
will argue that the seeming fact that this deep structural
uncertainty does not influence substantially outcomes from the
“standard” CBA hints at an implausible treatment of uncertainty.”

“The
pre-industrial-revolution level of atmospheric CO2 (about two centuries
ago) was

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


about280 parts per million (ppm). The ice-core data show that
carbon dioxide was within a range roughly between

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


180 and

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


280 ppm
during the last 800,000 years. Currently, CO2 is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


385 ppm, and
climbing steeply. Methane was never higher than

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


750 parts per billion
(ppb) in 800,000 years, but now this extremely potent GHG, which is
thirty times more powerful than CO2, is at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1,780 ppb. The sum total of
all carbon-dioxide-equivalent (CO2-e) GHGs is currently at

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


435 ppm.
Even more alarming in the 800,000-year record is the rate of change of
GHGs, with increases in CO2 being below (and typically well below)

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40
ppm within any past sub-period of ten thousand years, while now CO2 has
risen by

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


40 ppm in just the last quarter century.

Thus, anthropogenic
activity has elevated atmospheric CO2 and CH4 to levels extraordinarily
far outside their natural range – and at a stupendously rapid rate. The
scale and speed of recent GHG increases makes predictions of future
climate change highly uncertain.  There is no analogue for anything
like this happening in the past geological record. Therefore, we do not
really know with much confidence what will happen next.”

“To keep atmospheric CO2 levels at twice pre-industrial-revolution levels would require not just stable but sharply declining emissions within a few decades from now. Forecasting
ahead a century or two, the levels of atmospheric GHGs that may
ultimately be attained (unless drastic measures are undertaken) have
likely not existed for tens of millions of years and the rate of change
will likely be unique on a time scale of hundreds of millions of years.

Remarkably,
the “standard”CBA of climate change takes essentially no account of the
extraordinary magnitude of the scale and speed of these unprecedented
changes in GHGs – and the extraordinary uncertainties they create for
any believable economic analysis of climate change.
Perhaps even
more astonishing is the fact that the “policy ramp” of gradually
tightening emissions, which emerges from the “standard” CBA, attains
stabilization at levels of CO2-e GHGs that approach

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


700 ppm. The
“standard” CBA [of Nordhaus] thus recommends imposing an impulse or
shock to the Earth’s system by geologically-instantaneously jolting
atmospheric stocks of GHGs up to

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


21/2 times their highest past level
over the last 800,000 years – without even mentioning what an
unprecedented planetary experiment such an “optimal” policy would
entail.”

“So-called
“climate sensitivity” (hereafter denoted S1) is a key macro-indicator
of the eventual temperature response to GHG changes. Climate
sensitivity is defi…ned as the global average surface warming following
a doubling of carbon dioxide concentrations. … the median upper 5%
probability level over all 22 climate-sensitivity studies cited in
IPCC-AR4 (2007) is 6.4° C – and this stylized fact alone is telling.
Glancing at Table 9.3 and Box 10.2 of IPCC-AR4, it is apparent that the
upper tails of these 22 PDFs tend to be sufficiently long and heavy
with probability that one is allowed from a simplistically-aggregated
PDF of these 22 studies the rough approximation P[S1>10° C]

Normal
0

0
2

false
false
false

MicrosoftInternetExplorer4


1%. The
actual empirical reason why these upper tails are long and heavy with
probability dovetails nicely with the theory of my paper: inductive
knowledge is always useful, of course, but simultaneously it is limited
in what it can tell us about extreme events outside the range of
experience – in which case one is forced back onto depending more than
one might wish upon the prior PDF, which of necessity is largely
subjective and relatively diffuse. As a recent Science commentary put
it: “Once the world has warmed by 4° C, conditions will be so
different from anything we can observe today (and still more different
from the last ice age) that it is inherently hard to say where the
warming will stop.”

“Exhibit C” concerns possibly disastrous releases over the long run of bad-feedback components
of the carbon cycle that are currently omitted from most general
circulation models. The chief worry here is a significant supplementary
component that conceptually should be added on to climate sensitivity
S1. This omitted component concerns the potentially powerful
self-amplification potential of greenhouse warming due to heat-induced
releases of sequestered carbon. … Over the long run, a CH4
outgassing-amplifier process could potentially precipitate a
cataclysmic strong-positive-feedback warming
. This real physical
basis for a highly unsure but truly catastrophic scenario is my Exhibit
C in the case that conventional CBAs and IAMs do not adequately cover
the deep structural uncertainties associated with possible
climate-change disasters.  Other examples of an actual real physical
basis for a catastrophic outcome could be cited, but this one will do
here.  The real physical possibility of endogenous heat-triggered
releases at high temperatures of the enormous amounts of
naturally-sequestered GHGs is a good example of indirect carbon-cycle
feedback effects that I think should be included in the abstract
interpretation of a concept of “climate sensitivity” that is relevant
here. What matters for the economics of climate change is the
reduced-form relationship between atmospheric stocks of
anthropogenically-injected CO2-e GHGs and temperature change. … When
fed into an economic analysis, the great open-ended uncertainty about
eventual mean planetary temperature change cascades into
yet-much-greater yet-much-more-open-ended uncertainty about eventual
changes in welfare.”

“Exhibit
D” concerns what I view as an unusually cavalier treatment of damages or
disutilities from extreme temperature changes. The “standard” CBA
treats high-temperature damages by a rather passive extrapolation of
whatever specification is assumed (typically arbitrarily) to be the
low-temperature “damages function.”  … Seemingly minor changes in
the specification of high-temperature damages can dramatically alter
the gradualist policy ramp outcomes recommended by the “standard” CBA.

Such fragility of policy to postulated forms of disutility functions
are my Exhibit D in making the case that the “standard” CBA does not
adequately cope with deep structural uncertainty – here structural
uncertainty about the specification of damages.”

“An
experiment without precedent is being performed on planet Earth by
subjecting the world to the shock of a geologically-instantaneous
injection of massive amounts of GHGs. Yet the “standard” CBA seems
almost oblivious to the extraordinarily uncertain consequences of
catastrophic climate change.”

“Almost
nothing in our world has a probability of exactly zero or exactly one.
What is worrisome is not the fact that extreme tails are long per se
(reflecting
the fact that a meaningful upper bound on disutility does not exist),
but that they are fat (with probability density). The critical
question is how fast does the probability of a catastrophe decline
relative to the welfare impact of the catastrophe. Other things being
equal, a thin-tailed PDF is of less concern because the probability of
the bad event declines exponentially (or faster). A fat-tailed
distribution, where the probability declines polynomially, can be much
more worrisome.
… To put a sharp point on this seemingly abstract issue, the
thin-tailed PDFs that Nordhaus requires implicitly to support his
gradualist “policy ramp” conclusions have some theoretical tendency to
morph into being fat tailed when he admits that he is fuzzy about the
functional forms or structural parameters of his assumed thin-tailed
PDFs
– at least for high temperatures. … When one combines fat
tails in the PDF of the logarithm of welfare-equivalent consumption
with a utility function that is sensitive to high damages from extreme
temperatures, it will tend to make the willingness to pay (WTP) to
avoid extreme climate changes very large.”

“Presumably
the PDF in the bad fat tail is thinned, or even truncated, perhaps from
considerations akin to what lies behind the value of a statistical life
(VSL). (After all, we would not pay an infinite amount to eliminate
altogether the fat tail of climate-change catastrophes.) Alas, in
whatever way the bad fat tail is thinned or truncated, a CBA based upon
it remains highly sensitive to the details of the thinning or
truncation mechanism, because the disutility of extreme climate change
has “essentially” unlimited liability.
In this sense climate change
is unique (or at least very rare) because the conclusions from a CBA
for such an unlimited-liability situation have some built-in tendency
to be non-robust to assumed tail fatness.”

“Reasonable
attempts to constrict the fatness of the “bad” tail can still leave us
with uncomfortably big numbers, whose exact value depends non-robustly
upon artificial constraints, functional forms, or parameters that we
really do not understand. The only legitimate way to avoid this
potential problem is when there exists strong a priori knowledge that
restrains the extent of total damages.
If a particular type of
idiosyncratic uncertainty affects only one small part of an
individual’s or a society’s overall portfolio of assets, exposure is
naturally limited to that specific component and bad-tail fatness is
not such a paramount concern. However, some very few but very
important real-world situations have potentially unlimited exposure due
to structural uncertainty about their potentially open-ended
catastrophic reach. Climate change potentially affects the whole
worldwide portfolio of utility by threatening to drive all of planetary
welfare to disastrously low levels in the most extreme scenarios.”

“Conclusions
from CBA [are] more fuzzy than we might prefer, because they are
dependent on essentially arbitrary decisions about how the fat tails
are expressed and about how the damages from high temperatures are
specified.
I would make a strong distinction between thin-tailed
CBA, where there is no reason in principle that outcomes should not be
robust, and fat-tailed CBA, where even in principle outcomes are
highly sensitive to functional forms and parameter values. For ordinary
run-of-the-mill limited exposure or thin-tailed situations, there is at
least the underlying theoretical reassurance that finite-cutoff-based
CBA might (at least in principle) be an arbitrarily-close approximation
to something that is accurate and objective. In fat-tailed unlimited
exposure situations, by contrast, there is no such theoretical
assurance underpinning the arbitrary cutoffs or attenuations – and
therefore CBA outcomes have a theoretical tendency to be sensitive to
fragile assumptions about the likelihood of extreme impacts and how
much disutility they cause.”

“My
target is not CBA in general, but the particular false precision
conveyed by the misplaced concreteness of the “standard” CBA of climate
change. By all means plug in tail probabilities, plug in disutilities
of high impacts, plug in rates of pure time preference, and so forth,
and then see what emerges empirically. Only please do not be surprised
when outcomes from fat-tailed CBA are fragile to specifications
concerning catastrophic extremes.  The extraordinary magnitude of the
deep structural uncertainties involved in climate-change CBA, and the
implied limitations that prevent CBA from reaching robust conclusions,
are highly frustrating for most economists, and in my view may even
push some into a state of denial. After all, economists make a living
from plugging rough numbers into simple models and reaching specific
conclusions (more or less) on the basis of these numbers. What are we
supposed to tell policy makers and politicians if our conclusions are
ambiguous and fragile?”

“It is
threatening for economists to have to admit that the structural
uncertainties and unlimited liabilities of climate change run so deep
that gung-ho “can do” economics may be up against limits on the ability of quantitative analysis to give robust advice in such a grey area. But if this is the way things are with the economics of climate change, then this is the way things are – and non-robustness to subjective assumptions is an inconvenient truth to be lived with rather than a fact to be denied or evaded
just because it looks less scientif…cally objective in CBA. In my
opinion, we economists need to admit to the policy makers, the
politicians, and the public that CBA of climate change is unusual
in being especially fuzzy because it depends especially sensitively on
what is subjectively assumed about the high-temperature damages
function, along with subjective judgements about the fatness of the
extreme tails and/or where they have effectively been cut off
.
Policy makers and the public will just have to deal with the idea that
CBA of climate change is less crisp (maybe I should say even less
crisp) than CBAs of more conventional situations.”

“The
moral of the dismal theorem is that under extreme uncertainty,
seemingly casual decisions about functional forms, parameter values,
and tail thickness may be dominant. We economists should not pursue
a narrow, superficially precise, analysis by blowing away the
low-probability high-impact catastrophic scenarios as if this is a
necessary price we must pay for the worthy goal of giving crisp advice.
An artificial infatuation with precision is likely to make our analysis
go seriously askew and to undermine the credibility of what we say by
effectively marginalizing the very possibilities that make climate
change grave in the first place.

“The
issue of how to deal with the deep structural uncertainties in climate
change would be completely different and immensely simpler if systemic
inertias (like the time required for the system to naturally remove
extra atmospheric CO2) were short (as is the case for SO2;
particulates, and many other airborne pollutants). Then an important
part of an optimal strategy would presumably be along the lines of
“wait and see.” With strong reversibility, an optimal
climate-change policy should logically involve (among other elements)
waiting to see how far out on the bad fat tail the planet will end up,
followed by midcourse corrections if we seem to be headed for a
disaster. This is the ultimate backstop rebuttal of DT given by some
critics of fat-tailed reasoning, including Nordhaus. Alas, the problem
of climate change is characterized everywhere by immensely long
inertias – in atmospheric CO2 removal times, in the capacity of the
oceans to absorb heat (as well as CO2), and in many other relevant
physical and biological processes. Therefore, it is an open question
whether or not we could learn enough in sufficient time to make
politically feasible midcourse corrections. When the critics are
gambling on this midcourse-correction learning mechanism to undercut
the message of DT, they are relying more on an article of faith than on
any kind of evidence-based scientific argument.

“I
think the actual scientific facts behind the alleged feasibility of
“wait and see”policies are, if anything, additional evidence for the
importance of fat-tailed irreversible uncertainty about ultimate
climate change.

“The
relevance of “wait and see”policies is an important unresolved issue,
which in principle could decide the debate between me and Nordhaus, but
my own take right now would be that the built-in pipeline inertias
are so great that if and when we detect that we are heading for
unacceptable climate change, it will likely prove too late to do
anything much about it for centuries to come thereafter
(except,
possibly, for lowering temperatures by geoengineering the atmosphere to
reflect back incoming solar radiation). In any event, I see this whole
“wait and see” issue as yet another component of fat-tailed uncertainty
– rather than being a reliable backstop strategy for dealing with
excessive CO2 in the atmosphere.

Nordhaus
states that there are so many low-probability catastrophic-impact
scenarios around that ‘if we accept the Dismal Theorem, we would
probably dissolve in a sea of anxiety at the prospect of the infinity
of infinitely bad outcomes.’ This is rhetorical excess and, more to the
point here, it is fallacious. Most of the examples Nordhaus gives have
such miniscule thin-tailed probabilities that they can be written off.”

Nordhaus
summarizes his critique with the idea there are indeed deep
uncertainties about virtually every aspect of the natural and social
sciences of climate change – but these uncertainties can only be
resolved by continued careful analysis of data and theories. I heartily
endorse his constructive attitude about the necessity of further
research targeted toward a goal of resolving as much of the uncertainty
as it is humanly possible to resolve.
I would just add that we
should also recognize the reality that, for now and perhaps for some
time to come, the sheer magnitude of the deep structural uncertainties,
and the way we express them in our models, will likely dominate
plausible applications of CBA to the economics of climate change
.”

(emphasis added)

MIT’s "Technology Review" on the regulatory obstacles to a "smart grid" needed for open, competitive electricity markets

February 6th, 2009 No comments

David Talbot, chief correspondent for the MIT Technology Review, has an excellent, long piece in the January/February online issue that explores some the of intra- and inter-state regulatory hurdles that frustrate both the expansion of renewable power and a truly free power market.

I’d like to excerpt some portions of the article here:

When its construction began in the late 19th century, the U.S. electrical grid was meant to bring the cheapest power to the most ­people. Over the past century, regional monopolies and government agencies have built power plants–mostly fossil-fueled–as close to popu­lation centers as possible. They’ve also built transmission and distribution networks designed to serve each region’s elec­tricity consumers. A patchwork system has developed, and what connections exist between local networks are meant mainly as backstops against power outages. Today, the United States’ grid encompasses 164,000 miles of high-voltage transmission lines–those familiar rows of steel towers that carry electricity from power plants to substations–and more than 5,000 local distribution networks. But while its size and complexity have grown immensely, the grid’s basic structure has changed little since Thomas ­Edison switched on a distribution system serving 59 customers in lower Manhattan in 1882. …

While this structure has served remarkably well to deliver cheap power to a broad population, it’s not particularly well suited to fluctuating power sources like solar and wind. First of all, the transmission lines aren’t in the right places. The gusty plains of the Midwest and the sun-baked deserts of the Southwest–areas that could theoretically provide the entire nation with wind and solar power–are at tail ends of the grid, isolated from the fat arteries that supply power to, say, Chicago or Los Angeles. Second, the grid lacks the storage capacity to handle variability–to turn a source like solar power, which generates no energy at night and little during cloudy days, into a consistent source of electricity. And finally, the grid is, for the most part, a “dumb” one-way system. Consider that when power goes out on your street, the utility probably won’t know about it unless you or one of your neighbors picks up the phone. …

The U.S. grid’s regulatory structure is just as antiquated. While the Federal Energy Regulatory Commission (FERC) can approve utilities’ requests for electricity rates and license transmission across state lines, individual states retain control over whether and where major transmission lines actually get built. In the 1990s, many states revised their regulations in an attempt to introduce competition into the energy marketplace. Utilities had to open up their transmission lines to other power producers. One effect of these regulatory moves was that companies had less incentive to invest in the grid than in new power plants, and no one had a clear responsibility for expanding the transmission infrastructure. At the same time, the more open market meant that producers began trying to sell power to regions farther away, placing new burdens on existing connections between networks. The result has been a national transmission shortage.

These problems may now be the biggest obstacle to wider use of renewable energy, which otherwise looks increasingly viable. Researchers at the National Renewable Energy Laboratory in Golden, CO, have concluded that there’s no technical or economic reason why the United States couldn’t get 20 percent of its elec­tricity from wind turbines by 2030. The researchers calculate, however, that reaching this goal would require a $60 billion investment in 12,650 miles of new transmission lines to plug wind farms into the grid and help balance their output with that of other electricity sources and with consumer demand. The inadequate grid infrastructure “is by far the number one issue with regard to expanding wind,” says Steve Specker, president of the Electric Power Research Institute (EPRI) in Palo Alto, CA, the industry’s research facility. “It’s already starting to restrict some of the potential growth of wind in some parts of the West.”

The Midwest Independent Transmission System Operator, which manages the grid in a region covering portions of 15 states from Pennsylvania to Montana, has received hundreds of applications for grid connections from would-be energy developers whose proposed wind projects would collectively generate 67,000 megawatts of power. That’s more than 14 times as much wind power as the region produces now, and much more than it could consume on its own; it would represent about 6 percent of total U.S. electricity consumption. But the existing transmission system doesn’t have the capacity to get that much electricity to the parts of the country that need it. In many of the states in the region, there’s no particular urgency to move things along, since each has all the power it needs. So most of the applications for grid connections are simply waiting in line, some stymied by the lack of infrastructure and others by bureaucratic and regulatory delays. …

Utilities, however, are reluctant to build new transmission capacity until they know that the power output of remote wind and solar farms will justify it. At the same time, renewable-energy investors are reluctant to build new wind or solar farms until they know they can get their power to market. Most often, they choose to wait for new transmission capacity before bothering to make proposals, says Suedeen Kelly, a FERC commissioner. “It is a chicken-and-egg type of thing,” she says. …

Smart-grid technologies could reduce overall electricity consumption by 6 percent and peak demand by as much as 27 percent. The peak-demand reductions alone would save between $175 billion and $332 billion over 20 years, according to the Brattle Group, a consultancy in Cambridge, MA. Not only would lower demand free up transmission capacity, but the capital investment that would otherwise be needed for new conventional power plants could be redirected to renewables. That’s because smart-grid technologies would make small installations of wind turbines and photovoltaic panels much more practical.  …

The good news is that many utilities have begun installing the requisite meters–ones that intelligently monitor power flow out of a house as well as into it. The question now is how to move beyond the blizzard of pilot projects, install smarter technologies across the grid, and begin integrating more renewable power into the new infrastructure. “The smart-grid vision is nice; we all have our color PowerPoint slides,” says Don Von Dollen, who manages intelligent-­grid research at EPRI. “I think people kind of get the vision by now. Now it’s time to get stuff done.”  …

Last summer, former vice president Al Gore began arguing that the country needed to implement an entirely carbon-free electricity system within a decade to avert the danger of global warming. As part of his vision, Gore called for a “unified national smart grid” that would move power generated from renewable sources to cities, increase the efficiency of electricity use, and allow for greater control over renewable resources. He estimated that the grid overhaul would cost $400 billion over 10 years.  …

While pilot projects like the one in Boulder are worthwhile as a way to demonstrate new technologies, they’ve been implemented in hodgepodge fashion, with different utilities deploying different technologies in different states. Transmission projects are advancing incrementally, but they’re often complicated by conflicts between the states. “What we have today is this patchwork of rules and regulations that vary by state,” says Peter Corsell, CEO of GridPoint, a startup in Arlington, VA, that makes smart-grid software and is participating in the Boulder project. “We are all entrenched in this broken system, and there is no agreement on how to fix it. It’s a vicious circle.

Some think that the answer is to give FERC more ­authority. Today, the agency can overrule states’ decisions on where to site transmission lines, but only in regions that the U.S. Department of Energy has designated as critical for the security of the elec­tricity supply. So far, only two such corridors have been designated: one in the mid-Atlantic states and another in the Southwest. Even in those regions, delays continue. Southern California Edison has proposed a major transmission line in the southwest corridor; stretching from outside Los Angeles to near Phoenix, AZ, it would be able to handle power generated by future photovoltaic and solar-thermal power plants. But Arizona rejected the idea, so the utility is preparing to take its plans to FERC.

Others think the solution is a new federal policy that would make the market for renewable power more lucrative, perhaps by regulating carbon dioxide emissions, as the cap-and-trade policy proposed by Obama would do. Under such a policy, wind energy and other carbon-free electricity sources would become much more valuable, providing an incentive for utilities to expand their capacity to handle them (see “Q&A,” p. 28). “It could all change very fast,” says Will Kaul, vice president for transmission at Great River Energy in Minnesota, who heads a joint transmission planning effort that includes 11 utilities in the Midwest.  …

[A]n explosion in the use of renewables will depend heavily on upgrading the grid. That won’t come cheap, but the payoff may be worth it. “We should think about this in the same way we think about the role of the federal highway system,” says Ernest Moniz, a physics professor at MIT who heads the school’s energy research initiative. “It is the key enabler to allow us to modernize our whole electricity production system.”

(emphasis added)

One would think that deregulation of state utilities would also be a step in the direction of freeing up markets, introducing competiion and incentivizing both new grid investments and profitting from efficiency improvements.

In any case, I hope to vist this subject in other posts.

Categories: power, regulation Tags: