Feedbacks mean that the warming resulting from a doubling of CO2 may be much higher than 3 C, say more paleo-climate studies

May 6th, 2011 No comments

I have discussed climate “sensitivity” a number of times earlier; here is a good place to start for those interested: http://mises.org/Community/blogs/tokyotom/archive/2008/10/14/what-do-we-know-about-climate-models-and-climate-quot-sensitivity-quot-a-recent-bibliography.aspx

I just ran across more bad news about the size of the risks we face. Given the risks, it seems rather perverse to me that the “do nothing (except externalizing risks and buckling our own seat belts for the longer haul)” approach is the one that is considered “conservative”.

Joe Romm, Climate Progress (emphasis added):

The disinformers claim that projections of dangerous future warming from greenhouse gas emissions are based on computer models.  In fact, ClimateProgress readers know that the paleoclimate data is considerably more worrisome than the models (see Hansen: ‘Long-term’ climate sensitivity of 6°C for doubled CO2).  That’s mainly because the vast majority of the models largely ignore key amplifying carbon-cycle feedbacks, such as the methane emissions from melting tundra (see Are Scientists Underestimating Climate Change).

Science has just published an important review and analysis of “real world” paleoclimate data in “Lessons from Earth’s Past” (subs. req’d) by National Center for Atmospheric Research (NCAR) scientist Jeffrey Kiehl.  The NCAR release is here: “Earth’s hot past could be prologue to future climate.”  The study begins by noting:

Climate models are invaluable tools for understanding Earth’s climate system. But examination of the real world also provides insights into the role of greenhouse gases (carbon dioxide) in determining Earth’s climate. Not only can much be learned by looking at the observational evidence from Earth’s past, but such know ledge can provide context for future climate change.

The atmospheric CO2 concentration currently is 390 parts per million by volume (ppmv), and continuing on a business-as-usual path of energy use based on fossil fuels will raise it to ∼900 to 1100 ppmv by the end of this century (see the first figure) (1). When was the last time the atmosphere contained ∼1000 ppmv of CO2? Recent reconstructions (24) of atmospheric CO2 concentrations through history indicate that it has been ∼30 to 100 million years since this concentration existed in the atmosphere (the range in time is due to uncertainty in proxy values of CO2). The data also reveal that the reduction of CO2 from this high level to the lower levels of the recent past took tens of millions of years. Through the burning of fossil fuels, the atmosphere will return to this concentration in a matter of a century. Thus, the rate of increase in atmospheric CO2 is unprecedented in Earth’s history.

I will repost the references at the end, since this is a review article (see also U.S. media largely ignores latest warning from climate scientists: “Recent observations confirm … the worst-case IPCC scenario trajectories (or even worse) are being realised” — 1000 ppm)

 

Kiel concludes: (emphasis in original)

Earth’s CO2concentration is rapidly rising to a level not seen in ∼30 to 100 million years, and Earth’s climate was extremely warm at these levels of CO2. If the world reaches such concentrations of atmospheric CO2, positive feedback processes can amplify global warming beyond current modeling estimates. The human species and global ecosystems will be placed in a climate state never before experienced in their evolutionary history and at an unprecedented rate. Note that these conclusions arise from observations from Earth’s past and not specifically from climate models.

 

 

 

 Romm has referred to other recent studies that indicate these extended feedbacks may occur  relatively quickly:

http://climateprogress.org/2009/10/18/science-co2-levels-havent-been-this-high-for-15-million-years-when-it-was-5%C2%B0-to-10%C2%B0f-warmer-and-seas-were-75-to-120-feet-higher-we-have-shown-that-this-dramatic-rise-in-sea-level-i/

The conclusion from this analysis—resting on data for CO2 levels, paleotemperatures, and radiative transfer knowledge—is that Earth’s sensitivity to CO2radiative forcing may be much greater than that obtained from climate models (1214).

Indeed, in the release, Kiehl notes his study “found that carbon dioxide may have at least twice the effect on global temperatures than currently projected by computer models of global climate.”

Why is the ‘real world’ warming so much greater than the models?  The vast majority of the models focus on the equilibrium climate sensitivity — typically estimated at about 3°C for double CO2 (equivalent to about ¾°C per W/m2) — only includes fast feedbacks, such as water vapor and sea ice.  As Hansen has explained in deriving his 6°C ‘long-term’ sensitivity: (emphasis in original)

Elsewhere (Hansen et al. 2007a) we have described evidence that slower feedbacks, such as poleward expansion of forests, darkening and shrinking of ice sheets, and release of methane from melting tundra, are likely to be significant on decade-century time scales. This realization increases the urgency of estimating the level of climate change that would have dangerous consequences for humanity and other creatures on the planet, and the urgency of defining a realistic path that could avoid these dangerous consequence.

For background on the tundra (and methane), see Science: Vast East Siberian Arctic Shelf methane stores destabilizing and venting:  NSF issues world a wake-up call: “Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.” …

Categories: Uncategorized Tags:

Have we already pulled the trigger on the methane gun? As Arctic warms, surprisingly large and uncontrollable releases of methane (much more potent than CO2) are growing from massive shallow seabed deposits and permafrost

May 6th, 2011 No comments

1.  Wikipedia, Clathrate gun hypothesis (emphasis added):

The clathrate gun hypothesis is the popular name given to the hypothesis that rises in sea temperatures (and/or falls in sea level) can trigger the sudden release of methane from methane clathrate compounds buried in seabeds and permafrost which, because the methane itself is a powerful greenhouse gas, leads to further temperature rise and further methane clathrate destabilization – in effect initiating a runaway process as irreversible, once started, as the firing of a gun. …

 

2.  Skeptical Science, Wakening the Kraken, April 23, 2011:

a major study in Science that found the vast East Siberian Arctic Shelf methane stores appeared to be destabilizing and venting.  The normally staid National Science Foundation issued a press release warning “Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.”

Now there is a new Geophysical Research Letters study on a paleoclimate analog that may be relevant to humanity today, “Methane and environmental change during the Paleocene‐Eocene thermal maximum (PETM): Modeling the PETM onset as a two‐stage event.” …

We know that in the past there have been sudden changes in global warming associated with releases of greenhouse gases.  These rapid, massive releases were characterised by unusualdeficiency in carbon isotope 13 (∂13C ) and massive extinction of animals, most recently at the time of the Paleocene-Eocene Thermal Maximum (PETM), about 55.8 million years ago. …

The description of Stage 2:  Very rapid and massive release of carbon deficient in ∂13C, does put one in mind of the Methane Gun hypothesis. It postulates that methane clathrate at shallow depth begins melting and through the feed-back process accelerate atmospheric and oceanic warming, melting even larger and deeper clathrate deposits.  The result:  A relatively sudden massive venting of methane – the firing of the Methane Gun.  Recent discovery by Davy et al (2010) of kilometer-wide (ten 8-11 kilometer and about 1,000 1-kilometer-wide features) eruption craters on the Chatham Rise seafloor off New Zealand adds further ammunition to the Methane Gun hypothesis.

It has been known for many years that methane is being emitted from Siberian swamplands hitherto covered by permafrost, trapping an estimated 1,000 billion tons of methane.  Permafrost on land is now seasonally melting and with each season melting it at greater depth, ensuring that each year methane venting from this source increases.

Methane clathrate has accumulated over the East Siberian continental shelf where it is covered by sediment and seawater up to 50 meters deep.  An estimated 1,400 billion tons of methane is stored in these deposits.  By comparison, total human greenhouse gas emissions (including CO2) since 1750 amount to some 350 billion tons.

Significant methane release can occur when on-shore permafrost is thawed by a warmer atmosphere (unlikely to occur in significance on less than a century timescale) and undersea clathrate at relatively shallow depths is melted by warming water.  This is now occurring. In both cases, methane gas bubbles to the surface with little or no oxidation, entering the atmosphere as CH4 – a powerful greenhouse gas which increases local, then Arctic atmospheric and ocean temperature, resulting in progressively deeper and larger deposits of clathrate melting.

Methane released from deeper deposits such as those found off Svalbard has to pass through a much higher water column (>300 meters) before reaching the surface.  As it does so, it oxidises to CO2, dissolving in seawater or reaching the atmosphere as CO2 which causes far slower warming, but can nevertheless contribute to ocean acidification.

A significant release of methane due to melting of the vast deposits trapped by permafrost and clathrate in the Arctic would result in massive loss of oxygen, particularly in the Arctic ocean but also in the atmosphere.  Resulting hypoxic conditions would cause large extinctions, especially of water breathing animals, which is what we find at the PETM.

Shakhova et al (2010) reports that the continental shelf of East Central Siberia (ECS), with an area of over 2 million km2, is emitting more methane than all other ocean sources combined.  She calculates that methane venting from the ECS is now in the order of 8 million tons per annum and increasing.  This equates to ~200 million tons/annum of CO2, more than the combined CO2 emissions of Scandinavia and the Benelux countries in 2007.  This methane is likely sourced from non-hydrate methane previously kept in place by thin and now melting permafrost at the sea bed, melting clathrates, or some combination of both.

Release of ECS methane is already contributing to Arctic amplification resulting in temperature increase exceeding twice the global average.  The rate of release from the tundra alone is predicted to reach 1.5 billion tons of carbon per annum before 2030, contributing to accelerated climate change, perhaps resulting in sustained decadal doubling of ice loss causing collapse of the Greenland Ice Sheet (Hansen et al, 2011).  This would result in a possible sea level rise of ~5 meters before 2100, according to Hansen et al.

Evidence supports the theory that sudden and massive releases of greenhouse gases, including methane, caused decade-scale climate changes – with consequent species extinctions – culminating in the Holocene Thermal Optimum.

In summary, immense quantities of methane clathrate have been identified in the Arctic.  Were a fraction of these to melt, the result would be massive release of carbon, initially as CH4 causing deeper clathrate to melt and oxidise, adding CO2 to the atmosphere.  Were this to occur, it would greatly worsen global warming.

While natural global warming during the ice ages was initiated by increased solar radiation caused by cyclic changes to Earth’s orbital parameters, there is no evident mechanism for correcting Anthropogenic Global Warming over the next several centuries.  The latter has already begun producing methane and CO2 in the Arctic, starting a feedback process which may lead to uncontrollable, very dangerous global warming, akin to that which occurred at the PETM.

This extremis we ignore – to our peril.

– Agnostic & Daniel Bailey

3.  Joe Romm, Climate Progress, April 25, 2011

Methane release from the not-so-perma-frost is the most dangerous amplifying feedback in the entire carbon cycle (see “NSIDC bombshell: Thawing permafrost feedback will turn Arctic from carbon sink to source in the 2020s, releasing 100 billion tons of carbon by 2100“).

It is worth noting that no climate model currently incorporates the amplifying feedback from methane released by a defrosting tundra. Indeed the NSIDC/NOAA study I wrote about in February on methane release by the land-based permafrost itself doesn’t even incorporate the carbon released by the permafrost carbon feedback into its warming model!

4.  Doc alert: Siberian methane (Jan 14, 2011)

 5. Joe Romm, Climate Progress, Paleoclimate data suggests CO2 “may have at least twice the effect on global temperatures than currently projected by computer models”, January 13, 2011

Methane release from the not-so-perma-frost is the most dangerous amplifying feedback in the entire carbon cycle.  The permafrost permamelt contains a staggering “1.5 trillion tons of frozen carbon, about twice as much carbon as contained in the atmosphere,” much of which would be released as methane.  Methane is  is 25 times as potent a heat-trapping gas as CO2 over a 100 year time horizon, but 72 times as potent over 20 years!  The carbon is locked in a freezer in the part of the planet warming up the fastest (see “Tundra 4: Permafrost loss linked to Arctic sea ice loss“).  Half the land-based permafrost would vanish by mid-century on our current emissions path (see “Tundra, Part 2: The point of no return” and below).  No climate model currently incorporates the amplifying feedback from methane released by a defrosting tundra.

6.  Joe Romm, Climate Progress, Science stunner: Vast East Siberian Arctic Shelf methane stores destabilizing and ventingScience: Vast East Siberian Arctic Shelf methane stores destabilizing and venting:  NSF issues world a wake-up call: “Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.” March 4, 2010

Methane release from the not-so-perma-frost is the most dangerous amplifying feedback in the entire carbon cycle. Research published in Friday’s journal Science finds a key “lid” on “the large sub-sea permafrost carbon reservoir” near Eastern Siberia “is clearly perforated, and sedimentary CH4 [methane] is escaping to the atmosphere.”

 … the situation in the ESAS is far, far more dicey, as NSF explains:

 

The East Siberian Arctic Shelf, in addition to holding large stores of frozen methane, is more of a concern because it is so shallow. In deep water, methane gas oxidizes into carbon dioxide before it reaches the surface. In the shallows of the East Siberian Arctic Shelf, methane simply doesn’t have enough time to oxidize, which means more of it escapes into the atmosphere. That, combined with the sheer amount of methane in the region, could add a previously uncalculated variable to climate models.

“The release to the atmosphere of only one percent of the methane assumed to be stored in shallow hydrate deposits might alter the current atmospheric burden of methane up to 3 to 4 times,” Shakhova said. “The climatic consequences of this are hard to predict.”

And we also know that a key trigger for accelerated warming in the Arctic region is the loss of sea ice.

A 2008 study by leading tundra experts found “Accelerated Arctic land warming and permafrost degradation during rapid sea ice loss.” The lead author is David Lawrence of the National Center for Atmospheric Research (NCAR), whom I interviewed for my book and interviewed again via e-mail in 2008. The study’s ominous conclusion:

We find that simulated western Arctic land warming trends during rapid sea ice loss are 3.5 times greater than secular 21st century climate-change trends. The accelerated warming signal penetrates up to 1500 km inland….

In other words, a continuation of the recent trend in sea ice loss may triple Arctic warming, causing large emissions in carbon dioxide and methane from the tundra this century.

Oh, and the Arctic warming could lead to another feedback according to a 2008 Science article:  “Continuation of current trends in shrub and tree expansion could further amplify this atmospheric heating 2-7 times.”  The point is that if you convert a white landscape to a boreal forest, the surface suddenly starts collecting a lot more solar energy (see “Tundra 3: Forests and fires foster feedbacks“).

 

 

“Our concern is that the subsea permafrost has been showing signs of destabilization already,” she said. “If it further destabilizes, the methane emissions may not be teragrams, it would be significantly larger.”

NSF explains:

“The amount of methane currently coming out of the East Siberian Arctic Shelf is comparable to the amount coming out of the entire world’s oceans,” said Shakhova, a researcher at UAF’s International Arctic Research Center. “Subsea permafrost is losing its ability to be an impermeable cap.”

 

 

 

Shakhova notes that the Earth’s geological record indicates that atmospheric methane concentrations have varied between about .3 to .4 parts per million during cold periods to .6 to .7 parts per million during warm periods. Current average methane concentrations in the Arctic average about 1.85 parts per million, the highest in 400,000 years, she said. Concentrations above the East Siberian Arctic Shelf are even higher.

The East Siberian Arctic Shelf is a relative frontier in methane studies. The shelf is shallow, 50 meters (164 feet) or less in depth, which means it has been alternately submerged or terrestrial, depending on sea levels throughout Earth’s history. During the Earth’s coldest periods, it is a frozen arctic coastal plain, and does not release methane. As the Earth warms and sea level rises, it is inundated with seawater, which is 12-15 degrees warmer than the average air temperature.

“It was thought that seawater kept the East Siberian Arctic Shelf permafrost frozen,” Shakhova said. “Nobody considered this huge area.”

The hardest of the hard core climate geeks (and we all know who we are) probably recognize the name Natalia Shakhova. She’s a Research Assistant Professor working with the International Arctic Research Center at the University of Alaska Fairbanks and is probably best know to people of our ilk for her work involving Siberian methane deposits. She gave a presentation at a US Dept. of Defense symposium and workshop last November, and it (and others from the event) are online.

Dr. Shakhova’s presentation is titled “Methane Release from the East Siberian Arctic Shelf (ESAS) and the Potential for Abrupt Climate Changes”, and you can download it in PDF format from the event’s site.

Based on that title and the things I write about here (and by “write about” you can substitute “obsess over”, if you’re feeling a need for unflinching accuracy), you’ve probably figured out that this is yet another unsettling collection of data about methane. A couple of tidbits to show that such a conclusion would be accurate, even without the benefit of context (emphasis in the original):

[Slide 34]

Interpretation of acoustical data recorded with deployed multibeam sonar allowed moderate quantification of bottom fluxes as high as 44 g/m2/d (Leifer et al., in preparation). Prorating these numbers to the areas of hot spots (210×103 km2) adds 3.5Gt to annual methane release from the ESAS. This is enough to trigger abrupt climate change (Archer, 2005).

[Slide 38, one bullet taken from the conclusion]

Considering the significance of the ESAS methane reservoir and enhancing mechanism of its destabilization, this region should be considered the most potential in terms of possible climate change caused by abrupt release of methane.

Methane (CH4) deserves attention it is such a highly potent greenhouse gas — 25-33 times more powerful than carbon dioxide (CO2) over a 100-year time-horizon, but as much as 100 time more potent over 20 years, according to the latest research!

Last year I reported on a major study in Science that found the vast East Siberian Arctic Shelf methane stores appeared to be destabilizing and venting.  The normally staid National Science Foundation issued a press release warning “Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.” …

Most deposits of methane clathrate are in sediments too deep to respond rapidly, and modelling by Archer (2007) suggests the methane forcing should remain a minor component of the overall greenhouse effect.[10] Clathrate deposits destabilize from the deepest part of their stability zone, which is typically hundreds of metres below the seabed. A sustained increase in sea temperature will warm its way through the sediment eventually, and cause the deepest, most marginal clathrate to start to break down; but it will typically take of the order of a thousand years or more for the temperature signal to get through.[10]

One exception, however, may be in clathrates associated with the Arctic ocean, where clathrates can exist in shallower water stabilized by lower temperatures rather than higher pressures; these may potentially be marginally stable much closer to the surface of the sea-bed, stabilized by a frozen ‘lid’ of permafrost preventing methane escape. Recent research carried out in 2008 in the Siberian Arctic has shown millions of tons of methane being released, apparently through perforations in the seabed permafrost,[11] with concentrations in some regions reaching up to 100 times normal.[12][13] The excess methane has been detected in localized hotspots in the outfall of the Lena River and the border between the Laptev Sea and the East Siberian Sea. Some melting may be the result of geological heating, but more thawing is believed to be due to the greatly increased volumes of meltwater being discharged from the Siberian rivers flowing north.[14] Current methane release has previously been estimated at 0.5 Mt per year.[15]Shakhova et al. (2008) estimate that not less than 1,400 Gt of carbon is presently locked up as methane and methane hydrates under the Arctic submarine permafrost, and 5–10% of that area is subject to puncturing by open taliks. They conclude that “release of up to 50 Gt of predicted amount of hydrate storage [is] highly possible for abrupt release at any time”. That would increase the methane content of the planet’s atmosphere by a factor of twelve,[16][17] equivalent in greenhouse effect to a doubling in the current level of CO2.

In 2008 the United States Department of Energy National Laboratory system[18] and the United States Geological Survey’s Climate Change Science Program both identified potential clathrate destabilization in the Arctic as one of four most serious scenarios for abrupt climate change, which have been singled out for priority research. The USCCSP released a report in late December 2008 estimating the gravity of this risk.

The East Siberian Arctic Shelf is a methane-rich area that encompasses more than 2 million square kilometers of seafloor in the Arctic Ocean. It is more than three times as large as the nearby Siberian wetlands, which have been considered the primary Northern Hemisphere source of atmospheric methane. Shakhova’s research results show that the East Siberian Arctic Shelf is already a significant methane source, releasing 7 teragrams of methane yearly, which is as much as is emitted from the rest of the ocean. A teragram is equal to about 1.1 million tons.

Scientists learned last year that the permafrost permamelt contains a staggering “1.5 trillion tons of frozen carbon, about twice as much carbon as contained in the atmosphere,” much of which would be released as methane.  Methane is  is 25 times as potent a heat-trapping gas as CO2 over a 100 year time horizon, but 72 times as potent over 20 years!

The carbon is locked in a freezer in the part of the planet warming up the fastest (see “Tundra 4: Permafrost loss linked to Arctic sea ice loss“).  Half the land-based permafrost would vanish by mid-century on our current emissions path (see “Tundra, Part 2: The point of no return” and below).  No climate model currently incorporates the amplifying feedback from methane released by a defrosting tundra.

The new Science study, led by University of Alaska’s International Arctic Research Centre and the Russian Academy of Sciences, is “Extensive Methane Venting to the Atmosphere from Sediments of the East Siberian Arctic Shelf” (subs. req’d).  The must-read National Science Foundation press release (click here), warns “Release of even a fraction of the methane stored in the shelf could trigger abrupt climate warming.”  The NSF is normally a very staid organization.  If they are worried, everybody should be.

It is increasingly clear that if the world strays significantly above 450 ppm atmospheric concentrations of carbon dioxide for any length of time, we will find it unimaginably difficult to stop short of 800 to 1000 ppm. …

Categories: Uncategorized Tags:

Tornadoes, fires and floods, oh my! Time to stop hiding our heads in the sand. Who benefits from our loading of the climate dice?

May 6th, 2011 No comments

[My apologies for weird formatting, I find it very difficult to deal with html embedded in text that I cut and paste!]

No doubt a locally cold winter helped many readers put behind them thoughts about last year’s worldwide record droughts, floods and heatwaves.

But the storms and firestorms are back with a vengeance, and neither the overall global warming nor our ongoing radiative forcing have stopped.I urge readers to take a look and reflect. There is, after all, a libertarian climate agenda of freeing markets and dismantling corporate risk-shifting and resulting over-regulation (as well as apparently serious suggestions from George Reisman and Stephan Kinsella that we start experimenting with atom bomb-based climate modification or other deliberate geo-engineering measures). 

Given the great heat sink that are the world’s oceans, we are only now feeling the forcing attributable to GHGs emitted 40 years ago (with a similar lag before the full effect of what we are emitting now will be felt). And the emissions of China and India are expected to double further before peaking in a few decades.

A few links and excerpts, in reverse chronological order:

Dr. Jeff Masters’ WunderBlog; April’s tornado outbreaks the two largest in historyhttp://www.wunderground.com/blog/JeffMasters/comment.html?entrynum=1796; Posted by:JeffMasters, 2:54 PM GMT on May 05, 2011

 

 

Stu Ostro, Weather Channel Senior Meteorologist, “The Katrina of tornado outbreaks“:

The atmosphere was explosively unstable with summerlike heat and humidity, interacting with a classic wind shear setup as a strong jet stream and upper-level trough crashed overhead….

The atmosphere is extraordinarily complex, and ultimately what’s happened the past month is probably a combination of influences, including La Nina, other natural variability, and anthropogenic global warming.

Extreme weather disasters, especially deluges and floods, are on the rise — and the best analysis says human-caused warming is contributing (see Two seminal Nature papers join growing body of evidence that human emissions fuel extreme weather, flooding).  Last year, we hadTennessee’s 1000-year deluge aka Nashville’s ‘Katrina’.  And  Coastal North Carolina’s suffered its second 500-year rainfall in 11 years.

Craig Fugate, who heads the U.S. Federal Emergency Management Agency, said in December, “The term ‘100-year event’ really lost its meaning this year” (see Munich Re: “The only plausible explanation for the rise in weather-related catastrophes is climate change”).

Former hurricane-hunter Masters has a good analysis of how the “Midwest deluge [is] enhanced by near-record Gulf of Mexico sea surface temperatures”

UPDATE:  “Persistent, heavy rains have helped swell the Mississippi and Ohio rivers to the highest levels ever recorded,” CNN reports.  And the rivers are still rising.

The Effect of Climate Change on Tornado Frequency and Magnitude:  “There is an obvious increase in tornado frequency between 1950-1999. This could be due to increased detection. Also this could be due to changing climatic conditions.”

For decades, scientists have predicted that if we kept pouring increasing amounts of heat-trapping greenhouse gases into the atmosphere, we would change the climate.   They specifically predicted that that many key aspects of the weather would become more extreme — more extreme heat waves, more intense droughts, and stronger deluges.

As far back as 1995, analysis by NOAA’s National Climatic Data Center (led by Tom Karl) showed that over the course of the 20th century, the United States had suffered a statistically significant increase in a variety of extreme weather events, the very ones you would expect from global warming, such as more — and more intense — precipitation. That analysis concluded the chances were only “5 to 10 percent” this increase was due to factors other than global warming, such as “natural climate variability.” And since 1995, the climate has gotten measurably more extreme.

Multiple scientific studies find that indeed the weather has become more extreme, as expected, and that it is extremely likely that humans are a contributing cause (see “Two seminal Naturepapers join growing body of evidence that human emissions fuel extreme weather, flooding that harm humans and the environment” and links therein).

Beyond that, as Dr. Kevin Trenberth, head of the Climate Analysis Section of the National Center for Atmospheric Research, explained here last year: “There is a systematic influence on all of these weather events now-a-days because of the fact that there is this extra water vapor lurking around in the atmosphere than there used to be say 30 years ago. It’s about a 4% extra amount, it invigorates the storms, it provides plenty of moisture for these storms.”  He told theNY Times, “It’s not the right question to ask if this storm or that storm is due to global warming, or is it natural variability. Nowadays, there’s always an element of both.” 

Jeremy Hance; mongabay.com; Are US floods, fires linked to climate change? http://news.mongabay.com/2011/0428-hance_extremeweather_us.html; April 28, 2011

“There have always been extreme events,” Peter Stott, a climatologist from the UK’s Met Office, told Yale360 in a piece on extreme weather and climate change. “Natural variability does play a role, but now so does climate change. It is about changing the odds of the event happening.”  

“By now, most people get that you can’t attribute any single weather event on global warming,” John Nielsen-Gammon, Texas’ state climatologist and a professor at Texas A&M University, told the McClatchy-Tribune news service. “But some things are clear: temperatures have been going up, and models all agree that the temperature rise will continue unless we get some massive volcanic eruptions or the sun suddenly becomes much dimmer.”

 Multiple torrential downpours are setting the stage for more 100-year floods in the coming days, as meteorologist Dr. Jeff Masters reports today.

Several papers published in the journal Nature demonstrate that such extreme precipitation events in specific localities is the result of climate change and not an overactive imagination. The scientists studied the actual, observable precipitation patterns in the 20th century and then compared them to climate model simulations and a splash of probability to discover a close, predictive match up.

They claim that their results provide “first formal identification of a human contribution to the observed intensification of extreme precipitation.” The scientists, led by Seung-Ki Min at the Climate Research Division from Environment Canada in Toronto, say that the global climate models may, in fact, be underestimating the amount of extreme weather events, “which implies that extreme precipitation events may strengthen more quickly in the future than projected and that they may have more severe impacts than estimated.”

In another study, this one led by Pardeep Pall at the University of Oxford, looked at a specific weather event: the 2000 floods in England and Wales, which occurred during the wettest autumn since 1766. …

Climate change could signal prolonged droughts in American Southwest
Think the 1930s “Dust Bowl” was bad in the American West? Scientists have found evidence of “mega-drought” events that lasted centuries to millennia in the same region during warm, interglacial periods in the Pleistocene era (370,000-550,000 years ago). The evidence heightens concern over how the region will react to the modern day global temperature spikes.

The American Southwest is already predicted to get pretty dry during climate change, due to a drop in winter precipitation that would increase evaporation rates and lead to smaller snow packs that normally provide water during the warmer months.

 

New York Times, In Weather Chaos, a Case for Global Warming, http://www.nytimes.com/2010/08/15/science/earth/15climate.html (August 14, 2010)

Categories: Uncategorized Tags:

Richard Nixon, not enviros, was responsible for meddling in energy usage

May 2nd, 2011 No comments

Rob Bradley, an erstwhile libertarian turned energy industry spokesman – and thus no friend of enviro-fascists (or libertarian critics like me) – provides a useful reminder of the history of government interference in energy (though he omits the Vietnam War and the role of gross pollution problems).

Below are excerpts from Bradley’s May 2 post at his Big Oil and Big Coal cheerleading blog, ‘Master Resource’ (emphasis added):

Remembering the Birth of Conservationism (Part I: President Nixon’s price controls, not Arab OPEC, produced energy crisis, demand-side politicization)

The oil crisis, contrary to popular remembrance, did not begin with the Arab Embargo of October 1973. It began with petroleum product shortages that arose in late 1972 when price controls became constraining. In February 1973, Senate hearings on fuel shortages … Expert testimony was heard about how 18 months of price controls were at the root of the supply shortfall, as were the lingering constraints of an earlier federal program designed to help the domestic industry in a time of oil surplus, the Mandatory Oil Import Program.

The U.S. Senate convened a meeting on energy conservation, identified as “the first congressional hearings to be devoted to this subject.” Demand was now decoupled from supply, creating an industry of thought, opinion, and passion as to what demand should be and what role government should play to correct oil-market problems. The game was rigged thanks to Richard Nixon, whose original 90-day freeze would be but the first of five price-control phases and the starting point for more than seven years of price-and-allocation regulation under the Emergency Petroleum Allocation Act of 1973 (EPAA).

The March 1973 hearings attracted the first wave of energy conservationists and environmentalists from organizations such as the Environmental Defense Fund, Friends of the Earth, and the Sierra Club.

Conservationism (as versus self-interested conservation) would now have a life of its own. Energy usage was a per se bad. Less was better. Energy appliances and motorized transportation would never be the same after President Nixon’s ill-fated action of wage and price controls.

Categories: Uncategorized Tags:

Are ‘enviros’ evil,or trying to protect property + reassert control over behemoths? New Zealand Navy + Petrobras vs. Maori fishermen

April 27th, 2011 No comments

A quick show of hands:

  • How many of you think that the recent protests by New Zealand “greens” and Maori fishermen against Government-licensed oil exploration activities by PetroBras is evidence of a blind envirofascist hatred of mankind?
  • How many of you cheer on the “capitalist” exploration and development of government-owned resources by big and state-entanged corporations, over the quaint claims to “fishing rights” by locals?
  • How many of you think that PetroBras and its shareholders are the real victims of these protests?
  • How many think it’s a sign of how government “ownership” of public resources leads to zero-sum politicization of decisions, and of decisions that are tilted toward activities that provide revenues to government, while shifting risks to local communities and individuals?
  • Does anybody see any parallels with BP and the Gulf?  With the crony capitalism supporting Tokyo Electric, the operator of the Fukushima nuclear plants?

I posted a few tweets on this topic, which I copy below in chronological order:

 TokyoTom

NZ Navy intervention in oil protest “disgusting” – Maori MPs |     
 TokyoTom 

Maori Sovereignty? “Maori feel the pollution risk to the water+fish stocks is too great”   
 TokyoTom 

Maori skipper detained by Navy warship for defending ancestral fishing waters from Oil Drilling  
 TokyoTom 

Maori fisherman: “We are defending tribal waters+our rights from reckless Govt policies”   
 TokyoTom 

“opening up natl parks+our coastline to transnationl corps shows contempt+will face fierce+sustained local resistance”
 TokyoTom 

Petrobras protest+Maritime Rules  |clear frm+ tht Gov ‘ownership’ leads to poor risk mgt+theft frm communities
 TokyoTom 

“Gov has awakened some sort of taniwha.We’re all virgins at doing this.We never fight”   
 TokyoTom 

“April 11: NZ Navy ships+Air Force planes begin monitoring the protest along with police”   
 TokyoTom 

“after the licence was given-in what way is that consultation? It isn’t, not even close”    
 TokyoTom 

Te Karere Ipurangi » Blog Archive » Oil surveys damage sea creatures organs – ECO    
 TokyoTom 

MP says it is a disgrace…wrong for NZ citizen to be threatened by Defense for opposing a deal btwn gov+foreign oil co
 TokyoTom 

NZ Gov happy w discretion to act unilaterally 2increase Gov revenues+to ignore locals  
 TokyoTom 

In NZ as in ,locals trying to exercise community crtl treated as ‘terrorists’    
 TokyoTom 

Rikirangi Gage to  “We are defending tribal waters+our rights frm reckless Gov policies”  
 TokyoTom 

AUDIO:Rikirangi Gage of te Whānau-ā-Apanui vessel radios captain of  oil survey ship  

Avatar

Categories: Uncategorized Tags:

On Feel Sorry for BP Day, Coast Guard reports blasts Transocean's Deepwater Horizon, a Marshall Islands flagship. Somehow role of Govt as irresponsible resource owner is overlooked

April 23rd, 2011 No comments

Yesterday, on Feel Sorry for BP Day (we all know that BP is just another victim of fishermen,other purported victims and Government, right?), the WSJ provided coverage of a Coast Guard report blasting Transocean, owner of Deepwater Horizon, a Marshall Islands flagship owner of the rig that blew up and sank last year, leaving an oil will that has greatly affected the Gulf of Mexico and the health and livelihoods of many thousands of people.

Somehow, both the Coast Guard and the WSJ reporters managed to overlook the 800 lb. gorilla in the room: namely, the role of Government as irresponsible resource owner. How convenient!

These days, even supposedly ‘capitalist’ new organizations don’t seem to have any grasp of how profoundly “non-capitalist” is our energy sector, which is heavily tied to government as resource owner and pollution-permit authorizer.

Here’s the link.

Categories: Uncategorized Tags:

WSJ article makes clear that dealing with nuclear power plants in crisis mode is very much experimental

April 23rd, 2011 No comments

I ran across an interesting WSJ piece today that I Tweeted as follows:

=EXPERIMENT: Tepco Let  Pressure Soar to Twice Design Limit Before Venting that then Exploded WSJ

Here are some excerpts of the WSJ article,”Reactor Team Let Pressure Soar

The operator of Japan’s stricken nuclear plant let pressure in one reactor climb far beyond the level the facility was designed to withstand, a decision that may have worsened the world’s most serious nuclear accident in a quarter century.

Japanese nuclear-power companies are so leery of releasing radiation into the atmosphere that their rules call for waiting much longer and obtaining many more sign-offs than U.S. counterparts before venting the potentially dangerous steam that builds up as reactors overheat, a Wall Street Journal inquiry found.

Japan’s venting policy got its first real-world test in the chaotic hours after March 11’s earthquake and tsunami knocked out cooling systems at the Fukushima Daiichi nuclear-power complex. By the first hours of March 12, an emergency was brewing inside the plant’s No. 1 reactor.

By around 2:30 a.m., the pressure inside the vessel that forms a protective bulb around the reactor’s core reached twice the level it was designed to withstand. Amid delays and technical difficulties, it was another 12 hours before workers finished releasing radioactive steam from this containment vessel, via reinforced pipes, to the air beyond the reactor building.

About an hour later, the reactor building itself exploded—a blast that Japanese and U.S. regulators have since said spread highly radioactive debris beyond the plant. The explosion, along with others amid overheating at reactors 2, 3 and 4, contributed to radiation levels that led to mandatory evacuations around the plant and the government’s admission that the Fukushima Daiichi disaster ranks alongside Chernobyl at the top of the nuclear-disaster scale.

Experts in the U.S. and Japan believe the venting delay may have helped create conditions that led to the blast. In one possible scenario, pressure built so high that it damaged gaskets and other parts of the venting system, through which highly explosive hydrogen gas leaked from the core into the reactor building. It was Japan’s cautious approach to venting, an outgrowth of its profound concern over nuclear contamination, that may well have made the accident worse, they say.

Containment vessels can withstand higher pressures, some studies have indicated. Among these are studies conducted in the 1990s by Japanese operators and equipment manufacturers, in preparation for Japan’s first set of severe-accident protocols, that say such vessels can withstand twice the design pressure. Many Japanese operators have adopted this as their benchmark for releasing contaminated air.

Tepco spokesman Yoshikazu Nagai confirmed that if there is a risk of releasing radiation, the company doesn’t vent until pressure hits roughly twice the design limit. “Venting is a last resort,” Mr. Nagai said.

General Electric Co., the designer of the vessel at Fukushima Daiichi, said it is unaware of any such Japanese studies or venting protocols.

The International Atomic Energy Agency said it doesn’t have specific guidelines on venting and doesn’t comment on the appropriateness of actions taken in member countries.

U.S. protocols on handling accidents at similar reactors call for venting before pressure exceeds the design level. The same protocol is followed by plant operators using similar types of reactors in Korea and Taiwan, industry experts in those countries say.

I am reminded of the article by Bill Keisling on the Three-Mile Island that I cross-posted earlier; here’s a relevant excerpt for those of you who missed it:

Rule 1: Commercial atomic energy technology is a pseudo-science and is not based on proper scientific experimentation.

As we recently witnessed during the multiple nuclear accidents at the Fukushima nuclear power plant, a damaged reactor (or reactors) often has broken controls, computers systems, and gauges that make monitoring a runaway nuclear reaction difficult, if not impossible.

Confusion and fright in the control room(s) at the time of emergency create what can almost be called A Fog of War. Indeed, war it is. They’re at war with a runaway nuclear reactor.

At Fukushima, as on Three Mile Island, operators wished they could simply peer into the containment building with their own eyes and dispense with the broken alarms, computers and gauges that tell them nothing, and often mislead them.


‘The nuclear power industry naturally doesn’t think very much of troublesome nitwits like Galileo, Francis Bacon, René Descartes, Isaac Newton, and their ridiculous, old-fashioned ideas about experimentation, reproducible results, and scientific method.’


But that’s only a small part of the problem. Truth is, no one really understands the behavior of tons of melted nuclear fuel in a reactor.

For a variety of reasons, the commercial nuclear power industry and its government regulators never conducted a single experimental meltdown of a full-size nuclear reactor.

So, until one melts, no one knows how a runaway reactor will behave.

As most of us remember from high school, scientific knowledge has advanced over the centuries because of what’s called the Scientific Method.

The Oxford English Dictionary defines the Scientific Method as “a method of procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses.”

In simple words, real-world experiments must be designed to test a hypothesis, and results must be reproducible.

As we know, cars and planes are rigorously tested and crashed all the time, in all manner of ways, in all sorts of conditions. That’s how designers and regulators learn how these complicated machines behave in real-world accidents, and whether they’re safe.

Not so nuclear reactors. For a variety of reasons, including half a century of financial and political considerations, regulators in the United States side-stepped or outright ignored the issue of full-scale reactor safety testing, and continue to ignore it to this day.

This inescapable and troubling fact is entwined with the history of atomic power regulation in the United States. 

Perhaps it’s time for us to try the experiment of ending all government support for nuclear power, “public” utility monopolies, and corporations in general (which could not exist in their present form without state grants of limited liability to shareholders)? Maybe then – when investors and power companies are faced with a greater share of the actual risks generated by their businesses – we will see a more responsible and prudent form of capitalism, one that generates true wealth, and noty simply moral hazard and risk-shifting, plus profits for a few?

Categories: Uncategorized Tags:

Ayn Rand Center promises to have fun with man-hating enviros for Earth Day, so for fun I sent them a note

April 13th, 2011 No comments

A little birdy Tweet by the Ayn Rand Center for Corporate Individual Rights told me that the ARCs Voices for Rent-Seekers Reason blog is “Gearing up for Earth Day”, so of course I had to go take a look.

Here’s some of what I found:

As Earth Day (April 22) approaches, the Ayn Rand Center wants to help you understand the destructive campaign environmentalists have pursued for decades against energy production.

… [It is] clear that environmentalists are not so concerned about carbon emissions—they fight against every form of practical, cheap energy regardless of whether it emits CO2 (like fossil fuels) or not (like nuclear and hydro).

As ARC fellow Dr. Keith Lockitch explains …

 

the basic moral premise at the root of environmentalism is the premise that nature is something to be left alone—to be preserved untouched by human activity.

… This moral animus against human “intrusion” upon nature creates a basic conflict between the goals of the environmentalist movement and the needs of human life.

 

The blog post ends with a promise that (emphasis added)

on Earth Day, which is on Friday, April 22, we will be hosting a live Q&A session from our headquarters in Irvine, CA, where resident fellows Keith Lockitch and Alex Epstein will answer any questions you have about Earth Day, environmentalism, the recent nuclear scare in Japan, and related issues.

Besides some inaugural encounters at Mises Daily and the Mises Blog with Objectivist Dr. George Reisman,  readers might like to note that I previously commented on some of Keith Lockitch’s work relating to aid and development.

It certainly should not surprise my regular readers for me to note that I find the gist of the ARC post to be largely irrelevant, and productive only in the sense of offering a self-deceptive defense of a profoundly corrupted and statist ‘free market’.  Citizens of all stripes have every right to want to fight with others over:

  • how valuable government-owned resources (public lands, lakes and waters and marine resources) will be used or protected (as no private transactions to otherwise express preferences are possible);
  • levels of air, land and water pollution that governments will license companies to emit (rather than enforcing rights to prevent trespassing by others to person and property);
  • what energy sources that ‘the government-licensed and -regulated companies we call ‘pubilc utiities’ will invest in (at guaranteed rates) and what conservation measures they will embark on (as preferences can not be expressed via free transactions in competitive markets) – including nuclear power; and
  • what risks to the public at large (and what political activities) we wiill allow those massive and politically powerful risk-shifting and commons-privatizing machines we call corporations.

While I strive for an optimism that self-described principled libertarians will aim for a constructive engagement with people who are understandly dissatisfied with a system profoundly skewed by government, sometime my cynicism gets the better of me.

I already regret the following note that I sent via the comment form to the ARC bloggers, but hey, who knows?

Sadly, it looks like far from a ‘principled’ examination of the very negative roles that government plays in the destruction of resources and the other environmental and human rights issues that concern environmentalists, ARC wants to attack motives.

Examine the role of govenment owership of resources in destruction of the Gulf of Mexico, in overfishing and in fostering political battles for control of use of ‘public lands’? Doesn’t seem ARC is interested.

Examine the role of government in licensing polluting coal plants, stripping govt-owned lands for coal, owning TVA and others which have large fly-ash pollution, and licensing mountain-top remval practices that destroy the rivers shared by others? Will ARC sympathize and point to government refusal to protect private property (thus leading to political battles over regulations), or simply bash enviros?

Will ARC examine the negative role of government in granting monopolies to utility companies, thereby leading to wasteful pricing and lack of choice -as well as costly crony capitalism in the case of nuclear power, which gets rate guarantees and liability waivers?

Will ARC examine the role played by government in creating corporations in the first place, in a form that embeds moral hazard and exacerbates ‘agency problems’, by legislatively telling shareholders they have no liability for the damages the legal fiction they own might cause to others? Will  ARC even notice how abuses arising from the corporate form have given rise to the modern, corrupt regulatory state?

I’m on pins and needles, just waiting to see how principled and productive ARC is going to be!

By the way, I had some comments for Keith a couple of years ago: Not #Climate Change Welfare, But Capitalism and Free Markets? A few thoughts: TT’s Lost in Tokyo http://bit.ly/h60o8O #tcot #p2

I AM hoping for productive comments that address the role of government (and of the commons-destroying corporations they’ve created) rather than the motives of those nasty ‘capitalism’-hating enviros.

Sincerely,

TT

http://mises.org/Community/blogs/tokyotom/default.aspx
http://twitter.com/Tokyo_Tom
https://www.google.com/profiles/TokyoTomSr

“The first principle is that you must not fool yourself – and you are the easiest person to fool.”
Richard Feynman

Categories: Gearing up for Earth Day Tags:

A guest post by investigative reporter & Three-Mile Island gadfly Bill Keisling on "The Fukushima Experiment"

April 7th, 2011 No comments

I’ve run across a very interesting post on problems with nuclear power and the “crony capitalist” nuclear power industry and government interface, by a veteran freelance Pennsylvania journalist/gadfly who started writing about ConEd’s Three-Mile Island facility well before it experienced its famous melt-down.

Bill Keisling is a dogged hunter of local corruption, a prolific author, blogger (at his website Yardbird.com) and videomaker (see, for example, his expose on how Pennsyvania college students were housed on a former Department if Defense nuclear watse sie).

Bill kindly gave me permission to cross-post his piece below, which I copy in its entirety.from his website, which I encourage readers to visit. I think his views provide very useful context.

  

 big wave at fukushima by mr. ok cola

The Fukushima Experiment

A nuclear meltdown survival guide

Japan’s Tepco utility executives and government officials are alternately accused of covering-up, withholding information, or downplaying the severity of their nuclear accident.

Truth is, as many of us nuclear meltdown veterans know, those utility executives and officials are as much in the dark as the rest of us.

If you live within two hundred miles of a nuclear power plant, consider this: If the plant suffers a meltdown, no one on earth will be able to tell you what to expect.

Welcome, then, to the Fukushima Experiment …

 

by Bill Keisling

 

Posted March 28, 2011 — The nuclear meltdowns at Japan’s Fukushima Daiichi atomic power plant reignited deeply personal memories for many of us in central Pennsylvania who lived through 1979’s Three Mile Island incident.

Some argue that the technological or natural causes of these two nuclear accidents differ greatly. Yet aspects of both are stunningly similar: both events caused world-wide hysteria and panic, followed by general condemnation of utility executives and government officials for their supposed mishandling or misunderstanding of the crisis.

 


The nuclear accident on Three Mile Island was a life-changing experience for me, and many others. In 1979 I was a young editor of a community magazine. I was actually the first writer/journalist to arrive at the gates of Three Mile Island the morning of the accident, on March 28, 1979. That morning I had both personal and professional reasons for being there.

The community newspaper I edited, Harrisburg Magazine, had, in the months leading up to the Three Mile Island accident, uncovered myriad problems at the nuclear power plant. We’d documented the willingness of state and federal regulators to look the other way so that the substandard and unsafe power plant could operate.

In August 1978 we even published a cover story detailing a possible disaster scenario involving these unresolved problems at the power plant titled, “Meltdown: Tomorrow’s Disaster at Three Mile Island.”

The owner of the power plant, Metropolitan Edison, was not amused. The electric utility responded by seeking a congressional investigation of our small magazine. Met-Ed almost ran us out of business.

Several months later, early on the morning of the accident, I got a call from a friend telling me that there was some sort of leak at the power plant and that a nuclear site emergency had been declared. I threw my camera and tape recorder into the car and drove the dozen or so miles to the gates of Three Mile Island.

There wasn’t much to see. To the naked eye, the two reactors and the four cooling towers sat placidly as ever on the island. From the gate nothing seemed particularly wrong, or out of place. A small amount of stream rose from two of the massive cooling towers.

The guards at the gate did their best to ignore me. I asked a guard what was going on but he brusquely refused to answer any questions. I pointed to a radiation monitor he wore on his jacket — a dosimeter — and asked what the instrument read.

“It doesn’t matter now,” he told me with a nervous break in his voice.

Shortly thereafter I was standing at the gate when scared nuclear workers began evacuating the plant. The guards hurriedly passed hand-held Geiger counters over each employee’s car, checking for radiation.

This, it turns out, wouldn’t be that much different from events at the gates of the Fukushima Daiichi nuclear power plant in March 2011. An American software engineer working at Fukushima witnessed terrified Japanese nuclear workers trying to escape by climbing over the nuclear plant’s fence following the earthquake.

As for myself, back in 1979, at the gates of Three Mile Island, my first impulse was to run. I later wrote about the moment in my novel, The Meltdown:

It made you think this wasn’t such a good place to hang out.

The main gate opened, the cars streamed out. They came one after another to the highway and turned right, wasting no time, tires spinning in the gravel. I heard one of the drivers say to another, ‘We’re all supposed to go to the substation down the road to be tested for contamination.’

Forty or fifty cars streamed from the plant, stopped momentarily to be swept by Geiger counters at the gate, then barreled up the road out of sight. All the while the cooling towers hung in the background.

Some sort of wild frightening premonition swept over me.

The idea came to me to put five hundred miles between me and this place. I turned and started back to my car. I only took two or three steps, then I stopped. Maybe I should call some friends, I thought. Let them know the reactor’s about to melt. It would be a kind, a thoughtful thing to do, a kindness I’d appreciate from a friend. But I wouldn’t be able to reach most of the people I knew.

At that moment I made a fateful decision that, for me, was life changing. I’d realized there was no place in the world to run from a nuclear accident. I couldn’t possibly warn all my friends and family. My life would be destroyed with the people and the town that I knew.

So sorry: American and Japanese utility executives employ different approaches to breaking bad nuclear news. Met-Ed’s Jack Herbein wagged his finger and told us to Talk to the Hand in 1979; Tepco execs offered deep bows (bottom). Herbein photo by Bill Keisling.


I turned to face the power plant, and planted my foot firmly in its path. I decided at that moment to understand what was happening, and to try to understand why it happened.

In the ensuing minutes, hours, and days, I saw it all, much of it first hand.

I followed the procession of cars evacuating the power plant gates to a nearby observation center. There I listened, watched, and interviewed scared workers. Things I saw that morning forever burn my memory.

One middle-aged nuclear worker sat nervously inside the touristy observation center waiting to be screened for radiation contamination. His hands shook violently and uncontrollably. He held his hands out in front of himself and watched them shake. He stared at his own shaking hands as if they were someone else’s hands, and not his own.

It was bedlam all around us at the observation center. Rad-suited crews swept the grounds for radiation leaks. One hyper worker knocked through the pandemonium gasping, “There’s been a mix up somewhere here!”

Helicopters carrying out-of-town newsmen and cameramen spun down from the sky. By the minute, before my eyes, it grew into an international incident.

Before long a helicopter carrying a utility executive landed on the lawn of the observation center. Jack Herbein, Met-Ed’s vice president for generation, convened an impromptu news conference on the back lawn.

Jack Herbein was normally a polished and controlled utility executive. That day he memorably told the television cameras that everything was “under control.”

“There’s nothing to worry about,” Herbein told us. “Just a little water spilt on the floor.”

We followed Herbein inside the observation center. I yelled over the din at him, inquiring whether this was a nuclear core meltdown.

Herbein looked straight at me, but didn’t answer. His eyes betrayed shock, and fright. He turned and hurried back to his helicopter and choppered away.

Within days, Met-Ed’s Jack Herbein would find himself at ground zero of an international uproar.

The accident just wouldn’t go away. Utility executives and government officials tried their best to play things down. Then, a few hours later, more wrenching bad news would leak from the power plant.

The reactor’s 150-plus tons of nuclear fuel might be melting. The governor ordered an evacuation of children and pregnant women. A potentially explosive hydrogen bubble was detected in the reactor. Things clearly weren’t “under control.”

TMI Jack Herbein by Bill Keisling

Met-Ed’s Jack Herbein stands on milk box to scold world press: ‘I don’t know why we need to tell you every little thing that we do!’ Tepco execs in 2011 offer still more apologetic deep bows to evacuees. Jack Herbein photo by Bill Keisling. Click photo or here to enlarge.


Four days after the initial accident on Three Mile Island, on Saturday, March 31, 1979, at a press conference in nearby Middletown, wearing the same rumpled suit he’d been in for days, an exhausted Jack Herbein of Med-Ed stood on a milk carton to boost himself above a mountain of microphones to bray at the immense polyglot mob of the world’s news media, “I don’t know why we need to tell you each and every little thing that we do!”

That one moment of frustrated pique cost Met-Ed, and Jack Herbein, all public sympathy.

But was Jack Herbein covering up, or was he simply as much in the dark as the rest of us?

 

More than three decades later it’s deja vu all over again, but this time fighting the dark are executives with the Tokyo Electric Power Co., operators of Japan’s runaway nuclear reactors at the Fukushima Daiichi nuclear power plant.

Tepco utility executives are alternately accused of covering-up, withholding information, or downplaying the severity of their nuclear accident.

Truth is, as many of us nuclear accident veterans know, those utility executives are as much in the dark as the rest of us.

Lessons from Three Mile Island in 1979 go a long way to explain what’s happening in 2011 in Japan.

In the years following the Three Mile Island accident much was learned about what the utility did, and did not know at the time of the 1979 reactor meltdown in Pennsylvania.

It became painfully obvious that the control room operators, the utility executives, and the government overseers of Three Mile Island simply did not know at the time what was happening inside their damaged nuclear reactor core.

Why they did not know is really the heart of the matter, and the thing we should consider.

In the event of a runaway nuclear reactor (politely called a “power excursion” by the industry), Tepco executives in Japan, like their counterparts in Pennsylvania, don’t have the foggiest idea what may happen when their reactors melt.

If you live within two hundred miles of a nuclear power plant, consider this: If the plant suffers a meltdown, no one on earth will be able to tell you what to expect.

Having spent decades looking into all this, I thought I might save those interested in researching the Fukushima nuclear disaster valuable time and trouble by providing a short list of the most important points I’ve learned about nuclear power accidents.

Decades of research can be boiled down to a few key observations or rules that until now I’ve kept in the back of my head.

I here offer my list as a time-saving primer to others:

Rule 1:

Commercial atomic energy technology is a pseudo-science and is not based on proper scientific experimentation.

As we recently witnessed during the multiple nuclear accidents at the Fukushima nuclear power plant, a damaged reactor (or reactors) often has broken controls, computers systems, and gauges that make monitoring a runaway nuclear reaction difficult, if not impossible.

Confusion and fright in the control room(s) at the time of emergency create what can almost be called A Fog of War. Indeed, war it is. They’re at war with a runaway nuclear reactor.

At Fukushima, as on Three Mile Island, operators wished they could simply peer into the containment building with their own eyes and dispense with the broken alarms, computers and gauges that tell them nothing, and often mislead them.


‘The nuclear power industry naturally doesn’t think very much of troublesome nitwits like Galileo, Francis Bacon, René Descartes, Isaac Newton, and their ridiculous, old-fashioned ideas about experimentation, reproducible results, and scientific method.’


But that’s only a small part of the problem. Truth is, no one really understands the behavior of tons of melted nuclear fuel in a reactor.

For a variety of reasons, the commercial nuclear power industry and its government regulators never conducted a single experimental meltdown of a full-size nuclear reactor.

So, until one melts, no one knows how a runaway reactor will behave.

As most of us remember from high school, scientific knowledge has advanced over the centuries because of what’s called the Scientific Method.

The Oxford English Dictionary defines the Scientific Method as “a method of procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses.”

In simple words, real-world experiments must be designed to test a hypothesis, and results must be reproducible.

As we know, cars and planes are rigorously tested and crashed all the time, in all manner of ways, in all sorts of conditions. That’s how designers and regulators learn how these complicated machines behave in real-world accidents, and whether they’re safe.

Not so nuclear reactors. For a variety of reasons, including half a century of financial and political considerations, regulators in the United States side-stepped or outright ignored the issue of full-scale reactor safety testing, and continue to ignore it to this day.

This inescapable and troubling fact is entwined with the history of atomic power regulation in the United States. In brief, here’s the story, with footnotes and references for those who want to follow along at home:

After the war with Japan ended in 1945 with the dropping of atomic bombs in Hiroshima and Nagasaki, the US found itself the world’s sole possessor of the secrets of atomic energy.

To take these secrets from the hands of the military and deliver them to the civilian population, the United States Congress passed the Atomic Energy Act of 1946. This legislation forbade any entity but the US Government from creating atomic energy, and disallowed international cooperation involving any atomic secrets. To oversee the peacetime atom, the Atomic Energy Commission (AEC) was created, and Harry Truman appointed five commissioners. A statute of Congress created the Joint Committee on Atomic Energy on August 2, 1946. This joint committee would police the AEC, and authorize all appropriations to the commission. 1

The EBR-I experimental reactor in Idaho was the scene of both the first atomic generation of electric power and an early reactor meltdown.


History was made almost five years later. Four, 200-watt light bulbs began to glow when 12 control rods were lifted away at the Experimental Breeder Reactor Number One (EBR-I) in Idaho Falls, Idaho. Sixteen technicians signed their names on a wall there, beneath this notation: “Electricity Was First Generated Here From Atomic Energy on December 20, 1951.” EBR-I seemed all the more remarkable because it was a breeder reactor and, it was said, could safely produce more fuel than it burned. 2

Mamie Eisenhower christened the Nautilus, the world’s first nuclear powered submarine, on January 21, 1954. The public loved it. Still, many Americans were anxious to give private industry an opportunity to split atoms. The Atomic Energy Commission was seen as an island of socialism in the sea of free enterprise.

Dwight Eisenhower signed the Atomic Energy Act of 1954 on August 30 of that year. The “Atoms for Peace” program was launched. Private enterprise could now exploit nuclear power, the AEC would begin to award contracts to businesses, and the poor nations of the world were promised atomic power. 3

The bubble burst in November of 1955. The tiny EBR-I reactor had been experiencing power fluctuations and, while trying to discover the cause of the problem, technicians attempted to bring the core to within a few degrees of melting temperature. At half power, fuel rods holding the Uranium-235 fuel began to bow inward, increasing the core’s reactivity. A “power excursion” occurred, and the reactor began to run away, its gauges climbing off scale. With a split second to spare, a technician commanded a “blanket” of U-238 bricks surrounding the fuel rods to drop away, stopping the power excursion.

An explosion was barely avoided, but the core, capable of producing 1.4 megawatts of heat output, had melted. 4


‘Lloyds of London would not write a policy protecting a nuclear power plant’


Insurance companies, which had been trying to assess the feasibility of insuring commercial reactors, were more squeamish than ever. Utilities considering building nuclear power stations discovered their investments could not be insured. Lloyds of London, known for taking risks on just about anything, would not write a policy protecting a nuclear power plant. Insurance companies throughout America began writing nuclear exclusion clauses into homeowners’ policies, preventing insurance payments for any nuclear related loss. The entire insurance industry pooled together would provide no more than $65 million worth of coverage for a nuclear power plant. 5

Hoping to win the insurance industry’s confidence, the Joint Committee on Atomic Energy authorized the AEC and the Brookhaven National Laboratory to prepare a study on the effects of a major accident at a 100- to 200-megawatt electrical output reactor.

In March 1957, the study, entitled “Theoretical Possibilities and Consequences of Major Accidents in Large Nuclear Power Plants,” Or WASH-740, was released. WASH-740 did not make the insurers rest easier. The Brookhaven laboratory estimated that in the event of a worst possible accident, 3,400 people would die, 43,000 would be injured and seven billion dollars worth of damage would be done. Commercial nuclear power production was now at a standstill. 6

Because the private insurance wasn’t enough, the utilities now settled for a bit of socialism. Senator Clinton Anderson and Congressman Melvin Price introduced legislation that provided for $495 million worth of government coverage — an arbitrarily arrived at amount — in addition to the $65 million private insurance pool. The Price-Anderson Amendment to the 1954 Atomic Energy Act became law in September 1957. The last hurdle apparently out of the way, private industry was, again, off and running to create fission energy. 7

In Pennsylvania, Metropolitan Edison and its fellow utilities of the General Public Utilities Corporation, along with the Pennsylvania State University and Rutgers University, created the Saxton Nuclear Experimental Corporation. The AEC approved a construction permit for a 20-megawatt thermally rated reactor in Saxton, Pennsylvania, in 1959. 8

The SL-1 experimental reactor being lifted from its containment building following its deadly 1961 accident.


But tragedy visited another experimental reactor on January 3, 1961. At about nine in the evening, three technicians were performing a maintenance operation on the SL-1 reactor in Idaho Falls, Idaho. The SL-1 was one of 17 test reactors scattered across 892 square miles of Idaho desert at the AEC’s National Reactor Testing Station. The tiny SL-1 was meant to produce electricity for about a dozen homes in arctic military bases. For some time the reactor’s nine control rods had been acting up, as had other reactor functions.

The SL-1 had been shutdown for about a week in expectation of major repair work, its control rods pushed firmly down and disconnected from the mechanical control rod drive. The number nine control rod was the most important. It was the only rod that could start the chain reaction when lifted away. To ensure that the cadmium control rods would not stick or jam, technicians had been “exercising” them, lifting them a few inches, then returning them. That night three technicians were standing on top of the reactor, reconnecting the control rods to the mechanical drive. The number nine control rod had to be lifted four inches by hand to be connected to the machinery.

During this operation the rod was lifted too far. In a fraction of a second the reactor became critical, a power excursion followed, and an estimated 1,500,000,000,000,000,000 atoms split.

By the time help arrived, one man was found dead. A second technician was rushed outside, but was so radiated that he had to be examined by a doctor wearing protective clothing. The second man quickly died. The third technician was found dead on the ceiling of the reactor building. A piece of control rod was jammed through his groin, pinning his corpse to the ceiling at the shoulder.

For twenty days, the bodies were packed in water, alcohol and ice, while scientists tried to cleanse the dead tissues of uranium. Finally the men were buried, but their heads and hands had to be removed and buried with other nuclear wastes. 9


‘A third technician was found on the ceiling of the reactor building. A piece of control rod was jammed through his groin, pinning his corpse to the ceiling at the shoulder.’


The Atomic Energy Commission reached another crossroad in 1964, when construction permits for the first big, pressurized and boiling water reactors were granted. A utility could make an appreciable profit on its investment when smaller reactor designs were made larger, taking advantage of economies of scale. Pressurized water reactors rated at thousands of megawatts of heat output would soon be operating.

To estimate the damage of a serious accident at a large commercial reactor, the Joint Committee on Atomic Energy authorized the AEC and the Brookhaven laboratory to update the 1957 WASH-740. The results were shocking.

Instead of 3,400 deaths, there would be 27,000; instead of 43,000 injuries, there would be 73,000; instead of $7 billion worth of damage, a “worst possible accident” at a new pressurized or boiling water reactor would cause $17 billion in damages. To make matters much worse, Brookhaven statisticians determined that an evacuation would make no appreciable difference in the number of people killed.

The study indicated that a landmass the size of the Commonwealth of Pennsylvania could be rendered uninhabitable; that is, if the reactor were to be built, say, in central Pennsylvania.

Fearing this updated WASH-740 report would create an outcry at a very sensitive time, the AEC withheld this report from the public. 10

A draft of the updated WASH-740 report would not be released until June 1973, after both the Three Mile Island and Fukushima nuclear power plants were designed, considered for licensing, or built.

About the same time the WASH-740 update was being prepared, an “internal report” of the National Reactor Testing Station was also being drafted. This report called for a six-year minimum, intensive testing program to be conducted with the large reactors.

The NRTS report recommended that full-scale destructive testing be included in these reactor tests. The 1964-65 report was not released to the public until 1974; its findings too were ignored by the AEC.

The BORAX-1 experimental reactor seen undergoing a power excursion. When it finally blew up, scientists pointed out ‘uncertainties of extrapolation.’


Power excursion testing previously had been conducted inside tiny reactors. The BORAX-1 test reactor was only 1/500th the size of the larger, commercial reactors approved after 1964. By pulling the control rods of the BORAX-1, power excursions were created, and water was vigorously expelled from the coolant system, causing the reactor to shut down. But when an excursion test designed to melt the core was conducted in 1954, a “somewhat unexpected” steam explosion occurred, destroying the reactor and tossing a one-ton piece of equipment 30 feet into the air. 11

The Argonne National Laboratory reports that BORAX-1 “was deliberately destroyed in July 1954. Fuel plate fragments were scattered for a distance of 200-300 feet… The final test revealed that the predictions of total energy and fuel plate temperatures had been considerably too low. Instead of the melting of a few fuel plates, the test melted a major fraction of the entire core. The discrepancy was attributed to the uncertainties of extrapolation. The results of this energy liberation in the way of peak pressures and explosive violence lie in a region where there had been no previous experimental data.”

In other words, you can’t predict how a big reactor may behave from experiments conducted with much less fuel in a smaller reactor.


‘The National Reactor Testing Station report recommended that full-scale destructive testing be included in reactor tests.

The 1964-65 report was not released to the public until 1974; its findings were ignored by the AEC.’


Additional power excursion tests were conducted in the early 1960s on the Special Power Excursion Reactor Test, or SPERT-1 test reactor. In his book Nuclear Power: Both Sides, physicist Michio Kaku writes, “In some of the experiments we ran on the SPERT reactor we deliberately withdrew the control rods rapidly from the core. Without the control rods to absorb and regulate the neutrons from the fission process, the chain reaction would spin quickly out of control, and power levels would rise from zero to 30,000 megawatts (30 billion watts) in less than one-hundreth of a second. The cooling water would boil furiously, causing a steam explosion. On one occasion in 1962 I had the dubious distinction of deliberately blowing up the SPERT-1 reactor.” 12

The AEC officially reported that the SPERT-1’s core failed to explode during the “severest test that could be performed,” but the AEC did not mention that the SPERT-1 had faulty fuel rods, which terminated the power excursion by expelling fuel powder and coolant. No SPERT-1 power excursion test was then conducted with corrected fuel rods.

Both the BORAX-1 and the SPERT-1 test reactors, moreover, had several design differences from the larger, commercial reactors. When test reactors were built with a similar design to the larger reactors, power excursion experiments that could damage the fuel were deliberately avoided.

Instead, the AEC relied on calculations. “Design basis accidents,” and “worse possible accidents” were computed, but never verified by proper, scientific experimentation.

The AEC assumed sophisticated, though unverified, reactor theory to be fact. One reason for relying on these unproven calculations was that it was much less expensive over the short-run to do so than destroying a commercial size reactor, which could cost hundreds of million dollars, if not more.

Another reason for this unorthodox “un-scientific method” was that power excursion testing with reactors containing 100 tons or more of uranium could have serious environmental consequences.

So the nuclear industry continued to bank on the unproven hypothesis that a large, commercial reactor could be operated with little or no danger of explosion. 13

This deliberate blunder was one of the great scientific errors of twentieth century technology.

In contrast, Albert Einstein’s theories of relativity, the foundation of modern atomic science, continue to be subjected to painstaking experimentation. 14

Still, a good bit of the scientific laziness, lack of curiosity, and outright intellectual dishonesty of the nuclear regulators simply was a ruse to protect the finances of the nuclear industry.

As we see, the illusion of reactor safety and nuclear finances go hand-in-hand. Real-world experimental data which undermines the perceived safety of nuclear power plants is a threat to the insurability, and thus the financial viability, of the power plants. So over the decades real-world experiments that would impeach the safety of nuclear plants simply were never performed, were suppressed, or were played down by nuclear regulators.

Some blamed the problem on the mission and culture of the Atomic Energy Commission to both regulate and promote atomic energy.

Supposedly addressing this problem, Congress passed the Energy Reorganization Act of 1974, which abolished the Atomic Energy Commission. The AEC was replaced by the Nuclear Regulatory Commission (NRC), and the Energy Research and Development Administration (ERDA). The NRC now would supposedly only regulate, while ERDA would promote nuclear energy, especially reactor development.

Yet, in the decades ahead, the NRC would continue to avoid full-scale experimental reactor meltdown tests in favor of costly computer models, fantasy reports, and ivory tower academic studies. There would be Bull Shit, More bull Shit, and bullshit Piled Higher and Deeper (in scientific and academic parlance, BS, MS, and PhD).

Nuclear reactor safety study became a colossal thought experiment. Reactor safely would exist only in the minds of their creators, and not in the real world, supported by reliable, controlled and reproducible scientific data.

Heaven and earth: Inside Fukushima’s Unit 2 control room in late March 2011, where events dismissed by the nuclear industry as ‘highly unlikely’ are an every day real-world nightmare for struggling operators and citizens.


Over the decades (and to this day) the NRC and the nuclear industry continued to cook up their own imaginary projections, involving narrowly defined “likely scenarios” and “analyses” devised by industry cheerleaders wearing tin-foil hats. These fairy tales are then supposedly bolstered with equally imaginary computer models.

In the 1970s, the NRC commissioned, for example, the infamous Rasmussen Report, or WASH-1400, as a follow-up to the discredited and suppressed WASH-740 reports.

The Rasmussen Report, also called “The Reactor Safety Study,” was soon also widely discredited within the scientific community. A subsequent review by the NRC conducted by Professor Harold Lewis of the University of California concluded that, “the uncertainties in WASH-1400’s estimates of the probabilities of severe accidents were in general, greatly understated.”

This led to other imaginary and sugar-coated Candyland reactor safety “studies,” including 1982’s CRAC-II, and 1991’s NUREG-1150.

“CRAC-II is both a computer code (titled Calculation of Reactor Accident Consequences) and the 1982 report of the simulation results performed by Sandia National Laboratories for the Nuclear Regulatory Commission. The report is sometimes referred to as the CRAC-II report because it is the computer program used in the calculations,” Wikipedia relates.

“The CRAC-II simulations calculated the possible consequences of a worst-case accident under worst-case conditions (a so-called “class-9 accident”) for a number of different U.S. nuclear power plants. In the Sandia Siting Study, the Indian Point (NY) Energy Center was calculated to have the largest possible consequences for an SST1 (spectrum of source terms) release, with estimated maximum possible casualty numbers of around 50,000 deaths, 150,000 injuries, and property damage of $274 Billion to $314 Billion (based on figures at the time of the report in 1982)…. CRAC-II has been declared to be obsolete and will be replaced by the State-of-the-Art Reactor Consequence Analyses study.”

The NRC itself would later discredit and issue a disclaimer of both the CRAC and NUREG “studies.” The NRC disclaimer of CRAC-II and NUREG-1150 reads as follows:

‘The U.S. Nuclear Regulatory Commission has devoted considerable research resources, both in the past and currently, to evaluating accidents and the possible public consequences of severe reactor accidents. The NRC’s most recent studies have confirmed that early research into the topic led to extremely conservative consequence analyses that generate invalid results for attempting to quantify the possible effects of very unlikely severe accidents. In particular, these previous studies did not reflect current plant design, operation, accident management strategies or security enhancements. They often used unnecessarily conservative estimates or assumptions concerning possible damage to the reactor core, the possible radioactive contamination that could be released, and possible failures of the reactor vessel and containment buildings. These previous studies also failed to realistically model the effect of emergency preparedness. The NRC staff is currently pursuing a new, state-of-the-art assessment of possible severe accidents and their consequences.”

In other words, after spending tens of millions of dollars in wasted resources to produce sham results, the NRC bureaucracy naturally resolved to spend tens of millions of more dollars to produce even more imaginary and far-fetched sham results. How reliable are these computer models?

In a timely article in the March 28, 2011 New York Times, John H. Broder, Matthew Walk and Tom Zeller point out, “American nuclear safety regulators, using a complex mathematical technique, determined that the simultaneous failure of both emergency shutdown systems to prevent a core meltdown was so unlikely that it would happen once every 17,000 years. It happened twice in four days at a pair of nuclear reactors in southern New Jersey.”

One imagines such a computer model in 2005 also setting the odds as “slim to none” of a black politician with a middle name of “Hussein” being elected president of the United States. The point is, the history of the world is filled with long shots with slim chances of overturning established norms. That in fact is what history is all about.

The NRC’s ‘State-Of-The-Art Reactor Consequence Analyses,’ or SOARCA, doesn’t even consider the consequences of accidents involving spent nuclear fuel pools, like those presumed to be now burning in Fukushima.


As mentioned, the NRC’s current search for a “state-of-the-art” study is called, appropriately enough, “State-Of-The-Art Reactor Consequence Analyses,” or SOARCA. (Not to be confused, gentle reader, with SCROTUM, which, in nuclear parlance, refers to the biological equipment by which operators are held by runaway reactors.)

NRC’s SOARCA website proclaims, “The project uses computer models and simulation tools to conduct in-depth analysis of two operating nuclear power plants, a boiling-water reactor and a pressurized-water reactor,” the types found in Fukushima and on Three Mile Island, respectively.

The SOARCA study further claims to consider “the highly unlikely event of a severe reactor accident.”

But, as Hamlet tells Horatio, “There are more things in heaven and earth than are dreamt of in your philosophy.”

SOARCA, it should go without saying, does not contemplate actual severe, real-world environmental catastrophes like the 9.0 earthquake and tsunami which unexpectedly destroyed multiple reactors and spent fuel pools at Fukushima, or myriad other events which the NRC considers “highly unlikely.”

The NRC’s SOARCA website further explains that the study does not take into account such events as “terrorist acts.” Nor, it goes without saying, does SOARCA consider what happens in the event of war, when one or more of the world’s 400-plus atomic reactors is damaged by combatants, leaving undisciplined Third World operators struggling to control a runaway reactor(s) and spent fuel pools.

Moreover, the SOARCA “study” doesn’t even consider the consequences of accidents involving spent nuclear fuel rod pools, like those now burning in Fukushima.

The NRC’s SOARCA FAQ page states:

Are accidents at spent fuel pools considered in this study?

No. The project focuses on evaluating the very unlikely severe accident scenarios that may occur at operating power reactors and, as such, it does not consider spent fuel pools.

Of course, on the real planet earth, and not the fantasy Game Boy simulations of the nuclear industry, if you are unlucky enough to work as a nuclear control room operator when a fire breaks out in one or more spent fuel pools, as it did in Fukushima, spewing highly radioactive smoke and throwing explosive debris several hundred feet into the air, thus preventing you from controlling your already damaged nuclear reactor(s), you’ve got a problem on your hands not considered by SOARCA. Then again, in the “highly unlikely” event that your reactor(s) blow up, spewing highly radioactive steam and throwing explosive debris several hundred feet into the air, thus preventing you from putting out a fire in your spent fuel pool(s), you’ve got an altogether different “highly unlikely” event(s) on your hands, Pilgrim.

What, me worry? NRC inspectors reported that ‘At times during various shifts, in particular the 11:00 pm to 7:00 am shift, one or more of the Peach Bottom (Pennsylvania) operations control room staff (including licensed operators, senior licensed operators and shift supervision) have for at least the past few months periodically slept or have been otherwise inattentive to licensed duties.’


That’s when, as we see in Fukushima, your SCROTUM is in serious danger, and, like those eminently professional and enlightened nuclear workers seen scaling the fence to escape Fukushima, you better, in nuclear terminology, SCRAM the reactor(s).

If, however, you cannot SCRAM fast enough, you should then consider the time-honored emergency inventory and communications procedure known in nuclear circles as KYSAG, or Kiss Your Sweet Ass Goodbye.

(I realize these terms are complicated and technical to the lay reader, but obtuse technical jargon is important to the nuclear industry.)

Or, if you’d prefer, like the supremely calm, collected, and laid-back control room operators at various American nuclear power plants, you can avoid much of the unnecessary stress of these “highly unlikely” events by simply going to sleep in the control room every night.

One man’s nuclear nightmare, after all, is just another man’s sweet dream, baby.

Which brings us to the next rule.

Rule 2:

Commercial atomic energy is based on voodoo economics.

With the vexing realities of nuclear industry finances, insurance, and what to do with thousands of tons of highly radioactive spent fuel rods, atomic reactor pseudo-science merrily intersects with the voodoo economics of the nuclear industry.

Because spent nuclear fuel must be safety stored for tens of thousands of years, no one can agree where to put it, or how to pay for the storage, and so the spent reactor fuel piles up at nuclear power plants in the U.S. and around the world.

The NRC and the nuclear industry wisely choose to simply ignore this nettlesome problem. Hey, if you can’t solve it, why talk about it?

 

Call it “highly unlikely,” and move on.

Also in the category of nuclear voodoo economics are the shrewd nuclear industry investors who wisely refuse to themselves finance or insure new nuke plants, and instead insist that taxpayers pick up the tab. President Barack Obama, in fact, has promised the nuclear industry $36 billion for this very purpose in 2011.

These nuclear industry subsidies have been harshly criticized for decades. The bottom line is this: if it came down to risking their own money, nuclear investors would have nothing to do with nuclear reactor technology.

At Three Mile Island Unit One’s licensing hearing way back in November 7, 1973, for instance, Pennsylvania Insurance Commissioner Herbert Denenberg testified about the $560 million ceiling on insurance payments as mandated by the Price-Anderson Act.

“The plant owners will undoubtedly deny that this capping of benefits and liability represents any real material value to them, or conversely, any real cost to the public,” he said.

“They will point proudly to the fact that no member of the public — as opposed to workers in or associated with the activity of the industry — has been killed, and no catastrophic accidents have occurred, in 17 years of experience with nuclear reactors.

“And they will assert that on the basis of this safety record and their continuing zeal to make reactors uncommonly safe, the public would be foolish to worry about the financial consequences of an accident costing more than $560 million or, for that matter, any major accident at all.


‘If pressed, they will admit that a catastrophic accident is both conceivable and possible.

It will be the general public who must bear the cost.’


“All these arguments by the utilities are irrelevant, of course. The utilities do not take their own assurances about safety seriously enough to place their corporate necks on the line by renouncing their exemption from liability for a catastrophic accident, and in fact, they insist on the continuance of this exemption as a condition of their operating nuclear plants.

“If pressed, they will admit that a catastrophic accident is both conceivable and possible. And if such an accident occurs, the fact is that it will be the general public — and not the utilities and the reactor manufacturers — who must bear the cost.”

So let’s all learn a valuable lesson from the shrewd nuclear investor, and let’s be realistic here: endangering millions of lives; permanently polluting hundreds of square miles with uranium fission by-products; squandering billions of dollars of good money after bad: honestly, what else is government for?

These shrewd investors know that the true life-cycle costs of nuclear plants make them economically unviable.

Which brings us to Rule 3.

Rule 3:         

Be thankful the nuclear power industry is doing its level best to destroy the nuclear power industry. These guys are pros at it.

If nuclear industry executives are not scientists, and if they are not economists, what exactly are they?

Would I lie to you, sugar?


They are public relations and lobbying professionals, bullshit artists and bologna merchants, and, thankfully, highly incompetent ones at that.

Rest assured, the nuclear power industry is doing its level best to destroy the commercial nuclear power industry, and nobody does this better than they do.

Over the decades, the nuclear power industry has built a proven track record for ceaselessly working to destroy itself, without the help of a single anti-nuclear activist.

Rule 4:        

You are the experiment: In the event of a nuclear meltdown, use the opportunity to point out that this catastrophe once again proves the inherent safety of atomic energy.

As I’ve previously noted, the nuclear power industry naturally doesn’t think very much of troublesome nitwits like Galileo, Francis Bacon, René Descartes, Isaac Newton, and their ridiculous, old-fashioned ideas about experimentation, reproducible results, and scientific method.

Which is not to say that scientific data from real-world, full-scale nuclear meltdowns are not being collected.

Mountains of data — some useful, much of it not — have been, and will continue to be, amassed from the nuclear accidents at Three Mile Island, Chernobyl and, now, Fukushima.

The Three Mile Island Experiment: graphic of Unit 2 reactor core damage.


Some five years after the meltdown on Three Mile Island, the damaged Unit 2 reactor was finally cool and clean enough to be popped open, like a festive foie gras in a dead Christmas goose.

Giddy industry representatives got to peer inside, like kids who can’t wait for Christmas, and who wonder what Santa brought.

Much to the surprise and delight of the nuclear industry, half the 150-ton core at Three Mile Island was found to have melted before solidifying into radioactive rubble at the bottom the reactor vessel.

And you probably thought that 150 tons of 5,000 degree F. molten uranium might melt through the stainless steel reactor vessel, burn through the concrete floor of the containment building, and give someone a hot foot on the other side of the planet, didn’t you? (This does however beg the question of whether, in China, the uninformed talk about The Pittsburgh Syndrome.)

The well-paid nuclear industry spin doctors wasted no time, of course, pointing out that this embarrassing melted pile of rubble inside TMI’s Unit 2 reactor was “proof” that nuclear plants are safe.

The scientific problem with using data from these real-world accidents — aside from the ethical problem of using uninformed humans in their homes as guinea pigs — is that these “results” are irreproducible, and therefore unscientific.

We’ll never know, for example, precisely how much coolant water was dumped on the damaged Fukushima reactors and spent fuel pools, and in what controlled circumstances, before and after the terrified reactor operators ran for their lives, and tried to jump over the fence, and so on.

In other words, more bad science.

Perhaps we can one day prove conclusively that large, commercial nuclear reactors will not melt down, but merely fizzle and pop for an extended period of time, as did Unit Two on Three Mile Island. Nevertheless, this is not the sort of knowledge we should acquire from experiments conducted with innocent victims in their backyards.

Speaking of ignorant fools, we now come to Rule 5.

Rule 5:

They’re building a better model fool every year.

The ancient Greeks had a single word for all this. It’s a word for what they believed was the greatest of all human follies: hubris.

Hubris, as we use the word today, implies mere arrogance or pride. But to ancient Greeks, hubris was a legal term and, some say, the greatest single crime one could commit in the ancient Greek world, not unlike our own treason or, in religious societies, blasphemy.

In Greek tragedy, a protagonist who acted with hubris foolishly ignored human limitations and challenged the gods and their rules, inviting ruin and retribution at the hands of vengeful gods like Nemesis.

Agamemnon, for one example, was tempted by ruin with the suggestion that he walk on a divine tapestry.

In other words, as the ancients and Charlie Murphy warn us, keep your dirty feet off God’s white leather sofa, unless you want to get your ass kicked.

That it’s sinkable is unthinkable: Like the White Star Line’s Titanic, the Zeppelin company’s promotions prominently boasted that no passenger had ever been injured on one of their airships.


The oceans and junk yards of the world are littered with Titanics, Hindenburgs, Unit 2 reactors, and the scrap of other infallible machines that their creators boasted could not sink, melt, fall from the sky, or otherwise fail.

To get around this historical fact, nuclear engineers are fond of saying that their machines, in fact, are perfect: it’s the human element, the foolish human operator, they’ll tell you, that’s at fault.

The nuclear industry today boasts that it can, in fact, without any proper scientific experimentation at all, produce a fool-proof machine!

Trouble is, those fools are so damned crafty.

And, as one nuclear regulator worrisomely intimated to me recently, “They’re building a better model fool every year.”

Whether the nuclear industry can successfully build a better fool-proof machine to keep up with this year’s better model fool is any fool’s guess.

Fools have been around a long time, and I’m betting on the fool. Hell, in the United States of America fools control not just one, but two political parties, both houses of Congress, and the judiciary.

So let’s be brutally realistic here. You can’t underestimate the fool.

Even the smart money’s betting on the fool. Why do you think nuclear investors don’t want to risk their own damn money? They’re not fools.

I defer to that celebrated nuclear combat veteran, philosopher, action figure, and low-fat hamburger grill marketer, Mr. T:

I pity the fool.

But it would be foolish of us to blame everything on the fool in the nuclear control room.

Contrary to nuclear industry spin, foolish control room operators were not at fault for the Three Mile Island meltdown. Foolish regulators cooperated with foolish utility executives to operate a foolishly complex, leaking nuclear reactor with faulty components and miscalibrated controls that badly confused the already foolish control room operators.

Which brings us to Rule 6.

Rule 6:

People don’t like or understand atomic energy:
E=MC2 is not a recipe for comfort food

More than 30 years later, my thoughts keep returning to the nuclear worker from Three Mile Island whose hands I watched shake uncontrollably on the morning of the meltdown.

Make no mistake, those control operators were scared. But the nuclear worker I watched that day wasn’t scared for the future of the atomic power industry, his job, or even for his life, as far as I could see.

He trembled with the instinctive fear of having encountered an unknown monster, in an unknown country. His was the fear of the Lilliputian running for his life when Gulliver finally wakes up.

D’oh! Fear of over-sized unknown monsters is the oldest story of the western world. It’s Homer, not just Homer Simpson.


It’s Ray Harryhausen’s 7th Voyage of Sinbad meeting the Cyclops. It’s the oldest story of the western world: The Iliad and The Odyssey. It’s Homer, not just Homer Simpson.

These operators were scared, as people always are, by the unknown, and the unpredictability of the unknown they don’t control.

E=mc2, contrary to popular belief, is not a free lunch. It’s a conversion formula, describing the equivalence of energy to mass, and the resulting enormous energies released from the interaction of very small, invisible particles. Enormous also, in commensurate scale, are the consequences, and our responsibilities.

It’s hard for human beings to grasp Einstein’s dreadful formula on any human scale.

Some nuclear industry proponents foolishly compare atomic energy to garden variety chemical reactions, like fire.

But we humans evolved with fire. The taming and handling of fire, it’s believed, helped to make us human. The use of fire, we’re told, began long ago with our evolutionary ancestors, before we humans even emerged as a species.

Writing in Science magazine in 2009, Professor David Bowman and his collaborators tell us, “The spread of highly flammable savannas, where hominids originated, likely contributed to their eventual mastery of fire. The hominid fossil record suggests that cooked food may have appeared as early as 1.9 (million years ago), although reliable evidence for controlled fire use does not appear in the archaeological record until after 400,000 years ago.”

Think about it. Our use and understanding of fire sets us apart from every other animal on the planet. Every other species on earth naturally fears fire. In a forest fire, animals instinctively run or burrow for their lives. We, on the other hand, jump into forest fires from airplanes.

Imagine the horrible cries of our hairy ape ancestors swinging in the trees when the first one of us picked up a burning stick, and felt its warmth, and watched it burn, and brought it home.

Of course, some of our ancestors burned themselves to a crisp playing with fire, as we still do. As the authors of the above paper caution, “the evolution of adaptations to fire remains a difficult topic to explore because traits that increase the rate of occurrence of fire, or of recovery following burning, are not unambiguously the result of natural selection.”

In other words, I suppose, burning yourself and your home to a crisp may decrease your chance of finding a soul mate.

Still, even to this day, what more could one want for one’s man cave than fire, flame-broiled meat, fire-brewed beer, and a fiery, large-screen tv?

Can the same ever be comfortably said for nuclear fission? Will splitting atoms ever match the gentlemanly art of grilling meat or shooting defenseless animals with a fire stick? I sincerely doubt it.

That’s not to say that some of us haven’t tried to jump the evolutionary gulf by constructing our very own backyard nuclear reactor.

Columbus of the Atom: Dave Hahn, The Radioactive Boy Scout, in police mugshot.


Lest we forget that modern day Columbus of the Atom, Dave Hahn, of suburban Detroit, Michigan, better known as the Radioactive Boy Scout. In the late 1980s Mr. Hahn famously sought an Eagle Scout Badge by building an atomic breeder reactor from tin foil and salvaged radium paint in his mom’s backyard garden shed.

Mr. Hahn’s misadventure reads like the American nuclear industry’s answer to Chairman Mao’s Great Leap Forward.

Mr. Hahn, posing as a high school science teacher, phoned up the nuclear industry and the NRC, who were only to glad and happy to offer him invaluable advice on achieving an atomic chain reaction in his own backyard. (Refer again to Rule 5: A better model fool, and Rule 3: The nuclear industry needs no help taking care of its own fools.)

Mr. Hahn’s homemade backyard nuclear reactor indeed started to heat up, and soon badly radiated his neighborhood. His face was left permanently pocked with radiation burns.

In the end, Dave Hahn was forced to tear down his backyard nuclear reactor before it went critical, lest he create His Own Private Fukushima. Unfortunately for the evolutionary progress of mankind, the U.S. Environmental Protection Agency was neither very amused nor supportive, and designated Mr. Hahn’s mom’s backyard a Superfund Cleanup Site.

The point is, and Mr. Hahn’s experiments notwithstanding, we’ve had millennia and more to understand and adjust to fire. Our natural affinity for quotidian chemical reactions like fire has been hard-wired into us by hundreds of thousands of years of evolution.

Not so nuclear energy. Nuclear reactions are largely immune from standard human observations and inhabit a counter-intuitive realm outside our understanding of time and our other natural senses.

Splitting atoms will always be the work of a stranger in a strange land. Our best nuclear physicists understand this, and even use the language of explorers and mystics to announce their mysterious doings.

Enrico Fermi sustained the first atomic chain reaction in 1942. To announce his successful criticality experiment (conducted with Fermi’s trademark meticulous scientific procedure, by the way) one of Fermi’s lieutenants sent a coded message to the chairman of the U.S. National Defense Research Committee:

“The Italian navigator has landed in the New World.”

“How were the natives?” Fermi’s man was asked.

“Very friendly,” came the reply.

We now know that “the natives” simply were pretending to be friendly. In reality, the unstable uranium atoms and their by-products were killing Enrico Fermi.

Fermi died at age 53 of stomach cancer. He developed cancer from radiation poisoning while constructing his large “pile” reactor built from heavy graphite bricks and uranium beneath Stagg Field, the football stadium at the University of Chicago. Several of his assistants would also die of cancer.

Which brings us to Rules 7 and 8:

 

Rule 7:

There are no ‘safe’ levels of radiation.

The best current thinking about the risks of radiation exposure are expressed by what’s called the linear no-threshold model, first expressed decades ago by the late Dr. John Gofman, and later endorsed by groups as varied as the National Academy of Sciences and the United Nations Committee of the Effects of Atomic Radiation, the latter of which reports:

“the Committee believes that an increase in the risk of tumour induction proportionate to the radiation dose is consistent with developing knowledge and that it remains, accordingly, the most scientifically defensible approximation of low-dose response.”

In simple words, no amount of radiation is good for you. This includes natural background radiation.

This makes lots of intuitive sense. We now realize, for example, that tumors and melanomas can be produced from too much exposure to sunshine, and that a breakdown in the earth’s ozone layer can increase this risk.

So the idea that additional man-made radiation is safe is scientifically unsupportable.

So forget about that favorite ploy of the nuclear industry, comparing doses from nuclear meltdowns to dental or chest x-rays, or MRIs. None of it’s good for you.

Take, for another example, the lessons learned from Rule 8:

 

Rule 8:

Theoretical physicists live to a ripe old age, experimental physicists die of radiation poisoning. Ergo, stay away from nuclear accidents.

Albert Einstein checks for coated tongue: Hysteria = e = mc2


Students of history and nuclear physics know that theoretical physicists like Albert Einstein and Stephen Hawking, who work with mathematical calculations and who seldom venture near radioactive isotopes, live to ripe old ages.

Experimental physicists, like Marie Curie and Enrico Fermi, on the other hand, who work with the isotopes, have a tendency to die of radiation poisoning and cancer.

The same applies for journalists and landscape oil painters.

Therefore, Sanjay and Anderson, resist the urge to visit the vicinity of a nuclear power plant meltdown. Take it from me: you may get a by-line and a nice story exposing the apparent lies and confusion of the nuclear industry, but you’ll spend years worrying that you may have caused your body real harm.

Is a by-line, a story, or a book worth the risk? No.

 

The bottom line:
What can we predict from the Fukushima Experiment?

Less than three years after Enrico Fermi succeeded in building a nuclear reactor, physicists working on the first atomic bomb detonation in Alamogordo, New Mexico, on July 16, 1945, placed wagers among themselves about whether the first nuclear explosion, aptly code-named Trinity, might ignite the earth’s atmosphere or otherwise destroy our planet.

Gambling for their clothes and risking a lethal dose: Alamogordo A-bomb test.


J. Robert Oppenheimer, witnessing the awesome horror we mortals brought in the desert that night, famously quoted the ancient Bhagavad-Gita: “Now, I am become Death, the destroyer of worlds.”

And so we humans dare play with the fire of stars, and attempt to calculate inscrutable quantum probabilities, while the great mass of us can’t comprehend the simple 2 + 2 addition of balancing a household, or a national budget.

For me, the nuclear accident in Fukushima, Japan, permitted me to revisit and re-examine the wild, rollercoaster ride of emotions and perceptions I experienced during my own hometown’s nuclear disaster in 1979. I was able to see that my own response and impressions to a nuclear meltdown are universal and natural, and not held by myself alone, or other immediate victims.

Some of the similarities of both nuclear accidents are obvious: the utility executives who seem clueless about what’s going on inside the reactor and who seem unable to provide reliable information to the public or to speak truthfully about it; the government officials who seem equally clueless about what’s going on in the reactor and who send equally mixed signals; and the spectrum of equally posturing talking heads in the media who alternatively predict Armageddon, and then offer the incident as proof that nuclear energy is safe and friendly.

As we see with the ongoing Fukushima incident, a nuclear accident causes the whole planet to go wild with hysteria, not unlike our ancestors must’ve screeched from the trees when one of us first stepped up to a burning stick to curiously stare and wonder at the warmth of its blaze.

It seems to me that all humanity is in the same uneasy predicament I found myself contemplating on the morning of the Three Mile Island accident, when I had to decide in a split second whether to run, or to turn back to face an unknown monster. In so turning, I suppose, we not only confront our feeble humanity, we’re charting our destiny by the stars.

We have no choice but to turn and plant our foot firmly in the path of the horrible thing, and resolve to carefully try to understand it, and truthfully try to explain it to others.

That’s what made us, and makes us, human beings.

A simple uneasy truth remains: when a nuclear reactor melts, we find ourselves in the same unknown country of Fermi, Oppenheimer, and their associates, and the horrified control room operators at Three Mile Island, Chernobyl, and the Fukushima Daiichi nuclear power plants.

There is one haunting fact that is as accurate today as it was on July 16, 1945, when scientists lay in the sand of Alamogordo, New Mexico, protecting their eyes, awaiting the results of the first nuclear bomb test.

No one knows what will happen.

 

 

 

 

 

 

 

Bill Keisling is the author of two books on the Three Mile Island accident, and one book on solar energy. He covered the Three Mile Island accident for Rolling Stone, The Progressive, and Harrisburg magazines.

 


 

Additional notes and references:

1. The Atomic Energy Commission, by Corbin Allardice and Edward Trapnell, Praeger Publishers, 1974, page 32 and pages 163-168.

2. We Almost Lost Detroit, by John Fuller, Reader’s Digest Press, 1975, page 9.

3. The Atomic Energy Commission, by Corbin Allardice and Edward Trapnell, Praeger Publishers, 1974, page 32 and pages 44-77.

4. The Accident Hazards of Nuclear Power Plants, by Richard Webb, The University of Massachusetts Press, 1976, pages 187-189.

5. Nuclear Power: The Bargain We Can’t Afford, by Richard Morgan, Environmental Action Foundation, 1977, Chapter 5, Hidden Costs.

6. We Almost Lost Detroit, by John Fuller, Reader’s Digest Press, 1975, pages 57-61.

7. Nuclear Power: The Bargain We Can’t Afford, by Richard Morgan, Environmental Action Foundation, 1977, page 38.

8. The Blair Press, Blair, Pennsylvania, April 25, 1979, page 13.

9. We Almost Lost Detroit, by John Fuller, Reader’s Digest Press, 1975, pages 104-115.

10. We Almost Lost Detroit, by John Fuller, Reader’s Digest Press, 1975, pages 159-164.

11. The Accident Hazards of Nuclear Power Plants, by Richard Webb, The University of Massachusetts Press, 1976, pages 66-73.

12. Nuclear Power: Both Sides, the best arguments for and against the most controversial technology, by Michio Kaku and Jennifer Trainer, W.W. Norton & Co., 1982, page 21.

13. The Accident Hazards of Nuclear Power Plants, by Richard Webb, The University of Massachusetts Press, 1976, pages 66-73.

14. Einstein: Profile of the Man, by Peter Michelmore, Dodd, Mead and Company, 1962, pages 8-11; see also, Einstein, by Hilaire Cuny, Paul S. Eriksson, Inc., 1962, pages 81-84.

Categories: Uncategorized Tags:

Post-tsunami radio clip of Jerry Taylor/Cato discussing the past and future of US nuclear power

April 7th, 2011 No comments

In the wake of the troubles at TEPCO’s Fukushima nuclear power plants, on March 18, 2011, Jerry Taylor of the Cato Institute discussed the past and future of U.S. nuclear power on WOR’s The John Gambling Show.

I note that Taylor has really only scratched the surface of the problems relating to nuclear power. For example, far from governments simply shifting the risks of nuclear power cost over-runs to ratepayers and taxpayers, this incentive structure actually compounds financial risks, as the contractors do not have to bear the amount of cost over-runs, and the utilities can put their hands into the pockets of others.

Further, Taylor has not addressed the further subsidies provided in the form of Federal liability caps and by “limited liability” state corporation laws that leave shareholders without ANY liability for damages that nuclear accidents may cause others – as has now materialized in Japan. Just as we have seen in our financial sector, the result is a loss of personal “skin in the game”, a concomitant reduction in critical oversight, unleashed moral hazard, poor decision-making and then hand-wringing and blame-shifting when the “black swans” come home to roost.

Here is the link to 10-minute clip (which Cato has so thoughtfully made easy to share, but unfortunately seems too big to upload here)

[View:http://mises.org/Community/themes/mises2008/utility/:550:0]

Categories: Uncategorized Tags: