On the Probability of Predicting Future Fukushimas

From pwalden

Jump to: navigation, search

A criticism of a paper on nuclear risk assessment by Wheatley, Sovacool and Sornette[1]

Back to Collected Thoughts
Return to Environmentalists vs. Nuclear Energy

The views expressed in this article are solely those of the author
Dr. Patrick L. Walden
Nuclear Physicist retired and TRIUMF experimenter emeritus.
TRIUMF only provides the server for this wiki.

Note:
Wheatley, Sovacool and Sornette reply, September 30, 2015:
Spencer Wheatley, Benjamin Sovacool and Didier Sornette have contacted me and have given a rebuttal to my remarks on their website. Their pdf rebuttal can be accessed directly here. By gentlemen's agreement my article has a link on their website and their reply has a link on mine. I have replied to their comments.
This note is repeated at the end of the article.

May 13, 2015

A cutaway view of an AP1000 nuclear power plant. This is a GEN III+ power plant, which is about a 100 times safer to operate than the current fleet of GEN II reactors. This is the future of nuclear power but it bears no role in the consideration of the probability of future nuclear accidents as stated in this paper. The paper is thus deficient in this regard.
A cutaway view of an AP1000 nuclear power plant. This is a GEN III+ power plant, which is about a 100 times safer to operate than the current fleet of GEN II reactors. This is the future of nuclear power but it bears no role in the consideration of the probability of future nuclear accidents as stated in this paper. The paper is thus deficient in this regard.

On April 7, 2015 a paper was submitted to the Cornell University Preprint Library, which purports to give probabilities on future Fukushima and Chernobyl disasters. It is a preprint and has not yet been refereed. The journal for which it is intended for publication has not been designated. One of the authors, Sovacool, has produced results in the past regarding nuclear power that are outliers with respect to the results of his colleagues and other researchers. His outliers always trend to place nuclear power in an unfavourable light. In this present paper, discussed below, a new measure is presented to quantify the scale of a nuclear disaster. It is the momentary cost of nuclear accidents expressed in 2013 USD. This measure neither quantifies the scale of the disaster nor is sufficiently robust enough to qualify as a measure. The figures Sovacool et al. use for the damage in 2013 USD are shown to be outliers with respect to other references. With difficulties like this and other problems it is not apparent that the authors' prediction of a 50% chance of another future accident like Chernobyl by 2042 bears any any real meaning. What can be shown is that by using Sovacool et al. figures as the worst possible case, modern nuclear technology would make nuclear accidents like Chernobyl and Fukushima become a thing of the past.

A Critique of....
Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents
a preprint by
Spencer Wheatley, Benjamin Sovacool and Didier Sornette

This critique also uses material from a review of the paper written for the MIT Technology Review,[2] as the popular impression or impact of this paper will most likely be taken from the article in the Review.

The Review article begins ominously with

The catastrophic disasters at Chernobyl and Fukushima are among the worst humankind has had to deal with. Both were the result of the inability of scientists and engineers to foresee how seemingly small problems can snowball into disasters of almost unimaginable scale.[3]

This is a statement of hyperbolic proportions written by the author of the Review. Although the authors of the research paper did not use such hype, it sets the tone in which these accidents are still viewed by the public, and in which any future prediction of a repeat accident is viewed with dire foreboding.[4] Consequently the predictions from this paper will be viewed with such dire foreboding, this foreboding which casts such a pale over the future use of nuclear power. Hence it is important that when any learned study portends to make such predictions that it be properly vetted because poorly done studies benefiting from the notoriety of the subject are of no use use to anyone. For instance this term “disasters of unimaginable scale” is it appropriate? In terms of deaths, Chernobyl was not greater than a run-of-the-mill coalmine disaster, and there were no deaths at Fukushima. Disasters of an almost unimaginable scale conjure up images of thousands upon thousands of deaths. Remember these are not simply disasters but disasters of almost unimaginable scale. If you truly want disasters of almost unimaginable scale, you are talking about the 2004 Indian Ocean earthquake and tsunami, and the 2011 Tōhoku earthquake and tsunami. That latter disaster caused 15,891 deaths, has 2,584 missing with 127,290 buildings totally destroyed. That was truly a disaster of almost unimaginable scale, and indeed it triggered the meltdown of the Fukushima reactors. But this has almost all but been forgotten with the hype associated with the events at Fukushima.[5] The world is fixated on overblown absurd fears of nuclear radiation that verge on the border of mass paranoia and delusion.[6]

nuclear accident rate in terms of events per plant per year. An event is an accident costing more than $20 million USD. From fig. 2 of the paper
nuclear accident rate in terms of events per plant per year. An event is an accident costing more than $20 million USD. From fig. 2 of the paper

The probabilities of future nuclear incidents given in the paper assumed no improvement in nuclear technology.[7] The probabilities are projections from a comprehensive accident database that has been compiled by the authors for all nuclear accidents from 1950 to the present, 2014. While this database maybe noteworthy,[8] the assumption of the authors is not, because this assumption is even contradicted by the data set itself. To the right is figure 2 which has been extracted from their paper. The main graph shows a continuing improvement in the parameter \widehat{\lambda_t} which is the accident rate per power plant per year. The data are the black dots with the linear vertical error bars. The superimposed curves are various fits to the data. These fits all show the accident rate has been decreasing over the time period for which the data has been gathered. However the authors now claim this parameter has settled down in the last few years and shows little sign of changing. Therefore they can make projections on the probability of future accidents based on the assumption that there will be no change. This is the same as assuming there will be no advancement in the technology. This assumption of no change contradicts all the known history of technology in the modern era which shows technology can and does improve over the years and sometimes rapidly so. Currently the industry standard for nuclear reactors are the GEN III+ designs. These reactors have a severe core accident probability a factor of 100 lower than the Gen II reactors, which are the current operating norm.[9] Three Mile Island and Fukushima were GEN II reactors. GEN IV reactors, which are being developed, have attributes which make them even safer. Hence predicting the probability of future nuclear accidents, when the nuclear fleet will be dominated by Gen III+ and Gen IV reactors, from a data set based on Gen II and earlier cruder reactors would seem to be of questionable value. Gen III+ reactors are a 100 times safer than Gen II. It seems to me that using the Sovacool et al. data set to predict future accidents is like predicting the frequency of present automobile breakdowns based on a data set compiled during the model T era. Sovacool et al. make no mention of GEN III+ and GEN IV reactors in their paper.[10]

Comparison of the INES, NAMS, and damage scales for nuclear accidents. See footnote for explanation. Damage is in units of a million dollars. The grey dots are the NAMS scores. The black dots are for the Log10(damage) points. From fig. 5 of the paper
Comparison of the INES, NAMS, and damage scales for nuclear accidents. See footnote for explanation. Damage is in units of a million dollars. The grey dots are the NAMS scores. The black dots are for the Log10(damage) points. From fig. 5 of the paper

Sovacool et al. would not only have us believe nuclear technology would stand still but also have us believe in the existence of Dragon King events, runaway nuclear disasters whose damage is so great they belong to their own regime. They make it quite clear that their maximum costly event, Fukushima at 166 billion dollars, would be nothing in comparison to future accidents, and in the end they quote average damage costs that could be over 20.5 billion dollars a year to pay for such accidents.[11] This is somewhat scary stuff and would make you think twice before replacing a fossil fuel energy driven economy with one driven by nuclear power. Indeed the authors attempted to predict when we next could expect to experience one of these Dragon King events. They say there is a 50% chance that a Fukushima event (or larger) will occur in the next 50 years, a Chernobyl event (or larger) will occurs in the next 27 years and a TMI event (or larger) will occur in the next 10 years. What are these Dragon King events and how were they designated? For that we look at the left frame of their figure 5 from the paper (look left). What we have here is a comparison of the INES (International Nuclear Event Scale) accident score published by the IAEA (International Atomic Energy Agency), the NAMS (Nuclear Accident Magnitude Scale) score, and the actual damage cost in 2013 USD.[12] Look at the 4 grey dot NAMS scores above 7 at the top. These Sovacool et al. maintain are the Dragon King events. They are left to right, Three Mile Island (TMI), Kyshtym, Fukushima, and Chernobyl. Chernobyl is in the extreme upper right.

Wait! What's this? TMI has a NAMS score almost equal to Chernobyl? That's right. Sovacool et al. have a 7.9 NAMS score for TMI and an 8 score for Chernobyl. That means TMI had a radiation release of 3972 PBq (1015 becquerels) to Chernobyl's release of 5000 PBq. Fukushima, with a NAMS of 7.5, had a radiation release of 1581 PBq.[13] That means that TMI emitted 2.5 times more radiation than Fukushima and up to 79% of the radiation released at Chernobyl. Chernobyl had a 30km mandatory evacuation zone around it and Fukushima had a 20km zone. Hundreds of thousands of people were forcefully evacuated from Chernobyl and Fukushima and never allowed to return. Did any such thing happen at TMI? Nope! TMI is close by populated areas in Pennsylvania and such a disastrous radiation release would have provoked an evacuation of perhaps a million persons. Did we see this? No! Instead we had this...
The studies found that the radiation releases during the accident were minimal, well below any levels that have been associated with health effects from radiation exposure. The average radiation dose to people living within 10 miles of the plant was 0.08 millisieverts, with no more than 1 millisievert to any single individual. The level of 0.08 mSv is about equal to a chest X-ray, and 1 mSv is about a third of the average background level of radiation received by U.S. residents in a year.[14]

Hence the 7.9 NAMS score for TMI is a mistake or a typo![15] However it got into the Sovacool et al. database, and astoundingly got into table 1 of the article, into figure 5, and then discussed in the text as evidence for the existence of Dragon King events. At no point were red error flags raised. I mean this error stood out like a sore thumb when first I saw it. For it to have been maintained here through so many stages means the authors do not really have a physical feel for what the figures in their data mean, and it does not speak well for the accuracy of their database.

A plot for the probability to have a higher value than the point in question. The lowest value would have a probability of 1, the highest zero. These are the points from the database for the INES above 2, shifted to the left by 1, the NAMS scores above 2, and the natural logarithm of all damage after 1980 exceeding a value of 27 million dollars in units of millions of dollars shifted right by 2. The dots with the black X's are the Dragon King events. The red lines and dots with the red X's are my corrected values. From fig. 5 of the paper
A plot for the probability to have a higher value than the point in question. The lowest value would have a probability of 1, the highest zero. These are the points from the database for the INES above 2, shifted to the left by 1, the NAMS scores above 2, and the natural logarithm of all damage after 1980 exceeding a value of 27 million dollars in units of millions of dollars shifted right by 2. The dots with the black X's are the Dragon King events. The red lines and dots with the red X's are my corrected values. From fig. 5 of the paper

Thus with errors like this, what is the evidence for the authors' Dragon King events? For that the authors present the right frame of their figure 5 (look right). The evidence comes about by comparing the complementary cumulative distribution function, CCDF,[16] extrapolated exponentially from lower value behaviour (the dashed lines in the figure) to the actual empirical values for the CCDF (the solid lines in the figure).[17] Sovacool et al. claim that the Dragon King events are larger than what would be expected from an extrapolation from the behaviour of the lower value events. And yes! There they are! They are crossed black dots to the right of the dashed line for the middle or NAMS extrapolation. In order, from lowest NAMS value to the highest, they are Kyshtym, Fukushima, TMI, and the largest, Chernobyl. This seems good, except for the fact that TMI[18] is not really there. It does not have a NAMS of 7.9. It is really 3.1 (see a previous footnote) and belongs with the lower valued events. Thus only Kyshtym, Fukushima, and Chernobyl are left, and they are replotted in red (Chernobyl is left untouched). Looking at the red dots and Chernobyl, there is only one event, Kyshtym, to the right of the dotted line. The Dragon King regime seems to have collapsed and does not exist.

However the authors pin their real hopes on the damage caused by the Dragon King events. Plotted to the right of NAMS are the natural logarithm of the damages of accidents since 1980. And again, the Dragon King events pop up as the crossed black dots to the right of the dashed extrapolation line. Except now they are different! We have lost Kyshtym and TMI. They happened before 1980, and besides they were not expensive enough anyway. To shore up the Dragon King event numbers comes Tsuruga, an event that nobody has ever heard of, but is expensive enough to qualify. So now we have, in order of cost, Tsuruga, Chernobyl, and the largest, Fukushima.[19] Yes Fukushima in this analysis is the largest most catastrophic nuclear accident because it will cost more even though it emitted 5 times less radiation than Chernobyl. Does this make sense? This is a measure that an economist would choose, not a scientist. To say Fukushima was the worst nuclear accident of all time based on how much it is projected to cost is ludicrous. Chernobyl by all accounts emitted 5 times the radiation of Fukushima and killed 50 people whereas Fukushima killed no one. By radiation alone Chernobyl was more deadly to the environment and those 50 deaths, which Sovacool et al. argued were only worth 6 million dollars a shot, only totalled 300 million dollars, a mere drop in the bucket compared to the author’s statement that Fukushima will cost 166 billion dollars.[20] That is a measure not very respectful of human life or the environment.

Let's examine this 166 billion dollar cost figure. How robust is it? The official cost is pegged at 55 billion dollars, but a recent report has doubled that cost to 105 billion dollars.[21] This is a report from RT (Russia Today) news which is claimed to be a propaganda wing of the Russian government.[22] Thus this report is not exactly considered to be a source of impartial information, and the $105 billion figure could be viewed as an inflated figure. Even if the cost of plant decontamination is added, which the report said had not been added to the costs, we get to only 140 to 153 billion dollars. Sovacool et al.'s estimate at 166 billion dollars is still beyond that. Thus the $166 billion cost Sovacool et al. place on Fukushima appears to be an exaggerated outlier. They do not say how they got their figure or any other figures in their data base. Indeed they do not even make their database available for examination. Thus the cost of the Fukushima accident is not robust but wildly varying,[23] and as such it is a poor measure of the severity of the accident. All these cost estimates make the cost of the Fukushima accident a rather murky affair including the authors' exaggerated figure of $166 billion, and the most reasonable course to consider, under these circumstances, is the official cost, $55 billion.

Let us examine the other Dragon King events. For Chernobyl, the authors quote $32 billion while other sources quote less,[24] $15 billion. For the unknown Dragon King event, Tsuruga, this turns out to be an extended construction SNAFU rather than an accident. This is about startup problems with the Monju nuclear power plant[25] within the Tsuruga complex. It is a sodium cooled reactor which started up in 1994 and had major fire damage from a sodium leak in 1995. It was closed until it started up again in 2010. It shut down a month or two later when a transfer machine was dropped onto the reactor vessel. Its future status is unknown, but it is presently shutdown. The reactor has cost to date (2014) 9.8 billion dollars including the accidents. Sovacool et al. list 15.5 billion. The authors report "an unknown radiation release value" for this event. But we can do better than that. There has been no radiation release or very little radiation release. Can this be called a Dragon King event? I mean being as there has been effectively zero radiation release, it hardly qualifies as a nuclear accident. That is why you never heard about it.

Thus we have the Fukushima cost reduced from $166 billion to $55 billion, Chernobyl reduced from $32 billion to $15 billion, and Tsuruga reduced from $15.5 billion to $9.8 billion. Do you notice a pattern here? The authors' costs seem to have been estimated on the high side. If these costs are plotted in the figure (they are as the red dots and crosses on the natural logarithm plot of the damage) the Dragon King regime disappears just as it did for the NAMS plot. The statistical evidence for a Dragon King regime occurring at "a threshold[26] above which runaway disasters occur" seems spurious at best.

Thus the "Dragon King" regime appears to have an artificially manufactured element about them. Juggling different events between the NAMS and the natural logarithm of damage scales, NAMS typos, and over estimates of damage costs make the existence of this regime somewhat dubious. Nevertheless scaling back the costs and doing away with the Dragon King regime will do nothing to change the authors' calculation for the probability of future events. All it will do is to lower those scary estimates of average annual future damage costs. Their estimates for the probabilities of future TMI's, Chernobyl's, and Fukushima's will remain the same, and even with lower estimated annual damage costs, nobody wants to see a future TMI, Chernobyl, or Fukushima. However there is a problem with their probabilities. They appear to have been somewhat manufactured as well.

A plot of accident rate per power plant per year, . It is claimed that there is a sudden change in this parameter in and around 1980, but that is difficult to see if you concentrate on the data points (the black dots with the vertical error bars) and ignore the coloured bands (fits) guiding the eye. Indeed in the previous figure of (above), it is difficult to discern such a break. From fig. 3 of the paper
A plot of accident rate per power plant per year, \widehat{\lambda_t}. It is claimed that there is a sudden change in this parameter in and around 1980, but that is difficult to see if you concentrate on the data points (the black dots with the vertical error bars) and ignore the coloured bands (fits) guiding the eye. Indeed in the previous figure of \widehat{\lambda_t}(above), it is difficult to discern such a break. From fig. 3 of the paper

The problem is with their empirical CCDF's. Due to the way they are defined (see a preceding footnote) their resolution is limited to 1/(number in event sample). For the natural logarithm of the damage scale, the event sample was 59 making the resolution around 1.7%. Thus because Fukushima was in that event sample, we ended up with a probability of 1.5% of having a Fukushima type accident for any given nuclear accident, which for the present world nuclear plant inventory averages to about 0.9 per year. Hence we end up with a 50:50 chance of another Fukushima in the next 50 years. No error was attached to this probability. Why was there only an event sample of 59? The authors' database had 174 events in it. Well, the authors made cuts. Events that had damages less than $20 million were said to be underrepresented, and were thus eliminated from consideration. Events which occurred before 1980 were said to obey a different probability distribution and thus were also eliminated. Hence we ended up with a sample of 59. While this change of probability distribution after 1980 may have some justification,[27] it is nevertheless somewhat statistically anemic. Looking at the figure to the left, if the coloured bands were removed, the break would hard to see just as it is hard to see in the previous \widehat{\lambda_t} plot above.

A log-log plot of the empirical CCDF vs. damage. 1-F=CCDF. The accident labels and red arrows are my addition. "?" indicates a point which does not appear to be in table 1 of the paper (the list of the top 15 most costly nuclear accidents). The figure is from the paper's fig. 7. Both the dashed and solid fits go through the Fukushima point according to the other part of the figure in the paper.
A log-log plot of the empirical CCDF vs. damage. 1-F=CCDF. The accident labels and red arrows are my addition. "?" indicates a point which does not appear to be in table 1 of the paper (the list of the top 15 most costly nuclear accidents). The figure is from the paper's fig. 7. Both the dashed and solid fits go through the Fukushima point according to the other part of the figure in the paper.

Doing an empirical CCDF distribution for both the pre-1980 and post-1980 sample of 100 events[28] combined together as it is done in appendix 1, the distribution appears to be continuous without any breaks[29] (see figure to the right), and the higher cost nuclear accident events appear to be on the same trajectory as the lower cost events over $800 million. No Dragon King regime is evident which makes the Dragon King regime of the 59 event plot above look like an artifact. Note that here the higher cost arm bends down to a faster fall off than it does in the 59 event plot in which it bends up to a slower fall off and the so-called Dragon King regime. With the higher probability resolution of the 100 event sample, which is 1/100 = 1%, the distribution now suggests that for every nuclear accident, the probability that it will be a Fukushima type accident or larger is now only 0.23%. That would mean that the 50:50 chance of another Fukushima accident or greater would occur would be extended to the next 342 years, which is almost a magnitude larger than the previous result of 50 years. This result cannot be ruled out with the statistics in the given data set. By the reasoning given by the authors, if I could rationally cherry pick the appropriate event sample size down to just 2 events, and one of them was Fukushima, then for every nuclear accident, the probability for a Fukushima type accident or larger would be ≈1/2=50%. This of course is ridiculous.

Hence any such predictions of high cost nuclear accidents using this data set will be associated with large errors. The authors even stated as such
"However, there is tremendous estimation uncertainty associated with these estimations."
Thus the prediction of another Fukushima or worse will happen in 50 years before 2064 is 50:50 is just as likely to be 50:50 all the way up to the year 2356. In other words this analysis gives a prediction, but what that means to any degree of accuracy is anybody's guess. As such the prediction is not very useful, and as I stated previously, "it is important that when any learned study portends to make such predictions that it be properly vetted because poorly done studies benefiting from the notoriety of the subject are of no use use to anyone". The predictions of the probability of future nuclear accidents stated in this paper are of no use to anyone.


There was a remark in the review article to the effect that "their database is carefully researched and their statistical pedigree hard to match". The anti-nuclear crowd lionize credentialed academics who produce results that favour their view of shutting down nuclear power, justified or not, and it is no different with Sovacool et al. From my analysis their database has some problems and their statistical analysis is wanting. Saying that "their database is carefully researched and their statistical pedigree hard to match" would not be exactly the way I would describe it. One of the authors, Sovacool, in the past, has been associated with similar studies with regard to nuclear power,[30] which have also been found wanting. His studies on the life-cycle CO2 emissions per kWh for nuclear power used a faulty data selection process. He overweighed the non-peer reviewed study by Jan Willem Storm van Leeuwen,[31] which is a widely debunked fiction, and rejected other studies for somewhat arbitrary reasons. In his analysis he took a mean of the studies, which is sensitive to outliers (van Leeuwen's) instead of a median, which effectively dumps outliers as being incorrect. Sovacool ended up with a value of 66 g(CO2e)/kWh. In contrast a more respected study by the IPCC, the experts on GHG emissions, came up with a value of 12 g(CO2e)/kWh.[32] Sovacool was or should have been well aware of the problems with van Leeuwen’s analysis, but he used it anyway. Indeed he used 3 versions of the van Leeuwen data in his analysis as if they were statistically independent, while rejecting other studies that were. Hence this statistical pedigree of his would seem somewhat tarnished. Sovacool claims that he is not an anti-nuclear advocate or an apologist for the anti-nuclear movement. However seemingly choosing to over emphasize van Leeuwen’s work in this CO2 paper despite the cautionary warning tags attached to van Leeuwen’s work would indicate that his professed unbiased research attitude towards nuclear energy to be questionable. Indeed in Sovacool's book, Contesting the Future of Nuclear Power (2011),[33] a review quotes that the book “reviews the little-known research which shows that the life-cycle CO2 emissions of nuclear power may become comparable with those of fossil power as high-grade uranium ore is used up over the next several decades and low-grade uranium is mined and milled using fossil fuels”. Actually this statement or one like it has come directly out of the flawed work of van Leeuwen, and it is false. When you start to quote “little-known” research which is well known and at odds with the analysis of everyone else, your biases are revealed, and in this case it places Sovacool's sympathies directly in the anti-nuclear camp. Sovacool also quotes in this book that the levelized cost for "a 1,000-MWe facility built in 2009 would be 41.2 to 80.3 cents/kWh, presuming one actually takes into account construction, operation and fuel, reprocessing, waste storage, and decommissioning." These are the highest levelized costs for nuclear I have ever seen. The current gold standard for levelized costs, the EIA, states 9.5 cents/kWh.[34] Even Lazard, which tends to reduce levelized costs for renewables and increase costs for fossil fuels and nuclear, has for nuclear 12.4 cents/kWh.[35] Sovacool’s costs are a way out of line, and would indicate that these levelized costs of his on nuclear power have about as much validity as his analysis of the CO2 emissions of nuclear energy. Sovacool's sentiments are well placed within the anti-nuclear envelope and his analysis of nuclear energy give outlying results that should be highly suspect.

So what have we learned from this current paper of Sovacool et al? There is this database which is not accessible, but which we know is not trustworthy. We know that because the radiation release that the database reportedly claimed that came from TMI was overstated by a factor of 64,000. This error remained undetected by the authors and was even discussed in the text as being true. Also the cost of nuclear accidents, which the authors claimed to be a good measure of the severity of the accident, was shown not to be robust as the cost of the Fukushima accident was shown to be all over the map, and the value chosen by Sovacool et al., 166 billion, was an exaggerated outlier. In fact all of the high cost "Dragon King" events were costed on the high side. Furthermore the cost of the accident was not correlated to the severity of the accident as Fukushima was tagged as the most severe accident in terms of cost, but in terms of radiation release, which is the real physical parameter by which to measure the impact of a nuclear accident on the environment, Fukushima emitted only a fifth of the radiation that Chernobyl emitted. Thus the database appears to be flawed. In terms of predicting the time span in which we could expect the next major nuclear accident, the paper cannot do this with any reasonable accuracy. The next Fukushima could have a 50:50 chance to occur in the next 50 years, in the next 342 years, or anywhere in between, take your pick. Thus the paper would appear not to be very useful. However maybe something could be made of the probability of a nuclear accident per year per nuclear plant. That number just depends on the number of incidents, the year, and the number of reactors, which are statistics that are difficult to get wrong.

Let us assume then, as stated by Sovacool et al., that the probability for another Fukushima[36] is 50:50 in the next 50 years is correct. We will take this as the worst possible case. Now let us consider the the impact of improved technology, something which Sovacool et al. failed to consider. As stated above the current technology is producing GEN III+ reactors which are a factor of 100 less accident prone than the current GEN II reactors which make up the bulk of the current nuclear reactor fleet. If all the current reactor fleet were replaced with GEN III+ reactors, the probability of a reactor accident per year per nuclear plant would shrink by a factor of 100. Therefore all else remaining the same, like the probability of a Fukushima-like accident or greater for any given a reactor event, the probability for another Fukushima or greater would be a 50:50 chance in the next 50×100 = 5000 years. Five thousand years encompasses the entire time span of recorded history from the ancient Sumerians of the Fertile Crescent to Greece, Rome, the Middles Ages and onto the present. In practical terms that means forever. However just replacing the present nuclear fleet with with GEN III+ reactors will not be all that progressive. It will make us all feel safer about nuclear power, like hitting a fly with a sledgehammer, but it does nothing to alleviate the crises facing mankind and our planet. We are facing a possible 6th global mass extinction event due to global warming, and here we have an energy technology which does not emit green house gases (GHG), which turns out to be extremely safe, and it is not being used to end our dependence on fossil fuel energy. Why?

World energy consumption according to BP (British Petroleum). It shows nuclear power contributes 5%. This is not canonical. I have seen higher percentages in some statistics and lower in others. The lowest have been 2.7%. The figure of 5% seems to be an average and it makes the expansion of nuclear power to 100% for the purposes of this essay a simple factor of 20.
World energy consumption according to BP (British Petroleum). It shows nuclear power contributes 5%. This is not canonical. I have seen higher percentages in some statistics and lower in others. The lowest have been 2.7%. The figure of 5% seems to be an average and it makes the expansion of nuclear power to 100% for the purposes of this essay a simple factor of 20.

Contrary to the authors' conclusions that better bonding and liability instruments are in order for the nuclear industry to offset the almost infinite damage possible from future nuclear accidents,[37] the opposite conclusion is called for. Nuclear power should be expanded with all possible speed in order to end our dependence on fossil fuels. Nuclear energy currently supplies about 5% of our global consumption of energy (see figure to the left). Expand that supply by a factor of 20 and 100% of our energy needs would be supplied by nuclear power. What would be the accident rate of a Fukushima event or greater? If the expansion was done using GEN III+ technology, there would would be a 50:50 chance that one would occur in the next 5000÷20 = 250 years. How long is 250 years? That is the entire length of time since the start of the industrial revolution in 1750 when Watt invented the steam engine and on up until the current present. And remember that probability is using the assumption of the worst possible scenario. The accident rate is most assuredly to be lower than this. To all intents and practical purposes these accidents would be effectively non-existent.

Two hundred and fifty years was all the time in the world for all the technical advances of the industrial and technological civilization. In that time every innovation of our modern civilization was developed. Hence in 250 years, if civilization survives its oncoming crises, there will be tremendous developments in nuclear and other sources of energy. But we do not have to wait 250 years for these developments. They are happening now. For example there are the GEN IV nuclear reactors which are a technology not 250 years away, but almost right around the corner, 40 years or less, mostly less. Pilot GEN IV plants have been built,[38] and have demonstrated practicality of the technology. With regard to nuclear accidents, there are fundamental differences between GEN II and GEN III+ reactors and the newer GEN IV reactors.

In GEN II and GEN III+ designs, the primary cooling system contains superheated water at 315˚C under a pressure of 153 atm that transports the heat from the reactor core to the steam boilers which generate the steam to turn the turbines that generate electricity.[39] It was the shutdown of this primary cooling system that was involved in all previous extreme nuclear accidents. With the cooling shut down, heat from the latent radioactivity in the core will drive the temperature and pressure in the primary cooling system way up. The pressure has to be reduced to prevent a rupture and in doing so volatile radioactive isotopes such as 137Cs and 131I are released into the atmosphere which coat the surrounding countryside in radioactivity. If the temperature gets too high the core will melt. This is basically what happened at Fukushima and much less so at TMI. At Chernobyl, the safety system was shut down for reasons that are not entirely clear, and the operators were not fast enough to shut the reactor down with the inadequate control rod system. Thus nuclear fission in the core was not shut down as it was at Fukushima and TMI. The temperature and pressure in the pressurized primary cooling system shot up sky high rupturing the reactor vessel discharging not only the volatile radioactive isotopes but also the contents of the core itself. There was no western-style containment structure so that bits and pieces of core were scattered about the countryside. The moderator, unlike western reactors, was made of carbon, which caught fire thus distributing even more radioactivity abroad. Despite these differences, the problem was basically the same, the pressurized primary cooling system can be like a bomb ready to go off and spread the internal radioactive contents far and wide.

In GEN IV designs heat is transported more likely than not by a primary cooling system containing liquid metal such as sodium. The liquid metal is at atmospheric pressure. Thus, if there is an accident, there is no pressure to push the radioactive contents of the plant far and wide. The radioactivity stays put, and the damage of the accident is limited to the capital cost of the plant. Hence you would not expect a 50:50 chance to have one accident in 250 years. Long before that, within the time frame of 40 years or less, as soon as the GEN IV reactors become the norm, there would be no more Fukushimas. Fukushimas would be an artifact tied to the past being the result of an antiquated technology. Fukushima type accidents are not an inherent consequence of nuclear power, they are the product of the technology chosen to extract that power.

A visual representation of the state of the world if nuclear power were to be willy-nilly removed from the equation to solve our future energy demand.
A visual representation of the state of the world if nuclear power were to be willy-nilly removed from the equation to solve our future energy demand.

Seeing that there is a 100% guaranteed assurance that the continued use of fossil fuels will bring about an infinite amount of damage and loss of life along with the possible destruction of the world civilization, the conclusions of Sovacool et al. that there should be increased bonding and liability instruments for the nuclear industry to allay future damage from extreme nuclear accidents is both shortsighted and unimaginative. To crush nuclear power under a heavy financial debt obligation to pay for hypothetical exaggerated costs of future nuclear accidents is certainly a solution that would be suggested by the anti-nuclear mentality. However the obvious solution is not to be found in the offices of actuaries, but in implementing the much cheaper solution of replacing the current nuclear fleet with modern nuclear technology and expanding it with all reasonable speed. The danger is not, in building up the capacity of nuclear power, but in not building it up.[40] Nuclear power is one of the safest,[41] most economical,[42] plentiful,[43] and greenest[44] sources of energy available. The only impediment standing in the way of implementing nuclear power is the irrational fear the public has of nuclear fomented by the misinformation and distortions of the world anti-nuclear movement. Typical is the fear towards nuclear waste and the long term solutions to store it.[45] which has been effectively placed into a stasis by anti-nuclear concerns despite the fact that the solutions[46] are feasible, and the amount of waste is small.[47] If we are to overcome our dependency on fossil fuels all solutions of emissions free technology must be developed. Failure to develop nuclear due to preconceived and misplaced irrational fears is an insane response and not a path to the future. Such an attitude could be enough to sink the boat. Irrational fears must be overcome, and "we should be moving vigorously to increase the nuclear energy supply".[48]


Wheatley, Sovacool and Sornette reply, September 30, 2015:
Spencer Wheatley, Benjamin Sovacool and Didier Sornette have contacted me and have given a rebuttal to my remarks on their website. Their pdf rebuttal can be accessed directly here. By gentlemen's agreement my article has a link on their website and their reply has a link on mine. I have replied to their comments.


references and footnotes

  1. The paper by Wheatley, Sovacool and Sornette is riddled with errors and is one of the worst examples I have ever seen of this in a paper that has supposedly been released for publication. The article Errors in the Sovacool et al. paper contains a partial list of these errors.
  2. The Review headline states The Chances of Another Chernobyl Before 2050? 50%, Say Safety Specialists. That is not what the authors stated. They said "there is a 50% chance that...a Chernobyl event (or larger) occurs in the next 27 years". Twenty-seven years from now is 2042, not 2050.
  3. The event which triggered the Fukushima meltdowns, the Tōhoku earthquake and tsunami, could hardly be thought of as a seemingly small problem. But then again the placement of the backup generators to power the cooling pumps at Fukushima could be viewed as a seemingly small matter, but which would have catastrophic consequences in the aftermath of the earthquake and tsunami.
  4. Actually the authors, at the end of their paper in an almost apologetic manner, referred to such dire forebodings with terms such as "quasi-hysteric" and "media hype". However the reader could be forgiven for not paying much heed to these attempts to bring the threat of a future nuclear accident back to a rational level after seeing the authors suggest that the mean of probable future nuclear accidents could cost as much as $20 billion per year.
  5. For an example of this exaggerated hype see the VICE video report on Fukushima
  6. Here are two articles regarding the exaggerated fears of nuclear radiation
    Fear vs. Radiation: The Mismatch from the New York Times
    Fear of Nuclear Power is out of all Proportion to the Actual Risks from the Guardian
  7. The authors write,
    Indeed, one could also argue that people and firms learn from their past accidents and thus improve their safeguards, so that statistical loss analysis would be doomed due again to an intrinsic non-stationarity of continuously evolving entities. But it has been demonstrated repeatedly in actuarial sciences that, notwithstanding learning and adaptation, robust statistical regularities can be unearthed. This is the stance we take in this article.
    Thus it would seem that no allowance for improved technology has been made. This would be an appropriate model for actuarial purposes as the name of the game is to make the best possible projections so that premiums can be set. If the projections state that incidents will happen at a higher frequency than what will possibly be the case, then that will be satisfactory. The premiums will be set higher than what the real incident frequency will reveal as time unfolds, and consequently there will sufficient funds to cover the claims. However in this paper it is the real incident rate that is trying to be determined, not a number good enough for actuarial purposes.
  8. It is noted that in the authors' table 1, "the 15 largest damage events" lists events at Windscale (Sellafield) and Rocky Mountain Flats. Windscale was an air-cooled flammable graphite moderated reactor that had a second purpose devoted to producing Plutonium for bombs. Rocky Mountain Flats was a processing plant for bomb grade Plutonium and was not even a reactor. Bomb grade plutonium is a tricky substance to handle. Commercial reactors do not produce bomb grade plutonium unless specifically modified to do so, which then reduces their effectiveness in producing electricity. This is not a commercially viable option. Hence it does not seem these Cold War facility accidents have any relevance in predicting accidents at future commercial reactors. In noting this, the reader may well wonder as to the relevancy of how this database can be used to predict future accidents at commercial nuclear reactors like Fukushima.
  9. The probability factor is called the "Core Damage Event" or "Core Damage Frequency" (CDF), which is another way of saying core meltdown probability. This information can be gleaned from the following references
    Generation III Reactor, AP1000, and Core Damage Frequency
  10. In November of 2013 the noted climate scientists Caldeira, Emanuel, Hansen, and Wigley issued an open letter to those influencing environmental policy but opposed to nuclear power, asking them to reconsider their position on nuclear energy. As part of this letter, they stated
    No energy system is without downsides. We ask only that energy system decisions be based on facts, and not on emotions and biases that do not apply to 21st century nuclear technology.
    In contrast to the plea of this letter, Wheatley, Sovacool and Sornette are clearly encamped in the biases and nuclear technology of the past. This is what the anti-nuclear movement does. They firmly resist the notion that nuclear technology can improve.
  11. See table 5 on page 17. Look at the Pareto Model for 100 times Fukushima damage as the maximum liability. It says it would cost the nuclear industry 20.488 billion dollars a year for liability coverage up to 100 times the cost of a Fukushima. I would have suspected the Pareto DK figures to have been higher and on the surface this looks like another typo courtesy of the authors. However time constraints prevented me from looking into this before writing up this article. Looking at the distribution of damage costs, the authors say that even a liability cost of 100 the Fukushima damage would not get the public off the hook from paying for future damage costs. The distribution of damage costs has no limiting factor and can indeed be infinite by their model. In Table 4 for the Pareto DK distribution it shows that there is a 1% chance, given an accident, that it would exceed 332 billion dollars in damage. Thus according to the authors these massive costs are probable and should be expected.
  12. INES is the most commonly used nuclear accident scale used in the popular press. It is an integer scale that ranges from 1, the lowest rating designating the smallest accidents, to 7, the highest designating the most destructive accidents. NAMS is an attempt to make this INES score into a more quantifiable number. It is defined to be
    NAMS = Log10(20R)
    where R is the radiation released by the accident in units of TBq (terra-becquerels i.e., 1012 becquerels) where a becquerel (Bq) is a unit of radiation decay equal to 1 disintegration per second. The references make it clear that the units of R in TBq are to be in the radioactive isotope iodine, I-131, equivalents. Damage cost in terms of 2013 USD should be self-explanatory.
  13. Actually the radiation release for Fukushima was more like 940 PBq or 7.3 on the NAMS scale. I find the authors consistently erred in their data sample by quoting figures higher that the normally quoted value.
  14. Extracted from the World Nuclear Library article on the accident at Three Mile Island.
  15. An IEEE reference gives the radiation release from TMI as 62 TBq. This would give a NAMS score of 3.1. This is more in line with the radiation doses received by the surrounding populace. The radiation release from Chernobyl was 84,000 times larger.
  16. The complementary cumulative distribution function, CCDF(xi) simply put is the probability for the continuously distributive variable x to exceed xi (i.e., x>xi).
  17. The empirical CCDF is derived in this manner. Let N be the total number of data points in the sample. For any data point of value xi, count the number of data points exceeding the value of xi. Call this number n. Then n<N and CCDF=n/N. This is the empirically derived CCDF. Note that for the largest value, CCDF=0, the point cannot be plotted in the figure. These points are indicated at the bottom of the graph at their appropriate xi value.
  18. TMI was not really a Dragon King event. TMI was a financial disaster. However in terms of danger from radiation, it was a hyped-up media event fanned by anti-nuclear paranoia and exacerbated by Jane Fonda in the movie, The China Syndrome, a name for a phenomena that does not exist.
  19. The anti-nuclear movement has been trying almost from day one of the Fukushima incident to portray Fukushima as a larger nuclear disaster than Chernobyl. Here they finally have one. Cost! Yet cost does not really measure the impact the radiation release had on the environment.
  20. The measure of Sovacool et al does not measure the severity of a nuclear accident. What it measures is how rich is a society in which the nuclear accident occurs, and how willing is the government to over indulge in radiation clean-up and to accommodate the radiation fears of its citizens, real or imagined. I have seen Japanese citizens panic over readings of 0.39 µSv/h, which while being high for Japan is normal for many parts of the world. The Japanese government has committed itself to a clean-up where no citizen would receive more than 1mSv extra dose per year due to Fukushima. The average world background is about 3 mSv/y and 10 mSv/y is harmless. There are some parts of the Brazilian costal tourist beaches where the average dose is 30 mSv/y or more from the natural background, and there is no epidemiological indication that says that these doses are harmful to the public health. See Are Chernobyl and Fukushima legislated disasters? To clean-up to the level of 1 mSv/y adds billions of dollars of unnecessary costs. These costs and others are discussed in a ‘’Nature’’ article Japan's Nuclear Crises: Fukushima's Legacy of Fear. It is said that the compensation for the inconvenience caused by the Fukushima incident pays better than compensation for loss of lives and property in the earthquake and tsunami. Does the fact that the Japanese government is more willing to put out such compensation for Fukushima than is the Soviet government for Chernobyl make Fukushima a greater disaster than Chernobyl? No it does not.
  21. August 27, 2014 Fukushima disaster bill more than $105bn, double earlier estimate – study. from RT (Russia Today) news
  22. Rational Wiki RT news article about Russia Today
  23. There are even more wildly varying costs. David Lochbaum, speaking for the Union of Concerned Scientists (UCS) at the Helen Caldicott symposium sponsored by Physicians for Social Responsibility (PSR) quoted costs of $71 billion to $250 billion although he preferred the lower figure (see 8:45 of the video). Like RT news, the UCS is not exactly the best source for unbiased reporting on nuclear power. In fact they are decidedly anti-nuclear. The fact that they would send a representative to a PSR Helen Caldicott symposium should attest to that fact. The PSR, co-founded by Helen Caldicott, is ultra anti-nuclear group, who are committed to bring about a nuclear free planet under any and all circumstances and that also includes nuclear power. This is an ideological driven organization and not a factual driven organization. It matters not if nuclear power can alleviate CO2 emissions into the atmosphere to mitigate global climate change; Helen Caldicott and the PSR are here to stop that. They have their own estimates on how much Fukushima will cost that would even blow the minds of Sovacool et al. Their estimate is $250 to $500 billion dollars! This is nonsense of course. To put matters into perspective, the whole Tōhoku earthquake and tsunami disaster which triggered Fukushima is estimated to cost around $300 billion. Perhaps Lochbaum's upper value of $250 billion was a concession to his hosts as his statement of costs would then not directly counterdict those of the PSR. As an indication of how credible the information coming from the PSR is, there is this exchange between George Monbiot and Helen Caldicott, the leader of the PSR, which shows Helen Caldicott is not adverse to half-truths, misinformation, and distortions when it comes to delivering information on nuclear energy. George had even more to say on the veracity of Helen Caldicott and the anti-nuclear movement in this report, Nuclear opponents have a moral duty to get their facts straight.
  24. A list of disasters by cost quotes $15 billion in direct costs for Chernobyl although indirect costs could be as high as €235 billion for the Ukraine and €201 billion for Belarus
  25. There is an article on The Monju Nuclear Power Plant in the Wikipedia
  26. Exactly what threshold did Tsuruga cross to qualify? It certainly was not the authors' suggested NAMS level of 5.
  27. If there is any justification, it appears to be weak. The authors themselves stated
    "The decrease in rate post-Chernobyl is not statistically significant."
  28. From section 4 one would be led to believe that this sample size would be 103 events. However, inexplicably, the authors seem to have lost three events, one being Chernobyl.
  29. There is a bend but not a discontinuous break
  30. previous studies are referenced in this Rationalwiki entry.
  31. This study was co-authored by Philip Smith and it is described with associated links in the Rationalwiki
  32. This result is from the IPCC's 2014 publication of life cycle greenhouse gas emissions from various energy sources. The summary of this study and related links are available from an article in the Wikipedia. This article also has the results from an earlier IPCC study that was published in 2011, which had the result of 16 g(CO2e)/kWh for nuclear power. Hence the IPCC results have been consistent. The article also discusses the Sovacool result of 2008 and the problems with that result.
  33. Contesting the Future of Nuclear Power (2011) Wikipedia article regarding the book
  34. Annual Energy Outlook 2015 EIA
  35. Lazard's Levelized Cost of Energy Analysis—version 8.0 September 2014
  36. We will only consider Fukushima type accidents for the future. Chernobyl was a graphite-moderated reactor that could catch fire and had a positive void coefficient. It had no western-style containment structure and used an unsafe control rod system. In short it was an accident waiting to happen. Fukushima was a water-moderated reactor with a negative void coefficient. There was no graphite to catch fire. It had a western-style containment structure and an excellent control rod system. These are remarkably different reactors and a Chernobyl event has little predictive power on a future Fukushima. In fact reactors like Chernobyl will not exist in the future because no one will build them. The last one will cease operations in 2026. Thus the chances for a Chernobyl event in the future are zero if they do not exist. In fact if you are in the business of predicting future catastrophic nuclear accidents, you could argue that Chernobyl be dropped from the data set because it is no longer relevant.
  37. This infinite damage is a result of highballing damage estimates as was shown above. In fact the greatest contribution to this infinite damage probability is the authors' estimation of Fukushima costing $166 billion dollars, a figure which has been shown to be an outlier.
  38. EBR-II Experimental Breeder Reactor - II
  39. This basically describes a pressurized water reactor (PWR), the most common type of reactor. For a boiling water reactor (BWR) the coolant is allowed to boil and the steam drives the turbines. The steam is recondensed and cycled back to cool the reactor. The problems of a pressurized vessel containing radioactivity are common for both types of reactors. Fukushima was a BWR.
  40. Nuclear power must be built up hand-in-hand with the other sources of emissions free power. Relying on renewables to supply 100% of the energy has a fundamental flaw, the intermittency problem. It can be shown that for intermittent sources of renewable energy, such as wind and solar, it is impossible to supply 100% of the energy 100% of the time. Other renewable sources such as hydropower are limited and cannot fill in the intermittent shortfall. Solutions such as long term energy storage will never be able to meet the demand unless there is a technological miracle. Trusting the Earth's future to 100% renewable energy and a technological miracle is a bet that in all likelihood will be lost.
  41. This essay demonstrates this fact for the future, and even for the current reactor fleet of GEN II reactors, it can be shown that nuclear has the least impact on public health of any known way to generate electricity, nuclear accidents included. Kharecha and Hansen wrote about the number lives saved from displacing fossil fuels with nuclear, Prevented Mortality and Greenhouse Gas Emissions from Historical and Projected Nuclear Power. Table 1 compares mortality rates between fossil fuels and nuclear power.
  42. Levelized costs are compared in the EIA's Annual Energy Outlook. These are raw levelized costs for various means of producing electricity. For renewables I estimate the costs must be doubled to make renewables a viable energy system. The cost of storing energy must be included for example. For nuclear power which is dispatchable, the cost is as presented.
  43. The current nuclear fuel is the isotope 235U. This is the fuel for GEN II and GEN III+ reactors. 235U consists of only 0.72% of the isotopic makeup of natural Uranium. GEN IV reactors will be able to convert natural Uranium into fissionable fuel. Thus the fuel supply will increase by a factor of a (100-0.72)/0.72 = 138. Furthermore GEN IV reactors will also be able to convert Thorium into fissionable fuel. Thorium is approximately 4 times more plentiful on the Earth's surface as is Uranium. With Thorium factored in, I estimate that the Earth's supply of fissionable fuel will increase by a factor of 552. At the present rate we produce energy, GEN IV reactors will be able to supply 100% of the World's energy needs for the next 287,000 years. The fuel supply is virtually unlimited like that for renewable energy.
  44. This aspect has been discussed in a previous footnote for a 2014 IPCC study. Only wind power emits less CO2e per kWh (11 vs. 12), and solar PV emits 4 times as much as nuclear.
  45. The storage of long term nuclear waste in deep underground stable geological formations is a perfectly safe and technically feasible solution. The closure of the Yucca Mountain Nuclear Waste Repository in the U.S. was for political reasons, not for technical or safety reasons. When GEN IV reactors come on line, the waste from GEN II and GEN III+ reactors becomes fuel for the GEN IV reactors. The amount of waste will be shrunk dramatically leaving only the relatively short lived fission products.
  46. deep underground storage in geological stable formations has been deemed to be an effective solution
  47. The amount of nuclear waste has been estimated to take up the size of a football field, to be safe perhaps maybe two or more such fields.
  48. Burton Richter: Beyond Smoke and Mirrors: Climate Change and Energy in the 21st Century, Cambridge University Press, 2010.
Personal tools