Categories
Open Thread

Open Thread 15

The Open Thread is a general discussion forum, where you can talk about whatever you like — there is nothing ‘off topic’ here — within reason (see Note below). So get up on your soap box! The standard commenting rules of courtesy apply, and at the very least your chat should relate to the general content of this blog.

The sort of things that belong on this thread include general enquiries, soapbox philosophy, meandering trains of argument that move dynamically from one point of contention to another, and so on — as long as the comments adhere to the broad BNC themes of sustainable energy, climate change mitigation and policy, energy security, climate impacts, etc.

You can also find this thread by clicking on the Open Thread category on the cascading menu under the “Home” tab.

Note: This is a new general Open Thread. However, two more specific Open Threads are also still available: (i) if you wish to make a philosophical comment on the Fukushima Nuclear crisis, go here; (ii) for technical comments on the Fukushima situation, go here. Every other non-post-specific comment can go below. For reference, the last general open thread (from 20 February 2011) was here.

By Barry Brook

Barry Brook is an ARC Laureate Fellow and Chair of Environmental Sustainability at the University of Tasmania. He researches global change, ecology and energy.

195 replies on “Open Thread 15”

As I predicted several weeks back certain sectional interests are lining up to demand exemption from or compensation for the proposed carbon tax.

The steel industry certainly has a good case as they operate in a highly competitive international environment and have strategic importance for Australia.As free trade is the current conventional wisdom they couldn’t possibly be offered protection now,could they?

It will become increasingly obvious that playing around with financial incentives/disincentives to reduce carbon emissions opens up up a whole industrial bin full of worms.And the end result will probably be next door to zero reduction in emissions with a swag of collateral damage.

What is urgently needed is positive action by government with the assistance of private industry to eliminate the primary offender,coal,from
electricity generation.The other polluters pale by comparison and can be tackled at leisure.

Of course,none of this is going to happen given the bipartisan attitude that deficits bad,surplus good,nuclear power bad,air fairy renewables good,goverment enterprise bad,privatization good,national interest bad,globalization good,stable population bad,Big Australia good etc etc.

By the way,if anybody is interested in learning about the real nature of money in a sovereign nation with a fiat currency then I suggest reading Bill Mitchell’s “Billy Blog”. Modern Monetary Theory consigns deficit hawks to the looney bin where they belong.

Like

@Hank

What current numbers are you comparing the table to? Is there some data of yearly fallout for locations as far away from fukushima as denmark is from chernobyl?(or are you not referring to the fukushima numbers?) Some clarification would be welcome.

Like

As we roll over the peak oil curve, which we are now doing, hardships are being created all over the world in the form of higher fuel and food prices. This is triggering riots in some countries and bringing out into the open other social problems that have been pent up for a long time. In the US there is a constant struggle to maintain economic “growth” through borrowing and spending. The US has worked itself into a debt corner so that a continuation of increased debt results in economic catastrophy. However putting the brakes on borrowing also results in an economic downfall. I see little hope for US economic recovery as long as it continues to put nuclear power R&D on the back burner. We should have continued with the 1992 plan. http://ifr.blip.tv/file/4198688?filename=IFRTeam-IntegralFastReactorPRISMIntroduction699.flv

Like

I’ve been puzzled why wind farm builders have been welcoming the carbon tax if renewable energy certificates are to be phased out as Garnaut insists. Isn’t it better from their point of view to have a 20% mandate instead?

A $25 carbon tax will penalise a Mwh from pulverised black coal by that amount since it generates a tonne of CO2. That’s 2.5c per kwh. However RECs have been selling recently for $33 a Mwh for large scale wind. Thus the immediate incentives will be similar. However as coal stations retire the question must arise about the cost and reliability of a growing gas/wind combo. My guess is that wind integration costs get steeper beyond a certain point maybe 10% or 20% of total Mwh. Meanwhile gas will get expensive with both increasing carbon tax and raw fuel cost.

Unless somebody can predict the exact parameters it seems unknowable at this stage whether wind and solar can get to 20% on their own just with carbon tax. My hunch is that it won’t without some major upheavals like new subsidies or mandates. Well before then major electricity users like aluminium will be complaining about costs as coal stations retire. Next question; what about the other 80%?.

Like

@John Newlands, 16April, 9.53pm.
A carbon tax gives more certainty than REC, for example if 25% renewable energy is built the price for REC may decline to zero. I think wind power is competing with low cost base load coal not higher cost OCGT and hydro peak power. A price on carbon will make it more expensive to keep coal fired running during off-peak periods. OCGT can just shut down when the cost of fuel exceeds price per kWh. Wind has to keep operating so is a price taker.
Why would wind power become more expensive at >20% total MWh produced? Loss of income by wind operators would occur when off-peak load shedding starts to become significant, ie when total wind capacity is 100-150% of off-peak demand, unless additional pumped hydro can use some of the surplus. OCGT(and hydro) will only operate when wind ( and coal) cannot supply all demand. CST with storage would be competing with OCGT to supply peak demand. A CO2 price of $25/tonne may not be high enough to make it competitive with OCGT for peak demand.

Like

I’m wondering if anybody has thoughts on the work of Dr. Steven Wing on re-evaluating the health impacts of radiation exposures in the 10 mile zone surrounding Three Mile Island. Steve Wing is an associate professor of Epidemiology at the University of North Carolina (Chapel Hill) Gilling’s School of Global Public Health. Here is his CV and list of publications. As described in a 2003 paper (and on Wiki), he says he was first asked to look into research findings by lawyers working on a court case by 2,000 plaintiffs (regarding radiation releases from TMI). As described in the paper, he “was wary of beaming involved in the lawsuit,” mainly because he felt the trend in the media was to see such claims as alarmist (a “product of radiation phobia”) and related to an effort to extort money (and he was cautious not to associate himself with such “interested” perspectives). The title of the 2003 paper laying this background is titled: “Objectivity and Ethics in Environmental Health Science.”

So what did he find regarding health impacts of TMI (and his review of the Columbia study cited here and here)? His findings were published in the Environmental Health Perspectives Journal in 1997: “A reevaluation of cancer incidence near the Three Mile Island nuclear plant: the collision of evidence and assumptions.” Dr. Wing highlights short follow up, estimates of low dosage (which he and others have questioned), and selectivity of health impacts (such as categorizing leukemia and childhood cancers separately) as evidence of weak (and unreliable) assumptions in earlier research. When he attempts to correct for these weaknesses (and draws on different models) he gets a different result. His summary of research findings can be read in the following UNC press release from 1997 (emphasis added): “He and colleagues conclude that following the March 28, 1979 accident, lung cancer and leukemia rates were two to 10 times higher downwind of the Three Mile Island reactor than upwind … [Wing states:] ‘The cancer findings, along with studies of animals, plants and chromosomal damage in Three Mile Island area residents, all point to much higher radiation levels than were previously reported. If you say that there was no high radiation, then you are left with higher cancer rates downwind of the plume that are otherwise unexplainable.'”

Thought I would include this here, since many people assume (depending on whether you take the Columbia Study to be accurate or not) that there were no health consequences from TMI. Wing’s reluctance to be involved in this research, and his subsequent publications on ethics and objectivity, and on-going professional work on the issue, would seem to indicate the contrary (or at least considerable uncertainty on the issue). If anybody knows any differently, please provide a follow-up, commentary, or rebuttals to this research. When I do a citation search for the 1997 article, I find the following in more recent journals:

– “Epidemiological studies of leukaemia in children and young adults around nuclear facilities: a critical review” (Radiation Protection Dosimetry, 2008). Agrees there is a higher incidence of cancer, but states the reason for this is not yet known: “Many studies were launched to investigate possible origins of the observed clusters around specific sites, but up to now, none of the proposed hypotheses have explained them.”

– “Three Mile Island epidemiologic radiation dose assessment revisited: 25 years after the accident” (Radiation Protection Dosimetry, 2005). It also finds problems with earlier assessments, “This commentary suggests that the major source of radiation exposure to the population has been ignored as a potential confounding factor or effect modifying factor in previous and ongoing TMI epidemiologic studies that explore whether or not TMI accidental plant radiation releases caused an increase in lung cancer in the community around TMI.”

– “The cancer epidemiology of radiation” (Oncogene, 2004). Puts TMI assessments and re-evaluation by Wing (et. al.) in context of other studies, most specifically cohort studies of atomic bombings at Hiroshima and Nagasaki, radon exposure among hard rock miners, and workers in nuclear weapons programs in former USSR. Study takes issue with findings on lung cancer at TMI (by Wing), but suggests “The degree of carcinogenic risk arising from low levels of exposure is more contentious, but the available evidence points to an increased risk that is approximately proportional to the dose received.”

– “Long-term follow-up of the residents of the Three Mile Island accident area: 1979-1998” (Environmental Health Perspectives, 2003). A comprehensive account that provides the following conclusion: “Although the surveillance within the TMI cohort provides no consistent evidence that radioactivity released during the nuclear accident has had a significant impact on the overall mortality experience of these residents, several elevations persist, and certain potential dose-response relationships cannot be definitively excluded.”

Like

Moderator: Sorry for the “throwaway” comment. I’ll be more careful. I figured that being it was for lulz, everyone would understand the satire.

Allow me to append that comment by saying that standing under a failing turbine would be much scarier than an invisible radiation release. Also, I hope all agree, levity is good, it relieves tension and the fear of death.

Like

John Newlands,your next and final question is a good one.That 80% is the deal breaker.
That is where the funny money schemes and the bottom of the garden dreaming come unstuck.

Like

Gene Preston,that word debt seems to be the bogey man in the US and other Western nations at the moment.Private debt is certainly of concern but federal government debt is not.

The government of a sovereign nation controls its currency,free and unimpeded,unlike a state or local government or private citizens.Probably for ideological reasons the US government has chosen to borrow to fund their deficit.There is no real reason to do this as they can create money at the stroke of a computer key and they do this all the time with due regard to inflation and other unwanted effects.As long as the amount of money available reflects actual resources available then there is no problem.

With unemployment approaching 10% at the lowest measure and upwards of 40 million souls on food assistance I hardly think that the US needs to worry about inflation for some time.These people can be put to work at useful and badly needed public infrastructure tasks but it will take
government initiative to do that.

Sadly,that initiative is lacking for ideological reasons as well as the control of government by the wealthy 1%.

I am not picking on the US here as this situation is common in the West,including Australia.

Like

I think we’re a bit smug in Australia since we seem to be avoiding the worst of the global downturn. Much of that must be due to the fact that every day we send numerous boatloads of rocks to China, some of which come back as climate change. When the global Peak Oil/Peak Debt contagion affects China they won’t buy so much from us then we’ll struggle just like Europe and the US.

Neil I think wind would have to be more than half the wind/OCGT combo to compete with pulverised black coal even after carbon tax. On checking this I see LCOE cost estimates are all over the place. $45 per Mwh black coal plus $25 carbon tax is $70. OCGT is $105 to $130 (some say more) plus say $18 carbon tax (from .75t CO2) is about double the cost of black coal. Unless gas balancing of wind can be minimised we’ll still be using coal just paying more for it.

That’s why when carbon tax fails to drastically cut emissions I think they will bring back RET maybe make it it 40% this time.

Like

Alternatives to fossil fuels for generating electrity.

I have tried to find actual contract prices, and failing that, estimated build prices and other costs related to operations, for four alternatives in the contiguos USA. These may already include some of the various forms of subsidies or tax incentives. The result is the busbar (generation) cost (LCOE) in US cents/kWh. CF is Capacity Factor.

Solar PV: 23.4 @ CF=25%
Solar thermal: 21.7 (Mohave desert with 4? hour thermal storage; without storage CF=25%)
Nuclear: 11.8 for Vogtle 3&4 Gen III+ Westinghouse AP-1000 @ CF=90%
Wind: 9.15 @ CF=30%+

so on cost alone to the retail utility company, just now wind [in windy locations] seems best. But alas, to provide on-demand power requires backup for the wind turbines; around here combined cycle gas turbines (CCGTs) are being increasingly used for that purpose — not fossil fuel free.

Of course, there is nothing which requires society to prefer the least costly solution. Using newly constructed pumped hydro as backup for the wind resource results in an estimated levelized cost of 14.572 cents/kWh to the retail utility.

Like

John Newlands,
I think you are missing the dynamic nature of electricity pricing in at least in SE Australia. OCGT only has to operate when the price is high(peak demand) or if there was a lot of wind capacity when demand was high and wind output low. Coal would be competing with wind during off-peak periods and with OCGT during peak periods, unless the power stations can start and stop once or twice a day. Coal will be able to compete where it has a bulk consumer such as an aluminium smelter.
Why would wind be able to compete with coal? because operating costs are lower than OCGT during peak demand, and it can afford to sell off-peak at close to zero, whereas coal has to pay for fuel and a CO2 tax during off-peak periods.
This may well result in more expensive electricity than having 80% generated from coal, but the market is pricing electricity in 30min blocks not 10 year blocks.
If we had nuclear and OCGT instead of wind and OCGT coal would be in a similar disadvantage during off-peak periods when nuclear would keep operating whatever the price.

Like

@David Benson 17April 9.41am.
OCGT is not FF free, but is used at a very low capacity factor(10-20%) so contributes a lot, lot less CO2 than coal-fired running at 75% capacity factor. Of course nuclear is also backed up by OCGT and pumped hydro.
Solar thermal with storage would be competing for peak power against OCGT not against wind or nuclear.

Like

“Solar thermal with storage would be competing for peak power against OCGT not against wind or nuclear.”

To give the Devil his due, that is probably the sanest statement about an application for solar I’ve seen to date. However having said that, cheap enough, thermal storage with the capacity to be economical useful could more easily be recharged by spare heat from a reactor, making the point of using solar moot.

Like

Neil Howes, on 17 April 2011 at 10:37 AM — At least here in the Pacific Northwest with ample hydro there is very little OCGT (used just for peak shaving). In fact reserve wind does an even better job (when the wind is blowing).

Backing for wind in Germany is cold thermal with 4–7 hour startup times; that implies to me it is coal fired.

Like

DV82XL, on 17 April 2011 at 10:46 AM — Please spec it out. What is the waste heat source that would recharge the molten salt thermal storage?

I had thought that the approximately two-thirds of the thermal was simply lost except for the little bit used in district heating schemes as exemplified by (coal fired, mostly) thermal generation in Germany.

Like

@David, I didn’t use the term ‘waste heat’. I was thinking more of routing working heat to storage when it was not required to make steam.

However the point is that any storage medium that would serve to level power from variable sources like wind and solar, could just as easily store surplus power from a NPP to use as peaking. This is the Catch-22 of renewables – they are not practical without economic, scalable storage, but the availably of such storage would make it cheaper to recharge from nuclear, making a renewable-based system uneconomic.

Like

Hmm there’s potential for nuclear/renewables cohabitation there. Off peak power from a reactor could charge up a storage medium, and then be utilised to back up renewables when they don’t match demand.

Like

RE:EL’s last post above -Normally I will not respond to this commenter, as I consider him a troll, however he has posted his last comment twice, and it needs to be answered, if only to demonstrate the kind of dissemination he engages in.

First of course we need to see an article titled:Objectivity and Ethics in Environmental Health Science which suggests that the TMI cancer incidence studies were conducted in the context of conflict between residents who believed they had been injured and officials who denied that such injuries were possible, and were thus subject to bias and commercial interference.

The next paper linked to:
“A reevaluation of cancer incidence near the Three Mile Island nuclear plant: the collision of evidence and assumptions,” has several obvious flaws.

As usual in TMI research, dubious methods of guessing what the doses might have been for a given population are questionable. In this case because 15 years had elapsed between the accident and the sampling, comparisons of chromosomal to stable aberrations were used to calibrate the dose estimated for TMI area residents. This calibration was based primarily on a group of Chernobyl emergency workers known as liquidators. In other words they looked for chromosomal aberrations to estimate dose levels, based on data from a high-level exposed group. No justification was mentioned for this method, and the details of this critical part of the study are sketchy indeed. Given that the hypothesis hinges on demonstrating a dose response, this is a considerable oversight and smells of using the conclusion as a premise.

As well, in studies of changing disease rates following a well-publicized event, heightened awareness of symptoms and surveillance by medical personnel can lead to increases in disease due to detection bias. While some effort was made to control for age, sex, and socioeconomic variables, the actual numbers of cases were small, making trends on the short time scale of the study period difficult to justify.

Next: Epidemiological studies of leukaemia in children and young adults around nuclear facilities: a critical review, come to the conclusion, (when you read the actual paper) that: “Localized excesses of cases of childhood leukemia exist in the United Kingdom close to the reprocessing plants at Sellafield and Dounreay, and in Germany close to the Kruemmel nuclear power plant. Nevertheless, none of the multi-site studies currently available shows an increase in the frequency of leukemia overall in children and young people aged 0-14 or 0-24 close to nuclear sites.” (emphasis mine) Somewhat different than what EL implies.

Then we have: Three Mile Island epidemiologic radiation dose assessment revisited: 25 years after the accident. Which states that the challenge of adequately reconstructing past radiation exposure makes it very questionable as to whether or not the various TMI-related epidemiologic studies had sufficient power and rigor to make any claims regarding whether or not the radioactivity released during the TMI accident had a statistically significant impact on the lung cancer mortality experience of this population. This was due to, the lack of control of confounding exposure values by radon decay products given that the counties around TMI have the highest regional natural radon potential in the United States.

Hardly supporting evidence for increased health effects due to the TMI incident.

Moving on there is, The cancer epidemiology of radiation Which is an exhaustive review of the standard work done on the subject of radiation induced cancers that has nothing to add to the TMI question other that to restate that LNT is considered a reasonable model for low-dose exposure.

Finally, Long-term follow-up of the residents of the Three Mile Island accident area: 1979-1998 . Actually concludes (again if you read the whole paper) that: ” the mortality surveillance of this cohort, with a total of almost 20 years of follow-up, provides no consistent evidence that radioactivity released during the TMI accident (estimated maximum and likely gamma exposure) has had a significant impact on the mortality experience of this cohort”

In short, nothing in the studies posted above support any measurable increase in cancers in the TMI ‘downwinders’ population that cannot be dismissed as experimental error, detection bias, or statistical noise.

Like

“but the availably of such storage would make it cheaper to recharge from nuclear, making a renewable-based system uneconomic.”

By the same token its cheaper to use coal for recharging.

I understand that everyone here is trying to quantify coal whose externalities are unacceptable even if it were free on paper.

That is pretty much t he same argument the anti-nuclear crowd makes. Nuclear waste, the costs of decomissioning, the costs of meltdowns both level 7 or level 5, death of innocent civilians (children most vulnerable).

PV and modern wind has no externalities, but its problem is pretty much all on paper ($$$). Even the installation casualties can be eliminated with CHEAP safety standards (rope and harness).

Like

Enviromentalist, on 17 April 2011 at 1:07 PM — The LCOE, some actual and some estimates, that I posted eralier includes the mandated fee into the decommissioning fund for nuclear. AFAIK the solar and wind operators are free to abandon when no longer operable.

As has been mentioned many times on this site, spent fuel rods are a valuable resource for reprocessing.

Both PV and wind have externalities which you may discover on the ExternE site:
http://www.externe.info/

Like

DV82XL wrote:

This calibration was based primarily on a group of Chernobyl emergency workers known as liquidators. In other words they looked for chromosomal aberrations to estimate dose levels, based on data from a high-level exposed group. No justification was mentioned for this method, and the details of this critical part of the study are sketchy indeed.

There is extensive discussion of this in the paper, and why authors are drawing on this approach. It’s not to make hard and definitive associations between dose levels and cancer incidence (this is difficult and near impossible to do without adequate monitoring of exposure levels around the accident site, and comprehensive follow up with residents). Instead, they use this approach to disprove a faulty a priori assumption of early studies that they suggest leads to an erroneous result … namely, that there was no environmental release of radiation at TMI above normal background levels for the area. They provide two lines of evidence to discount this assumption: 1) anecdotal reports of hair loss, dead pets, vomiting from residents at time of accident (which they associate with higher than background levels of radiation exposure), and 2) cytogenetic analysis of 29 persons who lived near TMI and “reported erythema, vomiting, diarrhea, and other symptoms at the time of the accident.”

Why does this matter … Wing provides more detail in a follow-up comment (pp. A 546 – A 547): “Both Talbott et al. (1) and Hatch et al. (3,4), who reported on the Columbia University studies of cancer incidence, began with the assumption that the maximum possible radiation doses from the accident were well below average annual background radiation levels. Even if standard radiation risk estimates are underestimated by an order of magnitude or more, such doses would be associated with very small increases in cancer in a general population with heterogeneous susceptibility (2). Given the measurement constraints of epidemiologic studies, it would not be possible to detect an accident-related increase in cancer at the dose levels assumed by these authors. Thus, when they find increased cancer rates among residents assumed to have received relatively higher radiation doses from the accident, such as the significant linear trend in female breast cancer (1), the authors must conclude that the association is not due to the exposure they are studying.”

There thus appears to be a problem of definition and assumptions here, and not one of measurable results and conclusions. The problem remains, if we are to take Talbott and Hatch (reporting on Columbia studies) as definitive, how are we to explain higher cancer incidence levels among downwinders when compared to other populations (sharing similar age, gender, and socio-economic factors … but differing only in geographic exposure to radioactive plume). Claiming that there is a detection bias, or that residents are somehow imagining their “cancers” doesn’t really do it for me. Most researchers accept these epidemiological facts, they simply explain them with different models and assumptions (or don’t explain them, as is the case with Talbott and Hatch).

DV82XL wrote:

In short, nothing in the studies posted above support any measurable increase in cancers in the TMI ‘downwinders’ population that cannot be dismissed as experimental error, detection bias, or statistical noise.

I appreciate that you looked at the other sources in the original post. I did not include them to suggest they confirmed the analysis by Wing. Quite the contrary. I’m interested in what readers here have to say about Wing. They were not cherry picked, but were included as the most recent papers citing Wing (1997), and as some measure of the response of peers to the paper (a crucial aspect of “peer review”). One of them disputes his analysis with respect to lung cancers (I indicated this in my summary). But they also appear to agree on the broader questions raised by Wing … the work is unfinished on health impacts and there are unanswered questions in the epidemiology (and I indicated where this was the case in my original comment). If it was only statistical noise or detection bias, I am not sure why other researchers are repeating it, and also expressing concern with data that appears to have no explanation.

Like

Solar/wind decomissioning is hardly comparable, the land is reusable and the materials recyclable, nuclear’s low footprint is not reusable, and the steel structure a radiation hazzard.

“As has been mentioned many times on this site, spent fuel rods are a valuable resource for reprocessing.”

Pie in the sky, it has the problems of renewables (cost, and low capacity) and the same problems as traditional uranium, meaning the nuclear industry will prefer to build profitable uranium reactors. Also it only gets rid of fissionable actinides the rest of the isotope cocktail we are seeing released at Fukushima remains as nuclear waste, again its positive that the waste/watt is much lower, but there is waste.

Click to access rr08.pdf

Another externalitiy is the safety of old reactors, because of electrical blackmail power companies do not decommission old plants unless forced by government, new nuclear does not replace old that is silly (a common attack on the anti-nuclear crowd), at best it just replaces dirty coal, but most likely just coping with increasing demand. It does not matter what generation a reactor is the older it is the weaker the entire structure is considering the constant neutron radiation. An PV past its warranty will still produce power without any safety risk.

Like

Renewables are the answer, Germany keeps doubling installed PV capacity almost every year or two, nearly keeping pace with moore’s law! this follows the popular prediction on PV prices. This from a country with low avg solar irradiation, sure its peak power for now, but eventually when storage is mastered with hydro or biogas it has the potential to be the only base load the planet would ever need in the foreseeable future. In the meantime wind plays an important role too, maybe even a natgas stopgap if needed.

http://www.renewableenergyworld.com/rea/news/article/2011/03/new-record-for-german-renewable-energy-in-2010??cmpid=WNL-Wednesday-March30-2011

The only REAL problem is cost, scale is perfect since it is basically silicone and energy PV itself net energy producer.

There is no need to saddle future generations with nuclear waste.

Like

@EL, There was extensive discussion of why they chose this approach to estimate dose levels, however an explanation justifying why it is valid is missing. Furthermore anecdotal reports are worthless, especially from a physiologically stressed population, and a sample space of 29 persons with self-reported symptoms is suspect both is size and validity , and at any rate was not matched with proper controls.

The bottom line is that without proper dosimetry, and reliance on uncontrolled sources for critical data AND given the small amplitude of the actual deltas, it is difficult to see what if anything is present in terms of some effect that can reliably be assigned to the event at TMI NPP.

It boils down to this: in almost all cases where there are claims of health impacts from exposure to radiation from nuclear power stations, finding a positive correlation, in all but the most obvious cases where high exposure was involved, requires jumping through hoops with the statistics. However this is not necessary when looking at the impacts of coal burning, where the evidence of wide scale harm is strong and unassailable.

As to why this continues to be a subject of interest, you know as well as anybody how funding research works, and why some topics have funds available, and others do not. The fact remains that there are those that wish to keep this topic current and find it expedient to fund research that yields marginal results that can be spun into something that appears significant in the public media.

Like

DV82XL wrote:

The bottom line is that without proper dosimetry, and reliance on uncontrolled sources for critical data AND given the small amplitude of the actual deltas, it is difficult to see what if anything is present in terms of some effect that can reliably be assigned to the event at TMI NPP.

This is the crucial point … is it not? And it would seem to apply equally well to studies showing no correlation, as well as those suggesting a strong relationship between dose levels and cancer incidence. Should we throw out all research on TMI because the proper dosimetry was never done, or try and establish better methodologies for working with the material that was collected (and acknowledge where there are weaknesses and where this is likely to show up and trend in conclusions). In fact, it’s the paucity and inconsistency of the data that leads Wing and others to search for other methodologies for arriving at more accurate results (or approaches that better account for unexplained findings or anomalies).

You claim the deltas are small, but that assumes that conservative estimates for short range projections are correct, and will continue to hold for long range projections (as well). That’s a lot of assuming (and a lot of wishful thinking on conservative estimates), when it’s measurable and repeatable (or independently verifiable) results that we are after. And this debate (in the peer literature) is all about that … coming to reliable conclusions about the health impacts of the accident at TMI today, and in the future (and closing the gaps on outstanding questions).

For those who haven’t downloaded the article, here’s their statement on delta curves (from the abstract), and where they think this points with respect to future research, on-going questions, and revised conclusions: “Considering a 2-year latency, the estimated percent increase per dose unit +/- standard error was 0.020 +/- 0.012 for all cancer, 0.082 +/- 0.032 for lung cancer, and 0.116 +/- 0.067 for leukemia. Adjustment for socioeconomic variables increased the estimates to 0.034 +/- 0.013, 0.103 +/- 0.035, and 0.139 +/- 0.073 for all cancer, lung cancer, and leukemia, respectively [see Table 2 and 3 on pg. 55]. Associations were generally larger considering a 5-year latency, but were based on smaller numbers of cases. Results support the hypothesis that radiation doses are related to increased cancer incidence around TMI. The analysis avoids medical detection bias, but suffers from inaccurate dose classification; therefore, results may underestimate the magnitude of the association between radiation and cancer incidence. These associations would not be expected, based on previous estimates of near-background levels of radiation exposure following the accident.”

Like

I hope you do not mind me introducing a different question regarding Plutonium; I came across the statement attributed to GT Seaborg, who discovered Plutonium, that it is the most dangerous substance on earth. If you want an emotional argument, then finding someone or something to demonise is an effective strategy. So my question is whether Plutonium deserves the claim stated by Seaborg and promoted by those who oppose nuclear power generation.

I looked at http://en.wikipedia.org/wiki/Plutonium for some basic information, but did not find the claim. I found that statement that “The U.S. Department of Energy estimates that the lifetime cancer risk for inhaling 5,000 plutonium particles, each about 3 microns wide, to be 1% over the background U.S. average.” The paragraph went on to say “no human is known to have died because of inhaling or ingesting plutonium, and many people have measurable amounts of plutonium in their bodies.”

So, does Plutonium deserve the fear that is being promoted about it? Helen Caldicott seemed to link this danger with any internalisation. I would have thought the statement would have been in relation to nuclear weapons. Is the statement that it is the most toxic substance on earth completely taking Seaborg out of context?

Like

@Environmentalist :

Your statement on the replacement of older reactors is somewhat oversimplifying the actual situation. The only unreplaceable part of a nuclear reactor is the pressure vessel. It is the only part that receives neutron doses and continuously ages because of them (neutron embrittlement), while the other materials are regularly replaced. The pressure vessel integrity is surveilled by a dedicated program, in which pieces of the steel and welds from which the pressure vessel was built, are irradiated in the reactor core under the same neutron flux as the pressure vessel itself. Since the capsules in which these samples are irradiated are slightly closer to the core than the vessel itself and they are put in the reactor from startup, they actually run ahead of the vessel ageing (this is called the lead factor). Every 3-4 years, a capsule is drawn from the reactor and the mechanical properties of the samples are tested in a laboratory, from which they derive what the expected lifetime of the vessel can be. It is mostly because these datasets did not exist when the first reactors were built that reactor lifetimes were set at 30 years for many reactors. Now we know that, at least for what the vessel is concerned, the safe exploitation of many plants is assured for over 60 years.

Of course, there is one more thing that cannot be changed, at least not in practice, which is the general layout of the reactor. Certain choices that were made (such as the location of the spent fuel pools or the availability of water and passive safety features) definitely have been much improved in today’s designs. The initial investment cost for a nuclear plant is indeed the capital contributor to the cost of nuclear power, while the fuel cost is more important for fossile power. However, taking into account the statistics of nuclear accidents and their consequences, expecting utilities to replace an older power plant for a newer model only for those reasons would be like expecting you to replace your car every time a new model comes out with better airbags. If one would consider his/her chance of dying in a car accident sufficiently large, he/she would probably do it… Accidents with nuclear power plants are hard to compare with car accidents, I agree, but your chance of dying in your car is still quite large.

As for PV and wind, they surely should be a bigger part of the energy mix, but it is my conviction that a balanced energy mix, including nuclear and even fossil, is our best option.

Like

The real problem with solar is that its not there 80 to 90 percent of the time depending on how sunny your area is. In my country, the Netherlands, not very sunny, the best performing PV installations, the ones that are decent at converting diffuse insolation, and have ideal installation angles and enthusiast everyday cleaning, get about 1000 kWh/kWpeak per year. This is about 12 percent capacity utilisation (0.12 capacity factor).

Our single, old, primitive nuclear plant, Borssele, gets 7000 kWh/kWpeak per year. 7x as much. and it lasts longer (60 years operating licence). This means this power source IS there 80% of the time. Whereas solar isn’t there 88% of the time. This is unacceptable, so the practical alternative is the burn a lot of natural gas. I don’t see how that’s sustainable, too much greenhouse gasses, importing from unstable countries/dangerous regimes, not good. I worry a lot about this fossil fuel lock-in. Its one of the reasons I like nuclear so much.

Like

@EL, – Well to start off with there are no strong correlations shown, only rather weak ones. Strong correlations are the ones shown between tobacco use and pulmonary system cancers, or between obesity and late onset diabetes. In those cases, one doesn’t need to massage the data or rely on sketchy assumptions. Not only that but the actual number of cases in the sample space is very small for the conclusion being drawn in these TMI studies and that in itself is fertile ground for breeding statistical artifacts. So perhaps the answer is that rather than construct a house of cards out the poor data available, it might be better to admit that little can be determined with a reasonable level of confidence with available tools. The fact is the bodies are not piling up such that a high level of concern is warranted, and there is little scientifically to be gained pursuing this question.

As for the latency argument, frankly it is getting stale. I find it disingenuous to continue to invoke latency every time actual results fail to meet the dire predictions made previously. We were told shortly after the event, when the immediate death toll was found to be minimal, that the full impact would not be felt for twenty years. Thirty-plus years later, the Cassandras are now saying it could be as much as sixty years before the damage appears or maybe several generations in the future. At what point do we accept the fact that the impact of this accident has not been anywhere as serious as it was assumed it would be?

Also the statement: “The analysis avoids medical detection bias,” is another case in this study where we are expected to take the authors’ assertions at face value with little in the way of substantive evidence to support it. Like the dose estimations, there is a element of hand-waving here that I find suspect.

This has nothing to do with real research, and everything to do with keeping anxiety levels high over nuclear energy by smearing FUD around. I can see straight through these ‘studies’, as anyone with a scientific background can, thus the only possible utility for these things is as propaganda.

Like

On plutonium I was at the hardware store looking at a display of smoke detectors ( I was looking for a CO detector as I use firewood). The cheaper ‘ionisation’ type alarms have less than a microgram of Americium 241 a decay product of Pu 241. It is removed before making MOX fuel I gather to reduce gamma ray emissions.

The price of optical or non-ionising smoke alarms generally seems to be a little higher. The radiation symbol is usually minuscule so millions of people may be unaware they are in effect using plutonium. I understand old alarms can be legally thrown in the garbage and like mercury in CFL bulbs the dilution is adequate.
Refs http://www.world-nuclear.org/info/inf57.html and http://en.wikipedia.org/wiki/Plutonium

Like

@NukeMaterialsSpecialist:

Thanks, but a 57 page technical report from 1995 on setting safety standards for workers doesn’t really address what Seaborg meant about the dangers of Plutonium, except that if it was the most toxic substance on earth you would not be wanting to inject it into people or to get them to ingest it at all. Is that how you read it?

Like

I’ve got about 370,000 Becquerels of Am-241 hanging in my house, in smoke detectors (10x 1 microcurie).

Here is a link to the Manhattan bomb builders that handled lots of plutonium and subsequently received quite large doses due to plutonium. Plutonium turns out to be so dangerous that fewer of the plutonium workers died than would be expected from average US population. In fact it is so dangerous that the level of mortality in the plutonium workers is similar to that of non-plutonium workers. Gee.

http://www.ncbi.nlm.nih.gov/pubmed/9314220

“Twenty-six white male workers who did the original plutonium research and development work at Los Alamos have been examined periodically over the past 50 y to identify possible health effects from internal plutonium depositions. Their effective doses range from 0.1 to 7.2 Sv with a median value of 1.25 Sv. As of the end of 1994, 7 individuals have died compared with an expected 16 deaths based on mortality rates of U.S. white males in the general population. The standardized mortality ratio (SMR) is 0.43. When compared with 876 unexposed Los Alamos workers of the same period, the plutonium worker’s mortality rate was also not elevated (SMR = 0.77). The 19 living persons have diseases and physical changes characteristic of a male population with a median age of 72 y (range = 69 to 86 y). Eight of the twenty-six workers have been diagnosed as having one or more cancers, which is within the expected range. The underlying cause of death in three of the seven deceased persons was from cancer, namely cancer of prostate, lung, and bone. Mortality from all cancers was not statistically elevated. The effective doses from plutonium to these individuals are compared with current radiation protection guidelines”

Like

Thanks Cyril R. It looks like radioactivity or toxicity don’t make it the most dangerous substance on earth. It seems most likely that Seaborg was referring only to its danger in nuclear bombs. I would have thought that an element with a half life of 80 million years would not be exceptionally dangerous due to its radioactivity alone, even if it concentrates in bone material. If there is something missing in this logic, I hope somebody will enlighten me.

Like

@Robert Lawrence – The toxicity of Pu is greatly exaggerated, as toxic metals go, arsenic,cadmium, mercury and beryllium beat it hands down as an acute systemic toxin, and polonium among others is more radiotoxic, although this somewhat dependent on specific isotope. From a purely chemical standpoint, Pu is about as poisonous as lead.

Botulinum toxin is the most acutely toxic substance known, with a median lethal dose of about 1 ng/kg when introduced intravenously and 3 ng/kg when inhaled.

Like

I came across the statement attributed to GT Seaborg, who discovered Plutonium, that it is the most dangerous substance on earth.

To take an educated guess here… did you hear that from Helen Caldicott?

Like most things Caldicott says… you probably cannot find any source that actually shows that it is true.

Now, let’s compare this with the following genuine, legitimate, citation-provided transcript of an interview with Seaborg:

“Q: Now, plutonium, this substance that you did your work on, has come to be demonized in our society, both for its proliferation potential, but also many environmentalists talk about it as “the most toxic substance in the world.”

A: The number of (I guess you’d call them) environmentalists characterize plutonium as the most toxic substance in the world. That, of course, is nonsense. There are many toxins and viruses that are more toxic than plutonium, that lead to immediate death if taken in amounts equal to what they’re talking about as the toxic amounts of plutonium. There have been scientists, as a result of accidents, dating clear back to the war, who have ingested plutonium up to the level of what is considered tolerable amounts. And some of those are still alive, 50 years later.

Whereas, if they had ingested an equal amount of some viruses or toxins, they would have died immediately. So it’s just nonsense to speak of plutonium as the most toxic substance in the world. It’s not anywhere near it, not in the ballpark of being near that toxic a substance, when people who ingested it 50 years ago are still alive.”

Lots of other interesting stuff in that interview, too.

http://www.pbs.org/wgbh/pages/frontline/shows/reaction/interviews/seaborg.html

Like

DV82XL wrote:

So perhaps the answer is that rather than construct a house of cards out the poor data available, it might be better to admit that little can be determined with a reasonable level of confidence with available tools.

I agree with you. This is a fair assessment. But the risk of building a “house of cards” around the null hypothesis seems to me to be an equally likely risk and has a far greater potential to cause harm (and mislead) than statistical assumptions looking at evidential anomalies and following a cautious and precautionary principle (which you are comfortable writing off as FUD). [deleted personal appraisal of probabilities, not supported by evidence]. Instead of turtles all the way down, it appears we will have to be content with assumptions (and hope that no great damage gets buried in the statistical noise).
MODERATOR: edited

Like

http://www2.timesdispatch.com/news/2011/apr/17/apparent-tornado-causes-surry-nuclear-reactors-shu-ar-978675/

The plant in Surry was shut down temporarily due to a nearby tornado (well, not so near, across the James and up the Chesapeake coast a ways, in Gloucester). Just something I never considered about US nuclear plants… we don’t often have severe earthquakes in the East, and nok tsunamis, but we do get severe storms with tornados (as well as hurricanes with storm surges and flooding).  

And the US plants have the same risks… if the diesel generators get taken out somehow, there’s just hours of battery power left before things can get bad.

Hurricanes are so powerful sometimes they can overwash a barrier island and create a new inlet within  hours. This happened at least a couple times in the last decade. We have some nuclear plants on barrier islands. Our plants are built to  withstand hurricanes and tornados, but I’m not sure how a plant could be built to withstand a storm that is powerful enough to wash away the land underneath it. 

Any who still really believe that a Fukushima-style incident can’t possibly happen in the US?  

I am search engine-challenged (help!) so I can’t find the story (a summary was on slashdot a year or two ago) about the discovery (or invention?) that a series of spaced pillars placed in the ocean surrounding an island or structure could cause giant waves to go around the island (using interferometry?) Perhaps these should be installed off the coast of any coastal nuclear facilities. Sounds like an ounce of prevention to me.

Like

Others in South Australia will be interested in a new website by Ben Heard called Decarbonise SA. This will promote nuclear power as a replacement of fossil fuels in our state. It will be well worth supporting.

http://decarbonisesa.com/

Like

shamus, on 18 April 2011 at 5:22 AM said:

And the US plants have the same risks… if the diesel generators get taken out somehow, there’s just hours of battery power left before things can get bad.

Exactly what nuclear plants in the US are at risk of a flooding event? Exactly how long do the emergency cooling steam driven pumps operate? What measures have or have not been taken at those specific plants to minimize these risks? Are there additional cooling systems staged nearby I.E. Fire Engine Pump Trucks?? Are ‘plug compatible’ generators available nearby?

There are lot’s of questions that need to be asked and answered before a judgment can be made on even one specific plant.

Like

Enviromentalist, on 17 April 2011 at 3:16 PM — Recycling spent fuel rods is done in France, Russia and Japan. I think China either does now or will soon.

Like

>There are lot’s of questions that need to be asked and answered before a judgment can be made on even one specific plant.

I think I can only answer the first question:
>Exactly what nuclear plants in the US are at risk of a flooding event?

The ones along the coast, especially the ones on barrier islands between the ocean and the intercoastal waterways. But I think you missed the point of what I (and the nyt) meant by “same risks.” As far as I know, most of the nuclear power plants in the US are of such similar design to Fukushima that if these plants lose power (by whatever means, fire, flooding, tornado, hurricane, earthquake, meteorite, or terrorist) the same backup safeties are in play… diesel generators and battery backups.

But of course you are right… generalizing as I am does not allow us to make any sweeping judgements about all the plants… there’s only 104 or so, they’d have to be individually assessed.

Like

The “Decarbonise SA” site looks good, although of course there is not much content yet.

I’m quite fond of the idea of building up a detailed plan, analogous to the Zero Carbon Australia plan, except with better science, better assumptions, better skepticism and without the anti-nuclear dogma, meaning much lower prices and more realistic availability of the technology when nuclear energy is included in the system.

Like

@Huw Jones

Sounds good, but what is the storage medium? As I understand it PWRs would not be hot enough for molten salt storage. Which leaves pumped hydro as the only viable option for now.

Like

Huw Jones, on 18 April 2011 at 11:06 AM — Probably using load following NPPs together with just a bit of pumped hydro for peak shaving is more economic.

Like

And the US plants have the same risks… if the diesel generators get taken out somehow, there’s just hours of battery power left before things can get bad.

To further harrywr2’s comment, the Japan experience shows that you need an event so powerful that it completely knocks out external AC power supply to the plant (i.e. cripples multiple other electricity plants as well as transmission infrastructure), as well as the backup diesel generators. While not impossible (as shown by what’s happened in Fukushima), it takes an incredibly mighty event to do so, and some fairly simple steps can be taken to minimise the risk of it happening again (you learn from the past).

I will also contend that with the passive safety features of modern reactors, this argument becomes more and more irrelevant. And anyone that argues we should be prioritising the replacement of any existing, operating nuclear power plants over fossil fuel plants, simply isn’t worth engaging with.

Like

That several topics become interwoven on an open thread is inevitable and I have no difficulty with following up to three (rarely four) different subthreads at the same time. When there are more than that many I simply skip the ones of lesser personal interest.

Like

As a former ‘safstrine’ I think the Decarbonise SA movement is overdue. Several things trouble me about the State
– dependence on distant tropical storms for water supply
– dependence on old industries like defence and auto manufacturing
– the talent exodus dare I suggest starting with Rupert Murdoch.

SA’s energy situation is parlous. It has Australia’s biggest gas user the Torrens Island baseload plant and possibly the creakiest coal station Playford B at Pt Augusta. The much vaunted 867 MW of installed windpower depends on RECs yet struggles to produce 70 MW in recurring heatwaves when the State needs to import a GW of power over the border. CouId be why ETSA wants to put radio controllers on air cons.

It reeks of desperation they pinned their hopes on granite geothermal which didn’t pan out. Now they think fracking will extend the life of the Cooper Basin so they can build a new gas baseload plant ‘Cherokee’. Another wild idea is UCG or coal mining in the Arckaringa basin. Maybe CO2 isn’t a problem. Meanwhile the world’s largest uranium deposit Olympic Dam can’t expand and the Chinese will get some uranium out of the copper extract. I’d call that ‘value subtracting’.

I think SA should go with uranium mining and enrichment. Tie in with a trans-Nullarbor HVDC link. Get the hi tech people away from slow diesel submarines into pressure vessels and nuclear contracting.

Like

>While not impossible (as shown by what’s happened in Fukushima), it takes an incredibly mighty event to do so

Yes… but these mighty events are not so rare. Hurricanes Hugo and Isabella both ripped new inlets in barrier islands off SC and NC, and if I’m not mistaken, Hurricane Francis did the same in FL. There can’t be too many plants operating on barrier islands, but there’s at least 2 in FL that I know of, and I believe there may be one in NC. If a hurricane rips a new inlet underneath one of those plants, I don’t think any AC lines, generators or batteries would survive. And you may think the chances of a hurricane hitting a bullseye like that are pretty slim, but I’m pretty sure two of the 2004 hurricanes made landfall within a mile of each other not far (10 miles?) from the St. Lucie plants, which surprised even the meteorologists. 

So my thin point is, unlikely as a hurricane ripping an inlet underneath an operating plant seems, we should probably not be so quick to dismiss what we in our arrogance believe is only a shadow of a remote possibility (which you agree is what the Fukushima builders did… “9.0 earthquake? 30m wave? impossible!”). And that’s just hurricanes. I’d like to see anything survive an F4 or F5 tornado, or one of those gigantic fires that show up in drought conditions we’re used to seeing in CA, now we’re seeing in TX.

These are events that cannot be predicted, but that doesn’t make them rare. There is nothing we can do about a natural catastrophe, but that doesn’t make them unworthy of serious consideration. The likelihood of natural catastrophe occurring in any particular place is just as great as it occurring in any other particular place (damn Nature, you scary!!).

Like

How low does the LCOE for wind power have to be so that with pumped hydro backup it can become more cost effective than nuclear power generation?

Using the estimates I’ve previously posted, the LCOE of wind power, c, has to be less than the value determined by the equation

0.32c + 0.68(5.6+c/0.80) = 11.8

which is c = 6.83 UScents/kWh. That price, without incentives, appears possible within the decade.

The dubious part is finding enough suitable land to sacrifice for the pumped hydro reservoirs; I don’t expect there to be much of it for the pumped hydro incremental cost of 5.6 UScents/kWh.

Like

And for the record, DV82XL and EL, both you guys are my heroes. In order for the best conclusions to be made, there must be people that are accurate and intellectually honest that disagree with each other, and your analysis and arguments and disagreements are exquisite. I hope we get to see more.

Like

@ shamus

I’m not quite sure what you’re trying to say in your last post. That large natural events can happen and cause mass destruction?

Even when taking into account the absolute worst case scenarios (i.e. Chernobyl and Fukushima, both 40+ year old reactors, vastly inferior to modern designs), over 14,000 reactor years experience globally has proven nuclear power to be the safest, most reliable and scalable way of generating electricity. Obviously things can be done better, but it’s silly not putting one form of power generation into perspective with other types. Not to mention into context with the reality of life without access to plentiful energy.

And just to comment on your idle speculation re tornadoes and fire: If a nuclear power plant is designed to survive a fully fueled Boeing 767 impact, and the strongest of hurricanes, it’s difficult to imagine a tornado doing anything other than cutting external power supply (in which case you have back up generators to keep cooling systems operating – unless hit by a 14m tsunami, apparently, and even then, better design [as in modern reactor designs] could have avoided this). E.g. Davis-Besse NPP when it was hit by a smaller tornado. Nuclear power plants are extremely robust industrial infrastructure, not houses. And cement and steel aren’t flammable…

Like

>I’m not quite sure what you’re trying to say in your last post. That large natural events can happen and cause mass destruction?

Precisely. Its what we can’t plan for that will get us. Granted, this particular argument against nuclear energy doesn’t hold a candle to the economic argument, but I still think it has merit. Dams breaking, a refinery fire, a turbine flying apart… these kinds of things happen, and then they’re taken care of reasonably quickly… in a month, a few years (ah, well, there is Gulf/BP thing isn’t there… what a mess). A natural catastrophe is bad enough… but if it can effect the stability of a nuclear reactor and there are problems,  then there is this lingering danger. In the case of Fukushima, with multiple reactors, if all goes according to plan, at least another 6-9 months, but something tells me it is equally possible the trouble may linger for far far longer, and the area may remain dangerous. 

>over 14,000 reactor years experience globally has proven nuclear power to be the safest, most reliable and scalable way of generating electricity.

If there were 14K reactor years since Fukushima, you may have had a decent point. But I certainly can’t subscribe to the idea that because nearly all other reactors haven’t had an issue that it somehow mitigates what happened at Chernobyl or is still happening at Fukushima. The O-rings worked out fine for all the shuttle flights before Challenger’s last. There was only a single failure. Should that excellent track record have permitted NASA to continue using them? (rhetorical, and perhaps a poor analogy… best I can come up with at the moment).

>it’s silly not putting one form of power generation into perspective with other types. Not to mention into context with the reality of life without access to plentiful energy.

There are other energy sources. Most that I can think of don’t do the things that nuclear power plants do when things go really wrong because an unexpected event causes cascading failure. 

>it’s difficult to imagine a tornado doing anything other than cutting external power supply

A tornado can empty a large pond in seconds. It can tear steel pipes out of the ground. Ever seen a piece of straw impale hard wood? Very strange. Tornados turn things like bulldozers into projectiles. You never know what to expect from the most powerful tornados.

http://en.wikipedia.org/wiki/Tornado_intensity_and_damage
“Above-ground structures are almost completely vulnerable to F4 tornadoes, which level well-built structures (including stone and reinforced steel buildings)”

>Nuclear power plants are extremely robust industrial infrastructure. 

I realize this, and I am thankful for it. But all it takes is enough energy (like in a natural catastrophe), and anything mankind can build can be destroyed. So look at other energy sources and ask if any of them can cause the scale and scope of disaster that a nuclear plant can create once it is destroyed, as well as the time and cost it takes for it to be made safe.

>And cement and steel aren’t flammable…
no way to tread lightly here… then why isn’t the WTC still standing? Because fire weakens cement and steel, and can weaken them to the point that they will fail. Perhaps the plant designs can withstand the impact of a 767, but whether they can stand up to the ensuing fuel fire is another question (and I’m not a materials engineer or any other kind of engineer, so I shouldn’t speak to it). But perhaps your point is that surrounding the plants there is very little local fuel for a large fire in a mostly concrete facility. Fair enough.

This argument need not go much further… I think I’ve beaten it to death by now. Its a small point to make, especially compared to the other arguments, and I think you and others must have it by now.  

Like

DV82XL and supporters
I am surprised the moderator didn’t cut you short before for breaking the commenting rules including DV8 saying the following to EL in different comments – to me they break the ad hom and attrubution of a person’s motives rules.

Normally I will not respond to this commenter, as I consider him a troll, however he has posted his last comment twice, and it needs to be answered, if only to demonstrate the kind of dissemination he engages in.

AND

This has nothing to do with real research, and everything to do with keeping anxiety levels high over nuclear energy by smearing FUD around. I can see straight through these ‘studies’, as anyone with a scientific background can, thus the only possible utility for these things is as propaganda.

AND

As to why this continues to be a subject of interest, you know as well as anybody how funding research works, and why some topics have funds available, and others do not. The fact remains that there are those that wish to keep this topic current and find it expedient to fund research that yields marginal results that can be spun into something that appears significant in the public media.

EL, has not,as far as I can see, similarly broken those rules but a couple of commenters have claimed his links did not prove what he asserted they did. Doesn’t that break the citation policy.

Get over yourself DV8 and give the moderator credit for a hard job done well. I have certainly noticed an improvement in the tenor of BNC since moderation was instituted.

Like

David Benson, 18 April 2pm,
Wind doesn’t have to be cost competitive with nuclear in Australia or US, because no additional nuclear is being built and coal fired plants are nearing end of lifetime.
What is LCOE?
Australia has some very large storage reservoirs suitable for pumped hydro, no new dams or land is required, just reversible turbines and tunnels. Present storage capacity of hydro >24,000GWh.

Like

If we talk risk, we have to talk acceptable risk. Risk will never be zero.

It seems to me a reasonable definition of acceptable risk for a nuclear plant is when it doesn’t add much damage to an event. It has to be ‘marginal’ in actual damage (its much harder to be marginal in the eyes of the media, of course).

Japan was hit by a natural disaster that killed over 25000 people and costs like 200-300 billion $ or more in financial cost.

The radiation from the nuclear plant is unlikely to kill anyone of the public, some workers may be at risk of increased cancer incidence. Cost of decommissioning the reactors probably in the range 10-20 billion $. There is some added financial damage in not being able to sell certain foods.

Seems to me that the risk to public health is minimal and financial damage cost is quite small compared to total financial damage.

The radionuclides in seawater are total media hype and of no long term concern to humans or ecosystems. The only real concern is the area northwest of Fukushima that received some fallout, in particular radiocaesium. From the maps it might be more than 100 square km that has to be at least temporarily closed and perhaps largely decontaminated by removal of vegetation and topsoil. This could be very expensive, and cannot be considered a marginal risk.

Like

John Newlands, 18April 12.40pm
EDSA wants to put controllers on A/C because SA’sa peak demand (>3200MW) is >X3 off-peak (<1000MW) and a lot of this is due to A/C load in summer heatwaves.
and SA has no hydro or solar both of which would be good for summer peak demand.
Most of SAs water comes from winter rains( in local catchments) supplemented w.ith water derived from winter snowfall in snowymountains .
Where does tropical storms come into the picture? or do you mean the once in 20 year floods in the Cooper and Darling catchments?

Like

Neil it looks like the SA Coorong and Lower Lakes will dry up without major floods in the Qld part of the MD Basin, maybe 2,000 km away. The NSW water contribution (Murrumbidgee, Upper Murray etc) never seems enough. Ironically river water is now pumped to Woomera only 70km from water strapped Olympic Dam that relies on groundwater. Mind you Adelaide plans new housing developments with town water. Barry suggests 2013 will be hot and if it is also dry Adelaide will suffer water shortages with the new Pt Stanvac desal working hard.

As to SA electricity imports I wonder if they should try seawater pumped hydro. I’d rather see carbon tax revenue spent on storage than direct renewables subsidies.

Like

@shamus

We just had a severe storm with many tornadoes (a record number after preliminary counts). One even hit a nuclear reactor in Surry, Va. By all reports the reactor held up fine, they have not classified all the tornadoes yet.

Maybe Barry with his resources can get more info.

Like

David Benson you forgot to include the cost of building the pumped hydro dam and generating equipment. I don’t think your analysis is accurate.

Like

Neil Howed, the LCOE is the levelized cost of energy. If you had a cash flow stream into the future you could find its present value by summing up the future costs, with each future year brought back to a current year dollar amount and summed with all the other future years. Then if you had that cash flow stream be the energy cost in future years you could swap back and forth between future years cents per kwh and future years costs. The LCOE is a level cents per kwh value applied each year that produces the same total present value cost as the actual future cash flow stream.

I do not like to use LCOE because it has several shortcomings. It assumes that:
1) we will continue to earn money on our investments as we have done in the past which I think will not be true one we reach a world economic decline,
2) the long range effects of running out of oil and climate change are too far into the future for LCOE to capture the costs adequately, i.e. the LCOE over empasizes the current value of making money at the expense of future investments. This leads to cutting down all the trees and then worring about the problem when we get there,
3) and finally LCOE is causing utilities to create purchase power agreements with escalating energy prices so that the LCOE looks ok but the future energy prices are far too high. This is selling out the future for the benefit of the current time. Its a bad practice that I think will lead to the failure of many small utilties here in the US as big companies bobble them up in the future. This is most prevelant with bio wood burning plants PPAs here in the US, such as the 2.3 billion dollar boondoggle that Austin energy recently signed into. It going to result in financial failure of the utility in about ten years.

Like

Mark Snodgrass, on 18 April 2011 at 9:19 PM said:
One even hit a nuclear reactor in Surry, Va… they have not classified all the tornadoes yet.

See Richmond Times-Dispatch link in my post above. The tornado touched down between Gloucester Point and Gloucester, and then reappeared in Deltaville; it did not hit the Surry plant, which they shut down for safety. By the damage, the tornado was a strong F2, or a weak F3.

Like

I have a couple of questions about the Fukishima situation that I hope someone can answer.

I understand that the tsunami struck and wiped out the diesel generators and that’s why emergency power wasn’t available after the batteries ran out.

However, there’s some ignorance on my part about what happened after that, and as a result of the tsunami.

Were the control rooms of any/all of units 1-4 also submerged by the tsunami?

I think this is a key question. If they were, then it doesn’t really matter that there wasn’t diesel power since it couldn’t be distributed by a control room that had been submerged in seawater.

If the control rooms were not submerged, what is the delay in bringing the reactors back under control now that reliable power has been restored?

Lastly, do we have any idea what the radiation level is within the control rooms at this point? Is this the source of the 26 Sv reading I asked about a week or so ago that went unanswered?

Thanks

Like

DV82XL wrote:

I agree with this. [DELETED: attacks on the moderator] DV8 does get a little personal (such as with his comment on me being a troll), but I found the rest of our discussion informative and substantive (and to raise issues I didn’t consider with my initial post, or had to give greater attention to). We come at these issues from completely opposite sides of the spectrum, and I can say he helped me see these questions with greater clarity and to be better informed (while we still disagree). The way I see it, if the issues are important enough to be debated in the scientific literature (where there are also substantive disagreements over findings and assumptions), it’s important enough to be considered here.

While I don’t want to put all my cards on the table, a few comments are probably worthwhile. I’m not anti-nuclear, but I am a reluctant defender of the industry. I believe it has to be held to the highest standard for safety, engineering, and regulatory oversight. I feel the same way about deepwater drilling and shale gas development, and believe when short cuts are taken (for whatever reasons) the general public is exposed to risk, and public confidence in our energy systems is damaged (and this leads to additional difficulties in meeting our current and future energy challenges). I’d like to see reliable and safe nuclear used to displace coal for baseload generation, but I think our energy system is broken (and needs a major retrofit). We don’t need “more sources” of energy, but better management of existing sources (and slowly phasing out older, inefficient, unsustainable, and polluting technologies). I believe political and monopoly interests in the marketplace are far more significant challenges than any technical issues we face. Renewables do fine up to 20% (even higher), conservation and efficiency yield huge rewards in sustainability of existing supplies and lowering costs, demand growth in energy consumption needs to be far better managed (it currently results in huge profits for utilities, a hidden tax on consumers, and a great deal of wasted energy from congestion), we could be adding storage to national grids (lots of affordable technologies may be ready in coming years), and we should be throwing up a great deal more wire and HVDC to improve transmission (regardless of energy sources utilized). In general, in the US (where I write) we’re going to all have to become more European in our approach to energy (with respect to conservation), and get used to higher costs. I’m also an academic by profession, teach courses in cultural anthropology (as I finish my PhD), and am involved with policy development in my city on Climate Change (working on energy efficiency programs in low income communities). So taking part in academic, policy, and “objective” analytical debates is not unfamiliar to me.

[Deleted attacks on moderator] and a concerted effort to minimize the severity of the Fukushima crisis, and health risks from radiation releases we have seen to date and are likely to see from Fukushima in the coming 6 to 9 months (as they endeavor to bring these reactors to a “cold shut down”). It is correct to point out that emissions from coal fired power plants and environmental damage (air emissions and well design failures) from shale gas drilling operations also have a significant impact to human health and environmental quality … and that the tsunami was an unmitigated disaster (and has no peer with respect to injury and deaths). But I believe the best response to the nuclear crisis in Japan is not going to be to minimize the health risks from radiation releases or engage in false analogies (this is going to do nothing to restore public confidence in these power plants). The best response is going to be transparent and comprehensive oversight and focusing on viable, substantive, and cost effective solutions to restore people’s trust and confidence in nuclear power plants, and prevent radioactive fission products from being released into the environment in the first place. This is largely the “spirit” and “constructive” purpose to which I make my comments on the site. [Deleted attacks on moderator]

Like

@EL –

EL wrote:

…the risk of building a “house of cards” around the null hypothesis seems to me to be an equally likely risk and has a far greater potential to cause harm (and mislead) than statistical assumptions looking at evidential anomalies and following a cautious and precautionary principle (which you are comfortable writing off as FUD).

That is not so. This is the point where Ockham’s razor comes into force. Statistical ghosts can be teased out of a given data-set to show a minor correlation with anything if you allow for the unlimited use of questionable techniques to supply missing parameters, or selectively ignore confounding variables. This is exactly what lex parsimoniae is invoked to avoid.

The null hypothesis here is not built on shaky ground, if for no other reason than the fact that there is no spike in morbidity, and mortality that sticks out of the background high enough to warrant an explanation. So it becomes a question of what sort of harm is being avoided with the assumption that there was an impact.

Like

Aren’t we overlooking that something major could still go wrong at Fukushima Daiichi that could make the situation a whole lot worse? With 7 reactors or spent fuel ponds requiring unusual steps to keep under control, it has always concerned me that one could get out of control and release so much radiation that it would become impossible to manage the other 6. This would seem less and less likely as more and more time passes but other things could go wrong. For instance, what happens if a major typhoon hits the accident site this summer?

Like

The problem with 5.4 cents per kwh is that it may not result in base loading here in ERCOT (Texas) because of the way the market is run. The risk that the plant is not base loaded and the revenus stream may fail is the reason nuclear power is going nowhere here in ERCOT. The cost of the Vogtle plant is put into the rate base in their case but this is no longer possible here. So the financing of nuclear has everything to do with a project being possible or not. I think that with time ERCOT will realize their mistake, when the lights keep going out ha ha.

Like

Lifting a mechanical block up and down is not going to be as practical as pumping water up hill and letting it back down. Imagine a block as large as a lake being lifted up and down and you get the idea.

Like

No I don`t.
Because the granite Block is heavier than water you can store much more energie per km³.
Thats why it is projectet to be so much cheaper than pumped hydro.
Think big!

Another idea is the “Ringwallspeicher”
It would also be cheaper than conventional pumped storage.
Could be done in old coal pits. Earth dams that big have also been build.
http://www.ringwallspeicher.de/

Like

I suspect ringwall pumped hydro is like vertical farming in that the energy capture area is not big enough. You can draw a picture of a pedal powered helicopter (off-topic reference http://en.wikipedia.org/wiki/Sikorsky_Prize ) but getting it to fly is the hard part. OTOH the seawater pumped hydro I referred to upthread proposes octagonal tanks 7km in diameter 20m deep atop 100m cliffs and a test plant exists in Japan. However replacing the plastic liner means that the technology remains oil dependent.

Like

The problem is not with using oil but with burning it. Hydrocarbons should be seen as a materials resource rather than an energy resource.

Like

Stephanie, granite cannot be pumped. This is a very expensive proposition. Heavy lifting equipment is very expensive. Not to be used to store a few MJ of commercial grade wholesale energy storage.

None of the alternative hydro schemes are cheap. RiverBank hydro has a project costing $2/Watt peak for just 6 hours of underground pumped hydro. You need more like 600 hours of storage for a total wind/solar grid.

Like

Cyril. Yes, it does not make sense for a few Mj…
The idea is to built a storage with a reach of at least a month.

Lueder von Bremen, EWEC 2009.
For a 60% wind/40% solar based solution you need 2-7days of storage.

Thats with todays wind technology. But there will also be Makani Power and Kitegen type windpower…
There is other renewable power available besides wind/pv.

A granite storage 4,4km diameter: 2,2km high – strage volume 624 TWh …that would power Germany for a year.

There is no haydraulic lifting equiptment involved. It`s liftet by water which also generates the power…
Pressurized pumped storage, if you like.

Please try to understand an idea before you start bashing it.

Like

John,

Many points have been brought up against the idea of the “Ringwallstorage”.
Matthias Popp has defended his work in his dissertation and is even so kind to answer questions raised in articles about his work online or help people understand certain aspects.

I am not the person to translate everything for you and take his place here.
If you are really interested and have any constructive critique to contribute please contact Matthias Popp in person.

Maybe you can try to autotranslate this page online to understand more of it. Its the EIKE critique and answers and corrections to the questions raise and errors made.

http://www.ringwallspeicher.de/Fragen_und_Antworten/EIKE-Kritik_am_Ringwallspeicher.htm

Like

John,
Raise the price money for the Sikorsky Price to 10.000.000$ and it will be done.

There is much more at stake here than 20k$…

I also prefere the granite storage idea over the ringwall.
It`s still amazing what people come up with.

The geo-hydraulic storage seems just like the next logical step after pumped hydro, raising the system size of storage technology.

If you have any questions that I could answer by translating the presentation I will try to do so.

Like

While views on energy storage other than mountain lake hydro are generally dismissive I think there is longer term question; will wind and solar assets be stranded when gas is prohibitive for load balancing plant? The notion that battery cars can store renewable energy seems to be making little progress. An alternative automotive technology fuelling by natural gas ( recently endorsed by Obama) will shorten the lifespan of gas as a power plant fuel.

At some point in cost terms (renewables + gas backup) > (renewables + storage) for equivalent output. Should nuclear baseload go worldwide then energy storage may be needed for peaking. Therefore solving the storage problem is imperative. It will raise NIMBY issues since water tanks can burst, flywheels can disintegrate and sodium batteries can incinerate.

Like

Hi John mate,
but you’ve got to admit that Nullarbor seawater hydro dam idea is interesting! 10 hours of storage for the WHOLE of Australia coming in at … what did they say … 2 billion?

(But, as always, they discount the sheer COST of adding a Continent Wide super-grid and then overbuilding wind and solar capacity).

http://eclipsenow.wordpress.com/storing-energy/

But with several of these massive hydro dams running, a renewable grid seems doable — it’s now a race about economics.

PS: About economics — there’s some new buzz about Wave power at the moment. There are claims it is ‘approaching baseload at CHEAPER than fossil fuel rates’ within 3 years.

Late Night Live had a piece on it recently.

I don’t have the technical expertise (or time) to chase it up, anyone?

http://www.abc.net.au/rn/latenightlive/stories/2011/3176948.htm

If this is for real and we build a Nullarbor hydro dam or 2 (with a carbon tax in place?) then… well… don’t the renewables guys have a case again?

Like

EN last TV item I saw on wavepower was the machine at Wollongong smashed up on the beach. I think 7km diameter tanks might be a bit difficult to build near beach resorts, hence the Nullarbor. That’s on the WA side but perhaps the Decarbonise SA people should link it with the proposed SA-WA HVDC cable.

When the transport industry ‘discovers’ gas they will pay $40+ per gigajoule while the proposed Morwell Vic gas plant is complaining about $7. We also use 50 Mt a year of of mostly imported oil compared to 20 Mt of gas. Fracking won’t save us if the wells only get two good years. We must save gas for the long haul and that means burning less in power stations .

Like

Gene Preston, on 20 April 2011 at 11:51 AM — The small company working on the design has VC backing, so thaty suggests there is a 5% chance of commericial success. I agree that proper seals are crucial.

Like

Yes, wave power has the problem of being able to survive storms, so I neither follow wave machine designs nor count upon successful development.

Like

Eclipse Now, on 20 April 2011 at 12:52 PM — Ducking under storms seems a good idea, but it does increase the variability of the power supplied. However, if the LCOE is low enough, wave power togethr with pumped hydro backup should prove to be an attractive solution.

But I’ll wait until I see some reliable figures on costs and various technical matters.

Like

I guess all ‘start up’ technologies sound convincing from the sheer enthusiasm of the staff raving about their technology. Yes, I’m on ‘wait and see’ mode with them as well. However, they’re talking 3 years whereas how long are Gen4 reactors?

I really love the idea of Gen4 reactors burning up today’s nuclear waste so that we solve that 100k year storage problem.

But hey, maybe today’s depleted uranium will come in handy if we decide to *really* get into space or terraforming? Last time I looked Mars didn’t seem to have good wave power potential. ;-)

Martian dust storms can blot out the sun for months, even up to a year!

Like

On the Gravity Power storage concept, the seals might only be required on the downstroke, as it were, if the weight is directly driven up when collecting energy and directly locked to the wall for pure storage. In that case, using one-way seals, there might be number of more durable possibilities.

If the concept is successful – every nuclear power station should have one. Probably, knowing nukes, three.

Like

Leave a Reply (Markdown is enabled)