Tuesday, February 21, 2017

MEDEAS: The Next Step after the Paris Climate Agreement


Jordi Solé, coordinator of the MEDEAS project speaks in Brno (Czekia) on Feb 15th, 2017. The European project MEDEAS has the ambitious goal of providing the tools necessary to put into practice the 2015 Paris agreement on climate


Let me start with something to dispel the confusion about what models are for. When you deal with complex, adaptive systems, models are NOT meant to predict the future. As John Gall said in his book on complex systems, "systems always kick back" - to which I may add, "and sometimes they kick back with a vengeance". (another way to express this concept is "forecasting always fails.")

But if dynamic models cannot predict the future, what are they good for? Simple, they are about being prepared for the future. Think of the Paris climate treaty of 2015. It was the result of millions of runs of various climate models, none of which claimed to predict "the" future. But these models are tools to prepare for the future; they tell you what may happen, depending on what you do. They are tools to shape political decisions. Out of all those runs, a goal was extracted, a setpoint, a number: "we don't want temperatures to rise of more than  2 °C and, for that purpose, there is a limit to the amounts of fossil fuels we can burn." It was a political decision that took into account not just what the models say, but what could be concretely achieved in the real world.  No model would give you that number as an output.The Paris agreement was a masterpiece of diplomacy and of communication strategy because it concentrated so much noise into a simple, stark, number: a goal to reach.

And there we stand: with Paris, we set the goal, but how do we get there? This section of policy planning was poor in Paris, where the best that could be done was to line up the INDCs, the intended nationally determined contribution; that is how single countries think they could reduce emissions. That's not planning, it is a first stab at the problem; it shows the good will to do something, but no more. As they stand, the INDCs won't get us far enough.

So, we are again at the task of getting prepared for the future. We know that we need to reduce carbon emissions, but how fast? Besides, it is not just a question of reduction, it is a question of substitution. We need to maintain the essential energy services to the world's population: surely, as a society, we can shed a lot of fat and keep going, but without a minimum of energy input, the system collapses. At the same time, we need to maintain the current input without exceeding the emissions limits. A difficult challenge, although not an impossible one.

Here, we need models, again. No model can tell you exactly how to get there, but models will tell you what is likely to happen given some choices and some decisions. And out of the models, you have to extract a concrete, politically feasible goal: how to invest the remaining resources into attaining the Paris objectives? In other words, what fraction of the world's GDP need to be invested in the transition to a renewable economy?

Giving an answer to this question is the ambitious task of the MEDEAS project which has now reached a full year of work and set up the basis for an extensive modeling effort. MEDEAS takes an approach mainly based on system dynamics, similar to the one of the well-known "The Limits to Growth" approach. It is not the only ongoing project in this area, others projects take different lines of approach. But in al cases the idea is to build up knowledge on what is needed for the transition. Some data are already available that tell us we need a major effort to replace fossil fuels fast enough. The transition that won't come by itself, pushed by purely economic forces. But we need to explore the issue more in depth before these considerations can be turned into a number that can be agreed upon by the interested parties. We need to take into account both what's needed and what is politically feasible. Then, we will have a goal to reach.

If you want to know more about MEDEAS, you can see the MEDEAS website. There is also a MEDAS newsletter, still in a preliminary phase. And, if you would like to be involved, contact me (ugo.bardi(strangething)unifi.it)


Below: an intense discussion held in Brno about the project with the coordinator, Jordi Solé from Barcelona and two Italian researchers from Florence, Sara Falsini and Ilaria Perissi. 









Thursday, February 16, 2017

Seneca and Medea



Sara Falsini (left) and Ilaria Perissi (right), researchers from Italy, illustrate their results at the MEDEAS project meeting in Brno, Czekia, on Feb 15-16 2017


I try to put up a post every week on this blog, but this week I was really swamped by a zillion things. Of these, two really overwhelmed me. The first is the MEDEAS meeting, right now ongoing in Brno, Czekia. The second is completing my new book, "The Seneca Effect". (the cover on the right is fanciful, also the title will be a little different),

Both things have a certain "ancient history" flavor, even though Seneca is a historical character whereas Medea is a mythological one (as far as we know). But they have many things in common, the book and the project are both aiming at understanding the future on the basis of the idea that the key of the future is in the past. (and, after all, Medea and Cassandra are similar mythological figures)

So, I can tell you that the MEDEAS project is going well, although it is an awful lot of work with several models being developed at different levels of detail and scope. The book, too, is almost finished, needs some retouching and some figures are being drawn. It should appear soon in English. Also the German version is being prepared.

Once all this has been accomplished, I can go back to blogging. Soon, I hope.



Monday, February 6, 2017

Checkmated on the "Climate Pause". The Mistakes Scientists Make


David Rose popularized the concept of the "pause" in global warming in a 2012 article on the Daily Mail. There never was such a thing, but it became a highly successful meme (*), still widely cited today as proof that global warming doesn't exist or it is nothing to be worried about. By now, the rapid rising temperatures of the past few years should have been consigned the "pause" to the oblivion it fully deserves. But a group of scientists offered to Rose the occasion to double down and to accuse them of manipulating the data. 


Years ago, I used to play chess, even though I always remained, at best, at a low-medium skill level. Once, I found myself playing with a local high-level player and I was thoroughly trashed, quickly checkmated. I offered my congratulations to him and he answered to me with something like, "Ugo, it is not that I am especially good. It is you who made mistakes with your moves. Make no mistakes, and nobody will ever checkmate you."

I think that was good advice that I still try to remember after many years. If you are defeated, it may be that your opponent is especially good, but it is also likely that he or she simply exploited your mistakes. Avoid making mistakes, and your life will be easier. But you need to recognize the mistakes you made and admit them.

This seems to be the problem with the present debate on climate science. Facing aggressive criticism, scientists keep making the most elementary communication mistakes. The latest disaster for science is the recent article by David Rose in which scientists are accused to be manipulating the data. Rose, you may remember, is the journalist who first diffused in the media the idea that there had been a "pause" in global warming. His 2012 article in the Daily Mail was a milestone in the meme war; with the "pause" (or "hiatus") still widely known and repeated as "proof" that global warming doesn't exist or that, at least, climate models don't work (*).

Obviously, the "pause" never was anything more than a perfectly normal oscillation - amplified by carefully choosing a specific interval of temperatures. The recent temperature increases broke all the warming records and that should have buried forever the "pause", together with other legends such as the claimed arrival of the planet Nibiru in 2012. But, no. Now David Rose doubles down with a new article in which he, this time, accuses scientists of having manipulated the data in order to make the pause disappear.

I don't think I need to tell you that Rose's latest article is a textbook example of logical inconsistency. First, he claimed the existence of the "pause" on the basis of temperature data that, evidently, he trusted. Now, he says that the data shouldn't be trusted because they don't show a pause. If there ever was an example of motivated reasoning, this is it.

Yet, communication is not just a question of formal logic. Take a tour of the Web and you'll see how many people are gleefully commenting on Rose's latest broadside against science. It is a landslide; the dam has given way: it is a true disaster for science. Maybe Rose is an evil genius in communication, but I think he is not. He is just exploiting the mistakes made by climate scientists.

This story is all about an article published in 2015 by a group of NOAA scientists who claimed that there is no evidence of a slowdown in the world's temperature increase. The article was perfectly good in scientific terms, but it was a terrible mistake in terms of communication. Why? Because it ignored a simple fact of life: in the mass media debate, mentioning a concept, even if for debunking it, has the effect of reinforcing the public perception that the concept is real.

This is a well known concept. On this issue, you may read a good article by Chris Mooney describing the "backfire effect" or, sometimes, the "boomerang effect". Among the many cases, it was found that having Barack Obama explicitly stating that he is not a Muslim tends to reinforce some people's belief that he is. And you surely remember the story of the "weapons of mass destruction" in Iraq. There never was any proof for their existence (and, indeed, they never existed). But the more the subject was debated, the more people became convinced that they existed.

In the end, it is simple: debunking doesn't work; on the contrary, it often reinforces the perception that the belief being debunked is true. So, it should have been obvious that a paper that attempted to demonstrate that there never was a "pause" would generate a backlash, one day or another. And it did.

Let me repeat: For what I can say, there is nothing wrong in scientific terms in the work by Karl et al. But place yourself in the shoes of a person who is not a scientist, won't you get the impression that the scientists are fiddling with the data? That's the point that the critics of science are making over and over and this message seems to be going through.

Maybe it was unavoidable that a review of the temperature data would lead to this result, but was it appropriate to publish a minor correction of a data set in a high-visibility journal? If it was in order to affect climate policies, it was a perfectly legitimate target, but only if based on rock-solid data. Didn't the people involved in this work realize that their corrections are debatable, to say the least? And how is it that no one in NOAA thought that in some quarters the corrections would be understood and described as politically motivated data manipulation? Do scientists always have to be so naive? 

Now, many scientists are trying to debunk Rose's article (**), but the problem remains the same: the more you mention the "pause", the more it becomes real for the public. And that's a victory for the enemies of science. It seems that, as scientists, we are falling over and over into the same traps. As long as we do that, we'll keep being checkmated by people who exploit our mistakes.





(*) About the power of the "pause" as a meme, note that even a Nobel prize in physics, Carlo Rubbia, became convinced that it was something real. You can hear him (in Italian) here saying that on minute 2.40 

(**) Note that climate scientists are debunking Rose who was debunking NOAA that was debunking Rose who was debunking climate scientists. Quite a trophic chain of debunking and counter-debunking. A true "metadebunking" that only confuses people and plays in the hand of the enemies of science.





Sunday, January 29, 2017

Another Defeat for Science: "Metallic Hydrogen"




This blob is supposed to be "metallic hydrogen" according to the claim of a group of Harvard scientists. Maybe. For sure, it is another disaster for the reputation of science and of scientists.


Another day, another disaster for Science. A group of researchers from Harvard claimed of having obtained for the first time "metallic hydrogen" in their laboratory. That gave rise to a series of improbable claims about the cornucopia of abundance that humankind could obtain from the discovery. Especially lyrical was "The Independent", in an article that was soon retracted and replaced with a more sober one their page, where they now say it was all a mistake. But the first article contained such gems as:

Now, in a stunning act of modern-day alchemy, scientists at Harvard University have finally succeeded in creating a tiny amount of what is the rarest, and possibly most valuable, material on the planet, 

metallic hydrogen could theoretically revolutionise technology, enabling the creation of super-fast computers, high-speed levitating trains and ultra-efficient vehicles and dramatically improving almost anything involving electricity. And it could also allow humanity to explore outer space as never before.

And more like this, and thank God they didn't mention flying cars but they got close.

Now, let's examine this story. First of all, "metallic hydrogen" is a legitimate target of investigation. It was theoretically predicted already about a century ago and believed to exist in the core of giant planets. From here onward, however, the whole story is just a mix of fantasy and bad science.

The claim comes from a test in which the researchers placed a sample of hydrogen inside a diamond anvil and compressed it at very high pressures. At some point, they saw something shiny appearing and they concluded that it was "metallic hydrogen." Immediately afterward, they proceeded to publish their story with all the associated outlandish claims of spaceships, alchemy, ultra-efficient vehicles, etc.

Now, when you start a career as a scientist, you are told that

1) Your experiments should be repeatable.

2) There be should be always proof - say, a blank test - that what you claim is not an artifact of your experimental setup.

3) You should never claim anything for which you have no evidence.

Consider this as a checklist and you'll see that the Harvard researchers should mark all three items as "failed." (1) Unbelievable but true: they didn't repeat their experiment, they didn't make a blank experiment, and they engaged in wild fantasies on what their result could mean or, at least, they didn't object on such fantasies being reported over the press.

Note that it is perfectly possible that the blob in the anvil could turn out to be metallic hydrogen but, at present, there is no justification for this early claim. Besides, there is strictly zero proof that metallic hydrogen could turn out to be stable at or near room temperature and hence useful for the multiple claimed miracles. To say nothing of the fact that a diamond anvil processes micrograms and it would be interesting to calculate how many of these anvils would be needed to produce the tons of fuel needed to power a spaceship (hint: trillions).

More evidence, if it was ever needed, of the general decline of science, underfunded, poorly organized, pushed and pulled in all directions at the same time by politicians, businesses, journalists, the public, and more - a disaster. And the results are what you would expect: the general defeat of science that we are witnessing nowadays.

The big problem, here, is that a lot of people are clearly starting to perceive that some scientists are trying to fool them. They don't like that and they may well arrive at the conclusion that all scientists are trying to fool them. And that's very bad because there are still plenty of good scientists who are producing good science and who are trying to alert humankind of the dangers ahead. But, in the general sinking of the scientific ship, bad and good science are lumped together and sent heading to the bottom.

Can this trend be reversed? Hard to say but, at least, we should do something to avoid that the overinflated ego of some scientists continues to lead science into this kind of disasters.




(1) Incidentally, this is exactly the same series of failures that we can attribute to Stanley Pons and Martin Fleischmann when they claimed to have discovered "cold fusion" in 1989. An even worse defeat for science, whose consequences are still felt.



Sunday, January 22, 2017

Trump: the Defeat of Science





Minutes after Donald Trump took office as President, the page on climate change of the website of the White House disappeared. This may be just a result of some internal protocol, but also the first stage of a coming "purge" of climate science and climate scientists. In any case, the election of Trump is a major defeat for science and we need to understand what mistakes we made to arrive at this point. I am writing here something that probably won't make me popular with my scientist colleagues, but I thought I had to write it.



Defeats are supposed to teach people how to do better; in theory. In practice, it often happens that defeats teach people how to become masters in blame-shifting. With some exceptions, this seems to have been the main result of the recent defeat of the Democrats in the 2016 presidential election, where we saw a truly spasmodic search for culprits: Putin, the Russian hackers, the Fake News, the Rednecks, the FBI, Exxon, the aliens from Betelgeuse, and more. Everything except admitting one's mistakes.

Even less soul searching has been performed by those who turned out to be among the major losers in this story: science and scientists. In particular, climate scientists saw their field wiped out from the White House Website minutes after President Trump took office. That may have been simply a question of protocol, but surely it is not a good omen for the future.

So far, scientists have reacted with appropriate outrage to possibilities such as Trump repudiating the Paris climate treaty. However, on the average, scientists seem to be completely unable to even imagine that there may be something wrong with what they have been doing. We may have here a good illustration of the principle expressed by James Schlesinger that "people have only two modes of operation: complacency and panic". Even though some scientists are starting to show symptoms of panic, most of them seem to be still in complacency mode.

Yet, for everything that happens there is a reason and if you invaded Russia in winter it is no good to blame the snow for the defeat. So, what did scientists do that led them to a situation that may turn out to be even worse than the retreat from Moscow for Napoleon's Grande Armée?

One problem, here, is that if scientists had wanted to present themselves to the public as a priesthood of acolytes interested only in maintaining their petty privileges, they succeeded beyond the rosiest expectations. Yet, I don't think that this is the problem. Overall, science is still a sane profession and very few scientists have been directly involved in financial scandals. The public perceives this and normally rates scientists as much more trustworthy than - say- journalists or politicians. And modern climate science, as part of the field of Earth sciences, is nothing less than a triumph of human knowledge. Truly a major advance of what we know on the way our planet and our ecosystem work.

The problem, in my opinion, is a different one. It goes deeper and it is not related to individual scientists or to specific scientific fields. It has to do with science as a whole and, in particular, with the inconsistent messages that scientists are beaming to the public. According to the results reported by Ara Norenzayan's in "Big Gods" (Princeton, 2013), people have a built-in "lie detector" in their minds that works by a heuristic algorithm: people will evaluate the truth of what they are told on the basis of consistency. Not only the message must be consistent in itself, but also the messenger must be consistent with the message carried. This is a fundamental point: people don't normally care about data and factual evidence: they care about the consistency of the message in their social environment; it is something that Dan Kahan has shown in a series of studies on the public perception of climate science.

So, if your local prophet tells you that you must be chaste, he'd better be chaste himself. If he tells you that you must make sacrifices and accept poverty, he'd better be poor himself. And chastity/poverty must be acceptable in your social environment. These are things that Francis of Assisi understood already long ago. Then, think of Donald Trump: why was he elected? It was, mainly, because Trump's political message was consistent with Trump himself. Trump was telling people that he would make America rich and powerful and that was perfectly consistent with the fact that he is rich and powerful himself. Because of this, Trump's message didn't trigger people's lie detector and Trump the unthinkable became Trump the unavoidable.

Getting back to science, the message of climate change is intimately linked to the need of making sacrifices. We are asking people to reduce their consumption, reduce waste, travel less, and the like. It is a perfectly legitimate message and many religious groups have been carrying similar messages successfully. Of course, it would never work if Donald Trump were to propose it; but why can't scientists propose it successfully? Scientists are not Franciscan monks, but normally they are not rich. I often tell my PhD students that they are exchanging three years of starvation for a lifetime of unemployment. I don't really need to tell them that: they know that by themselves.

The problem is that there exists another side of science where scientists are beaming out exactly the opposite message of that of the need of making sacrifices. It is the side of the "gee-whiz science" or, maybe, "Santa Claus Science", scientific research still operating along the optimistic ideas developed in the 1950s, at the time of the "space age" and the "atomic age". The message that comes from this area is, "give us some money and we'll invent some magic device that will solve all the problema." It is a message that's ringing more and more hollow and the public is starting to perceive that the scientists are making promises they can't maintain. Not only the various scientific miracles that were promised are not materializing (say, nuclear fusion) but many pretended scientific revolutions are making things worse (say, shale oil). Still, many scientists keep making these promises and a certain section of society accepts - even requires - them.

So, the name of the problem is inconsistency. Scientists are taking two different and incompatible roles: that of doom-sayers and that of gift-givers. And "inconsistency" is just a polite way to say "lie." White scientist speak with forked tongue. Ye can't serve God and mammon.

The result is that not just Donald Trump despises science; it is a consistent fraction of the public that just doesn't believe the scientific message, especially about climate. The fraction of Americans who think that climate change is a serious threat has remained floating around 50% - 60%, going up and down, but not significantly changing. It is trench warfare in the climate communication war. Things may get worse for science under the Trump presidency. It already happened at the time of McCarthy, why shouldn't it happen again?

At this point, good manners dictate that when you write about a problem, you should also propose ways to solve it. Of course, there are ways that could be suggested: first of all, as scientists we should stop asking money for things that we know won't work (the "hydrogen-based economy" is a good example). Then, science badly needs a cleanup: we should crack down on predatory publishers, fight data fabrication, establish transparent standards for scientific publications, provide for free results of science to those who pay for it (the public), get rid of the huge number of irrelevant studies performed today, and more. Personally, I would also like a science that's more of a service for the community and less of a showcase for primadonnas in white coats.

But, as we all know, large organizations (and science is one) are almost impossible to reform from inside. So, where is science going? Difficult to say, but it may need a good shake-up from the outside (maybe from Trump, although he may well exaggerate) to be turned into something that may be what we truly need to help humankind in this difficult moment. The transformation will be surely resisted as much as possible, but change is needed and it will come.



"No man can serve two masters: for either he will hate the one, and love the other; or else. he will hold to the one, and despise the other. Ye cannot serve God and mammon." (Matthew 6:24)











Tuesday, January 17, 2017

Amelie the Amoeba: How Things Grow



This academic year, I gave a lesson on the growth mechanism of complex systems. It is a fascinating subject that can be applied to several fields, from biology to economics. Since the students I was talking to were not specializing in complex systems (they were students of geology), I used a light tone and used "Amelie the Amoeba" an image for the growth mechanism of bacteria in a Petri dish of many other things dish. Then, the image above summarizes what I told them.

If you know about these matters, you can probably understand what the drawings show. If you don't, some notes are appropriate. So, here is a very brief summary of how things grow in the universe.

1. The "Solow" mode, or exponential growth. The name refers to the economist Robert Solow who proposed this model, but most economists today seem to argue that exponential growth is the natural, actually the only possible, mode of growth of the economy. They may not be completely wrong; after all, it is the way bacteria grow (for a while) in a Petri dish. So, Amelie the Amoeba is very happy to be growing exponentially, too bad that if she were to continues for a long time, she would eventually devour the whole universe.

2. The "Malthus" mode, also "Verhulst" or simply "sigmoid" mode. It takes into account the fact that the Petri dish contains a limited amount of nutrients and Amelie can't keep growing forever. Malthus was the first to apply this model to the human population, assuming that it would reach a certain limit and then stay there: contrarily to what commonly said, Malthus never predicted collapses. The concept of "collapse" was alien to him, but at least he was right in noting that all physical systems have limits.

3. The "Hubbert" mode or the "bell-shaped" curve. That's more like what could happen to Amelie in a Petri dish. Grow for a while, reach a "peak amoeba" size, and then shrink and die for lack of food. Hubbert applied the model to the oil production of the United States, predicting reasonably well the future of the extraction of "conventional" oil. And, if you try to do the test for bacteria (or amoebas) in a Petri dish, it works as well.

4. The "Seneca" mode. This is the name I gave to the kind of growth kinetics where the decline is much faster than the growth. It comes from something that the Roman philosopher Lucius Annaeus Seneca said in one of his letters ("increases are of sluggish growth, but the way to ruin is rapid") and it happens all the time, even to amoebas in a Petri dish.

5. The "Hokusai" mode. The Japanese painter Katsushita Hokusai never made mathematical models and he probably never knew what an amoeba is. But with his famous painting, "the wave", he provided a good visual impression of what happens when things get real bad. Not only decline is faster than growth, but the curve actually starts chasing you! Even amoebas can get nasty and eat your brain.

Friday, January 13, 2017

Peak Uranium: the uncertain future of nuclear energy

 
Alice Friedmann recently posted on her blog "Energy Skeptic" a summary of the discussion on nuclear energy from my book "Extracted" (Chelsea Green, 2014). It is a well-done summary that I am reproducing here. Note that the text below mixes some of the considerations of the main text (written by me) and of one of the "glimpses"; that were written by other authors. The glimpse that reports the results of a model of future uranium production was written by Michael Dittmar. He told me in a recent mail exchange that his model seems to be doing pretty well more than two years after its results were published in "Extracted". (U.B.)

Peak Uranium by Ugo Bardi from "Extracted: How the Quest for Mineral Wealth Is Plundering the Planet"


Figure 1. cumulative uranium consumption by IPCC model 2015-2100 versus measured and inferred Uranium resources

[ Figure 1 shows that the next IPCC report counts very much on nuclear power to keep warming below 2.5 C.  The black line represents how many million tonnes of reasonably and inferred resources under $260 per kg remain (2016 IAEA redbook). Clearly most of the IPCC models are unrealistic.  The IPCC greatly exaggerates the amount of oil and coal reserves as well. Source: David Hughes (private communication)


This is an extract of Ugo Bardi’s must read “Extracted” about the limits of production of uranium. Many well-meaning citizens favor nuclear power because it doesn’t emit greenhouse gases.  The problem is that the Achilles heel of civilization is our dependency on trucks of all kinds, which run on diesel fuel because diesel engines transformed our civilization with their ability to do heavy work better than steam, gasoline, or any other kind of engine.  Trucks are required to keep the supply chains going that every person and business on earth require, from food to the materials and construction of the roads they run on, as well as mining, agriculture, construction trucks, logging etc. 

Nuclear power plants are not a solution, since trucks can’t run on electricity, so anything that generates electricity is not a solution, nor is it likely that the electric grid can ever be 100% renewable (read “When trucks stop running”, this can’t be explained in a sound-bite).  And we certainly aren’t going to be able to replace a billion trucks and equipment with diesel engines by the time the energy crunch hits with something else, there is nothing else.


Alice Friedemann   www.energyskeptic.com  author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report ]

Bardi, Ugo. 2014. Extracted: How the Quest for Mineral Wealth Is Plundering the Planet. Chelsea Green Publishing.

Although there is a rebirth of interest in nuclear energy, there is still a basic problem: uranium is a mineral resource that exists in finite amounts.

Even as early as the 1950s it was clear that the known uranium resources were not sufficient to fuel the “atomic age” for a period longer than a few decades.

That gave rise to the idea of “breeding” fissile plutonium fuel from the more abundant, non-fissile isotope 238 of uranium. It was a very ambitious idea: fuel the industrial system with an element that doesn’t exist in measurable amounts on Earth but would be created by humans expressly for their own purposes. The concept gave rise to dreams of a plutonium-based economy. This ambitious plan was never really put into practice, though, at least not in the form that was envisioned in the 1950s and ’60s. Several attempts were made to build breeder reactors in the 1970s, but the technology was found to be expensive, difficult to manage, and prone to failure. Besides, it posed unsolvable strategic problems in terms of the proliferation of fissile materials that could be used to build atomic weapons. The idea was thoroughly abandoned in the 1970s, when the US Senate enacted a law that forbade the reprocessing of spent nuclear fuel.

A similar fate was encountered by another idea that involved “breeding” a nuclear fuel from a naturally existing element—thorium. The concept involved transforming the 232 isotope of thorium into the fissile 233 isotope of uranium, which then could be used as fuel for a nuclear reactor (or for nuclear warheads). The idea was discussed at length during the heydays of the nuclear industry, and it is still discussed today; but so far, nothing has come out of it and the nuclear industry is still based on mineral uranium as fuel.

Today, the production of uranium from mines is insufficient to fuel the existing nuclear reactors. The gap between supply and demand for mineral uranium has been as large as almost 50% from 1995 to 2005, though gradually reduced the past few years.

The U.S. mined 370,000 metric tons the past 50 years, peaking in 1981 at 17,000 tons/year.  Europe peaked in the 1990s after extracting 460,000 tons.  Today nearly all of the 21,000 ton/year needed to keep European nuclear plants operating is imported.

The European mining cycle allows us to determine how much of the originally estimated uranium reserves could be extracted versus what actually happened before it cost too much to continue. Remarkably in all countries where mining has stopped it did so at well below initial estimates (50 to 70%). Therefore it’s likely ultimate production in South Africa and the United States can be predicted as well.
Table 1. The European mining cycle allows us to determine how much of the originally estimated uranium reserves could be extracted versus what actually happened before it cost too much to continue. Remarkably in all countries where mining has stopped it did so at well below initial estimates (50 to 70%). Therefore it’s likely ultimate production in South Africa and the United States can be predicted as well.

The Soviet Union and Canada each mined 450,000 tons. By 2010 global cumulative production was 2.5 million tons.  Of this, 2 million tons has been used, and the military had most of the remaining half a million tons.

The most recent data available show that mineral uranium accounts now for about 80% of the demand.  The gap is filled by uranium recovered from the stockpiles of the military industry and from the dismantling of old nuclear warheads.

This turning of swords into plows is surely a good idea, but old nuclear weapons and military stocks are a finite resource and cannot be seen as a definitive solution to the problem of insufficient supply. With the present stasis in uranium demand, it is possible that the production gap will be closed in a decade or so by increased mineral production. However, prospects are uncertain, as explained in “The End of Cheap Uranium.” In particular, if nuclear energy were to see a worldwide expansion, it is hard to see how mineral production could satisfy the increasing uranium demand, given the gigantic investments that would be needed, which are unlikely to be possible in the present economically challenging times.

At the same time, the effects of the 2011 incident at the Fukushima nuclear power plant are likely to negatively affect the prospects of growth for nuclear energy production, and with the concomitant reduced demand for uranium, the surviving reactors may have sufficient fuel to remain in operation for several decades.

It’s true that there are large quantities of uranium in the Earth’s crust, but there are limited numbers of deposits that are concentrated enough to be profitably mined. If we tried to extract those less concentrated deposits, the mining process would require far more energy than the mined uranium could ultimately produced [negative EROI].

Modeling Future Uranium Supplies
Uranium supply and demand to 2030
Table 2. Uranium supply and demand to 2030

Michael Dittmar used historical data for countries and single mines, to create a model that projected how much uranium will likely be extracted from existing reserves in the years to come. The model is purely empirical and is based on the assumption that mining companies, when planning the extraction profile of a deposit, project their operations to coincide with the average lifetime of the expensive equipment and infrastructure it takes to mine uranium—about a decade.

Gradually the extraction becomes more expensive as some equipment has to be replaced and the least costly resources are mined. As a consequence, both extraction and profits decline. Eventually, the company stops exploiting the deposit and the mine closes. The model depends on both geological and economic constraints, but the fact that it has turned out to be valid for so many past cases shows that it is a good approximation of reality.
This said, the model assumes the following points:
  • Mine operators plan to operate the mine at a nearly constant production level on the basis of detailed geological studies and to manage extraction so that the plateau can be sustained for approximately 10 years.
  • The total amount of extractable uranium is approximately the achieved (or planned) annual plateau value multiplied by 10.
Applying this model to well-documented mines in Canada and Australia, we arrive at amazingly correct results. For instance, in one case, the model predicted a total production of 319 ± 24 kilotons, which was very close to the 310 kilotons actually produced. So we can be reasonably confident that it can be applied to today’s larger currently operating and planned uranium mines.

Considering that the achieved plateau production from past operations was usually smaller than the one planned, this model probably overestimates the future production.

Table 2 summarizes the model’s predictions for future uranium production, comparing those findings against forecasts from other groups and against two different potential future nuclear scenarios.

As you can see, the forecasts obtained by this model indicate substantial supply constraints in the coming decades—a considerably different picture from that presented by the other models, which predict larger supplies.

The WNA’s 2009 forecast differs from our model mainly by assuming that existing and future mines will have a lifetime of at least 20 years. As a result, the WNA predicts a production peak of 85 kilotons/year around the year 2025, about 10 years later than in the present model, followed by a steep decline to about 70 kilotons/year in 2030. Despite being relatively optimistic, the forecast by the WNA shows that the uranium production in 2030 would not be higher than it is now. In any case, the long deposit lifetime in the WNA model is inconsistent with the data from past uranium mines. The 2006 estimate from the EWG was based on the Red Book 2005 RAR (reasonably assured resources) and IR (inferred resources) numbers. The EWG calculated an upper production limit based on the assumption that extraction can be increased according to demand until half of the RAR or at most half of the sum of the RAR and IR resources are used. That led the group to estimate a production peak around the year 2025.

Assuming all planned uranium mines are opened, annual mining will increase from 54,000 tons/year to a maximum of 58 (+ or – 4) thousand tons/year in 2015. [ Bardi wrote this before 2013 and 2014 figures were known. 2013 was 59,673 (highest total) and 56,252 in 2014.]

Declining uranium production will make it impossible to obtain a significant increase in electrical power from nuclear plants in the coming decades.

Who

Ugo Bardi is a member of the Club of Rome and the author of "Extracted: how the quest for mineral resources is plundering the Planet" (Chelsea Green 2014)