Onto the last of the last of the guest presenters, David Spiegelhalter, who helps people like the NHS predict the future. I trust he is better at that than he is at assessing the past! 1 trillion tonnes of carbon he informs us, is our budget beyond which we can expect ‘dangerous’ climate change (2C rise above pre-industrial levels). We’ve already burnt around a half a trillion tonnes since the beginning of the Industrial Revolution and “that’s given us almost a degree of warming”, says Spiegelhalter. So what he is saying in effect is that Hannah’s 0.85 degree temperature rise since 1880 is all down to the burning of fossil fuels. Even the IPCC does not go this far. They say:
“Greenhouse gases contributed a global mean surface warming likely to be between 0.5°C and 1.3°C over the period 1951–2010, with the contributions from other anthropogenic forcings likely to be between –0.6°C and 0.1°C, from natural forcing likely to be between –0.1°C and 0.1°C, and from internal variability likely to be between –0.1°C and 0.1°C.
Together these assessed contributions are consistent with the observed warming of approximately 0.6°C over this period. [10.31, Figure 10.5]”
This ties in with the IPCC AR5 attribution statement:
“It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together. The best estimate of the human-induced contribution to warming is similar to the observed warming over this period.”
Greenhouse gas concentrations in the atmosphere have only been measured really accurately at Mauna Loa since about 1960, when 310ppm was registered. Before that, scientists rely upon low resolution ice core data which appears to show a very gradual increase from a baseline of about 280-290ppm in 1880. It’s only during the 1950s that atmospheric CO2 concentrations really take off.
Note the very rapid 1910 to 1945 warming followed by the sharp cooling to 1950 – all when CO2 was not much above pre-industrial levels and increasing only very gradually. Therefore, we must conclude that natural variability was in charge prior to 1950, certainly with respect to major ups and downs. AGW purists might wish to claim that the general upward trend in temperatures since 1880 is also down to anthropogenic CO2 but then they would have to explain also the longer general upward trend in global temperatures since the end of the LIA, without resort to more plausible natural (solar) influences. In summary, it’s quite likely that the increase in global temperature from 1880 to 1950 can be mainly attributed to natural (internal and external) forcings. But the BBC’s Climate Change by Numbers would have its viewers think otherwise.
The so called Monte Carlo method of statistically predicting the most likely future outcome of anything from Formula 1 races to NHS medical policy/practice to CO2 induced climate change is a powerful mathematical tool involving multiple repetition of many different scenarios, Spiegelhalter tells us. I am sure it is – given the right data. The climate model runs cluster around a ‘most likely’ multi-model mean which suggests that we can expect to hit the 2C ‘dangerous’ threshold within the next 30 years or so if we burn all of that remaining 1/2 a trillion tonnes of carbon. What he neglects to mention of course is that the multi-model mean is way ahead of actual observed temperatures and in fact the vast majority of all of the climate models run significantly warmer than reality. Clearly, something is amiss with the models. The Monte Carlo method is faithfully predicting the most likely future outcome, but an outcome probably based upon incorrect assumptions/data about the real climate, about natural oceanic oscillations, cloud feedbacks, water vapour feedbacks, solar variability, and so on. However complex the climate models are, they are mere simplifications of what is actually going on in the coupled ocean-atmosphere system – and it appears that they are simply wrong. The longer the ‘faux pause’ continues, the more wrong they get.
Even if the models eventually prove to be right “why should we worry about a rise of two degrees Celsius?” asks Spiegelhalter. Because of weather, extreme weather to be more precise. Climate scientists tell us we can expect more frequent droughts, floods and storms – though the evidence for such thus far is much less than convincing (see Roger Pielke Jr.’s work). But that’s only part of the problem. Environmental engineers use a technique called Extreme Value Theory to allow for the occurrence of really extreme (e.g. once in a thousand years) events which will test their structures to the absolute limit. This relies upon collecting a lot of data about past extreme events, but in a warming world, Spiegelhalter tells us, this data becomes rapidly obsolete and hence the predictions of extreme value theory, which rely upon overall stable conditions, become increasingly unreliable. Hence our ability to plan for such events is reduced, placing society at risk. This sounds all quite reasonable but again, it is reliant upon the unproven hypothesis that patterns of extreme weather events will change/are changing due to global warming, which in fact is also not happening – for the past 15 years or so, global surface temperatures have not increased at anything like the pace that was predicted by the climate scientists’ models; indeed they have not increased by a statistically significant amount at all in any dataset.
Climate Change by Numbers? A for effort, C- for attainment.
Sorry, you’re gonna need more numbers. #CCBN2