BBC climate change

Motivation

The climate prediction experiments that have been conducted over the years have never given an accurate picture of the situation, due to the limited number of models that have been used. There are many parameters and natural phenomena that need to be accounted for in these models, and to reduce the uncertainty associated with each estimate made for these parameters, a large number of models need to be run. With the assistance of distributed computing, this experiment managed to use thousands of models to predict the climate of the 21st century.

The Science

According to the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report, the Coupled Model Intercomparison Project phase 3 ensemble (CMIP-3) has predicted a ‘likely’ (>66% probability) global mean temperature in 2100. However, there has been a lack of sampling of many important uncertainties, which severely affects the results, such as aerosol forcing, ocean heat uptake and compensation between long-wave and short-wave feedback. Thus, whilst the IPCC estimates are valuable and likely, they under-represent the uncertainty associated with such predictions.

Our experiment used perturbed-physics ensembles, which allowed us to tweak all the parameters within each of their possible ranges, and so sample uncertainty in a systematic manner. We used a multi-thousand member ensemble of Atmosphere-Ocean General Circulation Models (AOGCMs) to model the years 1920 to 2080. The parameters that were changed are the physics in the atmosphere, ocean and sulphur cycle components, as well as natural forcing scenarios.

For our forcings in this part of the experiment, we used climate record data from 1920-2000. We started numerous experiments in 1920 and forced them with historical data for fifty years. This process is called a hindcast: it resembles a forecast, except the outcome is actually known. We know what happened 1920-2000, but it’s still a challenge for the model to simulate it accurately. By comparing the spread of forecasts with observations from the past, we got an idea of how good our range of models is – do most of them do a good job of replicating what actually happened? This also let us ‘rank’ the models according to how well they did.

All the models were also used to produce a forecast for the future – until 2080. We used a range of possible scenarios for what might happen in the next 100 years in terms of greenhouse gas emissions, volcanic eruptions, solar activity etc. for the period 2000-2080. The future forcings ensemble was necessary because we didn’t know what the sun or the volcanoes are going to do over the next fifty years. We also didn’t know how levels of greenhouse gases were going to change over that period. We ran a large number of different possible future scenarios in which we varied solar, sulphate and greenhouse forcing, to span what we hoped would be the likely range.

Published papers:

H.J. Fowler, D. Cooley, S.R. Sain & M. Thurston. Extremes13 (2), 241–267, June 2010. Detecting change in UK extreme precipitation using results from the climateprediction.net BBC climate change experiment article on Springer Link website.

D.J. Frame, T. Aina, C.M. Christensen, N.E. Faull, S.H.E. Knight, C. Piani, S.M. Rosier, K. Yamazaki,  Y. Yamazaki & M.R. Allen. Philosophical Transactions of the Royal Society A, 367 (1890), 855–870, 13 March 2009. The climateprediction.net BBC climate change experiment: design of the coupled model ensemble on the Royal Society website.