Timeline

1999

Myles Allen writes ‘Do-it-yourself climate prediction’ article

2002

New funding & a final title: climateprediction.net

2003

Public launch of climateprediction.net in September!

2004

climateprediction.net moves to new BOINC software

2005

First experiment results published in Nature

2006

BBC climate change project is launched

2008

Geoengineering project goes live

2010

Millennium & weather@home projects launched

2012

weather@home results for European Region and the Western US released

1999

The climateprediction.net project is born as Myles Allen writes a commentary article in Nature called Do-it-yourself climate prediction.

2002

Thanks to funding from the Natural Environment Research Council (NERC) and the Department of Trade and Industry (now Department for Business, Innovation & Skills), the project grows considerably, and allows the team to bring in expertise from the Open University, Knowledge Media Institute (KMi) and the Oxford University Computing Laboratory (ComLab).

The project, initially called Casino-21 (a reference both to Monte Carlo simulations and 21st century climate), is renamed climateprediction.com, then refined to climateprediction.net, to make it clear that this is not a commercial enterprise.

2003

The project team grows even more, bringing in further computing and climate science expertise. Alpha and beta testing is done throughout the year. The full public launch occurs on the 12th September 2003, with overwhelming public interest – 25,000 users worldwide register on the first weekend!

2004

In June, an extension of the original experiment is launched, investigating the effects of a thermohaline circulation (THC) slowdown on global climate. This coincides with the worldwide release of the film ‘The Day After Tomorrow.’ On the 30 July, climateprediction.net holds its first Open Day for anyone involved with the project. On the 26 August, climateprediction.net moves to BOINC (Berkeley Open Infrastructure for Network Computing) developed by the SETI@home project in the USA. This software platform gives participants in the project a lot more flexibility – Mac and Linux users can join in, participants don’t have to choose between helping to predict the climate of the 21st century, looking for signs of Extra-Terrestrial life or folding proteins, but can run a combination of distributed computing projects, and our visualisation can be used as a screensaver. Whilst individual participants may still choose to run the experiment in the original way, participation in BOINC really strengthens the project as a whole.

2005

In January, the first results from the experiment appear in the scientific journal Nature. A few months later, another extension to the main experiment is launched, investigating the ‘global dimming’ effect of changing amounts of sulphur dioxide emissions.

2006

In February, climateprediction.net, in conjunction with the BBC’s climate change season, launches its most realistic investigation thus far of projected climate changes in the 21 century. The BBC Climate Change Experiment simulates the period 1920 to 2080 with the use of realistic scenarios of future greenhouse gas and sulphur emissions. In addition, the model used has a fully dynamic ocean, which provides a more realistic simulation of the ocean than the ‘slab’ ocean used prior to this. The experiment becomes a huge success, attracting around 300,000 new participants. At this point in time, the BBC Climate Change Experiment poses an interesting problem to the project – how to maintain an infrastructure that is appealing both to distributed computing enthusiasts and to the average BBC viewer. As a result, new features are added to the graphical displays of model output and the BBC develops some excellent accompanying web pages, leading to the enablement of participants to view their model output in more depth.

2008

Working in collaboration with Kate Ricke and Granger Morgan of Carnegie Mellon University in the USA, climateprediction.net develops a geo-engineering experiment with results released in October 2008. Kate develops a model set-up, based on the Met Office model that has been used thus far, to investigate the effect of geo-engineering the climate by using changes in volcanic aerosol to mimic geo-engineering activities.

2010

March sees the launch of an experiment to model the last millennium and, in so doing, to increase our understanding of various sources of data used as ‘proxies’ to inform us about past climates. As well as telling us more about such ‘paleoclimates’, such an experiment enables us to further refine our selection of models when making our best projections of future climate changes. This Millennium experiment, led at climateprediction.net by Dr. Hiro Yamazaki, uses a fast variant of the Met Office model used up until now – a version called FAMOUS, which achieves a tenfold increase in speed because of a reduced resolution atmosphere and a lengthened ocean timestep.

In addition, on the 17 November, a new experiment is launched, called weather@home, with support from the Guardian. Over the previous few years, work had been underway to develop the use of a regional climate model. Thus far, all climateprediction.net experiments had involved global climate models, which are informative but are not detailed enough to predict potential changes to weather events and patterns in specific regions of the world. In collaboration with Met Office colleagues Dr. Richard Jones and Dr. Simon Wilson, the Met Office’s PRECIS regional model has been developed for release under the climateprediction.net distributed computing infrastructure. This highly detailed regional model is ‘embedded’ within a high resolution global atmosphere model (HadAM3P) and initially three target regions have be modelled – the Western US, Southern Africa and Europe.

2012

The first results of the weather@home experiments for the European Region and the Western US were published in a special edition of the Bulletin of the American Meteorological Society ‘Explaining extreme events from a climate perspective’.