Feature Stories - Climate models run supercomputer catwalk
Climate models are essential to prepare society for the potential impacts of climate change, but the science is hugely complex and still suffers from significant uncertainty. Current predictions suggest that temperature could rise anywhere between 1.1 and 6.4 °C in the period 1990 to 2095.This is a huge variation. 'The most important individual uncertainty factor in climate modelling is changing cloudiness,' says Petri Räisänen from the Finnish Meteorology Institute and a researcher with the MillCli project . 'Should low-level clouds increase in the future, more solar radiation will be reflected into space, which would then work against global warming. Then again, should low-level clouds recede, global warming will escalate,' says Dr Räisänen. So the MillCli project used supercomputing resources provided by the 'Distributed European infrastructure for supercomputing applications' (DEISA) to study two key uncertainly factors in climate models: clouds and their interaction with radiation. DEISA is a hugely important resource for European scientists like Dr Räisänen. Over the course of some five years and two project phases, DEISA has assembled Europe's most powerful supercomputers via a network, and developed tools to help researchers to use this massive computing power, no matter where they are based. The team also provided support and advice to be sure researchers can get the greatest benefit from the available equipment. So now, for example, Irish scientists can use German supercomputers optimised by Dutch programmers and supported by Spanish, French or Italian technical experts. It is truly a pan-European supercomputing platform. Part of DEISA's work has led to the development of the 'DEISA extreme computing initiative' (DECI), which makes world-class resources available to European scientists who are tackling really tough scientific problems. Indeed, DECI has supported numerous climate change research initiatives throughout Europe - mainly in relation to modelling and simulation - and has been invaluable to MillCli's work. 'Although our computations consumed only 12 % of the CPU quota allocated to MillCli, they would have been too extensive to be run on the computational resources available at the Finnish Meteorological Institute,' notes Dr Räisänen. MillCli needed extremely powerful computers to understand clouds and radiation and how they are represented in current models. 'The most important problem here is insufficient resolution,' notes Dr Räisänen. 'The model atmosphere consists of grid cells with a typical area of 200x200 km and a height of 0.5-1 km. Many processes affecting the generation and properties of cloud formation occur on a much smaller scale.' Highly regarded MillCli used the ECHAM 'General circulation model' (GCM) of the atmosphere to study the problem in question. ECHAM5 is the fifth generation of the model, which was developed by the Max Planck Institute for Meteorology. It is a highly regarded model within climate studies and it was one of the GCMs used by the Intergovernmental Panel on Climate Change to predict the rate of global warming for this century. 'The goal was to study how the use of a more advanced treatment of subgrid-scale cloud structure - features smaller than the model 200 km grid-spacing - influences the climate simulated by ECHAM5, and in particular its sensitivity to increased atmospheric CO2.' The team used three variations of ECHAM5. The first used a simple 'Relative-humidity' (RH) scheme to establish the cloud fraction in each grid cell. The second version used a more sophisticated treatment of cloudiness based on an innovative cloud scheme developed by Adrian Tompkins. This so-called Tompkins scheme estimates cloud fraction, and also the subgrid-scale variability of cloud water, based on the probability distribution function of water content within a grid cell. A third model contained the Tompkins function and it replaced the standard radiation scheme in ECHAM5 with a more advanced scheme that treats subgrid-scale cloud structure directly. The researchers tested each model version with 100-year runs, one for pre-industrial CO2 concentration of 286.2 'Parts per million by volume' (ppmv), and another for an elevated value of 450 ppmv. They also performed a large number of shorter runs to help the interpretation of the results. The computations were performed on an SGI Altix 4700 computer at the Leibniz-Rechenzentrum (LRZ) in Munich, Germany. During MillCli's calculations, 28 processors were employed for each run, and one simulated year took about 6-7 hours of wall clock time. Several simulations were carried out in parallel. 'From the point of view of simulating the current climate, the differences between the three model versions were relatively small. When the structures of the simulated cloud fields were compared with observational data derived from satellite imaging, similar systematic errors were found in the different model versions,' says Dr Räisänen. That picture changed dramatically once the researchers carried their calculation forward and, with the different model versions, showed clear differences in climate change simulations over time, despite the similar performance in the simulation of the current climate. 'The first version, used as the starting point, indicated climate warming to be less than that produced by the two other versions. The third version showed the highest response to increased CO2 and a warming almost 50 % stronger than that in the first version,' reveals Dr Räisänen. In fact the global-mean warming caused by increasing CO2 from 286.2 ppmv to 450 ppmv was 2.02 Kelvin (K) for the first model, 2.73 K for the second and 3.01 K for the last. 'The result is explained by the fact that, in the two versions based on the Tompkins cloud scheme, global warming reduces low-level cloudiness. This means that solar radiation reflected back to space is reduced, but the amount of thermal radiation escaping from the Earth into space is not much changed,' Dr Räisänen remarks. 'Hence, this generates a positive feedback phenomenon that strengthens global warming. But it is currently unknown what is ultimately causing the reduction in low-level cloudiness.' According to Dr Räisänen, the results support the conclusion that, although the models seem to produce quite similar simulations of present climate, they can still produce notable differences in the strength of climate change over time. So it is difficult to assess the reliability of climate change predictions based on merely how well the model performs in simulating the present climate. It is advisable to investigate more than just simple time averages, he suggests. A general question that still lacks good answers, according to Dr Räisänen, is which features in simulations of today's climate are critical for simulating future climate change. So, uncertainty about the rate of global warming remains, but MillCli's research has shown that reduced low-level cloudiness will have a big impact on temperature and the rate of climate change. DEISA2 was funded to the tune of EUR 10.24 million (of EUR 18.65 million total budget) under the EU's Seventh Framework Programme for research, 'e-Science grid infrastructures' sub-programme. Useful Links: - 'Distributed European infrastructure for supercomputing applications' - DEISA2 project data record on CORDIS - e-Infrastructures programme / projects - MillCli project Related Articles: - Linking supercomputers to simulate the sun, the climate and the human body - Supercomputers target HIV - Supercomputing calms troubled waters - Supercomputing gets its own superhero - The grid: a new way of doing science - Europe's fusion researchers to tap into top supercomputing resources