Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Uncovering information in fluctuating CLimate systems: An oppoRtunity for solving climate modeling nodes and assIst local communiTY adaptation measures

Article Category

Article available in the following languages:

Citizen science and improved modelling for a better understanding of climate dynamics

Given the need for a consistent picture of climate variability, the EU-funded CLARITY project borrowed methods from statistical physics, and drew on the power of citizen science, to point the way forward.

Climate Change and Environment icon Climate Change and Environment

CLARITY (Uncovering information in fluctuating CLimate systems: An oppoRtunity for solving climate modeling nodes and assIst local communiTY adaptation measures) set out to uncover information about data fluctuations in climate records using the two most prominent providers of global datasets used to track temperature anomalies. This data was then used as input for a Bayesian modelling strategy to improve modelling reliability. The team found that, even if the data provided by these two products comes from the same sources – recordings from global ground meteorological stations – their different methodologies paint different pictures of the long-term dynamics of global temperature anomalies. They therefore propose that future investigations should cross-check such statistical analyses with corresponding results obtained from actual observations. They also propose that meaningful analysis should take into account the methodology underpinning the data preparation. Towards this end, the project collected contemporary climate records, crowd-sourced in the city of Venice. Statistics and modelling Most basic statistical characterisations are linear - that is, they trace transformations, over time in a deterministic way with one data point leading ineluctably to another. However, dynamic systems such as climate patterns simply can’t be adequately reduced to this method of analysis. One of the biggest challenges when trying to make climate change projections is to accommodate fluctuations and anomalies within longer term trends. Based on previous successes with similar complex systems, the two statistical methods used by CLARITY to attempt this were ‘detrended fluctuation analysis (DFA)’, and ‘wavelet transformations (WT)’ analysis. These methods are generally used to characterise the so-called long-term persistence (LTP), also called long-term correlations of records from complex systems. They both do so by measuring fluctuations of the record around a certain trend line, in time windows of different lengths. These methods were applied to global temperature anomalies and other climate data as project supervisor Prof Angelo Rubino, elaborates, “Because DFA and WT systematically eliminate linear trends in the data, combined they enable an assessment of how systems – in this case climate patterns – behave over longer periods of time allowing us a fuller picture.” To reduce uncertainty, or error, the data was then exposed to Bayesian modelling, which applies a formula to a given dataset to find an optimal model for representing this data. As Prof. Rubino elucidates, “What is unique about this modelling is that it incorporates not only data, but also additional sources such as expert opinions, as further input in its quest to find the best fitting model.” Crowdsourcing for all the data under the sun Another part of the project’s work was to collect contemporary climate data crowd-sourced from the city of Venice and surrounding towns. These community-centred efforts included measuring the ambient UV index with static sensors deployed in the terraces, yards and roofs of schools, universities and a hospital, as well as other assorted outdoor spaces accessible to supporters of the project. Additionally, solar UVA and UVB radiation and personal exposure to solar UV radiation (pUVR) were continuously measured with sensors worn by volunteers, including tourists. The DFA and WT analysis of the UV data is still to be undertaken but the team has applied these methods to the pUVR data and is gleaning a greater insight into patterns of individual behaviour under the sun, such as the duration of overall exposure and duration of periods spent outside. From understanding to action CLARITY’s scientific results help deepen our understanding of the complex interactions driving the processes of climate change which could lead to more efficient adaptation and mitigation strategies, in light of the European commitment to the Paris Accord. The project’s use of DFA and WT statistical analysis to understand climate data is an approach which can be replicated for other climate systems or datasets, both to produce specific measures for data dynamics, and to use those to test the accuracy of climate models. Prof Rubino is also keen to point out the possibilities for citizen science saying, “CLARITY’s community-oriented approach helps bring about socially relevant climate science, collecting public data for a more inclusive discussion about adaptation. It could be extended to future public planning and policy design to engage end-users in all stages of data assessment.”

Keywords

CLARITY, modelling, statistical analysis, climate change, fluctuations, adaptation, mitigation, citizen science, data, Venice, UV

Discover other articles in the same domain of application