View all news

Making sense of uncertainty in complex systems

Jonathan Rougier

Jonathan Rougier

5 August 2013

While most of us begin to feel restless and insecure in the face of uncertainty, Dr Jonathan Rougier seems to thrive in it. A statistician in the School of Mathematics, he specialises in assessing the uncertainty inherent in complex systems - systems that are typical in environmental science

While most of us begin to feel restless and insecure in the face of uncertainty, Dr Jonathan Rougierseems to thrive in it. A statistician in the School of Mathematics, he specialises in assessing the uncertainty inherent in complex systems - systems that are typical in environmental science

Scientists who study complex systems, such as climate, use models to bridge the gap between the measurements and observations that they are able to make and the system they are trying to explain. To understand how climate has changed in the last 10,000 years, for example, a scientist might use pollen fragments collected from lake sediment cores to identify which tree species were present 10,000 years ago. The information would then be processed through a model that represents the relationship between tree species and climate properties such as temperature and precipitation. This then allows the scientist to make inferences about past climate based on observations from some ancient pollen fragments.  

There can be considerable uncertainty associated with these models. “This is where a statistician really earns their money,” says Rougier, who is based in the School of Mathematics, “accounting for the systematic way in which the model is an imperfect representation of the system being studied. “

Models are powerful tools for describing how complex systems work and can offer valuable insights, so long as their limitations are understood and respected. However, as science has been harnessed to the needs of policy, scientists have had to venture beyond using models as explanatory tools and start using them as predictive tools. This is where understanding and accounting for uncertainty can be particularly important as it puts scientific predictions into context of how little we truly know about the future.

“We’re trying to get the uncertainty in there,” says Rougier. “Right now, it’s hard to get people to 'fess up' about how uncertain their model-based predictions really are. This is partly because a glaciologist, for example, can provide detailed judgements about the physics behind how an ice sheet might respond to ocean warming, but it is a statistician who has the tools to assess and represent the uncertainties associated with that.”

Accounting for some sources of uncertainty requires highly sophisticated statistics; however, Rougier says there are other ways in which small improvements in practice can have a profound impact on the outcome. For example, some statisticians feel that experimental design is both the most useful and the most neglected aspect of applied science.  There is a now a large statistical literature on Computer Experiments (experiments run on a computer simulator in lieu of the system itself).  A small amount of additional time spent in design can pay huge dividends in reducing the number of experimental runs required, and in simplifying the resulting analysis.

On the impact pipeline between policy-makers and scientists, Rougier admits that he is placed on the end closest to the scientists.  He is much more interested in working directly with the scientists themselves, behind the scenes to improve models of complex systems. From climate to natural hazard prediction, Rougier works with systems which defy the ability to be modelled with any accuracy, yet have significant policy relevance for society: “This is what ultimately interests me, the role of science in bringing evidence into policy-making”.

Within complex codes there are many parameters that we simply don’t know. They might be abstract values that are standing in for physics that we don’t understand or for physics that would be so expensive in terms of programming and processing. We refer to this as parameter uncertainty. This is one of the key things that we want to incorporate into an uncertainty assessment around our predictive models – the lack of knowledge that we have about the parameters in this complex code.

Dr Jonathan Rougier, statistician

Rougier’s work has already had direct impact. He has worked with the Met Office Hadley Centre, one of the world’s leading climate change research centres, for about ten years, and with other climate groups in Europe and the US.  Rougier has helped recognise and address the sources of uncertainty that are inherent in climate models: “Within a very complex code like a climate simulator there are many parameters that we simply don’t know. They might be abstract values that are standing in for physics that we don’t understand or for physics that would be so expensive in terms of programming and processing, that we simply don’t bother putting it in. However, the simulator has to have values in order to run, so we literally plug a value in and see whether the model gives a realistic answer or not. We’re not really sure if it’s correct or not and there may be hundreds of these parameters in a big climate simulator. We refer to this as parameter uncertainty. This is one of the key things that we want to incorporate into an uncertainty assessment around our predictive models – the lack of knowledge that we have about the parameters in this complex code.” 

The other main source of uncertainty in climate models is structural uncertainty, which exists because the model is an imperfect representation of the system being studied.  Computational constraints, among other things, limit the ability for climate models to capture all of the complex interactions that occur between biological, chemical, and physical processes at the global scale.  This means that it would be unrealistic to suppose that any setting of the parameters could be `correct’.

Rougier has proposed a framework for assimilating these different sources of uncertainty in climate science. The Met Office applied this framework in their contribution of probabilistic predictions of future climate as part of the most recent UK climate impacts assessment.

Key facts:

  1. Dr. Jonathan Rougier and Professor Stephen Sparks FRS co-lead the 2009 NERC scoping study on risk and uncertainty in natural hazards.  NERC have recently funded a double consortium of eight UK universities in this area.
  2. Current projects: RATES (2012-15, PI Professor Jonathan Bamber), a project to reconstruct Antarctic ice mass trends through a statistical synthesis of measurements from satellites, radar, GPS, and field studies. CREDIBLE (2012-16, PI Professor Thorsten Wagener), A Bristol-lead consortium of four universities on uncertainty and risk assessment for natural hazards.
  3. Dr. Jonathan Rougier is Director of the Bristol Environmental Risk Research Centre, which was established by the University in 2009 to coordinate, promote and advance interdisciplinary research across the natural, engineering and social sciences in environmental hazard risk assessment and uncertainty science, and is now part of the Cabot Institute. Find out more at www.bristol.ac.uk/brisk/
  4. The Cabot Institute carries out fundamental and responsive research on living with environmental uncertainty and risk. Its interests include natural hazards, resilience and governance, good and energy security, and the changing urban environment. Research fuses rigorous statistical and numerical modelling with a deep understanding of interconnected social, environmental and engineered systems – past, present and future.
Edit this page