Coping with decision-making and uncertainty in post-normal science


Posted on

word cloud - uncertainty

This text includes a snapshot of uncertainty and a brief introduction to post-normal science. The aim of the seminar was to provide an introduction to these topics to make people think about their influence on environmental data science. The text is from a talk given as part of a CEEDS seminar on 24th February 2021

Uncertainty

The diagram above shows some of the vocabulary for uncertainty seen in the literature from different disciplines (including philosophy; sociology; psychology; environmental science) and shows how complex this topic is, as this quote by Elkan in Parsons (2001; p14) shows:

“A complete and consistent analysis of all the many varieties of uncertainty involved in human thinking and revealed in human language is a philosophical goal that we should not expect to achieve soon.”

We are familiar with thinking of uncertainty as a statistical matter, where uncertainty is represented as a quantitative issue. The two distinct categories of uncertainty seen in the scientific literature are aleatory and epistemic:

· Aleatory is due to the randomness of the world, we can’t do anything about it and is sometimes called ‘noise’, and the future.

· Epistemic covers uncertainties which can be reduced with further research.

I then highlighted some of these words to get the audience to also consider qualitative uncertainties. These included risk, ambiguity, inconsistency, indeterminacy which Wynne (1992) uses when discussing decision-making in a policy context, and also Knightian uncertainty. In 1921 the economist Frank Knight distinguished between risk and uncertainty and so this can be used when epistemic uncertainties cannot be quantified.

Finally, I included linguistic uncertainty, which is created by the use of jargon, different meanings to words in different disciplines or by the use of vague terminology. This type regularly arises in interdisciplinary research and one which could easily be dealt with awareness of the problem.

This diagram from Asselt and Rotmans (2002) shows the different sources of uncertainty Looking at the ‘uncertainty due to variability’, we go from the randomness of nature – which I would equate to the aleatory uncertainty mentioned earlier - then we see the uncertainties created due to human behaviour, values, bias, societal influence.

We have the ‘uncertainties created by limited knowledge’, many of which apply to issues arising from data collection and processing:

· Inexactness - from measurement errors;

· Lack of observations - data that could have been collected but wasn’t;

· Practically immeasurable - data that currently can’t be collected due to lack of instruments or cost of development or collection;

· Conflicting evidence - different interpretations of the evidence;

· Reducible ignorance – knowledge that may become available in the future

· Indeterminacy – processes that are understood, but can’t be fully determined.

· Irreducible ignorance – processes that are unlikely ever to be understood.

Post-Normal Science (PNS)

This approach was developed in the mid 1980s by Funtowicz and Ravetz, but it is still very relevant today. It became increasingly clear to them that scientific challenges were becoming more complex, with more uncertainties and an increasing need for more policy decisions.

They called this post-normal following on from ‘normal science’ described by Kuhn in the 1960s as problem solving and where uncertainties were barely acknowledged. In normal science the goal is knowledge, and the quality equals certainty, rather than robustness. Decisions can be made based on probability or risk.

However, once we move away from this, decisions become more complex. The uncertainties become more ambiguous with disagreements about the science, assumptions, methods, language, etc.

In PNS the main goal is quality, rather than knowledge, and the quality is not only about the scientific outcome but also includes the process, the people, and the purposes and therefore including the qualitative uncertainties mentioned earlier.

Funtowicz and Ravetz advocated that in order to use science for policy decisions quality control by managing the uncertainties was needed. They developed their NUSAP system, which built on quantitative methods of dealing with uncertainty with Numerical, Unit and Spread, and then they followed this with Assessment and Pedigree.

Alongside this, they also suggest that all stakeholders should be involved in the research – which they describe as the ‘extended peer community’, so this could include funders, community groups, or anyone else affected by a decision, so that the differing values can be discussed. It can also help incorporate local or indigenous knowledge which might otherwise not be included.

This diagram illustrates the transition from normal or applied science through to post-normal as uncertainties and/or decision stakes increase. Covid-19 is a good example of a complex situation where there are lots of uncertainties and quick decisions have needed to be made.

The uncertainties seen on this diagram are:

Technical uncertainties or what Funtowicz and Ravetz call inexactness - created by errors or gaps in data; managed using standard statistical techniques.

· In Covid, the normal/applied science aspect, such as the epidemiology of the virus, data on cases, spread, deaths, etc. In the early days we saw uncertainties around number of cases as the testing programme was rolled out.

Methodological uncertainties or unreliability - potentially created by using incorrect methods. This can be managed by the scientific process of peer reviewed papers; and by assessment reports written by a group of experts. However, there can be questions around who is on these boards, what are their values, and it should also be noted that they may not necessarily reach a consensus.

· In Covid this could be the assessment of the data, evidence and research by SAGE, who provide advice to the Government based on the knowledge at the time they meet.

· A look at Retraction Watch shows that 71 covid-related papers have been retracted already, so there are issues around the peer review process here.

Epistemological uncertainties cover the more psychological uncertainties such as ethics; values; bias.

· In Covid, this includes other uncertainties that affect the Government’s decisions – deaths vs economy; NHS capacity; education, etc. I would also include ‘Ambiguity’ here – such as changing the way the death rate is calculated and only including deaths within 28 days of positive test, leading us to wonder what the actual death rate is?

Ignorance which covers uncertainties that are difficult to predict, which we saw in the earlier diagram. For example, it is well known that viruses mutate but predicting when, where and how is difficult; take up of vaccines and people’s attitudes to these but it all affects the decisions and the outcome of decisions.

It should be noted that this is all a cyclical process, as some of the uncertainties are reduced by new knowledge, this information needs to be continuously reviewed and fed into the decision-making process.

Summary

· Many different sources of uncertainty.

· PNS provides a framework for uncertainty management.

· Can be used to communicate the different uncertainties to decision makers – from risk to ignorance.

· Can show the importance of different uncertainties and how they can affect a decision.

· Involvement of stakeholders - allows for clarification of what is needed to enable decisions vs the research done/to be done/ability to do.

· Implications for Environmental Data Science – increased interdisciplinarity; consideration of areas of conflict – methods, language, assumptions; increased documentation.

Useful References:

Beven, K. 2009. Environmental modelling: An uncertain future?: An introduction to techniques for uncertainty estimation in environmental prediction. London: Routledge.

Funtowicz, S.O. and Ravetz, J.R., 1990. Uncertainty and quality in science for policy (Vol. 15). Springer Science & Business Media.

Funtowicz, Silvio O. and Ravetz, Jerome R, 1993. Science for the post-normal age, Futures, Volume 25, Issue 7, Pages 739-755, https://doi.org/10.1016/0016-3287(93)90022-L.

Parsons, S., 2001. Qualitative methods for reasoning under uncertainty (Vol. 13). Mit Press.

Van Asselt, M.B.A., Rotmans, J. 2002. Uncertainty in Integrated Assessment Modelling. Climatic Change 54, 75–105 https://doi.org/10.1023/A:1015783803445.

Wynne, B., 1992. Uncertainty and environmental learning: reconceiving science and policy in the preventive paradigm. Global environmental change, 2(2), pp.111-127.

Related Blogs


Disclaimer

The opinions expressed by our bloggers and those providing comments are personal, and may not necessarily reflect the opinions of Lancaster University. Responsibility for the accuracy of any of the information contained within blog posts belongs to the blogger.


Back to blog listing