Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

A rendezvous with the uncertainty monster

34 views

Published on

'A rendezvous with the uncertainty monster' by Florian Knobloch, Dept. of Environmental Sciences, Radboud University Nijmegen, NL

Published in: Environment
  • Be the first to comment

  • Be the first to like this

A rendezvous with the uncertainty monster

  1. 1. A rendezvous with the uncertainty monster Florian Knobloch Dept. of Environmental Sciences, Radboud University Nijmegen, NL
  2. 2. Schroedinger’s cat may be simultaneously both alive and dead, as a result of being linked to a random subatomic event that may or may not occur. We understand the underlying process (quantum physics), but not the actual state in an individual case (x or y). We thus know f(x), and can statistically model the number of dead cats in a sample. Three-body problem is the problem of determining the motions of the three bodies, based on an initial set of data. We understand the process (laws of gravity), and can also observe its initial state. But due to complex interactions, small measurement errors will inflate through time. In a thee-body model, uncertainty will thus increase over time.
  3. 3. Interacting humans Common to all studies on complexity: systems with multiple elements adapting or reacting to the pattern these elements create: cells in a cellular automaton, ions in a spin glass, or cells in an immune system. Ions in a spin glass always react in a simple way to their local magnetic field. Humans react with strategy and foresight by considering outcomes that might result as a consequence of behavior they might under- take. This adds a layer of complication to economics that is not experienced in the natural sciences (Arthur 1999). How do people react to policies??
  4. 4. So, what is uncertainty? • No commonly shared terminology or typology of uncertainties • Different analysts use different terms for the same kinds of uncertainty, and some use the same term to refer to different kinds. • uncertainty due to lack of knowledge and uncertainty due to variability inherent to the system • new information can either decrease or increase uncertainty (by revealing new complexities) Walker et al. 2003: uncertainty is ‘any deviation from the unachievable ideal of completely deterministic knowledge of the relevant system’. The ‘uncertainty monster’ Introduced by Van der Sluijs (2005) as an analysis of how scientists respond to monstrous uncertainties that they face in the production of the knowledge base of complex environmental problems, including ambiguity over: • knowledge versus ignorance, • objectivity versus subjectivity, • facts versus values, • prediction versus speculation, • and science versus policy.
  5. 5. I. Where it lives: locations of uncertainty II. What it looks like: levels of uncertainty III. Where it comes from: nature of uncertainty IV.How to tame it: dealing with uncertainty V. Examples
  6. 6. I. The location of uncertainty 1. Context is an identification of the boundaries of the system to be modelled, and thus the portions of the real world that are inside the system 2. Model uncertainty is associated with both the conceptual model and the computer model 1. Model structure uncertainty, which is uncertainty about the form of the model itself 2. Model technical uncertainty, which is uncertainty arising from the computer implementation of the model (e.g., software or hardware errors) 3. Inputs to the model are associated with the description of the reference system 4. Parameter uncertainty is associated with the data and the methods used to calibrate the model parameters (exact, fixed, chosen or calibrated) 5. Model outcome uncertainty is the accumulated uncertainty associated with the model outcomes of interest to the decision- maker. Walker et al. 2003 Walker et al. 2003
  7. 7. I. The location of uncertainty: Model structure uncertainty • Arises from a lack of sufficient understanding of the system (past, present, or future) that is the subject of the policy analysis • implies that any one (or none) of many model formulations might be a plausible representation • involves uncertainty associated with • the relationships between inputs and variables, • among variables, • and between variables and output • pertains to the system boundary, functional forms, definitions of variables and parameters, equations, assumptions and mathematical algorithms • For most case studies: the coupling and downscaling of models Walker et al. 2003
  8. 8. II. Levels of uncertainty Ranging from the unachievable ideal of complete deterministic understanding at one end of the scale to total ignorance at the other. Walker et al. 2003 Statistical uncertainty: any uncertainty that can be described adequately in statistical terms (what is usually referred to as ‘‘uncertainty’’ in the natural sciences) (such as measurement uncertainty, imprecision, sampling error, etc.) • Implicitly assumes that the functional relationships in the given model are reasonably good descriptions of the phenomena being simulated • If this is not the case, deeper forms of uncertainty supersede statistical uncertainty Scenario uncertainty: implies that there is a range of possible outcomes, but the mechanisms leading to these outcomes are not well understood and it is, therefore, not possible to formulate probabilities of any one particular outcome occurring Source: Walker et al. 2003
  9. 9. II. Levels of uncertainty Walker et al. 2003 Recognised ignorance: fundamental uncertainty about the mechanisms and functional relationships being studied. We know neither the functional relationships nor the statistical properties and the scientific basis for developing scenarios is weak. • reducible ignorance: may be resolved by conducting further research • irreducible ignorance: research cannot provide sufficient knowledge (indeterminacy) Total ignorance: the other extreme from determinism on the scale of uncertainty, which implies a deep level of uncertainty, to the extent that we do not even know that we do not know.
  10. 10. III. Nature of uncertainty Epistemic uncertainty: uncertainty due to the imperfection of our knowledge, which may be reduced by more research and empirical efforts. Variability uncertainty: uncertainty due to inherent variability, which is especially applicable in human and natural systems and concerning social, economic, and technological developments. • Human behaviour (behavioural variability): ‘non-rational’ behaviour, or deviations of ‘standard’ behavioural patterns (micro-level behaviour); • Social, economic, and cultural dynamics (societal variability): the chaotic and unpredictable nature of societal processes (macro-level behaviour); • Technological surprise: New developments or breakthroughs in technology or unexpected consequences (‘side-effects’) of technologies. Models may use frequency distributions to represent variability uncertainty in case the property falls into the level of statistical uncertainty. Walker et al. 2003
  11. 11. III. Nature of uncertainty Source: Van Asselt et al. (2002)
  12. 12. III. Nature of uncertainty Van Asselt and Rotmans (2002) • Descartes (1596–1650): systematic inquiry using mathematical and quantitative methods will lead to certain knowledge about reality (‘positivism’), that can be falsified in the Popperian sense. -> main paradigm of natural science and modelling • Hume (1711–1776): the gap between observations and reality couldn’t be bridged by reason. • Hegel (1770–1831): systematic inquiry leads to knowledge, but not to perfect and complete knowledge. • Post-modernism and social-constructivism: • Science is not a purely objective, value-free activity of discovery: it is a creative process in which social and individual values interfere with observation, analysis and interpretation. • Knowledge is not equivalent with truth and certainty. -> prominent paradigm in the social sciences What we observe is not nature itself, but nature exposed to our method of questioning. —Werner Karl Heisenberg
  13. 13. III. Nature of uncertainty Beck et al. (2016) Krueger et al. identify multiple points in the modeling process at which expert opinion may enter informally, including • the construction of the underlying perceptional model, • the specification of a formal model structure, • the setting of parameter values, and • the presentation and evaluation of model output. Risbey et al. note how choices made in early IAM studies can settle modelers’ choices in subsequent studies by producing a normative benchmark for what scholars and decision-makers accept as credible. For example, one of the first estimates of global climate change damages was generated from the DICE model in 1992, stating that climate change will cost 1–2% of global GDP. This estimate has become a yardstick for gauging the plausibility of subsequent IAM studies. As a result, model-policy coproduction can lead to a ‘lock-in’ to possibly sub- optimal models and modeling techniques.
  14. 14. III. Dealing with uncertainty Van der Sluijs (2005) distinguished five styles of coping with monsters in the science–policy interface, with different degrees of tolerance towards the ‘abnormal’: Monster hiding: Hide the monster (the ‘never admit error strategy’). Apart from the ethical issues of monster hiding, the monster may be too big to hide and uncertainty hiding enrages the monster. Monster exorcism: Expel the monster. Focuses on reducing the uncertainty through advocating for more research. The IPCCs initial approach for climate science. But: “For each head of the uncertainty monster that science chops off, several new monster heads tend to pop up due to unforeseen complexities.” Monster adaptation: Transform the monster (the ‘quantification strategy’). Attempts to quantify uncertainties. Monster adapters feel uncomfortable with anything that does not fit in a spreadsheet. Use of subjective probability and Bayesian approaches. Externalise the subjective parts and uncertainties into ranges of scenarios (the model is pure science, only the inputs are subjective). Monster embracement: Emphasize the monster (the ‘nothing is certain strategy’). Emphasizes the limits of the positivist schools of thought. Distortion and magnification of uncertainties. Sometimes denial of the reality of risks by pointing to all those uncertainties. Monster assimilation: Rethink the monster (the ‘post-normal science strategy’). Scientific consensus is unlikely in complex ‘post-normal situations’ (facts uncertain, values in dispute, high decision stakes), so we need to drop our demand for a single certain truth. Instead, strive for transparency of the various positions and learn to live with ambiguity and pluralism in risk assessment. Could create new monsters.
  15. 15. IV. Examples 84 MARJOLEIN B. A. VAN ASSELT AND JAN ROTMANS Figure 2. Flow chart of methodology of multiple model routes. 5.1. META-PERSPECTIVE ON PLURALISM A perspective is defined here as a coherent and consistent description of the per- ceptual screen through which (groups of) people interpret or make sense of the world and its social dimensions, and which guide them in acting. In accordance with this definition, a perspective has two dimensions: A worldview and a manage- ment style. In van Asselt (2000), we argued that there is pluralism about pluralism, A particular form of scenario analysis: Pluralistic uncertainty management implies that a model comprises a set of perspectives, instead of one (usually hidden) perspective: ‘multiple perspective-based model routes’ A perspective is reflected in choices concerning model inputs, parameter choices, model structure and equations. In this way, uncertainty and legitimate alternative interpretations are not hidden, but are made explicit The implemented and calibrated model routes allow for systematic experimentation. Dystopias describe what would happen to the world if reality proved not to resemble the adopted world view following adoption of the favoured strategy, or vice versa. Source: Van Asselt and Rotmans (2002)
  16. 16. • perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions • a coupled climate and social model resulted in behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C for 2100 global temperature change). • functional form of behavioural response had the largest influence on temperature projections • 766,656 simulations, each representing a unique combination of model structure and parameter values • regression tree to partition the variation in final projected global temperature across model parameters and structure IV. Examples
  17. 17. IV. Examples: where to deal with uncertainty 1. Stakeholder consultations: elicitation of uncertainties • Which dynamics are important? • Are there divergent perspectives? 2. Modelling: analysis of uncertainties • How flexible is the modelling approach? • Which kind of uncertainties can be dealt with? 3. Communication: report on uncertainties • Which uncertainties should be discussed? • In which form?
  18. 18. V. Discussion • How do you deal with uncertainty in you work? • Which ‘monster strategy’ do you prefer? • How to address uncertainty in your case study? • What kind of input could help you from our side?
  19. 19. VI. References Arthur, W. B. (1999). Complexity and the economy. Science, 284(5411), 107-109. Beck, M., & Krueger, T. (2016). The epistemic, ethical, and political dimensions of uncertainty in integrated assessment modeling. Wiley Interdisciplinary Reviews: Climate Change, 7(5), 627-645. Beckage, B., Gross, L. J., Lacasse, K., Carr, E., Metcalf, S. S., Winter, J. M., ... & Kinzig, A. (2018). Linking models of human behaviour and climate alters projected climate change. Nature Climate Change. Van Asselt, M. B., & Rotmans, J. (2002). Uncertainty in integrated assessment modelling. Climatic change, 54(1-2), 75-105. Van der Sluijs, J. (2005). Uncertainty as a monster in the science–policy interface: Four coping strategies. Water science and technology, 52(6), 87-92. Walker, W. E., Harremoës, P., Rotmans, J., van der Sluijs, J. P., van Asselt, M. B., Janssen, P., & Krayer von Krauss, M. P. (2003). Defining uncertainty: a conceptual basis for uncertainty management in model-based decision support. Integrated assessment, 4(1), 5-17.

×