Lecture 22 - Model Validation

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall with Kai
GameKnowt Play
New
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/6

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

7 Terms

1
New cards

Why Validate Models?

A lot of environmental models are black boxes, meaning its hard to gain any sort of understanding of the system and how it works. By validaitng models, we can make it more transparent and build our understanding of the system.

There are various tools to test and evaluate environmental models:

  • Sensitivity & uncertainty analysis

  • Cross-validation

  • Etc

2
New cards

Compounding Uncertainties

Uncertainties are brought into models at every step of making and using a model. Remember that this includes:

  1. Conceptualisation of the model & our thinking of the problem

  2. Analysis of relationships with a chosen developed model

  3. Testing, refining, and evaluating the developed model, a feedback loop of adjusting parameters and outputs

  4. Applications of model

Uncertainties are any departure from the real world, any non-complete determinism and can even arise when lots of information is available and accessible (depends on how we use the information)

3
New cards

Epistemic Uncertainty

Uncertainties surrounding determinate facts

  • Measurement error, imperfections in sampling technique (instruments, improper usage, etc)

  • Systematic error, bias in the sampling technique

  • Natural variation/Inherent randomness, always present in systems (non-deterministic systems)

  • Model uncertainty from omission of information and representation

  • Subjective judgement, the ways we interpret data (our bias, understandings, interpretations, etc)

4
New cards

Linguistic Uncertainty

Uncertainties surrounding communication of models

  • Non-numeric vagueness, fuzziness around (a lot, many, etc) and permits borderline cases

  • Context dependence, failure to understand the context (relational, etc)

  • Ambiguity, fact words that have more than one meaning

  • Underspecificity, unwanted generality

5
New cards

Location Uncertainties

  • Context: the boundaries of system (how closed is it, have they been identified, etc)

  • Model uncertainty, in the structure (structural uncertainty, number of inputs, how they’re used) or from hardware/software (technical uncertainty)

  • Inputs, some of which can be controlled (random), others can’t. Uncertainty about exogenous factors that control a system (& their importance) and data available to describe the system (amount of knowledge vs. variability of data)

  • Parameter uncertainty, associated with data used to parameterise and calibrate a model, which tend to be oosely constrained & difficult to measure

  • Outcome uncertainty, being the prediction erros

6
New cards

Level of Uncertainties

Depending on the system investigating, our level of uncertainties can vary

  • Statistical uncertainty, we can describe with statistical terms (sampling error, inaccuracies, etc)

  • Scenario uncertainty, related to uncertainty in qualitative outcomes

  • Recgonise ignorance, ‘known unknowns’

  • Total ignorance, ‘unknown unknowns’

7
New cards

Nature of Uncertainties

  • Epistemic uncertainty, in what we ‘know’, and is reducible (limited & inaccurate data, measurement errors, limited understanding, subjective judgements, etc)

  • Variability uncertainty, randomness induced by variation associated with many potential sources (external input data, parameters, and certain model structures, just field things and we know that much about because it’s chaotic)