1/72
LOs: explain what a model is and list potential goals of modelling efforts (abstraction, simplification, prediction, explanation explicit vs implicit modelling, theory testing); Explain and apply Marr's three levels of analysis (computation (problem), algorithm (rules), implementation (physical), top-down vs bottom up approach)
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
model
a simplified or idealised representation of a more complex thing
statistical models
a mathematical relationship between variables, that holds under specific assumptions
theoretical cognitive models
description of the relationship between different mental processes, that makes assumptions about the nature of these processes
difference in aims of statistical vs theoretical models
theoretical models try to explain something and provide further predictions from these descriptions but stat models just describe the relationship
cognitive box and arrow models
models that describe the relationship between different mental processes, under the assumption that the mind operates like a multi-staged information processing machine
Broadbent (1958) model- what did he suggest
box and arrow model of attention
suggested multiple stages
sensory inputs into retina
that input goes through a selective filter
if passes, can go to high level processing
eventually into working memory
posited that if an input is attended to, it will go through all of these stages- box arrow box arrow
if unattended, will go through first box but no further- just be processed in the sensory stage
classic example of a box and arrow model
Broadbent’s model of attention
inputs into retina → sensory buffer store (if attended, continues) → selective filter → higher level processing → working memory
sensory buffer store
identifies physical characteristics
unlimited capacity
first stage of Broadbent (1958) attention model
selective filter
further processing based on goals
higher level processing
extracts meaning from the input
evolution of box and arrow models
mind has stages, info goes through stages
started off simple, but can gradually become very complex
e..g ppl added to broadbent’s model bc found it did not fulfil purpose
e.g. Wolfe (2021) visual guidance search model
choosing most important, essential, central components
how can we test cognitive models
manipulating the input and observing the output can offer glimpse into machination of mind, allowing us to test our models
change input and register output
types of theoretical models
informal (box and arrow)
formal (computational)
what type of model is box and arrow
informal theoretical
what type of model is computational
formal theoretical
formal cognitive models
a mathematical description of the relationship between mental processes, usually expressed through computer code
formal cognitive tries to create formula to tell exact mathematical relationship between input and output- computer code simulation of mental processes
why propose should make formal cog models
some suggest box and arrow not fit for purpose
model distinctions
theoretical
informal (box and arrow)
formal (computational)
statistical
George Box quote (1976)
all models are wrong, but some are useful
Alfred Korzybski quote (1931)
the map is not the territory
what do all models do?
make simplifications and abstractions
when we create a model, what is acknowledged?
not going to describe all the info we r describing, only the parts we think are critical for what we are trying to represent
therefore make models are simplifications and abstractions
simplification
make something simpler
abstraction
generating general rules and concepts from specific information
what must models in science produce
some predictions
what can prediction be
directional or numerical
Popper’s key philosophy
non scientific theories explain after the fact but cannot provide falsifiable predictions
must be falsifiable to be scientific
how accurate is a model that provides numerical predictions?
can be more or less accurate than a directional one
how do we use informal models to predict and explain? stages plus example
framework → theory → model → hypothesis → data
(arrows looping back other way too)
e.g.:
cognitive psych → early selection theory → broadbent’s filter model → irrelevent stimuli that contain target defining feature will be automatically detected → new gorilla experiment detection rates, t tests
framework
the conceptual system that defines terms and provides context
e.g. cog psych
theory
a scientific proposition that provides relations between phenomena
e.g. early selection theory
model
a schematic representation of a theory, more limited in its scope
e.g. broadbent’s filter model
hypothesis
a narrow testable statement
e.g. irrelevant stimuli that contain target defining feature will be automatically detected
data
collected observations, often as part of an experiment
e.g. new gorilla experiment, detection rates, t tests
what do statistical models do
predict and do not try to explain
what do theoretical models do
predict and try to explain
explanation without exact prediction
models of schizophrenia can indicate causes but cannot yet predict individual cases
model may be able to predict group diffs, not individual cases
prediction without explanation
some models can predict whether an individual will develop Alzheimer’s, even though we are not close to understanding the factors that explain Alzheimer’s
informal cognitive models
a verbal description of the relationship between different cognitive procedure
where often some assumptions are implicit
often provides only directional predictions
formal/computational models
a mathematical description of the relationship between different cognitive procedure, often instantiated via computer program/simulation
assumptions are explicit
often provides numerical predictions
informal vs formal models
verbal vs mathematical
implicit vs explicit
directional vs numerical
how do we use formal models to predict and explain?
framework → theory → specification → implementation → hypothesis → data
(with loops arrows up)
what are parts of how formal models explain that are not part of how informal models explain?
specification and implementation
specification
a formal description of the relations described by a theory
the formal model, comprised of symbolic representations
implementation
a specific instantiation of a specification
a computer program, able to simulate and predict numerical outputs from input
strengths of formal models
more accurate predictions
counter intuitive predictions
benefits of explicit assumptions
strength of formal models: more accurate predictions
numerical simulation means can see if model provides unreasonable predictions so easier to reject bad models
can help select which experiments to perform
can provide more subtle form of hypothesis testing, see how close a model is to predicting an actual result
strength of formal models: counter-intuitive predictions
model can more clearly describe which predictions follow from a model
with informal models, it is hard to notice when they make counter intuitive predictions; formal models clearly produce such predictions
idea of implicit vs explicit assumptions
reduces hindsight bias
reveal when intuitions do not match up with theory
coherent motion: are dots going more left or more right? use of modelling + findings
counterintuitive model where formal helps
difficult task
noise = dots moving up and down, neither left nor right
idea is some kind of evidence accumulation process
where we are sampling visual field until get enough evidence to make right decision
evidence accumulation can be measured using decision time
how long does it take for us to make a decision
reaction time measure
question of what happens if have more noise
intuition wld be that decision time wld be slower bc do not know
their computational model suggested with more noise, wld get to boundary of leftward decision more quickly. going to have more errors, but correct responses gonna be faster
so more noise = shorter response times
this is clearly counterinuitive
this model cld reveal that if our model is correct, adding more noise wld shorten response time
and this is what wad found
what is the best kind of prediction? Popper
counterinuitive
uinexpected and more risky so even more impressive
strength of formal models- benefits of explicit assumptions
by making assumptions explicit, we can reveal unanswered questions, flaws in our reasoning, contradictory or unreasonable assumptions
“what i cannot create, i do not understand” - feynman
when assumptions are implicit, sometimes do not notice that they are incorrect or unreasonable
cons of formal models
require substantial expertise
transparency: transparent mostly for experts
comparison: best compared against other computational models
prediction: sometimes numerical predictions are premature
progress: changing the model is costly time wise which can limit progress
theory 1: a computational model may give the semblance of scientific validity (neural network models)
theory 2: making a model simulate a cognitive task does not neccessarily teach us more about cognition
hype timeline
innovation trigger, then rapid spike up to a peal of inflated expectations, then rapid dip into trough of disillusionment, then steady climb of the slope of enlightenment, then plateau of productivity
David Marr
1945-1980
british mathemetician and neuroscientist
worked on visual processing
question: how can we understand information processing systems like the brain?
biggest legacy was that we can understand and model a system at a number of levels: computation, algorithm, implementation
problem with understanding the brain (Marr)
we can only ever hope to sample from a tiny fraction of its activity, in a tiny fraction of a bit of brain
how did Marr propose we make sense of brain data
break any brain problem into three levels
Marr’s levels of analysis
computation
algorithm
implementation
computation
the problem being solved
algorithm
the steps/rules to solve it
implementation
the actual machinery
what type of approach would Marr prefer?
top-down because then not bogged down by infinite amount of data we find (elephant in the dark)
how should we consider Marr’s levels?
consider all three at same time, up and down, down and up
botttom-up approach: neuroscience.. and AI ( Kriegeskorte & Douglas, 2018)
implementation: the machinery of neural circuits
→
rules: what representations and algorithms can we generate, given specific neural circuits?
→
problem: what problems are solved by these algorithms?
exp