1/59
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
describe mental imagery and how it relates to perception
mental imagery: the act of “perceiving” without stimulus
Farah: imagery primes perception
Perky: perception primes imagery
imagery is more fragile and effortful but uses the same machinery as perception
explain the Kosslyn–Pylyshyn debate: spatial vs. propositional representations
evaluate evidence from mental scanning, brain imaging, and TMS
brain imaging (Kreiman et al): neurons in medial temporal lobe respond to both perceiving and imagining
fMRI (Le Bihan et al): primary visual cortex is activated during recall
Ganis et al: task: participants either perceived a faint image or imagined it
Finding: there was a near-complete overlap in the frontal and parietal areas with differences in the early visual cortex
imagery involves high level visual processing while perception uses lower levels
suppressing the non visual (Amedi et al): during visual processing, non-visual areas become suppressed. This mirrors what happens during visual perception. the brain treats imagery like perception processing that requires protection
TMS (Transcranial Magnetic stimulation): magnetic pulses induce an electrical current in cortical neurons, causing a temporary disruption in the region. if disrupting x impairs y, x causes y.
articulate where current evidence leaves the imagery debate
TMS ends the epiphenomenon debate.
Epiphenomenon would say that disruption in the visual region would not impair imagery, however it impairs both imagery and visual perception
aphantasia
no visual imagery
hyperphantasia
higher levels of visual imagery than normal
dual coding theory
words that evoke an image in the mind are more easily remembered
Shepard and Metzler
mental rotation:
determine whether or not two images are the same or mirror images - they are rotated
finding: reaction time increases linearly with angle or rotation, meaning mental imagery is not symbolic
Kosslyn
spatial (depictive)
Mental scanning:
people remember a picture - they mentally scan across the image to find an answer
finding: longer distances take longer to scan
mental images perserve spatial information
map:
they do this mental scanning implicitly even when told not to
Mental size:
told to imagine a rabbits whiskers when next to an elephant vs a fly, took longer for elephant but easier for fly
finding: larger images need finer detail making for more processing time
Pylyshyn
propositional
imagery is an epiphenomenon
it accompanies cognition but has no causal role
mental representations are propositional abstract symbol structures
The spatial appearance of imagery is in the mind of the observer, not in the representation
Key argument:
people are not really scanning a visual image.
participants are acting out what is expected (that it takes long the further the distance)
Evidence:
when told to imagine faster they are able to
if imagery was spatial it would be contained by a medium
Thatcher effect
rotation of the face:
rotation of facial features is undetectable upside-down
when mentally rotating a face we cannot process all of its components at once
imagery is not a single picture - it has propositional components
Pylyshyn: this is what is expected if imagery is a structured symbol system
farah
imagination primes perception:
task: people told to imagine a H or T - one of the letters get flashed
finding: people are primed to respond the letter that gets flashed
finding: imagination and perception share a common representational substrate
language
communication system that uses symbols to convey meaning
allows us to learn from others
creativity of language
language is a hierarchical system
components combine into larger units: sound, words, sentences, stories
governed by rules that determine how components can be arranged
we can generate and understand sentences we have never encountered before
this creativity sets human language apart from animal communication
Universality of Language
almost everyone learns a language
deaf children will learn to develop their own sign language if no one else knows sign language
early language development is similar across children
core features: word, grammar, nouns, verbs, negatives, questions, tense
psycholinguistics
the study of cognitive processes humans use to understand produce and learn languages
comprehension, production, representation, acquisition
components of language
phonetics, phonology, morphology, syntax, semantics, pragmatics
speech sounds, phonemes, words, phrases literal meaning of sentences, meaning in context of discourse
phoneme: the smallest unit of sound that changes meaning
grapheme: the smallest written unit corresponding to a phoneme
morpheme: the smallest meaningful unit
lexicon:
syntax:
semantics:
phonemes and graphemes
phoneme: the shortest segment of speech that if changed changes meaning
grapheme is the written representation of a phoneme
languages differ in orthographic depth: how regular the mapping is between sound and letters
shallow orthographies: consists sound-letter mapping
deep orthographies: irregular mappings - “there/their” “two/too/to”
morphemes
smallest unit of language with actual meaning
free morpheme can stand alone as words “bed” “room”
bound morphemes attach to other morphemes (un-, -ing, -s)
morphological processing:
mental lexicon
your mental dictionary
lexicon size depends on education, age, and what counts as a word
starts at around one at age one, but near the end roughly is around: 42,000 - 50,000
vocabulary grows rapidly in childhood and continues to grow to adulthood
spoken word perception
recognizing spoken word is harder than it seems
sloppy pronunciation: we rarely articulate every sound clearly
different accents: the same word sounds very different across speakers
no clear boundaries: in continuous speech, words run together without pauses
the brain uses context and statistical regularities to segment the speech stream
speech spectrograms reveal enormous variation even from eh same phrase
word superiority effect
a letter is recognized faster when it appears in a word than when presented alone
suggests words and letters are activated simultaneously, not serially
Rumelhart and McClelland’s interactive and activation model
top-down processing: word-level knowledge feeds back to her recognize letters
similarity, a sentence superiority effect words are recognized faster in grammatical sentences
word frequency and ambiguity
word frequency effect: words that occur more frequently are recognized faster
confound: high-frequency words tend to be shorter
lexical ambiguity: many words have multiple meanings
biased dominance: one meaning is much more frequent than the other
balanced dominance: meaning are roughly equal (cast → play/plaster)
access speed depends on both dominance and sentence context
semantics and syntax
semantics: the meaning off words and sentences
syntax:the rules for combining words into grammatical structures
semantic violation: meaning is wrong
syntactic violation: grammar is wrong
the brain processes these two types of information through different mechanisms
Parsing and Garden path sentences
parsing: mentally grouping words into phrases to extract meaning
garden path sentences initially appear to menacing one ting then turn out to mean another
late closure: the parser assumes each new word belongs to the current phrase
when this assumption fails, the reader must reparse causing mental confusion
context helps resolve ambiguity in real time
understanding stories
understanding text requires building coherent mental representations
we go beyond what is explicitly stated through inferences:
anaphoric inference
instrument inference: filling in implied tools or methods
causal inference: linking events to their cuases
situation models: we represent stories as if we were experiencing the events ourselves
embodied language comprehension
situation models: suggest we mentally simulate what we read
Stanfield and Zwaan: participants responded faster to pictures matching the implied orientation in a sentence
same effect for implied shape
brain imaging: reading action words activates similar cortical areas as actually performing those actions
language understanding is deeply connected to perception and action
conversation and language production
conversation is dynamic and rapid: speakers take turns seamlessly
given-new contract: speakers structure sentences with given (shared) information before new information
common ground: mutual knowledge and assumptions between speakers
same-as-me bias: we always treat new people like they are like us
syntactic coordination: conversation partners tend to adopt similar grammatical constructions
syntactic priming: hearing a specific construction increases the chance you will use it too
reduces computational load making conversation flow more smoothly
language disorder
Broca’s aphasia: difficulties making sentences
Wernicke’s aphasia:
Broca’s and Wernicke’s area
two brain regions for language: Wernicke (temporal lobe) language comprehension and semantics
Broca’s area (frontal lobe) language production and syntax
modern neuroscience reveals a more distributed network (Perislyvian network)
Aphasia
Broca (non fluent) patients speak slowly with jumbled words
Wernicke (fluent)
ERPs for language
N400: a negative wave at 400ms, sensitive to semantic violations, associated with temporal lobe (Near Wernicke’s area)
P600: a positive wave at 600ms sensitive to syntactic violations associated with frontal lobe (near Broca’s area)
language lateralization
language is primarily processed in the left hemisphere
lateralization has surprising consequences for perception:
Gilbert et al:
how do we learn languages
three major theoretical perspectives:
behaviorist (Skinner): language is entirely learned through reinforcement
nativism (Chomsky): language is rooted in an innate universal grammar
statistical learning: language is acquired by detect
Skinner
Behaviorism:
language is entirely learned behavior shaped by reinforcement and conditioning
there s no need to assume underlying mental structures like grammar
the problem is that children proceed sentences that they have never heard, including ones that they wouldn’t be rewarded for
Chomsky
universal grammar:
we poses an innate universal grammar which is shared across languages
children only need a social context, not explicitly teaching or reinforcement
criticism: how do we explain the enormous difference in grammar between languages
Statistical learning
does language shape thought?
Sapir-Whorf hypothesis
Evidence for linguistic relativity
Color perception:
spatial reasoning:
Grammatical gender:
gilbert et al
criticisms and nuance
the Inuit show myth:
What is an emotion?
an inferred sequence of reactions
to a stimulus designed to have an effect upon
the stimulus that initiated the complex
sequence”
Plutchik’s Definition of Emotion
emotions are responses to objects and events that take place in the environment
emotions are functional, quickly, and efficiently facilitating action that has an effect on the world around us
it takes place inside people and must be inferred rather than observed
emotions versus moods
emotions:
short lived, tied to an identifiable trigger, lead to a specific action
Moods:
longer-lived, less tied to identifiable triggers, doesn’t have a specific action tied to it
Approach vs withdrawal
behavioral inhabitation vs behavioral activation (BISBAS)
emotions can be distinguished by tendencies to either approach or withdraw
circumplex model
when people rate similarity of emotions into words, placing closely related words produces a two dimensional circle:
two minions are:
valence: how pleasant is it
arousal: how strongly do you feel it
proposed criteria for basic emotions
it should universal among humans
should have a universal innate form of non-verbal expression
should be evident early in life
should be physiologically distinct form each other in the body and in the brain
Plutchik Model
why we study emotion
originally emotion and cognition were studied separately but now it is shown that emotion and cognition and intertwined
amygdala
it is in the medial temporal lobe
it is a threat detection system
receives sensory input via two routes:
fast subcortical pathway (thalamus to amygdala) for rapid detection
slower cortical pathway for detailed evaluation
emotion and attention
dot-probe task
boraden-and-Build hypothesis of positive emotions
negative emotions may focus attention, in order to solve problems
positive emotions may broaden attention, in order to identify new opportunities and build resources
Navon Task: global (broad) vs local (focus) attention
critical modification to Broaden-and-Build
not all positive emotions may have the same effect on attention, that w need to consider the emotion’s motivational intensity:
pre-goal emotions are higher in motivational intensity, involve solving of problems
post goal emotions are lower in emotional intensity, involve reactions to emotions that have already happened
approach motivation and breadth of attention
emotion and memory: encoding
memory encoding: the initial formation of a new memory
questions: are emotionally arousing images better remembered?
procedure: view 60 photos of objects and events ranging from neutral to emotionally intense
arousal and memory
yerkes-dodson law: attention, learning, and other aspects of cognition are at their best when arousal at an intermediate level
physiological arounsal and memory
does emotional memory facilitation depend on physiological arousal?
procedure:
results:
emotions and memory consolidation
amygdala and hippocampus are closely related.
synaptic tag-and-capture hypothesis:
the brain tags new memories for enchanted consolidation later, if later events indicate that memory is especially important
emotion and memory: retrieval
state-dependent retrieval: we’re more likely to recall a memory in a state similar to the one we were in when the memory was formed