1/52
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Aphasia
a disorder of language, thought to have neurological causes, in which either language production, language reception, or both, are disrupted
Broca's aphasia (expressive aphasia / motor aphasia)
an organic disorder of aphasia with symptoms including difficulty speaking, using grammar, and finding appropriate words
Curare derivative
a widely used paralyzing agent during medical and surgical procedures
Grammar
a system of rules that produces well-formed, or "legal" entities
Gricean maxims of cooperative conversation
pragmatic rules of conversation, including moderation of quantity, quality, relevance, and clarity
Informationally encapsulated process
a process with the property of informational encapsulation ("the floor is made out of floor")
Lateralization
specialization of function of the two cerebral hemispheres
Lexical ambiguity
the idea that some words have different meanings
Lexical decision task
a task in which an experimental subject is presented with letter strings and asked to judge, as quickly as possible, if the strings form words
Linguistic competence
underlying knowledge that allows a cognitive processor to engage in a particular cognitive activity involving language, independent of behaviour expressing that knowledge
Linguistic performance
the behaviour or responses actually produced by a cognitive processor engaged in a particular cognitive activity involving language
Manner of articulation
the mechanics of how the airflow is obstructed, creating a particular sound
Modularity hypothesis
Fodor's proposal that some cognitive processes, in particular language and perception, operate on only certain kinds of inputs and operate independently of the beliefs and other information available to the cognitive processor or other cognitive processes
Morpheme
the smallest meaningful unit of language
Morphology
the study of the meaningful units of language (words)
Parsing
analyzing (a sentence) into its parts and describe their syntactic roles
Phoneme
the smallest unit of sound that makes a meaningful difference in a given language
Phoneme restoration effect
a perceptual phenomenon where under certain conditions, sounds actually missing from a speech signal can be restored by the brain and may appear to be heard
Phonetics
the study of speech sounds
Phonological rules
rules that govern the ways in which phonemes can be combined
Phonology
the study of the ways in which speech sounds are combined and altered in language
Phrase structure rules
rules that describe the ways in which certain symbols (phrases) can be rewritten as other symbols
Place of articulation
the place where the obstruction of airflow occurs, creating a particular sound
Pragmatics
the rules governing the social aspects of language
Preposing
taking a certain part of a sentence and moving it to the front, usually for emphasis
Propositional complexity (of a sentence)
the number of underlying distinct ideas in a sentence
Semantics
the study of meaning
Spectrogram
a graphic representation of speech, showing the frequencies of sound, in hertz (cycles per sound), along the y axis, plotted against time on the x axis, in which the darker regions indicate the intensity of each sound at each frequency
Speech act theory
a subfield of pragmatics that studies how words are used not only to present information but also to carry out actions
Assertives
when the speaker asserts their belief in some proposition
Directives
instructions from the speaker to the listener
Commissives
utterances (promises) that commit the speaker to some later action
Expressives
statements that describe the psychological state of the speaker
Declarations
speech acts in which the utterance is itself the action
Speech errors
instances in which what the speaker intended to say is quite clear, but the speaker makes some substitution or reorders the elements
Syntax
the arrangement of words within sentences; the structure of sentences
Constituents
small groupings of words
Truth conditions
the circumstances that make something true
Voicing
the vibration of the vocal cords to produce a new particular sound
Wernicke's aphasia (receptive aphasia / sensory aphasia)
an organic disorder of aphasia with symptoms including difficulty in understanding speech and producing intelligible speech, although speech remains fluent and articulate
Whorfian hypothesis (of linguistic relativity)
the idea that language constrains thought and perception, so that cultural differences in cognition could be explained at least partially by differences in language
McGurk effect
an auditory-visual illusion that illustrates how perceivers merge information for speech sounds across the senses
Whorfian hypothesis
the language you know shapes the way you think about events around you
Miller (1990)
described two fundamental problems in speech perception:
Speech is continuous
A single phoneme sounds different, depending on context
Although the actual acoustic stimulus can vary infinitely in its phonetic properties, perception of speech sounds is categorical:
in processing speech sounds, we automatically (without awareness or intention) force the sounds into discrete categories
Lisker and Abramson (1970)
used a computer to generate artificial speech sounds consisting of a bilabial stop consonant (which sounds like either a \b\ or a \p\ sound) followed by an "ah" sound. The \b\ and \p\ sounds have the same consonantal features and differ only in voice onset times
Results:
Any syllable with a VOT of 10.03 seconds or less was heard as a "ba"
Any syllable with a VOT of more than 10.03 seconds was heard as "pa"
Participants did not report differences in the sounds of the syllables that were on the same side of the boundary
A syllable with a VOT of 20.10 seconds was indistinguishable from a syllable with a VOT of 20.05 seconds
Two syllables that were just as close in VOTs but fell on opposite sides of the boundary (ex. 0.00 and 10.05) were identified by 100% of the participants as being different sounds: a "ba" sound and "pa" sound respectively
We pay attention to certain acoustic properties of speech (those that make a meaningful difference in language), but ignore others
Categorical perception has also been demonstrated for some nonspeech sounds (ex. Tones, buzzes, and musical notes played on different instruments)
Very young infants can discriminate, many, if not all, of the sound distinctions used in every language
This ability begins to narrow to just the phonemes in the infant's primary language when the infant is about 6 months of age
Massaro and Cohen (1983)
examined the categorical perception of the stop consonants \b\ and \d\ (two sounds that differ only in the place of articulation). Participants heard nine computer-synthesized syllables that ranged in their acoustic properties from a clear "ba" sound to a clear "da" sound.
Results:
The participants did not notice a discrepancy when the auditory information presented was "ba" but the videotaped speaker was saying "da"
What the speaker appeared to be saying influenced what was heard: syllables in the middle of the "ba" and "da" continuum were perceived slightly differently as a function of what the speaker appeared to be saying relative to the perception reported in the neutral condition
We also make use of visual information in the perception of speech
Warren (1970)
presented participants with a recording of the sentence: "The state governors met with their respective legi*latures convening in the capital city," in which a 120-millisecond portion had been replaced with a coughing sound (indicated by the asterisk)
Results:
Only 1 of 20 listeners reported detecting a missing sound covered by a cough, and the one who did misreported its location;
The other 19 demonstrated phoneme restoration effect
Neely (1977)
included prime-target pairs (ex. "BIRD"-"sparrow") that were related in meaning. He also instructed participants that whenever they saw the prime word "BUILDING," they should expect it to be followed by a target word that named a part of the body (ex. "foot").
Likewise, whenever they saw the prime word "BODY," they were told to expect it to be followed by a target word that named a part of a building (ex. "door").
For most trials, the rule was consistent, however, the experimenter violated the switch instructions and paired the primes and targets in the "regular" related manner (ex. "BUILDING"-"roof"; ex. "BODY"-"foot")
Results:
Showed two types of processes that could be responsible for semantic priming:
Fast-acting automatic spread of activation
When the target word was presented very shortly after the prime word (ex. 250 ms), the instructions did not matter
Slower expectancy-driven process
When the target word was presented with an increased interval that separated the presentation of the prime and target (ex. 700 ms), the instructions mattered
Smith, Brown, Toman, and Goodman (1947)
injected Smith with a curare derivative which paralyzed all his muscles and necessitated the use of an artificial respirator.
Paralysis did not prevent him from other kinds of cognitive activity
reported remembering and thinking about events that took place while under curare
subvocal speech and thoughts are not equivalent
Fodor (1983, 1985)
argued that cognitive processes, perception and language, are modular:
Domain-specific
Operates specifically with certain kinds of input and not others
Sentence parsing involves processes that are specific to the division of phrases and words into constituents
Such processes are meant only for parsing and are of little use in other cognitive tasks
Informationally encapsulated process
Operates independently of the beliefs and the other information available to the processor
Operates relatively independently of other processes
Modular processes operate automatically and independently
Modular processes are domain specific
Specialized to work with only certain kinds of input
The syntactic parsing aspects of language are not used in other kinds of cognitive processing
Bloom (1981)
noticed that the Chinese language lacks a structure equivalent to those in Indo-European languages that mark a counterfactual inference (ex. "If your grandmother had been elected president, there would be no taxation").
Gave both Chinese-speaking and English-speaking participants different stories to read in their native language
Results:
7% of Chinese-speaking participants offered counterfactual interpretations of the story
98% of English-speaking participants offered counterfactual interpretations of the story
Little evidence suggests that language constrains either perception (as demonstrated in the colour-naming studies) or higher-level forms of thinking (as demonstrated in the counterfactual reasoning studies)
Petersen, Fox, Posner, Mintun, and Raichle (1988)
examined the processing of single words using PET scans. Presented participants with single words, either in writing or auditorily, and were asked either to make no response, to read written stimuli, or to generate a word related to the presented word
Results:
Different areas of the brain were activated for different tasks:
The areas activated did not overlap --> the area of the brain activated in written-word recognition is separate from that area activated when words are heard
Concerning Broca's area…
Not all patients with lesions in Broca's area develop Broca's aphasia
Not all patients with Broca's aphasia have damage in Broca's area
Not all Broca's aphasia patients show the same degree of impairment: many of them show an inability to process subtle nuances of language