COGSCI Exam 2

0.0(0)
studied byStudied by 2 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/197

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 1:30 AM on 10/23/25
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

198 Terms

1
New cards

Most sentences you encounter are new to you

In your whole life, you probably hear on the order of a billion sentences (give or take an order of magnitude). The chances you hear an individual sentence twice are pretty slim. It's remarkable that, despite this, you have no trouble with (most) sentences you hear!

2
New cards

How does language work?

language is a thought transmission system; it transmits though from one mind to another via externalization.

Steps:

Person A thinks of something, then something happens, and the emit the signal.

Person B receives their signal, then something happens, and they think of something.

*****meaning to form to meaning, essentially

3
New cards

Marr's Tripartite Explanatory Framework

Functional level: problem that the capacity is supposed to solve (form to meaning)

Algorithmic level: procedures that enable the problem to be solved (form-meaning mapping/tree structures)

Physical level: neural/chemical substrates implementing the procedures

4
New cards

Noam Chomsky

The way of thinking about the scientific study of human language that will adopt is largely due to Noam Chomsky, and traces back to his work in the 1950s and 1960s.

We will refer to Chomsky's approach to language as cognitivism.

Cognitivism (sometimes called "generative" or "Chomskyan") provides us with a view of language as a mind-internal computational system.

5
New cards

TRUE OR FALSE: According to the chapters from Language Unlimited by David Adger, children may learn the very basics of grammar informally from their parents, but they typically won't develop complex "mental grammar" until they're taught the finer details in school.

False! We actually passively/reflexively learn many intricate things relating to language without being taught them.

6
New cards

Poverty of Stimulus: Language Edition

we have been classifying theoretical approaches in cognitive science with reference to poverty of the stimulus, in the case of language, most language learning is entirely reflexive, it happens rapidly, and the input is relatively sparse, yet you know a lot of complicated things about your language! In fact, kids know most of the interesting stuff by the time they're 4, 5, maybe even 6 years old.

7
New cards

Chomskyan view

innate structure makes up the gap between the child's experience and attained knowledge.

so: input + innate structure = output

8
New cards

Associationism view

denies the reality of the poverty of stimulus problem: input is sufficiently rich for the output

so: input = output

9
New cards

nativism view on language

language is entirely innate (nature)

10
New cards

associationism view on language

language is entirely a matter of learned associations

11
New cards

chomskyan cognitivism

Nature (universal grammar) and nurture (learning) account for language! this view combines the nativism view and associationism view and takes a hybrid position between the two

12
New cards

behaviorism

behaviorism (by B.F. Skinner) is a descendant of the older associationist tradition; it essentially posits that all behavior, including language, is learned by stimulus-response associations, so no innate structure, just learning

13
New cards

stimulus response associations

if you frequently get a reward after a certain behavior in a given situation, you become more likely to do that behavior in that situation (so like if a parent showed a kid an onion and they said onion! then the parent approves and the word-object is reinforced; so how they would learn sentences is just that they are reinforced as correct, so if a child sees an onion in a suitcase and says 'the onion is in the suitcase', the parent approves and the sentence-situation association is reinforced)

14
New cards

according to associationism, what is "in the head" that explains language?

a bunch of associations between specific verbal actions and specific situations (plus a minimal learning apparatus that enables learning go associations)

it got there through associationistic learning, in particular instrumental conditioning

15
New cards

what's wrong with the associationistic approach to language?

it can't explain core properties of language:

1. stimulus independence

2. novelty

3. productivity

4. systematicity

16
New cards

core property of language: stimulus independence

people routinely utter sentences in contexts that are independent of the contexts in which the sentences were learned, like people can say sentences in contexts that they weren't reinforced in

the associationist account can't explain this as it would imply that sentence production is tied to the context of learning

17
New cards

core property of language: novelty

children correctly generalize grammatical rules for made-up words, and they overgeneralize to the point that they make grammatical errors (like "go-ed" or "tooth")

the associationist account fails here too because these behaviors couldn't have been reinforced; the children never heard them before!

18
New cards

core property of language: productivity

productivity is infinite generative capacity

for example, sucessor function: if x is a a whole number (base case), then x+1 is a whole number (define an infinite set from a base case plus a successor function)....so like 'it's raining' to 'Timothy thinks its raining' to 'Mary thinks Timothy thinks its raining'

19
New cards

core property of language: systematicity

language is systematic: knowing a single expression is tantamount to knowing a group of expressions. For example, when you know the sentence "the onion is in the suitcase," you also know the sentence "the suitcase is in the onion," even thoguh that doesn't (normally) make a whole lot of sense! Also, if you learn, for instance, "the onion is between the mercury and the syringe," then you immediately also know that "the onion is between the syringe and the mercury."

but all you heard was the first one, so according to the associationist learning story, that's all you should get

20
New cards

generative/chomskyan view

essentially that "what's in the head"/what we known when we know a language is abstract combinatoric rules (for example, phrase-structure grammars)

21
New cards

how did we acquire knowledge of language according to cognitivism/chomsky?

it's substantially innate - part of universal grammar

22
New cards

universal grammar

this isn't what it sounds like it doesn't mean like universal grammar to all species or whatever.

it means the species-typical capacity (and limits) of the human mind to acquire and use language (for this class, we're going to say this has something to do with phrase-structure grammar)

23
New cards

syntax

syntax is about hierarchical organizations of sentences (words from phrases, phrases from larger and larger phrases, and ultimately sentences)

24
New cards

is the syntax of a sentence its meaning?

no! for example, "colorless green ideas sleep furiously" is a well formed sentence, it's syntactically in-tact but it has no meaning

25
New cards

phrase-structure grammar

a phrase-structure grammar is a set of rewrite rules that defines the structure of every valid sentence in a language (if multiples rules could apply, choose one, apply rules one at a time until no more rules can be applied, once no more rules can be applied. we have generated a sentence)

26
New cards

properties of phrase-structure grammars

phrase-structure grammars are:

hierarchal: a sentence is broken down into abstract constituents, which are in turn broken down into further constituents

combinatoric: the rules are defined over elements that can be recombined in open-ended ways

recursive: rules can feed into each other -- the output of one rule can be the input to another -- potentially forming "infinite loops" that allow sentences to be arbitrarily long

27
New cards

top-down vs bottom-up

top-down would be making the tree from the top down (so starting with S and then making all the branches)

bottom-up would be going from the branches back to the start, S ("given a string of words, what's it's structure?")

this is a hierarchical way to look at phrase-structure

28
New cards

is syntax recursive?

yes! syntax is recursive, a recursive grammar allows a category to branch back into itself, or to a category that eventually leads back to itself

consequences of this are that a sentence can be arbitrarily long and there are infinitely many possible sentences

29
New cards

The sentence, "colorless green ideas sleep furiously" is

semantically meaningless; syntactically well-formed

30
New cards

which of the following is NOT a correct description of phrase structure grammars and phrase structure trees?

they are hierarchical

they are combinatoric

they employ abstract categories (such as noun phrases and verb phrases)

they are deciduous

they are deciduous

31
New cards

ambiguity

language is ambiguous in many different ways!

32
New cards

lexical ambiguity in language

A word can have multiple different meanings!

"I saw a bat" could mean a baseball bat or the animal bat

33
New cards

structural ambiguity in language

words may individually be unambiguous, but there can be multiple ways to assemble them into a sentence "the scouts took the key to the cellar," for example, could either mean they brought the key to the cellar or they took the key that opens the cellar

34
New cards

How does Chomskyan Cognitivism capture our four"key properties" of language?

Stimulus independence: People routinely utter sentences in contexts that are independent of the contexts in which the sentences were learned.

Novelty: Many of the sentences a person knows have never been encountered before.

Productivity: There is no upper bound on the number of sentences in a language.

Systematicity: Knowing a sentences is tantamount to knowing a whole set of related sentences.

35
New cards

Novelty

Recall: Children apply regular rules to new words.• And over-apply regular rules to real words that happen to be irregular.

Associationist account:

Problem! Kids have never been reinforced to do this.

Cognitivist account:

No problem! Kids are doing exactly what the cognitivist account predicts: learning and applying abstract rules.

36
New cards

Systematicity

How is it that knowing one sentence is tantamount to knowing aset of sentences?

Cognitivist answer: Knowing a sentence means knowing its structure. And the same structure can be filled in in different ways!

37
New cards

Principles & Parameters of Universal Grammar

One influential approach to the content of UGholds that UG consists of:

Principles: Abstract properties all language share

Parameters: Limited points of variation along which language scan differ.

38
New cards

Principles

In the Principles & Parameters framework, a principle is an abstract condition or rule that characterizes all languages.

The condition expressed in this slide from last time is one purported example of that.

39
New cards

Structure dependence of rules

All grammatical rules are defined with reference to hierarchical structure (so not, for example, absolute linear position)

40
New cards

What about parameters?

Language variation can often be described in terms of binarychoices

The important thing: Parameter choices are not just the logical possibilities.

Example: In forming content questions, languages either move a question word to the front (English), or they leave it in place (Chinese).

Those are the only options languages use*, even though thereare other logical possibilities

41
New cards

Does UG consist of principles & parameters?

Maybe!

The Principles & Parameters approach is influential,• But there are lots of debates about what the specific principlesand parameters might be.

And there is criticism of the general framework that we won't talkabout here.

42
New cards

Universality of Language

Language is present in all human societies, and in typical development, everyone develops it around the same time.

43
New cards

Equal Complexity of Languages

All languages are equally complex; there are no "stone-age" or "primitive" languages.

44
New cards

Comparison to Non-Innate Traits (like farming)

Non-innate traits such as farming are not universal — they are not present in all societies, not everyone develops them, and they vary in complexity.

45
New cards

Farming as a Non-Innate Trait Example

Farming is not universal, not equally complex across societies (e.g., subsistence vs. industrialized farming), and while some forms are more complex, none are inherently "better."

46
New cards

Key Difference Between Language and Non-Innate Traits

Language shows a universal pattern of development and complexity across all humans, unlike non-innate cultural traits such as farming.

47
New cards

Universality of Language

Language exists in all human societies; in typical development, everyone acquires it on a similar timeline.

48
New cards

Equal Complexity of Languages

All languages are equally complex—there are no "primitive" or "stone-age" languages.

49
New cards

What Universality Might Indicate About UG

If certain properties are universal across all languages, they might reflect innate principles of Universal Grammar (UG).

50
New cards

Alternative Explanations for Universality

Historical accident, logical necessity, or universal functionality could also explain universal traits—must be ruled out before attributing them to UG.

51
New cards

Comparison: Language vs. Non-Innate Traits (like farming)

Unlike language, non-innate traits such as farming are not universal, not equally complex, and not developed by everyone.

52
New cards

Poverty of the Stimulus

The linguistic input children receive is insufficient to explain their knowledge of complex grammatical rules, implying innate linguistic knowledge.

53
New cards

Combinatoric and Abstract Rules

The rules of language combine elements using highly abstract categories; humans apply them without explicit teaching.

54
New cards

Child Language Learning Bias

Children appear to "know" certain aspects of language structure in advance, suggesting an innate bias toward phrase-structure rules.

55
New cards

Universal Grammar (UG) Model

Input → Universal Grammar → Output; UG guides how linguistic input becomes structured language output.

56
New cards

Structure-Dependent Rules

Linguistic rules depend on hierarchical syntactic structure, not simple linear order of words.

57
New cards

Question Inversion Example

"The man who is tall is running" → "Is the man who is tall running?" (Rule: move the main clause auxiliary, not the first auxiliary you see.)

58
New cards

Linear vs. Structure-Dependent Hypothesis

A simpler "front the first auxiliary" rule fits many examples, but speakers consistently use the correct structure-dependent version.

59
New cards

Children's Learning Pattern

Children never even consider the simpler linear rule—showing an innate bias for structure-dependent grammatical rules.

60
New cards

Argument from Universality

Language shows a unique universal pattern—supports innateness since no learned or cultural trait behaves this way.

61
New cards

Argument from Poverty of Stimulus

Language knowledge exceeds input; supports innateness since children cannot learn all grammatical details from experience alone.

62
New cards

Combined Conclusion

Both arguments suggest humans have an innate language faculty, often referred to as Universal Grammar (UG).

63
New cards

Quote - Chomsky (Reflections on Language)

"The rich and complex language individuals acquire is hopelessly underdetermined by fragmentary evidence. This can only be explained by restrictive innate principles guiding language construction."

64
New cards

Purpose of Senghas et al. (2004) Study

To test whether core properties of language—discreteness and combinatorial patterning—can emerge naturally in children's language creation.

65
New cards

Nicaraguan Sign Language (NSL) Background

Emerged in the late 1970s-1980s when deaf children in Managua schools began creating a shared gestural language that became more complex over generations.

66
New cards

NSL Timeline

Before 1970s: Isolated home signs

1977-1981: Deaf schools established

1980s-2000s: Children create and transmit a full language system

By 2004: ~800 signers

67
New cards

Discreteness and Combinatorial Patterning

Every language combines a finite set of discrete elements (sounds/signs) into larger hierarchical structures (words, phrases, sentences).

68
New cards

Main Finding of Senghas et al. (2004)

Successive cohorts of NSL users showed increasing combinatorial and sequential structure, even though this was absent in their input.

69
New cards

Motion Event Coding in NSL

Early signers expressed manner and path simultaneously; later cohorts expressed them sequentially—showing emergent combinatorial structure.

70
New cards

Sequential Combination Bias

Even when simultaneity is possible, humans show a predisposition for linear, sequential structure—suggesting an innate bias for language-like organization.

71
New cards

Quote - Senghas et al. (2004)

"Human learning abilities are capable of creating discreteness and hierarchical combination anew, even when absent from the environment."

72
New cards

NSL & Innateness

NSL shows that children can create language with universal structural properties, supporting the existence of an innate language-learning mechanism.

73
New cards

LEVELS OF ANALYSIS (Marr's Levels Applied to Language)

Functional Level: Describes the problem that the language capacity is designed to solve—mapping meaning to form and vice versa.

Algorithmic Level: Specifies the procedures or rules that enable the problem to be solved (e.g., phrase-structure rules).

Physical Level: Refers to the neural or biological mechanisms that implement language processing.

74
New cards

Hierarchical Organization

Language is organized into nested structures (e.g., phrases within phrases) that determine meaning beyond linear order.

75
New cards

Syntactic Ambiguity

When the same sequence of words has multiple meanings because of different possible structural groupings, not because individual words are ambiguous.

76
New cards

Chapter 1: Creating Language (Reading 1)

Human language is infinitely creative: most sentences we encounter are completely new, yet we effortlessly understand them. This creativity is possible because language is structured hierarchically, not just as linear word strings. Speakers have both a mental lexicon (words) and a mental grammar (rules), which together allow for unlimited combinations. The chapter contrasts two explanations for this ability: (1) Chomskyan / Universal Grammar view — language is supported by innate, species-specific capacities that impose hierarchical structure; and (2) Darwinian / pattern-learning view — language arises from general learning abilities that detect patterns from experience. Examples like English “that”-drop patterns show that input alone doesn’t provide enough data for learners, supporting the existence of Universal Grammar. The chapter argues that self-similarity (structures within structures) gives language its limitless expressive power.

77
New cards

Chapter 3: A Sense of Structure (Reading 2)

Humans have an unconscious "sense of linguistic structure" that lets us interpret and produce sentences beyond what is audible or visible. Ambiguous phrases (e.g., "public park or playground") show that structure, not word meaning, determines interpretation. This sense operates like other perceptual systems: bistable ambiguities (e.g., "She looked up the mountain" or the Becker cube illusion) demonstrate how we select one interpretation at a time. Contextual cues (like adding "desperately") can disambiguate structure. Psychological experiments (Bock's structural priming) reveal that speakers subconsciously reuse syntactic structures, proving structural sensitivity in production. Brain studies (Pallier et al., 2011; Poeppel et al., 2016) show neural regions (notably the Inferior Frontal Gyrus) and brain rhythms track abstract syntax even with nonsense input ("Jabberwocky"). This evidence supports the idea that language involves abstract mental structures — "mental gestures" — that we cannot consciously access, yet rely on constantly.

78
New cards

Senghas et al. (2004): Children Creating Core Properties of Language - Nicaraguan Sign Language (Reading 3)

In a natural experiment on the birth of Nicaraguan Sign Language (NSL), Senghas, Kita, and Özyürek show that children—not adults—drove the emergence of core linguistic properties: discreteness and combinatorial, hierarchical structure. Comparing cohorts (1st: pre-1984; 2nd: 1984–1993; 3rd: post-1993) describing motion events, they found older hearing speakers’ co-speech gestures encode manner+path simultaneously (holistic/iconic), while successive child NSL cohorts increasingly segment manner and path into sequential elements and combine them by rules—evidence of linearization and hierarchy absent in the input. Children also innovated A-B-A embeddings (e.g., roll–descend–roll) to signal simultaneity within a larger structure, a clear hierarchical device not present in surrounding gestures. The pattern reveals that preadolescent learners possess analytic, recombinational biases that reshape input into language-like systems, supporting a sensitive period and arguing against accounts where discreteness/hierarchy arise solely via cultural transmission without being reflected in the learner’s mechanisms.

79
New cards

Hickok, Bellugi & Klima (2001/2002): How the Human Brain Processes Language (Reading 4)

Studies of deaf signers reveal that language is left-lateralized regardless of modality: left-hemisphere damage causes sign aphasias paralleling Broca’s and Wernicke’s aphasias in speech, while right-hemisphere damage mainly affects discourse coherence and spatial referencing, not core linguistic ability. ASL has phonological, morphological, and syntactic structure, and sign-language errors mirror spoken-language paraphasias. Brain imaging shows Broca’s and Wernicke’s areas activate for sign just as for speech, indicating that central linguistic computations are modality-independent, while peripheral sensory/motor processing differs. These findings support a modular brain architecture: language systems are specialized for linguistic computation, not tied to auditory–oral or visual–manual channels.

80
New cards

Samuels (2004): Innateness in Cognitive Science (Reading 5)

Argues that common “innate” glosses (non-acquired, present at birth), and biology-based accounts (genetic determination, invariance/canalization, heritability) all misfire for cognitive science. Proposes a psychological primitiveness view: a cognitive trait is innate if its acquisition isn’t explained by psychological/learning processes but by lower-level (e.g., biological) mechanisms—subject to a normalcy condition (it emerges in normal development). This reframes innateness as a tool to (1) mark the limits of psychological explanation and (2) identify building blocks for developmental theories, while noting open issues (what counts as “learning,” how to fix “normal” environments).

81
New cards

Arguments for the innateness of language

Universality:

All human communities have language, and these languages share core structural properties (e.g., hierarchical syntax, recursion, abstract categories). If language were purely learned, we’d expect wide variation. Instead, this universal pattern strongly suggests that humans are biologically predisposed to acquire language.

Poverty of the Stimulus: Children are not explicitly taught the full set of grammatical rules, and the input they receive is limited and imperfect. Yet they consistently produce rich, abstract, rule-governed language. This implies that something innate must fill the gap between the sparse input and the complex linguistic output.

82
New cards

Emergence of Hierarchical Structure (NSL)

Sequential A-B-A constructions (e.g., roll-descend-roll) show how children innovated hierarchical patterns to express simultaneity in sign language.

83
New cards

Associationism vs. Cognitivism (Language Disorders)

Associationism struggles to explain double dissociations (e.g., Broca's vs. Wernicke's aphasia), which show discrete language functions.

84
New cards

Double Dissociation

When two patients show opposite impairment patterns for two tasks. In language:

Broca's aphasia = impaired production, intact comprehension

Wernicke's aphasia = fluent but meaningless speech, impaired comprehension. This supports neural localization of language functions.

85
New cards

Broca's Aphasia

Damage to Broca's area (left frontal lobe, "syntactic region") → impaired, nonfluent, agrammatical speech but relatively preserved meaning.

86
New cards

Wernicke's Aphasia

Damage to Wernicke's area (left temporal lobe, "semantic region") → fluent but semantically incoherent speech; impaired comprehension.

87
New cards

Language and the Brain: Modality Independence

Hickok et al. (2002) showed that sign language recruits Broca's and Wernicke's areas just like speech. Language is not tied to auditory-vocal modality.

88
New cards

Hickok et al. (2002): Core Finding

Sign language processing is left-lateralized like spoken language, and independent from general visual-spatial abilities—supporting a modular view of language.

89
New cards

Language & Cognitive Ability Dissociation

Williams syndrome shows impaired general cognition but preserved linguistic structure, demonstrating language is not dependent on general intelligence.

90
New cards

Four Key Claims about Language Processing

Mandatory – Language processing happens automatically, without conscious control.

Fast & Incremental – We interpret language rapidly as it unfolds (~250–350 ms per word).

Largely Unconscious – Most processing occurs outside awareness (e.g., semantic priming).

Ambiguity is Pervasive – The brain must quickly resolve ambiguous input using multiple information sources.

91
New cards

What is the significance of sequential sign patterns like roll-descend-roll in language development?

They express simultaneity and show children innovating hierarchical linguistic structure.

92
New cards

How do syntax and semantics function in language processing?

They can function independently, as evidenced by neuropsychological dissociations such as aphasias.

93
New cards

What is double dissociation in the context of language impairment?

It refers to two patients showing opposite impairment patterns for two functions, exemplified by Broca's vs. Wernicke's aphasia.

94
New cards

What characterizes Broca's aphasia?

Damage to Broca's area leads to impaired, nonfluent, agrammatical speech while meaning comprehension remains intact.

95
New cards

What are the symptoms of Wernicke's aphasia?

Damage to Wernicke's area results in fluent but meaningless speech and impaired comprehension.

96
New cards

How does associationism relate to aphasia evidence?

Associationism struggles to explain distinct, localized deficits like those seen in Broca's and Wernicke's aphasias, supporting a modular view of language.

97
New cards

What does Williams Syndrome demonstrate about language and cognition?

It shows impaired general cognition but preserved language structure, indicating language independence from general intelligence.

98
New cards

What did Hickok et al. (2002) find regarding sign language and the brain?

Sign language recruits the same brain areas (Broca's and Wernicke's) as spoken language, indicating neural organization is not tied to sensory modality.

99
New cards

What is the modularity of language according to Hickok et al.?

The brain processes linguistic information through specialized, modality-independent modules, treating visual linguistic input like spoken input.

100
New cards

What are the four key claims about language processing?

1. Mandatory - happens automatically. 2. Fast & Incremental - processed as input unfolds. 3. Mostly Unconscious - outside awareness. 4. Handles Ambiguity - uses multiple info sources to resolve unclear input.