Rumelhart, McClelland, Hinton (Machine Learning)

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/19

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 5:37 AM on 4/10/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

20 Terms

1
New cards

PDP Basic Unit

Processing occurs through a large number of simple elements called units → These units send excitatory and inhibitory signals to one another

2
New cards

Parallelism

Unlike sequential (serial) computers, PDP models process many pieces of information—or constraints—simultaneously

3
New cards

The Microstructure of Cognition

While human thought may appear sequential at a macro level (ideas coming one after another), it is actually composed of vast numbers of parallel microsteps

4
New cards

Connection Strengths (Weights)

The "knowledge" in a PDP model is not a stored list of facts but the strengths of the connections between units

5
New cards

Local Representation

A system where one specific unit is reserved for one specific concept or pattern (e.g., one unit for "Lance" in the Jets and Sharks model)

6
New cards

Distributed Representation

Knowledge about a pattern is spread across the connections of many units; No single unit stands for a whole concept; instead, the concept is the pattern of activation across the network

7
New cards

Active Representation

The current pattern of activation across units

8
New cards

Stored Representation

The connection strengths that allow that pattern to be re-created later

9
New cards

Learning as Connection Tuning

Learning is not about formulating explicit rules; it is the gradual adjustment of connection strengths based on experience so the right patterns of activation emerge under the right conditions

10
New cards

Hebb Rule

A simple learning principle: adjust the strength of a connection between two units in proportion to their simultaneous activation → This allows the system to teach itself without a central "programmer"

11
New cards

Content Addressability

The ability to retrieve a full memory from just a fragment of its attributes (e.g., remembering a person's name just by thinking of their hair color and where you met them)

12
New cards

Graceful Degradation

Because information is distributed, the system does not crash if it receives slightly wrong information or if a few units are damaged; it simply produces a slightly weaker or "best guess" output

13
New cards

Emergent Properties List

  • Spontaneous Generalization

  • Default Assignment

  • Rule-like Behavior

14
New cards

Spontaneous Generalization

If a system is probed with a general cue, it will naturally retrieve what is common to the memories matching that cue, effectively creating a "general concept" on the fly

Emergent Properties List

15
New cards

Default Assignment

If information about a specific object is missing, the system fills in the blanks based on what it knows about similar instances (e.g., assuming a bird can fly because most other birds in its memory can)

Emergent Properties List

16
New cards

Rule-like Behavior

PDP models can behave as if they follow linguistic or logical rules without those rules being explicitly written in the system

Ex. A model learning past tenses through examples can "overregularize" (saying "camed" instead of "came") just as children do, because it has extracted the statistical regularity of the "-ed" ending

Emergent Properties List

17
New cards

Interactive Activation Model

Used for word recognition. It shows how higher-level knowledge (words) and lower-level features (letters/lines) constrain each other simultaneously

18
New cards

Jets and Sharks Model

A simulation of gang members used to demonstrate content addressability and default assignment

19
New cards

Pattern Associator

A simple network that learns to associate one pattern (like the sight of a rose) with another (the smell of a rose) using the Hebb Rule

20
New cards

Typing Model

Demonstrates that serial behavior (typing one letter after another) emerges from a parallel system where units for upcoming letters anticipate and inhibit each other