1/9
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Emergence
The arising of novel and complex properties or behaviors in a system from the interactions of its simpler components.
Connectionism
A computational approach to cognition that models mental processes as the emergent activity of interconnected networks of simple units.
Parallel Processing
The ability of a system to perform multiple computations or processes simultaneously.
Distributed Representation
A way of representing information where a concept is represented by a pattern of activation across multiple units in a network.
Neural Networks
Computational models inspired by biological neural networks consisting of interconnected nodes that process signals.
Hebbian Learning
A learning rule stating that the connection strength between two neurons increases if they are active simultaneously.
Backpropagation
an algorithm in artificial neural networks that trains the network by adjusting its weights and biases. it calculates the error at the output and works backwards layer by layer in order to minimize loss
100-Step Constraint
most mental processes can be explained through roughly 100 sequential processing steps. Although numerous neurons can be activated in parallel, the overall processing time for basic cognitive functions is limited
Graceful Degradation
The ability of a system to maintain performance even when parts are damaged or removed.
One-off Learning
The capacity to learn a new concept after a single exposure, posing challenges for many connectionist models.