IS

Module 8: Mind as Network

Emergence: The arising of novel and complex properties or behaviors in a system from the interactions of its simpler components. These emergent properties are not predictable from the properties of the individual components alone.

Connectionism: A computational approach to cognition that models mental processes as the emergent activity of interconnected networks of simple units (nodes), often inspired by the structure of the brain.

Parallel Processing: The ability of a system to perform multiple computations or processes simultaneously, as opposed to sequentially. This is a key feature of many connectionist models and is thought to be characteristic of brain function.

Distributed Representation: A way of representing information where a concept or item is not represented by a single unit but by a pattern of activation across multiple units in a network. This allows for more robust and flexible representations.

Neural Networks: Computational models inspired by the structure and function of biological neural networks. They consist of interconnected nodes (neurons) that process and transmit signals based on the strength of their connections (weights).  

Hebbian Learning: A learning rule often summarized as "neurons that fire together, wire together." It states that the connection strength between two neurons increases if they are active at the same time. This is a fundamental mechanism for learning associations in neural networks.

Backpropagation: A supervised learning algorithm commonly used to train artificial neural networks. It calculates the gradient of the error function with respect to the network's weights and uses this gradient to adjust the weights in order to minimize the error.

100-Step Constraint: The observation that many cognitive processes seem to occur within a relatively small number of sequential processing steps (around 100). This constraint poses challenges for purely sequential models of cognition and motivates parallel processing approaches.

Graceful Degradation: The ability of a system, particularly a neural network, to maintain a reasonable level of performance even when parts of it are damaged or removed. This robustness arises from the distributed nature of representations and processing.

One-off Learning: The ability to learn a new concept or association after a single exposure. This is a challenge for many connectionist models, which often require multiple exposures for learning to occur effectively.