connectionism1
Computational Theory of Mind
Information processing involves symbol manipulation via rules.
Distinction made between:
Structure (symbols)
Process (rules, algorithms)
Types of processing:
Sentential
Propositional
Sequential
Language of thought concept.
Connectionist Theory of Mind
Alternative view of information processing, also known as:
Parallel Distributed Processing (PDP)
Neural Networks
Historical contributions:
McCulloch & Pitts (1940s)
Rosenblatt (1960s) - perceptron (2-layer network)
Applications of Connectionist Networks
Commonly utilized to classify patterns by defining decision regions in pattern space.
Examples from Churchland’s "The Engine of Reason, the Seat of the Soul" include:
Color Space: Classification of colors based on opponent cells.
Taste Space: Categorization of tastes with respect to different activation levels for various substances.
Face Space: Parameters like nose width, eye separation, and mouth fullness classified in three dimensions.
Connectionist Networks Overview
Definition: Networks of simple units (processors) operating simultaneously (in parallel) and inspired by biological processes.
Processing Stages
Input Phase: Input information analogous to perception.
Computation Phase: Transformation of input analogous to thinking.
Output Phase: Generates output analogous to action.
Structure of Connectionist Networks
Each processing unit behaves as a simplified neuron:
Computes total incoming signals from previous layers.
Adopts level of internal activation based on these signals.
Generates outgoing signals modulated by connection weights.
Connection Weights
Function as communication channels between processing units (similar to synapses).
Can be:
Inhibitory (negative weights)
Excitatory (positive weights)
Strength of connection varies based on absolute values (range: -1 to +1).
Training Connectionist Networks
Initial connection weights assigned randomly.
Network trained via presenting patterns associated with known responses.
Error term calculated as the difference between desired and actual outputs:
Used to adjust connection weights via backpropagation.
Trained until error cannot be further reduced.