Information processing involves symbol manipulation via rules.
Distinction made between:
Structure (symbols)
Process (rules, algorithms)
Types of processing:
Sentential
Propositional
Sequential
Language of thought concept.
Alternative view of information processing, also known as:
Parallel Distributed Processing (PDP)
Neural Networks
Historical contributions:
McCulloch & Pitts (1940s)
Rosenblatt (1960s) - perceptron (2-layer network)
Commonly utilized to classify patterns by defining decision regions in pattern space.
Examples from Churchland’s "The Engine of Reason, the Seat of the Soul" include:
Color Space: Classification of colors based on opponent cells.
Taste Space: Categorization of tastes with respect to different activation levels for various substances.
Face Space: Parameters like nose width, eye separation, and mouth fullness classified in three dimensions.
Definition: Networks of simple units (processors) operating simultaneously (in parallel) and inspired by biological processes.
Input Phase: Input information analogous to perception.
Computation Phase: Transformation of input analogous to thinking.
Output Phase: Generates output analogous to action.
Each processing unit behaves as a simplified neuron:
Computes total incoming signals from previous layers.
Adopts level of internal activation based on these signals.
Generates outgoing signals modulated by connection weights.
Function as communication channels between processing units (similar to synapses).
Can be:
Inhibitory (negative weights)
Excitatory (positive weights)
Strength of connection varies based on absolute values (range: -1 to +1).
Initial connection weights assigned randomly.
Network trained via presenting patterns associated with known responses.
Error term calculated as the difference between desired and actual outputs:
Used to adjust connection weights via backpropagation.
Trained until error cannot be further reduced.