1/6
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No study sessions yet.
Neuron vs computation neuron
A biological neuron is a real cell that processes electrical/chemical signals. A computational neuron is a simplified math model inspired by neurons, designed to mimic signal integration and firing in a network for tasks like pattern recognition or learning.
Biological neurons require consolidation to maintain connections; computational neurons store weights digitally and don’t naturally “forget” unless programmed to.
Computational neuron
Structure:
Dendrites → receive inputs
Soma (cell body) → sums inputs
Axon → sends output
Computation:
Add up incoming signals:
Σ(inputs)\Sigma (\text{inputs})Σ(inputs)
Compare sum to a threshold
Output is binary:
Fire (1)
No fire (0)
Fundamentals
Neurons sum their inputs
Inputs may be active or inactive
Inputs may be excitatory or inhibitory
Output depends on:
Input values
Input weights
Threshold
Computational Neuron: Perceptron!
What are they
specific type of computational neuron
Structure
same as computational neuron
Computation
same as computational neuron
Fundamentals
Make decisions based on inputs
Example: Should I go to a concert?
Inputs:
Friends are going (+)
I know the artist (+)
Ticket is expensive (–)
Weighted sum ≥ threshold → Go (1)
Weighted sum < threshold → Stay home (0)
Synaptic Integration and spatial summation
Neurons sum all incoming EPSPs and IPSPs
If total charge exceeds threshold → action potential fires
Botox
Many pharmacological drugs influence synaptic transmission
• Botox (botulism toxin) works by blocking the release of a neurotransmitter (acetylcholine) that is required for muscle contraction
IAC models
What is the IAC model?
Interactive Activation and Competition (IAC) network
Structure
A type of connectionist model
Nodes represent features, concepts, or people
Nodes interact via excitatory and inhibitory connections
Computation
Art is represented as a pattern of activation, not a single node
Nodes representing Art’s features (e.g., occupation, hobbies, traits) are:
Excitatorily connected to the “Art” node
Inhibitorily connected to competing person nodes
When the “Art” node is activated:
Activation spreads to connected feature nodes
Competing nodes are suppressed
Memory = stable activation pattern across the network, not storage in one place.
IAC networks store knowledge in connections between units.
Partial input can activate the full memory via excitation + inhibition dynamics.
This is emergent — the network doesn’t have a “lookup table”; the remembered pattern arises from the interaction of nodes.
Artificial Neural Network
a layered network/system of computational neurons that learn by changing connection weights