DD

Lecture Notes Review - Neural Networks

Brain-Style Computation

  • Brains function as parallel, distributed, analog computers rather than as symbolic logic systems.
  • This concept leads to neural networks (NNs) designed to mimic brain operation to compute tasks.

Simple Neural Networks

  • Neural networks can categorize into two types:
    • Biological Neural Networks (BNN)
    • Artificial Neural Networks (ANN)

McCulloch-Pitts (M-P) Neurons/Networks

  • Introduced in 1943, modeled artificial neurons to compute Boolean functions.

    • Uses discrete binary values (0 or 1).
    • Calculates output (Y) based on weighted inputs (W1, W2) and thresholds (Q).
    • Example of AND function to determine output based on inputs:
      Y = f(X1 * W1 + X2 * W2 - Q)
  • Example Calculations of a Boolean AND Computation:

    • For inputs (X1, X2):
    • (0, 0): Y = f(0 * 0.3 + 0 * 0.5 - 0.6) = f(-0.6) = 0
    • (0, 1): Y = f(0 * 0.3 + 1 * 0.5 - 0.6) = f(-0.1) = 0
    • (1, 0): Y = f(1 * 0.3 + 0 * 0.5 - 0.6) = f(-0.3) = 0
    • (1, 1): Y = f(1 * 0.3 + 1 * 0.5 - 0.6) = f(0.2) = 1

Learning in M-P Networks

  • Knowledge is represented in connection weights & thresholds, allowing learning through weight modification.
  • Example hiring rule for a candidate: a desirable employee based on grades or experience.

Brief History of Neural Networks

  • 1943: First neural network model by McCulloch & Pitts.
  • 1949: Hebbian Learning Rule proposed by D. Hebb, foundational for unsupervised learning.
  • 1957: Rosenblatt's Perceptron introduced a simple learning mechanism using the Delta learning rule.
  • 1986: Backpropagation learning rule discovered by Rumelhart & McClelland, enabling complex networks to learn.

Deep Neural Networks (DNN)

  • Defined as networks with multiple layers (5-100 hidden layers).
  • Successful in applications like speech recognition and language translation.
  • Enabled significant advancements in AI, particularly with the work of Hinton and others in modern neural network applications.

Backpropagation and Biological Plausibility

  • Backpropagation is effective in simulating human cognitive behaviors (memory, vision) but not biologically plausible.
  • Research is ongoing to find biologically informed learning algorithms.
  • Example: Average Gradient Outer Product (AGOP), a proposed backpropagation-free learning mechanism.