PSYCH 1 - Conditioning and Learning

Rationalism vs. empiricism

  • British empiricism


Associationism: Aristotle and John Locke’s Laws of Association


Habituation: 


Tabula rasa doctrine w/ rationale: John Locke’s theory that people are born with a blank slate


Laws of Association: we associate things with Frequency (repetition), contiguity (co-occurrence in either time or space), and similarity and contrast


I. Pavlov’s conditioning methods:


Esophageal and gastric fistulae: esophageal is hole in dog’s neck. Gastric fistula is hole in dog’s stomach.


CS, CR, US (UCS), UR (UCR): appear after specific experiences – conditional reflexes (cs → cr); (wired in reflexes, automatically happens) – unconditional reflexes (us → ur)


Formation of conditioned reflex: start with unconditioned reflex, and predict US using CS many times (aquisition):


Extinction in Pavlovian (classical) conditioning: After predicting US using CS many times, Test with only CS (omit US) CS CR and check for response


Classical Conditioning and:

- Post-Traumatic Stress Disorder: cs (weapon sound) → us (blast) → ur (fear)

- Bulimia: cs (full stomach after binge) → us (finger down throat) → ur (nausea, purging)

- “Pseudoinsomnia”: cs (lying in bed) → us (get to sleep!!!) → ur (anxiety, insomnia)


Aversion therapy: fear conditioning


Systematic desensitization: used for phobia treatment


E. L Thorndike’s puzzle-box experiment w/ rationale: placed cats in puzzle box to observe how they escaped. They all did the same thing gradually because they knew it was the right way to escape.


Thorndike’s contributions to education: Corollary: Law of Exercise – repetition strengthens reaction; used as the basis for drill and rote methods, and created the drill method. He invented the spelling bee and developed lined paper for learning handwriting.


Law of Effect: An organism whose actions lead to a satisfying state of affairs is likely to repeat those actions. An organism whose actions lead to an “annoying” state of affairs is unlikely to repeat those actions.


B. F. Skinner: animals act to change their environments, i.e., their behavior is “instrumental.” – likened the emergence and extinction of operant responses to the emergence and extinction of species; both reflected “adaptations” from a process of selection.


Operant conditioning: animal learning is adaptive


“A-B-C’s” of operant conditioning: Antecedent Stimulus, Behavior, Consequence


Reinforcement vs. punishment:  Given A, C or B will change whether the animal likes it. If C increases B, then reinforcement. Suppose C decreases B, then punishment.


Positive vs. negative rft / punishment: postive if includes stimulus, negative if doesn’t


Shaping: Delivers reinforcing consequences to

successive approximations of desired behavior


Schedules of reinforcement: (What is the pattern of

reinforcing consequences following a behavior?)

– Continuous reinforcement

– Intermittent reinforcement: Fixed- & variable- ratio schedules; Fixed- & variable- interval schedules


Operant extinction: Eliminates the reinforcing consequence to eliminate the behavior


Special cases of learning:

- conditioned taste aversions

- birdsong learning

- social learning (imitation, vicarious reinforcement and punishment)


Skinner’s views on thinking and free will: Thinking as verbal behavior, Free will as fiction


Applications of Skinner’s findings: Applied Behavior Analysis used for Behavior Modification, Assertiveness training, Biofeedback, Programmed instruction