Chapter 5(1)

B.F. Skinner Major Theoretical Concepts

A. Radical Behaviorism

  • Rejection of mentalistic events.

  • Focused on observable behaviors and their consequences.

B. Respondent vs Operant Behavior

  1. Respondent:

    • Elicited by known stimuli.

  2. Operant:

    • Emitted by the organism with no known stimulus.

C. Type S and Type R Conditioning

  1. Type S:

    • Respondent behavior associated with classical conditioning.

  2. Type R:

    • Operant behavior.

D. Operant Conditioning Principles

  1. Responses followed by a reinforcer tend to be repeated.

  2. Reinforcing Stimulus:

    • Any event that increases the operant rate.

E. Skinner Box

  • Used to study operant conditioning.

F. Cumulative Recording

  1. Accumulated responses function over time.


Major Theoretical Concepts (cont’d)

G. Conditioning the Lever-Pressing Response

  1. Deprivation:

    • No inferred state like "drive".

  2. Magazine Training:

    • Training to associate sound cue with food.

  3. Lever Pressing:

    • Behavior learned through operant conditioning.

H. Shaping

  1. Accelerates lever-pressing behavior by reinforcing successive approximations.

I. Extinction

  1. Response occurs without a reinforcer leading to a return to operant level.

J. Spontaneous Recovery

  1. The reappearance of an extinguished operant response without the reinforcer.


Major Theoretical Concepts (cont’d)

K. Superstitious Behavior

  1. Responding to noncontingent reinforcers (behavior learned without direct reinforcement).

L. Discriminative Operant

  1. SD (Discriminative Stimulus):

    • Signals reinforcer availability.

  2. S ∆ (Delta):

    • Signals no reinforcer available.

M. Secondary Reinforcer

  1. A stimulus paired with a primary reinforcer (e.g., money).

N. Generalized Reinforcer

  1. A secondary reinforcer that is paired with multiple primary reinforcers.

O. Chaining

  1. Works backward from primary reinforcer.

  2. Incorporates successive backward secondary reinforcers.

P. Positive and Negative Reinforcers

  1. All reinforcement increases the operant rate.

  2. Positive Reinforcer:

    • Stimulus added contingent upon a response.

  3. Negative Reinforcer:

    • Stimulus removed contingent upon a response.

Q. Punishment

  1. Decreases the operant rate.

  2. Positive Punishment:

    • Stimulus added contingent upon a response.

  3. Negative Punishment:

    • Stimulus removed contingent upon a response.

  4. Alternatives to Punishment:

    • Change the environment.

    • Ignore undesirable behavior.


Major Theoretical Concepts (cont’d)

R. Schedules of Reinforcement

  1. Continuous Reinforcement:

    • Reinforcing every response.

  2. Partial Reinforcement:

    • Reinforcing some responses, but not others.

    • Types of Partial Reinforcement:

      • Fixed Interval (time-based).

      • Fixed Ratio (response-based).

      • Variable Interval (unpredictable time-based).

      • Variable Ratio (unpredictable response-based).

Relativity of Reinforcement

A. Premack's Principle

  1. Responses can serve as reinforcers.

  2. More preferred responses may reinforce less preferred responses.

B. Misbehavior of Organisms

  1. Instinctual Drift:

    • Organisms may revert to instinctive behaviors.

  2. Interaction between instinctual behavior and learned behavior.

C. Autoshaping

  1. Involves instinctive behavior patterns elicited by specific stimuli.

B.F. Skinner Major Theoretical Concepts Examples

A. Radical Behaviorism

  • Example: A behavior analyst focuses on a child's tantrum, observing how it leads to parent attention without considering the child's internal thoughts.

B. Respondent vs Operant Behavior

  • Respondent: A dog salivating at the sound of a bell (a known stimulus).

  • Operant: A person learning to write correctly without a prompt from a teacher.

C. Type S and Type R Conditioning

  • Type S: A child feeling fear (respondent behavior) when they hear thunder.

  • Type R: A rat pressing a lever (operant behavior) to receive food.

D. Operant Conditioning Principles

  • Example: A student studies for tests (response) and receives high grades (reinforcer), increasing their study behavior.

E. Skinner Box

  • Example: A rat is placed in a Skinner box, where it learns to press a lever to receive food as a reward.

F. Cumulative Recording

  • Example: Tracking a pigeon's pecking behavior displayed gradually on a cumulative record over time.

G. Conditioning the Lever-Pressing Response

  • Deprivation: A rat is not given food for a period (no inferred state).

  • Magazine Training: The sound of a food dispenser is paired with food delivery until the rat approaches at the sound.

  • Lever Pressing: The rat learns to press the lever to get food.

H. Shaping

  • Example: A trainer rewards a dog for getting closer to a ball and then only for touching it until it retrieves it on command.

I. Extinction

  • Example: A student stops receiving praise for answering questions, leading them to stop answering altogether.

J. Spontaneous Recovery

  • Example: After a period away from school, a student randomly raises their hand and answers a question despite previous lack of reinforcement.

K. Superstitious Behavior

  • Example: An athlete wears a ‘lucky’ shirt that they believe helps them win, even though it has no effect on their performance.

L. Discriminative Operant

  • SD: A green traffic light (signals reinforcer availability – ability to go).

  • S ∆: A red traffic light (signals no reinforcer – must stop).

M. Secondary Reinforcer

  • Example: A child is given a gold star (secondary reinforcer) after completing their homework, which is associated with praise (a primary reinforcer).

N. Generalized Reinforcer

  • Example: Money, which can be exchanged for various primary reinforcers like food, clothing, or entertainment.

O. Chaining

  • Example: A pigeon is taught to peck (first chain), then turn in a circle (second chain), and finally flap its wings (final behavior) to receive food.

P. Positive and Negative Reinforcers

  • Positive: A child receives a cookie for cleaning their room.

  • Negative: A teenager’s curfew is lifted for completing chores.

Q. Punishment

  • Positive Punishment: A child touches a hot stove and feels pain (added stimulus).

  • Negative Punishment: A teenager loses phone privileges for breaking curfew (removal of stimulus).

R. Schedules of Reinforcement

  • Continuous Reinforcement: A dog is given a treat every time it sits on command.

  • Fixed Interval: A student is rewarded with a pizza party every two weeks for completing homework.

  • Fixed Ratio: A factory worker is paid for every 10 items produced.

  • Variable Interval: A person checks their email at random times; sometimes they receive responses immediately, other times not for hours.

  • Variable Ratio: A slot machine pays out after a random number of pulls, leading to high engagement.

robot