Reinforcement

Schedules of Reinforcement

  • Definition: Schedules of reinforcement are specific patterns that determine when a behavior will be reinforced.

Types of Schedules of Partial Reinforcement

  • Four Main Types:
    • Fixed Ratio
    • Variable Ratio
    • Fixed Interval
    • Variable Interval

Ratio vs. Interval Schedules

  • Ratio schedules:
    • Involve the number of behaviors that must be performed prior to reward.
  • Interval schedules:
    • Refer to the amount of time that has passed before the behavior can be rewarded.

Fixed vs. Variable Schedules

  • Fixed Schedule:
    • The number of behaviors or the amount of time is always the same.
    • Delivery of the reinforcer is predictable.
  • Variable Schedule:
    • Required number of behaviors or amount of time changes.
    • Delivery of the reinforcement is not predictable.

Fixed Ratio Schedule

  • Definition: Provides reinforcement after a set number of behaviors.
  • Examples:
    • Getting a chocolate bar after inserting 6 quarters into a vending machine.
    • Earning a bonus after selling 20 electric spoons.
  • Characteristics:
    • Common in the business world to increase production.
    • Example: A factory worker must produce a certain number of items to get paid.
  • Slot Machines Analysis:
    • Slot machines operate on a fixed ratio where a win is seen every 20 tries (predictable rewards).
    • This predictability makes it easy to learn, though actual practices in casinos use variable elements.

Variable Ratio Schedule

  • Definition: Rewards are given on an unpredictable basis after a variable number of behaviors.
  • Examples:
    • Slot machines may pay off every 20 games on average, but timing varies greatly.
  • Behavioral Impact:
    • Produces high and steady rates of behavior.
    • More resistant to extinction than other free schedules.
      -Requires behavior (e.g., players must keep trying to win).

Fixed Interval Schedule

  • Definition: Reinforces the first appropriate behavior after a fixed time period.
  • Examples:
    • Class exams occurring every three weeks.
    • Pets anticipating dinner at 5 PM.
  • Behavior Pattern:
    • Rate of behavior increases as the time approaches reinforcement (e.g., checking cookies baking more frequently as the timer nears).

Variable Interval Schedule

  • Definition: Reinforcement occurs after variable amounts of time.
  • Examples:
    • Pop quizzes and random drug tests.
    • Fishing (fish bite at unpredictable times).
  • Behavior Impact:
    • Results in slow and consistent behavior, leading to more regular study habits (compared to cramming for exams).

Summary of Schedules

  • Fixed Ratio: Reinforcement after a set number of responses.
  • Variable Ratio: Reinforcement after a variable number of responses.
  • Fixed Interval: Reinforcement after a fixed amount of time.
  • Variable Interval: Reinforcement after a variable amount of time.