1/16
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Continuous Reinforcement
Every behavior is rewarded (behavior extinguishes easily).
Intermittent Reinforcement
Reward is given only part of the time the behavior is displayed.
Fixed Interval Schedule
Reinforcement of first response after a fixed amount of time has passed (you know how much time).
Variable Interval Schedule
Reinforcement of first response after varying amounts of time.
Fixed Ratio Schedule
Reinforcement after a fixed number of responses (you know how many responses must be made).
Variable Ratio Schedule
Reinforcement after varying number of responses.
Ratio
Means NUMBER OF RESPONSES.
Interval
Means AMOUNT OF TIME.
Fixed
Means FIXED AMOUNT OF RESPONSES OR FIXED AMOUNT OF TIME.
Variable
Means VARIED NUMBER OF RESPONSES OR VARIED AMOUNT OF TIME.
Behavior at Fixed Interval Schedule
Little behavior at beginning, but increases at the end of the time interval.
Example of Fixed Interval Schedule
Looking at your watch during a lecture until end of lecture.
Example of Variable Interval Schedule
Surprise quizzes in class.
Example of Fixed Ratio Schedule
Being paid for every 10 pizzas made.
Example of Variable Ratio Schedule
Playing the slot machines.
Questions to Ask Yourself
Does the reinforcement occur after a response (behavior) or after a period of time?
Skinner
Psychologist associated with the study of reinforcement schedules.