lecture 7: Schedules of Reinforcement

studied byStudied by 0 people
0.0(0)
learn
LearnA personalized and smart learning plan
exam
Practice TestTake a test on your terms and definitions
spaced repetition
Spaced RepetitionScientifically backed study method
heart puzzle
Matching GameHow quick can you match all your cards?
flashcards
FlashcardsStudy terms and definitions

1 / 20

encourage image

There's no tags or description

Looks like no one added any tags here yet for you.

21 Terms

1

Schedules of Reinforcement

Rule specifying which occurrences of a given behaviour, if any, will be reinforced

New cards
2

Continuous Reinforcement

Every correct response is reinforced; fast learning and fast extinction

New cards
3

Partial/Intermittent Reinforcement

Only some correct responses are reinforced; slow learning and extinction

New cards
4

Learning something New

  • acquisition - more

    • more you reinforce the quicker acquisition happens

  • maintenance - less

    • give less reinforcers

  • schedule thinning

    • process of gradually reducing reinforcement of behaviour

New cards
5

Partial/Intermittent Reinforcement Advantages

a) reinforcer remains effective longer because satiation takes place more slowly.

b)  Behaviour that has been reinforced intermittently tends to take longer to extinguish.

c) Individuals work more consistently on certain intermittent schedules.

d) Behaviour that has been reinforced intermittently is more likely to persist after being transferred to reinforcement in natural environment.

New cards
6

Ration Schedules

Reinforcement based on number of responses emitted

New cards
7

Fixed ratio schedule

  • Reinforcement occurs each time a set number

    of responses of a particular type are emitted.

    • Produce high steady rate of responding until reinforcement, followed by post-reinforcement pause

    •  Initially produces high rate of responding during extinction

    • Produces high resistance to extinction

New cards
8

Ratio strain

Deterioration of responding from increasing an FR schedule too rapidly

New cards
9

Variable- ratio (VR) schedule

  • The number of responses required to produce reinforcement changes unpredictably from one reinforcement to the next

  • Produces a high steady rate of responding

  • Produces no (or at least very small) post- reinforcement pause

New cards
10

Differences between VR and FR schedules

VR schedules can be increased more abruptly than FR schedules without producing ratio strain

Values of VR that can maintain a behaviour are somewhat higher than those of FR

VR produces higher resistance to extinction than FR of same value does

New cards
11

Bot VR and FR schedules

  • used when a high rate of responding is desired, and each response can be monitored.

  • It is necessary to count the responses to know when to deliver reinforcement.

  • FR is more commonly used than VR in behavioural programs because it is simpler to administer

New cards
12

Interval Schedules

Schedules based on time

New cards
13

Fixed Interval (FI) Schedule

  • The first response after a fixed amount of time following the previous reinforcement is reinforced and a new interval begins

  • Size of FI schedule: amount of time that must elapse

  • No limit on how long after the end of the interval a response can occur in order to be reinforced

New cards
14

FI schedules (without access to a clock) produce:

  • A rate of responding that increases gradually near the end of the interval until reinforcement

  • A post-reinforcement pause

    • Length depends on value of FI: higher value = longer pause

New cards
15

Variable Interval (VI) Schedule

  • A response is reinforced after unpredictable intervals of time

  • Length of the interval changes from one reinforcement to the next: varies around some mean value

  • Produces a moderate steady rate of responding and no post-reinforcement pause

  • Produces high resistance to extinction

  • Lower rates of responding than FR or VR

New cards
16

Simple Interval Schedules

  • “Simple” because only requirement = time must pass

  • Interval schedules are not often used because: – FI produces long post-reinforcement pauses

    • VI generates lower response rates than ratio schedules

    • Simple interval schedules require continuous monitoring of behaviour after each interval until a response occurs

New cards
17

Limited Hold

  • a deadline for meeting the response requirement of a schedule of reinforcement.

  • Ex. FR 30/LH 2 minutes

  • Not common for limited hold to be added to ration schedules

New cards
18

Interval Schedules with Limited Hold

  • Finite time after a reinforcer becomes available that a response will produce it.

    • FI/LH

    • VI/LH

  • example bus arrives at specific stop every 20 mins and you have a limited time 1 min to get on the bus

  • Interval schedules with short limited holdsàsimilar results to ratio schedules

  • For small FIs, FI/LH produce results similar to FR schedules

  • VI/LH – similar results to VR schedules

  • LHs used when we want ratio-like behaviour, but are unable to count each instance of behaviour

New cards
19

Duration Schedules

  • Reinforcement occurs after the behaviour has been engaged in for a continuous period of time.

    • Fixed Duration (FD) Ex. FD 5 minutes

    • Variable-Duration (VD) – interval changes unpredictably

  • Used only when target behaviour can be measured continuously

New cards
20

Guidelines for Effective Use of Intermittent Reinforcement

  • Choose an appropriate schedule for behaviour you wish to strengthen and maintain.

  • Choose a schedule that is convenient to administer.

  • Use appropriate instruments and materials to determine accurately and conveniently when the behaviour should be reinforced.

  • Frequency of reinforcement should initially be high enough to maintain desired behaviour, then decrease gradually.

  • Inform individual of what schedule you are using.

New cards
21

Concurrent Schedules of Reinforcement

  • Different schedules of reinforcement that are in effect at the same time

  • Herrnstein’s (1961) matching law:

    – The response rate or the time devoted to an activity in a concurrent schedule is proportional to the rate of reinforcement of that activity relative to the rates of reinforcement on the other concurrent activities.

  • Research findings on factors influencing choice of reinforcement:

    – Types of schedules that are operating

    – The immediacy of reinforcement

    – The magnitude of reinforcement

    – Response effort involved in different options

New cards

Explore top notes

note Note
studied byStudied by 10 people
752 days ago
5.0(1)
note Note
studied byStudied by 8 people
909 days ago
5.0(1)
note Note
studied byStudied by 56 people
899 days ago
5.0(1)
note Note
studied byStudied by 30 people
974 days ago
4.0(1)
note Note
studied byStudied by 1114 people
680 days ago
4.0(6)
note Note
studied byStudied by 58 people
1065 days ago
5.0(1)
note Note
studied byStudied by 6 people
760 days ago
5.0(1)
note Note
studied byStudied by 139196 people
332 days ago
4.8(594)

Explore top flashcards

flashcards Flashcard (49)
studied byStudied by 111 people
543 days ago
4.8(4)
flashcards Flashcard (138)
studied byStudied by 201 people
870 days ago
5.0(4)
flashcards Flashcard (40)
studied byStudied by 21 people
554 days ago
5.0(2)
flashcards Flashcard (60)
studied byStudied by 7 people
15 days ago
5.0(1)
flashcards Flashcard (63)
studied byStudied by 3 people
739 days ago
5.0(1)
flashcards Flashcard (36)
studied byStudied by 30 people
550 days ago
5.0(4)
flashcards Flashcard (28)
studied byStudied by 2 people
729 days ago
5.0(1)
flashcards Flashcard (46)
studied byStudied by 232 people
69 days ago
5.0(1)
robot