RZ

Chapter 7: Operant Conditioning: Schedules and Theories of Reinforcement 

  1. What is a schedule of reinforcement? Contrast continuous and intermittent schedules. 

The response requirement that must be met is to obtain reinforcement. A continuous reinforcement schedule is a schedule in which each specified response is reinforced. EX: each time a rat presses the level it obtains a food pellet. Intermittent reinforcement: only some responses are reinforced. EX: only some of the rat's level presses result in a food pellet 

  1. Define each of the following schedules, as well as describe the pattern of responding generated by each (see Table 7.1 for a nice summary of response patterns): FR; VR; FI; VI.  

Fixed ratio (FR) = a schedule in which reinforcement is contingent upon a fixed, predictable number of responses  

 

  

  1. What is meant by the terms “rich” & “lean” as applied to schedules? Define “stretching the ratio” and “ratio strain”. 

Rich = schedules in which the reinforcer is easily obtained, Lean = reinforcer is difficult to obtain. Stretching the ratio refers to moving from a low ratio requirement (dense schedule) to a high ratio requirement (lean schedule) - should be done gradually.  

 

  1. Define FD and VD schedules, and identify the drawbacks associated with their use. 

Fixed duration = Reinforcement is contingent upon continuous performance of a behavior for a fixed, predicable period of time 

Variable duration = reinforcement is contingent upon continuous performance of a behavior for a varying, unpredictable period of time.  

 

  1. Explain how to implement DRH & DRL schedules. Give examples of situations in which it would be appropriate to use each of these schedules. 

Differential reinforcement of high rates- contingent upon emitting at least a certain number of responses in a certain period of time or more generally reinforcement is provided at a fast rate. EX: a child struggles to do their homework quickly, and instead of rewarding them for completing the homework. set a timer and reward them when they finish in a specific time frame. Praise a small reward like extra screen time. The outcome is reinforcing the behavior when it's done quickly, encouraging the child to work at a faster pace.  

 

  1. How is a noncontingent reinforcement schedule different from the other schedules discussed in this chapter?  Define FT & VT schedules. Provide a real-life example in which noncontingent reinforcement created superstitious behavior.   

A non-contingent schedule of reinforcement- which the reinforcer is delivered independently of any response. There are two types of non-contingent schedules: fixed time and variable time. FT= delivered following a fixed, predictable period of time regardless of the organism's behavior VT = reinforcer is delivered following a varying, unpredictable period of time, regardless of the organism's behavior.  

 

  1. Summarize the argument regarding the possible clinical benefits of providing noncontingent reinforcement.  

 

  1. Define conjunctive, adjusting, and chained schedules. 

Conjunctive schedule – two of more simple schedules must be met before a reinforcer is delivered  

Adjusting schedule- response requirement changes as a function of the organism's performance while responding for the previous reinforcer.  

Chained schedule- sequence of two or more simple schedules, each of which has its own S^d and the last of which results in a terminal reinforcer.  

 

 

Chapter 8:  Extinction and Stimulus Control 

  1. Define extinction and describe each side effect.   

Extinction is the non-reinforcement of a previously reinforced response, the result of which decreases the strength in that response. Some side effects include  

  1. What is resistance to extinction?  

The extent to which responding persists after an extinction procedure has been implemented. Ex. A dog that continues to beg for food at the dinner table for 20 minutes after everyone has stood feeding it is displaying much higher resistance to extinction than does a dog that stops begging after 5 minutes.  

  1. Define and give an example of a DRO procedure. Why would someone want to use DRO to reduce behavior (e.g., as opposed to simply using an extinction procedure)? Also know the “general rule” in the last sentence on p. 305. 

Which is the reinforcement of any behavior other than the target behavior that is being extinguished. This procedure tends to be more effective than extinction procedures because the target behavior is weakened both by lack of reinforcement for that behavior and by reinforcement of alternative behaviors coming to replace it.  

  • Whenever one attempts to extinguish an unwanted behavior, one should also provide plenty of reinforcement for more appropriate behavior. 

 

  1. How do you know when you have stimulus control?  

We know when a behavior has been consistently reinforced by the presence of a certain stimulus.  

E.g. = at red lights, we stop; at green lights, we proceed, in an elevator we stand facing the front rather than the back 

  1. Identify the critical features of a generalization gradient (carefully look at the gradients in Figure 8.4). How do you interpret the stimulus control displayed on a gradient (e.g., appropriately use the terms generalization and discrimination based on the steepness or flatness of a gradient)? 

A flat gradient indicates more generalization, and a steep gradient indicates less generalization.  

 

 

 

  1. Explain the procedure involved in discrimination training and apply this to teaching a person to discriminate between two stimuli. 

Discrimination training- Reinforcement of responding in the presence of one stimulus and not another.  

An example would be  

  1. What is a peak shift?  Note that discrimination training is needed in order to produce the shift.  

The peak of a generalization gradient following discrimination training will shift from S^d to a stimulus that is further removed from the S^triangle.  

 

  1. Define multiple schedule. Explain how it is different from a chained schedule.  

Consists of two or more independent schdules presented in sequence, each resulting in reinforcement and each having a distinctive S^d. A chained schedule requires that all of the component schedules be completed before the sought-after reinforcer is delivered.  

 

  1. What are the lessons derived from the contrast literature (in the first two sentences of 317,2)? 

 

The occurrence of these contrast effects indicates that behaviors should not be viewed in isolation. Consequences for behavior in one setting can greatly affect the strength of behavior in another setting.  

 

  1. After carefully reading 323-325, evaluate the degree of stimulus control that you have established for your own studying behavior.  Explain how the stimulus control for your studying behavior could be improved.