1/46
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Aversive Stimulus
A stimulus whose removal will function as a negative reinforcer and whose presentation will function as a positive punisher
SS
shock-shock interval, the time between shocks when no response is made
RS
response-shock interval, the time between the response and the next shock
Contingency
The greater the contingency between a response and the punishing event, the faster behavior decreases
Most effective way to eliminate behavior
Contiguity
The more immediate a response is followed by the punisher, the faster the behavior will decrease
Punishment intensity
The greater the intensity of the punisher, the faster the response will decrease
Motivation level
The greater the motivation level, the less effective punishment will be
Extinction
Stop providing reinforcement for problem behaviors
Response prevention
Controls to prevent behavior
Noncontigent reinforcement
Problem behavior due to wanting attention or something
Identify the reinforcer of the unwanted behavior and frequently deliver the reinforcer regardless of what the subject is doing
Reduced motivation to engage in the behavior
DRO
Differential reinforcement of other behavior
Reinforcer is delivered if the subject does not engage in the unwanted behavior for some period of time
Every instance of unwanted behavior resets the time
DRA
Differential reinforcement of alternative behavior
Identify the reinforcer for the problem behavior
Extinguish the unwanted behavior while also providing the reinforcer when a more desirable behavior occurs
Example:
Kim throws a temper tantrum to get out of class assignments
When she throws a tantrum, Kim remains in class and the assignment is not removed
If Kim raises her hand, she receives help and/or gets to take a break from the assignment
DRL
Differential reinforcement of low rates
Behavior itself is not problematic, but the frequency of behavior is
Used if you want to reduce, but not eliminate a behavior
Reinforcer is delivered if the time between two consecutive responses (IRT) is greater than some value
Example:
Student always raises his hand and tries to answer every question
Teacher does not want to punish or extinguish behavior so she calls on him only if he goes some time between trying to answer a question
Concurrent schedule of reinforcement
Two (or more) schedules of reinforcement operate at the same time
Impulsivity
Choosing the small immediate reinforcer
Self-control
Forgoing the small immediate reinforcer and instead choosing the larger more delayed reinforcer
Between-subjects design
Participants are randomly assigned to one of two or more groupsÂ
Random assignment helps ensure that the groups are identical on all other variablesÂ
Groups differ on the level of the IV
Within-subjects design
Independent variable is manipulated within subjectsÂ
Each participant is exposed to all the the experimental conditionsÂ
Often in randomized orderÂ
IV (independent variable)
The variable that is manipulated by the experimenterÂ
Levels of IVÂ
No maximum number of levelsÂ
Minimum of two levelsÂ
DV (dependent variable)
The variable that is measured
AB design
contains one baseline (A) and one treatment (B)
ABA design
a single-case design in which the response to the treatment condition is compared to baseline responses recorded before and after the treatment
Independent Variable vs. Dependent Variable
IV – what is being manipulated (x-axis), DV – what is being measured(y-axis)
Frequency
number of responses over time
Rate
responses/time
Latency
time or speed
Intensity
strength
Topography
the form a behavior takes
Habituation
Decreased response to a stimulus as a result of repeated exposure to it
Dishabituation
Return of a previously habituated response to a stimulus due to the presentation of a different more intense dishabituating stimulus
Spontaneous Recovery
something that comes back over a passage of time, no longer habituated
Sensitization
an increased response to a relatively weak stimulus as a result of prior exposure to that stimulus or a more intense stimulus
Ex: After experiencing a small earthquake, you may become more sensitive to minor vibrations or sounds, reacting strongly to things that previously wouldn't have bothered you
The Law of Effect
Stimuli that are closely followed by satisfaction, more likely to reoccur
Stimuli followed closely by discomfort, less likely to occur
The three-term contingency (A-B-C)
Antecedent, behavior, consequence
Stay out past curfew → parents are mad → take away my keys
Primary versus secondary versus generalized reinforcers
Primary – food, sex, water
Secondary – praise, applause, smile, good grades Generalized – money bc it can buy other reinforcer
The Premack Principle
More probable behaviors will reinforce less probable behaviors
Homme et al (children yelling and running around vs sitting and listening)
$ for favors
Forward chaining
Child must complete first step, then 1 and 2, then 1 2 3 and so on. Rewarded at each completed step, then after 2 steps, 3 steps…
Backward chaining
first train behavior C and provide reinforcer when C occurs. When that is learned, require that B then C occur, then require that A then B then C occurs to receive reinforcement
Why subjects pause on FR schedules but not VR schedules?
On an FR, subjects can fatigue or become satiated, whereas VR introduces an element of unpredictability making it more intriguing to continue without pausing
Why the scalloped shaped pattern of responding occurs under FI schedules?
Animals understand time, when its closer to the specific amount of learned time, there will be more responding
Stimulus generalization
same responses to similar stimuli
Response generalization
similar responses to specific stimulus
FR
A reinforcer is delivered upon the occurrence of the nth response.Â
In a fixed ratio 10-second schedule, a reinforcer is delivered upon the occurrence of the 10th response
Ex: piecework pay; a seasonal worker earns $5 for every bushel of apples he picks
VR
A reinforcer is delivered on average after n responses.
In a variable ratio 10-second schedule, a reinforcer is delivered on average after 10 responses.
Ex: playing slot machines; guys approaching girls at a bar
behavior of salespeople
on average, they might have to approach X people to make a sale, but the very next person might be the one to make a purchase
FI
A reinforcer is delivered upon the occurrence of the first response after n-seconds have elapsed
In a fixed interval 30-second schedule, a reinforcer is delivered upon the occurrence of the first response after 30 seconds.
Ex: checking your watch as you get closer and closer to the end of class
VI
In a variable interval, a reinforcer is delivered upon the occurrence of the first response after a variable amount of time has elapsed.
On a VI 60-s schedule, on average the amount of time that must elapse before a response produces a reinforcer is 60 s, but the actual intervals vary from reinforcer to reinforcer.
Ex: checking e-mail (the times you get mail is semi-unpredictable)