1/45
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What is Operant Conditioning?
A BEHAVIOR associated with a CONSEQUENCE!!!!
CONSEQUENCE influences that BEHAVIOR occuring again in the future.
Who invented Instrumental Learning?
Edward L. Thorndike
-Demonstrated the power of changing a behavior by manipulating the consequences of that behavior.
REJECTED the idea of ANTRHOMORPHISM!!!
-Wanted to explain animal behavior regarding elementary stimulus and response events.
Thorndike’s Cat in the Box
Consequences serve to STRENGTHEN OR WEAKEN S-R CONNECTION
Thorndike’s Cat in the Box was “voluntary” learning Instrumental Conditioning.
The cats in Thorndike’s puzzle boxes were able to escape more quickly over successive trials. Thorndike interpreted this learning as a strengthening between the _____ . Stimulus and Response
Anthromorphism
Attributing human characterisitcs to animals.
(S) - (R) - (O)
Outcomes serve to STRENGTHEN OR WEAKEN the “Box-lever” connection
Example: Cat in the box scratches the box and is disappointed.
Context (S) - Behavior (R) - Outcome (O)
The outcome serves to STRENGTHEN OR WEAKEN the “box-lever” connection.
Example: Cat in the box pushes the lever and is happy.
The association is made between the S and R; the Outcome simply STRENGTHENS the S-R connection.
What is the Law of Effect?
If a response in the presence of a stimulus is followed byb a satisfying even, the association between the stimulus and response is STRENGTHENED. If the response is followed by an annoying event, the assocation is WEAKENED.
Consequences serve to STRENGTHEN OR WEAKEN the S-R connection.
Thorndike called this type “voluntary” learning Instrumental Conditoning.
What are the two types of learning?
Pavlonian Conditioning
Instrumental Conditioning
What is Pavlonian Conditioning?
S-S associations
Experimenter has control of CS and US; Subject is Passive
CR is “involuntary”; response is elicited
What is Instrumental Conditioning?
R - Reinforcer associations
Subject
Contingency is neccessary
What is the Behavioral Approach?
Invented by B.F. Skinner
Distinguished between Classical Conditioning and Instrumental Conditioning
Skinner got rid of the Stimulus, responses are emitted, not elicited in operant learning.
No longer measure S-R strength
Rewards and Punishments
What is reinforcement?
Response followed by a PLEASANT consequence
Response is more likely to occur
Increase in responding
What is punishment?
Response followed by an UNPLEASANT consequence
Response is less likely to occur
Decrease in responding
How do you reward a behavior that never occurs?
Answer: Shaping - until goal behavior is mastered
What is Continuous reinforcement?
Reinforcement occurs after every target response
What is Partial reinforcement?
Sometimes the target response is reinforced, and sometimes it is not reinforced.
Partial Reinforcement Schedules:
Fixed Ratio (FR) - Set number of times
Fixed Interval (FI) - Set interval of time
Variable Ratio (VR) - Number of times
Variable Interval (VI) - Amount of time
What is the Partial Reinforcement Extinction effect?
When the Response to Outcome contingency is broken, the organism stops making the target response.
Responses take longer to extinguish than behaviors on a continuous reinforcement schedule.
PARADOXICAL RESULT - Opposite of what you expect happens!!!! You would think that continuous reinforcement would create stronger learning and take longer to extinguish, but the opposite is true.
one would guess that less reinforcement during a partial schedule would lead to weaker learning. And that learning would be easier to extinguish
A pigeon is trained to peck a keylight for food. After acquisition, the pigeon is changed to phase II, during which pecking the keylight no longer results in food. Eventually, the pigeon stops pecking the keylight. What phenomenon occurred during phase II? Extinction
Why do we see the partial reinforcement extinction effect?
Answer:
Amsel - Frustration Theory
Capaldi - Sequential Theory
What is Amsel’s Frustration Theory?
When an organism makes a response and does not receive reinforcement, the organism experiences frustration.
Response and no food leads to frustration
What is Capaldi’s Sequential Theory?
Animals have the memory of a recent, non-rewarded trial when experiencing a rewarded trial.
What is Instinctive Drift?
Tendency for animals to revert to old behaviors when have learned new behaviors.
The competition between natural responses and the responses required by the experimenter sometimes leads to the development of behaviors that interfere with an animal making an instrumental response. The development of these behaviors is called…Instinctive Drift
What is the Contrast effects?
Sudden shifts in the behavior after changing the value of the reinforcer.
What did B.f. Skinner believe?
He believed that scientists should avoid theoretical speculation about mental constructs.
TRUE/FALSE: Ratio schedules tend to support higher rates of responding compared to interval schedules.
Answer: True
Learning as the result of reinforcement (such as positive and negative reinforcement) means that responding is_____in the future.
more likely to occur
What is the difference between positive punishment and negative reinforcement?
Punishment decreases the target response, while negaitve reinforcment increases the target response.
TRUE/FALSE: Learning as the result of punishment (such as positive punishment and negative punishment) means that responding is_____in the future.
less likely to occur
What is an example of positive reinforcement?
Your parents buy you dinner after you win an award.
Suzie attends class so she won’t get behind and fail the test. Suzie’s behavior of attending class is an example of____
Negative reinforcement
Carson is stranded on a deserted island. One day a suitcase washes up on shore and he finds a novel among the items in the case. Carson was never much of reader, but he is elated to find this book and savors reading every page. Why does Carson find reading this book so rewarding (now that he is standed)?
According to the Matching Law, the impact of a reinforcement schedule depends on the other options that are available.
Drive-reduction theory of reinforcement states that___
any outcome that satisfies a physical need (or drive) will reinforce responding
Tom is "channel surfing." In other words, he is trying to choose which channel on his TV that he should watch. If he finds a show boring, then he switches to another channel. This is an example of____
a concurrent schedule
Most of the time choice behavior falls in line with what is predicted by the matching law. What two factors may cause "undermatching?"
differing sensitivities and differing biases
A child enjoys reading a book more than playing video games. According to the Premack Principle, we should be able to increase the child's reading behavior by using video game playing as a reinforcer.
Answer: False
The relative rate of responding on an alternative typically matches the relative rate of reinforcement on that alternative. This statement is also known as _____.
Answer: The Matching Law
According to Amsel's frustration theory, the partial reinforcement extinction effect occurs because____
during a partial reinforcement schedule, subjects were reinforced while frustrated. Therefore, keep responding during extinction training.
Spontanenous Recovery after extinction refers to_____
when the Instrumental beavior returns after some peeriod of time has passed
I train a pigeon to peck at a green light that has been paired with food. Next, I present different colors of light to the pigeon and measure how much he pecks at each one. The results are posted below. What can you conclude about the pigeon?
He cannot tell the difference between the green light and the other colors.
Colwill and Rescorla showed that animals can associate responses to their outcomes (R-O associations). To do this, they trained rats to push a lever to the right for sugar water, and push the lever to the left for food. What did Colwill and Rescorla do next to show that rats associate each response (lever to the right or left) with a different outcome (water or food)?
Phase II: sugar water is paired with illness. Test: Do the rats choose to press the lever to the right or left?
When Timmy (5 years old) has a tantrum, his mother does not respond (she ignores it). Over the past 3 months, Timmy has stopped having tantrums at home. Recently, Timmy has started kindergarten. While Timmy does not have tantrums when he is at home, he has started to have them when he is at school. This example illustrates the reoccurrence of an extinguished behavior due to ____.
Answer: renewal
When S+ and S- are very similar during training, the stimulus generalization gradient will be _____.
Answer: steep
Extinction may be enhanced by____
repeating the extinction training over and over, with each session separated by a few days.
The fact that we learn that it is "okay" to undress in our bedroom and "not okay" to undress during class in room 102, means that undressing behavior is____
Answer: under stimulus control
Recovering opiate addicts often warn doctors not to administer opiates for pain while they are hospitalized because they fear ____.
Answer: Reinstatement
A rat is trained to press a lever only when he is "high" on cocaine using the following procedure:
cocaine --> press lever --> food
saline --> press lever --> no food
The next week, we give the rat a new experimental drug and find that he starts pressing the lever. What can we conclude?
The new drug produces the same introceptive cues as cocaine
What does the following experiment by Colwill and Rescorla show?
Animals can make S-O associations