1/130
Associative Learning II
_________ learning is the learning of the connection between a behaviour and its consequences
Instrumental
Instrumental learning is a form of associative learning, and the consequence that follows is a direct result of behaviour.
a. true
b. false
a. true
______ learning is used to train very complex types of behaviours like sniffer dogs at the airport, and circus animals
instrumental
According to instrumental learning, organisms learn to make a specific response to obtain/avoid a specfic consequence.
a. True
b. False
a. True
With instrumental conditioning, the organism is reinforced if _______
it produces a particular response in a specific stimulus situation
With instrumental conditioning:
a. R → S → C
b. S → R → C
c. C → S → R
d. C → R → S
b. S→ R→ C
With classical conditioning there is:
a. CS + US → R
b. CS + R → US
c. US + R → CS
d. all of the above
a. CS + US → R
They key difference betwwen classical and instrumental conditioning is _______
Unlike CC, IC involves a reward/reinforcement following the display of a response
Learning is what has allowed us to adapt to the environment, and it needs to be universal and applicable across the species
a. True
b. False
a. True
Edward Thorndike developed the cats in a puzzle experiment.
a. True
b. False
a. True
In the Cat Puzzle Experiment, Thorndike knew that the learning occurred based on _______
how long the cats took to escape the puzzle box
In the Cat Puzzle Box experiment, the behaviours required of the cats were within their behavioural repertoire.
a. True
b. False
b. False
the behaviours required of the cats were not within their behavioural repertoire so that Thorndike could see how behavioural responses developed with practice
During trial and error, the unsuccessful behaviours drop off and the behaviour that preceds opening the door becomes more frequent.
a. True
b. False
a. True
In the puzzle box experiment, learning was demonstrated when each time the cat was placed into the box __________ would occur
the behaviour triggering the opening of the door
Thorndike’s Law of Effect was based on the ________ experiment
Cat Puzzle Box
According to the Law of Effect, responses accompanied or closely followed by satisfaction will be _________ while responses followed by discomfort will be __________
a. strongly connected to the situation; weakly connected to the situation
b. weakly connected to the situation; strongly connected to the situation
a. strongly connected to the situation; weakly connected to the situation
The law of effect is a statement of __________
the principle of reinforcement
In instrumental conditioning, the escape response is controlled by _________
the environment at the present time
Once an association is formed between the behaviour and the stimulus, the stimulus then becomes the ______
a. conditioned stimulus
b. unconditioned stimulus
c. discriminative stimulus
d. instrumental stimulus
c. discriminative stimulus
Once an association is formed between the behaviour and the stimulus, the response which produced the reinforcement is called the ______
a. conditioned response
b. unconditioned response
c. discriminative response
d. instrumental response
d. instrumental response
The ________ stimulus signals when and where reinforcement is available.
a. conditioned stimulus
b. unconditioned stimulus
c. discriminative stimulus
d. instrumental stimulus
c. discriminative stimulus
According to instrumental conditioning, ______ is the formation of an association between a stimulus and a response which is strengthened by the reinforcement
learning
Thorndike believed learning to be mechanistic, without conscious thought or reasoning,
a. True
b. False
a. True
an opposing view to Thorndike’s view of learning was ______
intellectual reasoning or insight
_______ implies that an animal would suddenly know what is required behaviourally
Thorndike’s belief that learning is mechanistic without conscious thought or reasoning
Learning occurs through trial and error and is a gradual process.
a. True
b. False
a. True
Skinner started discrete experimental trials where there was precision control over the ________
a. discriminative stimulus
b. the reinforces
c. the recording of responses
d. all of the above
d. all of the above
_______ designed a mechanism where if the animal pressed a lever/bar, a pellet of food will be released into the chamber
a. Pavlov
b. Thorndike
c. Watson
d. Skinner
d. Skinner
_________ indicated that the animal’s response operates within the environment to produce a certain outcomes
operant response
________ is a subclass of instrumental conditioning
operant conditioning
_______ conditioning involves the use of separate subjects [animals] per trial
instrumental
Skinner’s approach took a functionalist perspective which removed inferences about unobservable concepts.
a. True
b. False
a. True
A ________ is anything that can be detected by the animal, and is the first part of the chain that triggers the response and consequence
stimulus
Discriminative stimuli can be so important that the operant response becomes ______
a. automatic
b. mechanic
c. stagnant
d. extinct
a. automatic
An example of ______ pertain to when well-trained rats familiar with a maze ran through an unexpected pile of food while displaying the operant behaviour.
habit slip
The _____ effect was demonstrated where pigeons were trained to peck at a lighted disk to receive food [grain]. In the cage was an empty food container. After the pigeons were trained, the food container was filled with grain. The pigeons occasionally ate the grain from the container, but the main source of food was from pecking responses
protestant
Instrumental conditioning requires a response that is specifically a pattern of motor responses.
a. True
b. False
Instrumental conditioning requires a response that is not specifically a pattern of motor responses
A response is defined by its ______
a. magnitude
b. strength
c. effect on the environment
d. all of the above
c. effect on the environment
Complex behaviours can be trained through the process of _____
shaping
Rewarding successive approximations of the desired behaviours is a part of the ______ process
shaping
A process where one is rewarded to form an association, followed by an immediate rule change is called ______
shaping
the _______ method teaches animals one step of the chain at a time
a. shaping
b. conditioning
c. chaining
d. instrumental learning
chaining
Skinner was able to teach a rat to pull a string which releases a marble, then to pick up the marble with its forepaws and carry it over to a tube and then to drop the marble in the tube via _______
a. shaping
b. conditioning
c. chaining
d. instrumental learning
c. chaining
a ______ is a consequence which increases the likelihood of that behaviour in the future
reinforcer
Examples of ______ include food, sleep, and sex
a. reinforcers
b. primary reinforcers
c. secondary reinforcers
d. physiological needs
b. primary reinforcers
_______ proposes that we have an innate biological need to obtain primary reinforcers
drive reduction theory
_________ have no intrinsic value unless paired with primary reinforcers so it can be traded for a primary reinforcer
a. reinforcers
b. primary reinforcers
c. secondary reinforcers
d. physiological needs
c. secondary reinforcers
_______ have been successful in token economies
a. reinforcers
b. primary reinforcers
c. secondary reinforcers
d. physiological needs
c. secondary reinforcers
_______ reinforcement has a powerful effect on human beings, e.g., praise, attention, physical contact
a. reinforcers
b. primary
c. secondary
d. social
d. social
Social reinforcers have a powerful effect because:
they are primary reinforcers
they are secondary reinforcers that can be paired with a primary reinforcer
a. 1 only
b. 2 only
c. 1 and 2
d. none of the above
c. 1 and 2
Which of the following is not an advantage of social reinforcers:
a. they can be given immediately
b. they do no disrupt ongoing behaviours
c. attention cannot be given to a wide range of behaviours
d. attention can be given to a wide range of behaviours
c. attention cannot be given to a wide range of behaviours
the term ______ is an oversimplification of the actual reinforcement used to train instrumental learning
reward
________ is defined in the presence of response-reinforcement contingency where the contingency is a rule
a positive reinforcement
The nature of reward rule relates ________ to ______
an instrumental behaviour to a particular outcome (the positive reinforcer)
_______ are important to defining instrumental conditioning
contingencies - a reinforcer is contingent on the occurrence of a response
When training instrumental learning, a _______ may be used to assure that the responding is due to the contingency and is not incidental
control condition
Which of the following are the dimensions of reinforcement:
a. quantity and quality of reinforcement
b. positive and negative contrast
c. drive and schedules
d. all of the above
d. all of the above
the reinforcement dimension _______ states that larger rewards produce better performance
quantity of reinforcement
the reinforcement dimension _______ states that a better quality reinforcement improves performance
quality of reinforcement
the reinforcement dimension _______ states that the effectiveness of a current reward is influenced by experience with previous rewards that differed in amount and quality
contrast effects
a _____ is a motivational state which indicates a desire for a reinforcer and can increasse response to reinforcers that are relevant to them
drive
When the reinforcer is withheld, it _______ the drive, and results in similar changes to the behaviour necessary to fulfil the drive
increases
Habit strength is determined by _______
the number of training trials reinforced
Motivation is to ________ as Incentive is to _______
drive;reinforcer
The degree of responding is based on the combination of all except:
a. drive
b. habit strength
c. contingency
d. incentive
c. contingency
Reinforcements need to be given each time the behaviour occurs.
a. True
b. False
b. False
Reinforcements do not have to be given each time the behaviour occurs, but can instead be done on a schedule
A schedule of reinforcement refers to the specific contingency between which of the following:
a. timing and response frequency
b. timing and reward delivery
c. response frequency and reward delivery
d. timing, reward frequency, and reward delivery
d. timing, reward frequency, and reward delivery
__________ reinforcement pertains to reinforcement each time the behaviour occurs
a. schedules of
b. continuous
c. partial
d. all of the above
b. continuous
__________ reinforcement pertains to reinforcement based on some percentage of the responses
a. schedules of
b. continuous
c. partial
d. all of the above
c. partial
a fixed ratio schedule results in _______
a. a fixed pattern of responses
b. a stable pattern of responses
c. an infrequent pattern of responses that increases closer to the time of receiving reinforcement
d. none of the above
a. fixed pattern of responses
a fixed interval schedule results in _______
a. a fixed pattern of responses
b. a stable pattern of responses
c. an infrequent pattern of responses that increases closer to the time of receiving reinforcement
d. none of the above
c. an infrequent pattern of responses that increases closer to the time of receiving reinforcement
a variable ratio schedule results in _______
a. a fixed pattern of responses
b. a stable pattern of responses
c. an infrequent pattern of responses that increases closer to the time of receiving reinforcement
d. none of the above
b. a stable pattern of responses
a variable interval schedule results in _______
a. a fixed pattern of responses
b. a stable pattern of responses
c. an infrequent pattern of responses that increases closer to the time of receiving reinforcement
d. none of the above
b. a stable pattern of responses
reinforcement given after a fixed number of responses pertains to a _______ schedule of reinforcement
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval
a. fixed ratio
reinforcement given after a variable number of responses pertains to a _______ schedule of reinforcement
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval
c. variable ratio
reinforcement given after a fixed time period pertains to a _______ schedule of reinforcement
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval
b. fixed interval
reinforcement given after a variable time period pertains to a _______ schedule of reinforcement
a. fixed ratio
b. fixed interval
c. variable ratio
d. variable interval
d. variable interval
When reinforcement is delayed, ____ may not occur because of other behaviours and forgetfulness
learning
_____ can reduce the risk of learning not occurring during reinforcement delays
persistent memory over the delay interval
According to the Premack Principle, the opportunity to perform a highly frequent behaviour can ________
reinforce a less frequent one
The Premack principle was developed from observing animals that spent hours doing something for which there was no reinforcement.
a. True
b. False
b. False
The Premack principle was developed from observing animals and humans that spent hours doing something for which there was no reinforcement.
In Premack’s rat experiment, the rats spent ____ seconds on the wheel and ____ seconds drinking the water.
a. 100;50
b. 50;100
c. 240;20
d. 250;50
d. 250;50
In Premack’s rat experiment, restricting access to the wheel, making it available only if a certain amount of water was consumed resulted in _______
a. less water consumption, more wheel time
b. more water consumption, more wheel time
c. more water consumption, less wheel time
d. less water consumption, less wheel time
c. more water consumption, less wheel time
An extension of the Premack principle states that it is not which behaviour is the preferred or the most frequent, but which behaviour is restricted that leads to response deprivation.
a. True
b. False
a. True
By restricting the ability to perform any other response, you can make the opportunity to perform a specific response ______
a. extinct
b. reinforcing
c. delayed
d. conditioned
b. reinforcing
The response component of the 3-way association in instrumental conditioning occurs in the _______
a. basal ganglia
b. motor cortex
c. prefrontal cortex
d. somatosensory cortex
b. motor cortex
The motor cortex receives input from the _______
a. sensory cortex
b. somatosensory cortex
c. prefrontal cortex
d. all of the above
d. all of the above
the motor cortex sends signals to the ______
a. motor neurons
b. inter neurons
c. sensory neurons
d. all of the above
a. motor neurons
Information from the sensory cortex can also travel indirectly to the motor cortex via the ______
a. basal ganglia
b. frontal cortex
c. prefrontal cortex
d. somatosensory cortex
a. basal ganglia
The _________ receives highly processed information from the sensory cortex and projects it to the motor cortex and stores stimulus-response associations.
a. basal ganglia
b. frontal cortex
c. prefrontal cortex
d. somatosensory cortex
a. basal ganglia
The presence of the lever activates the _____ to send signals to the ______ which triggers the pressing of the lever which is the response.
a. visual; motor
b. motor; visual
c. sensory; visual
d. sensory; motor
a. visual;motor
Learning occurs when the link between the _______ cortex and the ________ system is altered in a way that changes the probability that future encounters with the stimulus will evoke the same response.
a. visual; motor
b. motor; visual
c. sensory; visual
d. sensory; motor
a. visual; motor
Food only activates the reinforcement system of the brain when it tastes good.
a. True
b. False
a. True
When stimulus inputs and reinforcement system are active at the same time, the link between the stimulus input and the motor neurons are ________
a. weakened
b. strengthened
c. multiplied
d. severed
b. strengthened
The _________ was previously referred to as the pleasure centre of the brain
Ventral Tegmental Area
The VTA produces pleasure in anticipation of the reinforcement.
a. True
b. False
b. False
the VTA produces excitement in anticipation of the reinforcement and not pleasure
The neurons of the VTA release _______ into the nucleus accumbens
dopamine
When dopamine is blocked, rats ________ to press the lever to trigger an electrical stimulation of the VTA.
a. continue
b. do not continue
b. do not continue
The neurotransmitter _______ is a part of the brain’s reinforcement system
dopamine
In human studies using PET and fMRI, there was greater activation in dopamine target sites when participants were presented with _________ reinforcers
a. primary
b. secondary
b. secondary