BLP

studied byStudied by 0 people
0.0(0)
learn
LearnA personalized and smart learning plan
exam
Practice TestTake a test on your terms and definitions
spaced repetition
Spaced RepetitionScientifically backed study method
heart puzzle
Matching GameHow quick can you match all your cards?
flashcards
FlashcardsStudy terms and definitions

1 / 39

flashcard set

Earn XP

Description and Tags

40 Terms

1

Q: What is an example of a negative reinforcement?

A: Removing chores as a reward for getting good grades.

New cards
2

Q: What type of reinforcement is given after varying amounts of time?

A: Variable interval.

New cards
3

Q: In operant conditioning, what does the term 'discriminative stimulus' refer to?

A: A stimulus that signals the availability of reinforcement.

New cards
4

Q: What does the term 'fixed ratio' refer to in reinforcement schedules?

A: A schedule that delivers reinforcement after a set number of responses.

New cards
5

Q: How does instant gratification relate to operant conditioning?

A: It can lead to high rates of responding due to immediate reinforcement.

New cards
6

Q: What is an example of operant extinction?

A: A child stops whining after being ignored.

New cards
7

Q: Which type of reinforcement schedule typically leads to choppy or up-and-down response rates?

A: Fixed interval.

New cards
8

Q: What is stimulus generalization in operant conditioning?

A: Responding similarly to similar stimuli.

New cards
9

Q: What is an example of a variable interval reinforcement schedule?

A: Receiving a text message at unpredictable times.

New cards
10

Q: What concept suggests that the more a behavior is reinforced, the more likely it is to occur?

A: The Law of Effect.

New cards
11

Q: What is the primary distinction between operant and classical conditioning?

A: Operant conditioning associates behavior with consequences, while classical conditioning associates stimuli.

New cards
12

Q: According to Thorndike’s Law of Effect, behaviors followed by positive outcomes are:

A: More likely to recur.

New cards
13

Q: Which reinforcement schedule is known for producing a high, steady rate of response?

A: Variable ratio.

New cards
14

Q: In operant conditioning, “shaping” refers to:

A: Reinforcing successive approximations toward the target behavior.

New cards
15

Q: Which of the following is an example of negative reinforcement?

A: Removing restrictions when a teen improves grades.

New cards
16

Q: A rat in an experiment stops pressing a lever after reinforcement is no longer provided. This is an example of:

A: Extinction.

New cards
17

Q: Positive punishment involves:

A: Adding something unpleasant to decrease behavior.

New cards
18

Q: The type of learning in which a response is strengthened or weakened by its consequences is known as:

A: Operant conditioning.

New cards
19

Q: What type of reinforcement schedule produces the highest resistance to extinction?

A: Variable ratio.

New cards
20

Q: Which theory explains that behaviors are reinforced by drive reduction?

A: Drive-reduction theory.

New cards
21

Q: In the context of instrumental conditioning, what does the term “reinforcer” refer to?

A: Any consequence that increases behavior.

New cards
22

Q: The Premack Principle states that:

A: Any behavior can reinforce a less probable behavior.

New cards
23

Q: What is the purpose of “discrimination training” in stimulus control?

A: To help organisms distinguish between different stimuli.

New cards
24

Q: Which is an example of positive punishment?

A: Extra chores for not completing homework.

New cards
25

Q: Herrnstein’s Matching Law predicts that the relative rate of responding will match:

A: The relative frequency of reinforcement.

New cards
26

Q: A reinforcement schedule that provides a reward after a fixed amount of time has passed is called:

A: Fixed interval.

New cards
27

Q: Which of the following is an example of instinctive drift?

A: A raccoon washing food rather than eating it when trained to pick up coins.

New cards
28

Q: According to Amsel’s Frustration Theory, the persistence of behavior under partial reinforcement is due to:

A: The association of frustration with reinforcement.

New cards
29

Q: Which reinforcement schedule would produce 'scalloped' response patterns?

A: Fixed interval.

New cards
30

Q: A pigeon responds similarly to two different tones. This is an example of:

A: Generalization.

New cards
31

Q: What is an example of a fixed ratio reinforcement schedule?

A: Giving a reward after every fifth response.

New cards
32

Q: Define extinction in operant conditioning.

A: The reduction of a behavior when reinforcement is no longer provided.

New cards
33

Q: Which type of reinforcement schedule produces the quickest learning?

A: Continuous reinforcement.

New cards
34

Q: What does the Matching Law predict?

A: Relative response rates will match relative reinforcement rates.

New cards
35

Q: In shaping, what is the term for reinforcing steps that are increasingly close to the desired behavior?

A: Successive approximations.

New cards
36

Q: Which reinforcement schedule is often associated with high response rates and resistance to extinction?

A: Variable ratio.

New cards
37

Q: Describe the partial reinforcement extinction effect (PREE).

A: Persistence of behavior is greater when it has been partially reinforced.

New cards
38

Q: A pigeon trained to peck at a red light also pecks at a similar orange light. This demonstrates:

A: Stimulus generalization.

New cards
39

Q: Which type of reinforcement involves providing a reward after a set period of time, regardless of response rate?

A: Fixed interval.

New cards
40

Q: Which term refers to when a behavior resurfaces after it was thought to be extinct?

A: Spontaneous recovery.

New cards
robot