BLP

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/39

flashcard set

Earn XP

Description and Tags

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

40 Terms

1
New cards

Q: What is an example of a negative reinforcement?

A: Removing chores as a reward for getting good grades.

2
New cards

Q: What type of reinforcement is given after varying amounts of time?

A: Variable interval.

3
New cards

Q: In operant conditioning, what does the term 'discriminative stimulus' refer to?

A: A stimulus that signals the availability of reinforcement.

4
New cards

Q: What does the term 'fixed ratio' refer to in reinforcement schedules?

A: A schedule that delivers reinforcement after a set number of responses.

5
New cards

Q: How does instant gratification relate to operant conditioning?

A: It can lead to high rates of responding due to immediate reinforcement.

6
New cards

Q: What is an example of operant extinction?

A: A child stops whining after being ignored.

7
New cards

Q: Which type of reinforcement schedule typically leads to choppy or up-and-down response rates?

A: Fixed interval.

8
New cards

Q: What is stimulus generalization in operant conditioning?

A: Responding similarly to similar stimuli.

9
New cards

Q: What is an example of a variable interval reinforcement schedule?

A: Receiving a text message at unpredictable times.

10
New cards

Q: What concept suggests that the more a behavior is reinforced, the more likely it is to occur?

A: The Law of Effect.

11
New cards

Q: What is the primary distinction between operant and classical conditioning?

A: Operant conditioning associates behavior with consequences, while classical conditioning associates stimuli.

12
New cards

Q: According to Thorndike’s Law of Effect, behaviors followed by positive outcomes are:

A: More likely to recur.

13
New cards

Q: Which reinforcement schedule is known for producing a high, steady rate of response?

A: Variable ratio.

14
New cards

Q: In operant conditioning, “shaping” refers to:

A: Reinforcing successive approximations toward the target behavior.

15
New cards

Q: Which of the following is an example of negative reinforcement?

A: Removing restrictions when a teen improves grades.

16
New cards

Q: A rat in an experiment stops pressing a lever after reinforcement is no longer provided. This is an example of:

A: Extinction.

17
New cards

Q: Positive punishment involves:

A: Adding something unpleasant to decrease behavior.

18
New cards

Q: The type of learning in which a response is strengthened or weakened by its consequences is known as:

A: Operant conditioning.

19
New cards

Q: What type of reinforcement schedule produces the highest resistance to extinction?

A: Variable ratio.

20
New cards

Q: Which theory explains that behaviors are reinforced by drive reduction?

A: Drive-reduction theory.

21
New cards

Q: In the context of instrumental conditioning, what does the term “reinforcer” refer to?

A: Any consequence that increases behavior.

22
New cards

Q: The Premack Principle states that:

A: Any behavior can reinforce a less probable behavior.

23
New cards

Q: What is the purpose of “discrimination training” in stimulus control?

A: To help organisms distinguish between different stimuli.

24
New cards

Q: Which is an example of positive punishment?

A: Extra chores for not completing homework.

25
New cards

Q: Herrnstein’s Matching Law predicts that the relative rate of responding will match:

A: The relative frequency of reinforcement.

26
New cards

Q: A reinforcement schedule that provides a reward after a fixed amount of time has passed is called:

A: Fixed interval.

27
New cards

Q: Which of the following is an example of instinctive drift?

A: A raccoon washing food rather than eating it when trained to pick up coins.

28
New cards

Q: According to Amsel’s Frustration Theory, the persistence of behavior under partial reinforcement is due to:

A: The association of frustration with reinforcement.

29
New cards

Q: Which reinforcement schedule would produce 'scalloped' response patterns?

A: Fixed interval.

30
New cards

Q: A pigeon responds similarly to two different tones. This is an example of:

A: Generalization.

31
New cards

Q: What is an example of a fixed ratio reinforcement schedule?

A: Giving a reward after every fifth response.

32
New cards

Q: Define extinction in operant conditioning.

A: The reduction of a behavior when reinforcement is no longer provided.

33
New cards

Q: Which type of reinforcement schedule produces the quickest learning?

A: Continuous reinforcement.

34
New cards

Q: What does the Matching Law predict?

A: Relative response rates will match relative reinforcement rates.

35
New cards

Q: In shaping, what is the term for reinforcing steps that are increasingly close to the desired behavior?

A: Successive approximations.

36
New cards

Q: Which reinforcement schedule is often associated with high response rates and resistance to extinction?

A: Variable ratio.

37
New cards

Q: Describe the partial reinforcement extinction effect (PREE).

A: Persistence of behavior is greater when it has been partially reinforced.

38
New cards

Q: A pigeon trained to peck at a red light also pecks at a similar orange light. This demonstrates:

A: Stimulus generalization.

39
New cards

Q: Which type of reinforcement involves providing a reward after a set period of time, regardless of response rate?

A: Fixed interval.

40
New cards

Q: Which term refers to when a behavior resurfaces after it was thought to be extinct?

A: Spontaneous recovery.