Psych of Learning - Test 2: Ch. 6-9

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall with Kai
GameKnowt Play
New
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/72

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

73 Terms

1
New cards

Instrumental Behavior

occurs because it was previously effective in producing certain consequences (aka operant behavior)

2
New cards

Operant conditioning

modifying behavior through changing the consequences of that behavior

3
New cards

Describe how operant behavior functions through the SD R → SR model.

SD

  • Discriminative stimulus that proceeds the response that signals that a certain consequence is now available (ex. tone signals that lever press will now produce food)

R

  • Response that produces a certain consequence (lever pressing produces food)

SR

  • Consequences that serves to increase or decrease the probability of the response that proceeded it (consequence of food pellet increases rat’s tendency to press lever again)

4
New cards

What is the difference between respondent (classical) conditioning and operant conditioning?

Respondent

  • elicited by antecedent stimuli (food elicits salivation)

Operant

  • response is emitted by organism and maintained by the consequences (the rat emits lever presses which produce food)

5
New cards

What is the Law of Effect?

if response (R) in presence of stimulus (S) is followed by a satisfying event, association between S and R becomes strengthened

if the response is followed by an annoying event, the S-R association is weakened

6
New cards

A punisher ——- behavior. A reinforcer ——— behavior.

weakens, strengthens

7
New cards

What is the three-term contingency (ABC)

  • Antecedent - salient stimulus in environment (thing or event that existed before or logically proceeds another)

  • Behavior - response to antecedent

  • Consequence - addition or removal of stimulus from environment

8
New cards

What are examples of unconditioned reinforcers?

Biologically determined

  • Food

  • Water

  • Oxygen

  • Escape from heat or cold

9
New cards

What is a conditioned reinforcer?

  • A previously neutral stimulus that has been repeatedly paired with an established reinforcer

  • Will function as a reinforcer

  • Ex:

    • Parents’ smile, tone of voice, attention, praise

    • Grades, positive evaluations

    • Money

10
New cards

An aversive stimulus is also called a ————.

punisher

11
New cards

What are the four types of contingencies?

  • Positive reinforcement

  • Negative reinforcement

  • Positive punishment

  • Negative punishment

12
New cards

Difference between an appetitive stimulus and an aversive stimulus

Appetitive

  • pleasant event

Aversive

  • unpleasant stimulus

13
New cards

Free Operant Procedures

Allow animal to repeat instrumental response without constraint over and over again without being taken out of the apparatus until the end of an experimental session.

  • animal is allowed to freely respond or not

  • animal is not forced to respond

14
New cards

What is a schedule of reinforcement?

A program or rule that determines which behavior is followed by reinforcer

15
New cards

What is the difference between a dense (thick) and lean (thin) schedule of reinforcement?

Dense (thick)

  • reinforcement is delivered frequently (ex. for every response there’s a reinforcement)

Lean (thin)

  • reinforcement delivered rarely (ex. every 100th response gets a reinforcement)

16
New cards

Continuous reinforcement

Reinforcement is delivered after every instance of a behavior

  • results in sharp increase in frequency of behavior over time

17
New cards

Partial Reinforcement

Reinforcement is not delivered after every occurrence of a behavior (e.g. intermittent reinforcement)

18
New cards

What are the four partial reinforcement schedules?

  • Fixed Ratio (FR)

  • Fixed Interval (FI)

  • Variable Ratio (VR)

  • Variable Interval (VI)

19
New cards

Shaping

reinforcing successive approximations of a target behavior

  • ex. clapping as the subject gets closer to desired location

20
New cards

Establishing Operation (EO)

an antecedent event that makes a reinforcer more potent

  • makes a behavior that produces that reinforcer more likely to occur at that time

    • e.g. deprivation

21
New cards

Abolishing Operation (AO)

an antecedent event that makes a reinforcement less potent

  • makes a behavior that produces that reinforcer less likely to occur at that time

    • e.g. satiation

22
New cards

Natural vs. Contrived Consequences

Natural

  • typically provided for a certain behavior; expected consequence of the behavior within that setting (ex. money is a natural consequence of selling merchandise)

Contrived

  • reinforcers that have been deliberately arranged to modify a behavior; not typical consequence of that behavior within that setting (ex. giving candy to teach potty training, not a directly related reinforcer)

23
New cards

Ratio Schedule

Reinforcement contingent upon a fixed number of responses

  • ex. FR 5: Rat must press a lever five times to obtain a food pellet

24
New cards

What is a fixed ratio schedule?

Reinforcement is delivered after n instances of a behavior

  • FR 10 - the 10th response

  • FR 100 - the 100th response

25
New cards

Post-Reinforcement Pause

Zero rate of responding that typically occurs just after reinforcement on fixed ratio

26
New cards

Ratio Run

high and steady rate of responding that completes each ratio requirement

27
New cards

Ratio Strain

if ratio requirement is suddenly increased, animal likely to pause periodically before completion of ratio requirement.

28
New cards

Habituation

reduction in dimension of response after repeated presentations of an unconditioned stimulus

  • classical conditioning

  • multiple presentations of US until they get used to it

29
New cards

Extinction

omitting the US or reinforcer

30
New cards

Forgetting

decline in responding occurs because of passage of time and does not require nonreinforcement of CS/instrumental response

  • Memory, cognitive

31
New cards

Classical Extinction

  • presentation of CS without the US

  • results in decreased CR in response to CS

32
New cards

Operant conditioning extinction

  • A previously reinforced behavior no longer produces reinforcement

  • involves withholding consequences

33
New cards

Side effects of Operant Extinction

  • Aggression

  • Extinction burst

    • big spike in responding happening before rapidly decreasing

  • Variability

  • Resurgence

34
New cards

Extinction burst

temporary increase in frequency and intensity of responding when extinction first implemented (try harder to obtain the reinforcer)

35
New cards

Increase in Variability

increase in variability of a behavior (try different ways of obtaining the reinforcer)

  • trying out different behaviors to get the same reinforcer

36
New cards

Resurgence

reappearance during extinction of other behaviors that had once been effective in obtaining reinforcement (trying an old way of obtaining the reinforcer)

37
New cards

Factors Affecting Resistance to Extinction

Partial Reinforcement Effect

History of Reinforcement

Magnitude of the Reinforcer

Degree of Deprivation

Previous Experience with Extinction

Distinctive Signal for Extinction

38
New cards

Partial Reinforcement Effect

Behavior maintained on partial reinforcement schedule will extinguish more slowly than behavior maintained on continuous schedule

39
New cards

History of Reinforcement

A longer history of reinforcement leads to it being hard to make extinct

40
New cards

Magnitude of the Reinforcer

Large reinforcers often result in greater resistance to extinction

41
New cards

Variable Ratio Schedule (VR)

Average number of responses required to obtain successive reinforcers

  • Ex. VR 10 - Pigeon must make 10 responses to earn first reinforcer, 13 for second, 7 for third, etc.

    • Average = 10 responses per reinforcer

42
New cards

——— schedules = fairly steady rate

  • VR

  • FR

  • VI

  • FI

VR

43
New cards

Predictable pauses in the rate of responding are less likely with ——- than —— schedules.

VR, FR

44
New cards
  • Results in extremely high and steady responding

  • Easiest way to teach rapid responding

Variable Ratio Responding

45
New cards

What type of reinforcement usually produces a high rate of response with little or no post-reinforcement pause?

  • often involved in gambling addiction

Variable Ratio (VR)

46
New cards

Interval Schedule

Reinforcement depends on first response after a certain amount of time has passed

47
New cards

Fixed Interval

  • Reinforcement is delivered after the first response following a set period of time

    • Ex. FI 10 - the first bar press after 10 minutes has passed results in a food pellet

48
New cards

Describe what the Fixed-Interval scallop shows.

  • Increase in response rate evident as acceleration in cumulative record toward end of each fixed interval

    • post-reinforcement pause followed by gradual increases rate of response

49
New cards

Variable Interval Schedule

Reinforcement is delivered after the first response following a random amount of time

  • Subject has to respond to obtain the reinforcer with unpredictable set-up time

  • ex. VI 10 - on average, the first bar press after 10 minutes has passed results in a food pellet.

    • Would vary between 1 and 20 minutes

50
New cards

———— schedules maintain steady and stable rates of responding without regular pauses.

  • VR

  • VI

  • FR

  • FI

VI

51
New cards

Motivates most vigorous instrumental behavior due to short inter-response times or relationship between response rates and reinforcement

  • VR

  • VI

  • FR

  • FI

VR

52
New cards

Interval between successive responses

Inter-Response Time (IRT)

53
New cards

Conjunctive Schedule

Requirements of two or more simples schedules must be met before a reinforcer is delivered

  • e.g. wages for work are contingent upon putting in the required hours and doing the work that needs to be done during that time

54
New cards

Adjusting Schedule

Response requirement changes as function of performance while responding for previous reinforcer

  • e.g. increasing the difficulty of material being presented in a chemistry class as students learn more

55
New cards

Drive Reduction Theory

  • Event is reinforcing to the extent that it is associated with reduction in physiological drive

    • E.g. food is a reinforcer for going to the cafeteria because it reduces a hunger drive

56
New cards

Incentive Motivation

Derived from property of reinforcer vs. internal drive state

  • e.g. playing a video game for the fun of it

57
New cards

Premack Principle

high-probability (or frequency) behavior (HPB) can be used to reinforce low-probability behavior (LPB)

  • e.g. First you work, then you play

58
New cards

Does extinction does not erase what was originally learned?

No

59
New cards

Spontaneous Recovery

rest period introduced after extinction training and responding comes back; nothing specific is done during rest period to produce recovery

60
New cards

Resurgence

reappearance of extinguished target response when another reinforced response is extinguished

61
New cards

Differential Reinforcement of Other Behavior (DRO)

reinforcement of any behavior other than the target behavior that is being extinguished

  • more effective than simple extinction procedures

  • reduces unwanted side effects of extinction

62
New cards

Functional Communication Training

  • A variant of DRO

  • Assumes that misbehavior often occurs because the person is trying to achieve some type of reinforcer

  • If so, training person to instead verbally communicate want they want should reduce the amount of misbehavior

63
New cards

Stimulus Control

When operant behavior is under the control of stimulus cues

64
New cards

Difference between escape and avoidance

Escape

  • when response terminates an aversive stimulus

Avoidance

  • When response prevents an aversive stimulus from occurring

65
New cards

Two-Factor (process) Theory of Avoidance

Classical conditioning of fear to CS + instrumental reinforcement of avoidance response through fear education

66
New cards

What are the problems with Two-Process Theory of Avoidance?

  • Avoidance responses are extremely persistent; why do they not extinguish?

  • Once the rat gets used to the procedure, it seems to show no fear but continues to avoid anyway

67
New cards

Why is avoidance in animals somewhat different from phobias in humans?

Avoidance conditioning in animals usually requires a few conditioning trials, but phobic conditioning in humans requires only a single trial

  • avoidance conditioning in animals may eventually extinguish, but phobic conditioning in humans is extremely resistant to extinction

68
New cards

What are the types of punishment?

  • Intrinsic Punishment

  • Extrinsic Punishment

  • Primary (unconditioned) Punisher

  • Secondary (conditioned) Punisher

  • Generalized (generalized secondary)

69
New cards

Intrinsic Punishment

inherent aspect of the behavior being punished; activity itself is punishing (e.g. smoking makes you feel nauseous)

70
New cards

Extrinsic Punishment

not an inherent aspect of behavior being punished; simply follows the behavior (e.g. being told “you smoke Is that ever disgusting”)

71
New cards

Primary (unconditioned) Punisher

event that is innately punishing (e.g. being jabbed with a needle)

72
New cards

Secondary (conditioned) Punisher

event that is punishing because of past association with other punishers (going to a doctor who often gives you a needle”

73
New cards

Generalized (generalized secondary) Punisher

event that is punishing because of its past association with many other punishers (e.g. a mean look from someone.)