Psych 381 - Chapter 6 Reinforcement & Choice

0.0(0)
studied byStudied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/34

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 12:23 AM on 3/11/25
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

35 Terms

1
New cards

Reinforcers

Primary and secondary, and intrinsic and extrinsic

2
New cards

Intrinsic reinforcement

Reinforcement provided by the mere act of performing the behavior; the performance of the behavior is inherently reinforcing.

reinforces that fulfill our internal desires (ex. running, reading to want to learn something new)

3
New cards

extrinsic reinforcement

The reinforcement provided by a consequence that is external to the behavior, that is, an extrinsic reinforcer.

Do it because they give you something (ex. working for money, studying for good grades)

4
New cards

Primary Reinforcers

Us's

5
New cards

Secondary Reinforcers

Cs's

Including tokens (similar to money)

6
New cards

Most reinforcers for humans are _____

secondary

7
New cards

Aversives ____ reinforce behaviour

can

ex. morris water maze

8
New cards

Morris water maze

puts a rat on a tank with water that's been coloured to not be transparent. Rats hate swimming even though they're good swimmers so water is aversive stimulus.

But it helps them reinforce finding a platform.

Shows being put in a situation you don't like can be reinforcing.

9
New cards

Continuous Reinforcement (CRF)

Behaviour is reinforced every time it occurs

10
New cards

Intermittent Reinforcement

when only some of the responses made are followed by reinforcement

11
New cards

Fewer reinforcers are needed for ______ schedules

intermittent

12
New cards

Why is intermittent reinforcement more resistant to extinction?

You learn to continue to do the behaviour even if it's not reinforced sometimes. You know you'll be reinforced eventually.

13
New cards

Cumilative Record

Based on old cumulative recorder device. Every time it made a response, it would move the pen

• Based on old cumulative recorder device (1957)

• Constant paper output, pen jumps with each response

Flat lines = no responses

Increases = response made

Tiny 45 degree lines = reinforcer was given

14
New cards

Ratio Schedules

Reinforcer is given after the animal makes the required NUMBER of responses

Not all responses are rewarded, but after a certain amount of responses.

15
New cards

If required number for ratio schedules is 1, then it's called ___

Continuous Reinforcement (CRF)

16
New cards

2 types of Ratio Schedules

Fixed Ratio (FR) and Variable Ratio (VR)

17
New cards

Fixed Ration (FR)

Fixed ratio between the number of responses made and reinforcers delivered (e.g., FR10)

Key elements: postreinforcement pause, ratio run, and ratio strain

18
New cards

Postreinforcement pause

Period of time when the animal doesn't respond

Also called procrastination in this context.

19
New cards

Ratio Run

How steady the rate of the graph is.

20
New cards

Ratio Strain

A long period without responding.

21
New cards

Real life FR examples

Paper boy giving money after every 15.

Sweatshops, paid by number of clothes made.

Calling someone on the phone when you always had to dial the number, so after 10 responses (digits), you get the reward of being able to call someone.

22
New cards

Variable Ratio (VR)

Different number of responses are required for the delivery of each reinforcer

23
New cards

VR value is equal to the ____ number of responses made to receive a reinforcer. Average has to be 5.

average

24
New cards

Fixed ratio/interval graphs look like _____, Variable ratio/interval graphs look like ______

Stairs, upwards line

VI has a lot more reinforcer and a slightly flatter angle.

25
New cards

Real life VR examples

Sports - being on a streak.

Gambling - Getting rewarded at times because at some point, you'll get rewarded.

Sales - Not every time you pitch a sale, but then you'll get one so you keep doing it

Scrolling on social media - keep scrolling even though you're bored because you know you'll find something.

26
New cards

Interval schedules

Responses are only reinforced if the response occurs after a certain TIME INTERVAL.

27
New cards

Fixed Interval (FI)

A response is reinforced only if it occurs after a set amount of time (responses during the interval do NOT matter)

Key element: Fixed interval scallop

28
New cards

Fixed interval scallop

A gradual increase in the rate of responding, with responding occurring at a high rate, just before reinforcement is available. No responding occurs for some time after reinforcement.

29
New cards

Real life FI schedules

Laundry - no matter how much you check you clothes, they're only ready when the time is done.

Studying - Shows the fixed interval scallop, because you study more the closer the exam is

30
New cards

Variable interval (VI)

Responses are reinforced if they occur after a variable interval of time

Based on an average. (ex. average of 5 mins)

31
New cards

Real life VI's

Waiting for the rain

Pop quizzes

Waiting for the bus

32
New cards

Ratio and Interval Schedules Compared: Reynolds (1975)

• Compared rates of key pecking of pigeons on VR and VI schedules

• Opportunities for reinforcement were made identical for each bird

• The VI bird could receive reward when the VR bird was within one response of its reward. This was to make them comparable.

Curve was steeper for VR rate.

With equivalent rate of reinforcement, variable ratio schedules produce a higher rate of responding than variable interval schedules

33
New cards

_____ schedules produce steadier responding compared to _____

Variable, Fixed

34
New cards

_____ schedules produce higher rates of responding than ______

Ratio, INnterval

35
New cards

How to tell the graph

Higher rates of responding is Ratio,

then fixed have more "stairs", and variable is straighter line.