Rejection of mentalistic events.
Focused on observable behaviors and their consequences.
Respondent:
Elicited by known stimuli.
Operant:
Emitted by the organism with no known stimulus.
Type S:
Respondent behavior associated with classical conditioning.
Type R:
Operant behavior.
Responses followed by a reinforcer tend to be repeated.
Reinforcing Stimulus:
Any event that increases the operant rate.
Used to study operant conditioning.
Accumulated responses function over time.
Deprivation:
No inferred state like "drive".
Magazine Training:
Training to associate sound cue with food.
Lever Pressing:
Behavior learned through operant conditioning.
Accelerates lever-pressing behavior by reinforcing successive approximations.
Response occurs without a reinforcer leading to a return to operant level.
The reappearance of an extinguished operant response without the reinforcer.
Responding to noncontingent reinforcers (behavior learned without direct reinforcement).
SD (Discriminative Stimulus):
Signals reinforcer availability.
S ∆ (Delta):
Signals no reinforcer available.
A stimulus paired with a primary reinforcer (e.g., money).
A secondary reinforcer that is paired with multiple primary reinforcers.
Works backward from primary reinforcer.
Incorporates successive backward secondary reinforcers.
All reinforcement increases the operant rate.
Positive Reinforcer:
Stimulus added contingent upon a response.
Negative Reinforcer:
Stimulus removed contingent upon a response.
Decreases the operant rate.
Positive Punishment:
Stimulus added contingent upon a response.
Negative Punishment:
Stimulus removed contingent upon a response.
Alternatives to Punishment:
Change the environment.
Ignore undesirable behavior.
Continuous Reinforcement:
Reinforcing every response.
Partial Reinforcement:
Reinforcing some responses, but not others.
Types of Partial Reinforcement:
Fixed Interval (time-based).
Fixed Ratio (response-based).
Variable Interval (unpredictable time-based).
Variable Ratio (unpredictable response-based).
Responses can serve as reinforcers.
More preferred responses may reinforce less preferred responses.
Instinctual Drift:
Organisms may revert to instinctive behaviors.
Interaction between instinctual behavior and learned behavior.
Involves instinctive behavior patterns elicited by specific stimuli.
A. Radical Behaviorism
Example: A behavior analyst focuses on a child's tantrum, observing how it leads to parent attention without considering the child's internal thoughts.
B. Respondent vs Operant Behavior
Respondent: A dog salivating at the sound of a bell (a known stimulus).
Operant: A person learning to write correctly without a prompt from a teacher.
C. Type S and Type R Conditioning
Type S: A child feeling fear (respondent behavior) when they hear thunder.
Type R: A rat pressing a lever (operant behavior) to receive food.
D. Operant Conditioning Principles
Example: A student studies for tests (response) and receives high grades (reinforcer), increasing their study behavior.
E. Skinner Box
Example: A rat is placed in a Skinner box, where it learns to press a lever to receive food as a reward.
F. Cumulative Recording
Example: Tracking a pigeon's pecking behavior displayed gradually on a cumulative record over time.
G. Conditioning the Lever-Pressing Response
Deprivation: A rat is not given food for a period (no inferred state).
Magazine Training: The sound of a food dispenser is paired with food delivery until the rat approaches at the sound.
Lever Pressing: The rat learns to press the lever to get food.
H. Shaping
Example: A trainer rewards a dog for getting closer to a ball and then only for touching it until it retrieves it on command.
I. Extinction
Example: A student stops receiving praise for answering questions, leading them to stop answering altogether.
J. Spontaneous Recovery
Example: After a period away from school, a student randomly raises their hand and answers a question despite previous lack of reinforcement.
K. Superstitious Behavior
Example: An athlete wears a ‘lucky’ shirt that they believe helps them win, even though it has no effect on their performance.
L. Discriminative Operant
SD: A green traffic light (signals reinforcer availability – ability to go).
S ∆: A red traffic light (signals no reinforcer – must stop).
M. Secondary Reinforcer
Example: A child is given a gold star (secondary reinforcer) after completing their homework, which is associated with praise (a primary reinforcer).
N. Generalized Reinforcer
Example: Money, which can be exchanged for various primary reinforcers like food, clothing, or entertainment.
O. Chaining
Example: A pigeon is taught to peck (first chain), then turn in a circle (second chain), and finally flap its wings (final behavior) to receive food.
P. Positive and Negative Reinforcers
Positive: A child receives a cookie for cleaning their room.
Negative: A teenager’s curfew is lifted for completing chores.
Q. Punishment
Positive Punishment: A child touches a hot stove and feels pain (added stimulus).
Negative Punishment: A teenager loses phone privileges for breaking curfew (removal of stimulus).
R. Schedules of Reinforcement
Continuous Reinforcement: A dog is given a treat every time it sits on command.
Fixed Interval: A student is rewarded with a pizza party every two weeks for completing homework.
Fixed Ratio: A factory worker is paid for every 10 items produced.
Variable Interval: A person checks their email at random times; sometimes they receive responses immediately, other times not for hours.
Variable Ratio: A slot machine pays out after a random number of pulls, leading to high engagement.