Behavioral

8/22/23

psychology and learning

what is psychology

  • scientific study of behavior and mental processes

    • behaviors are public/objective.

    • mental processes (cognitive and feelings) are private/subjective

      • inferred from behaviors

        • e.g., do dogs Dream?

      • can I know what a person dream about

        • introspect: looking inward, and report

        • problem?

  • behaviorist tried to eliminate reference to mind

what is learning

  • a lasting change in behavior due to experience

    • experience = sensory stimulation; practice; training

      • /= “instincts” (unlearned behaviors

  • why lasting

    • /= sickness; fatigue; drug-induced; sleepiness

types of Associative learning

  • why does the dog sit

    • sitting associated with a consequence (food)

    • operant conditioning

  • why does the dog salivate when I say sit

    • word/sound “sit” was associated with food

    • classical conditioning

    • behavioralist → “Sit” was repeatedly associated with food in the past for that dog

    • a behavioralist would not say

      • the dog “expected” food

      • he knew food was coming

      • the signal made the dog think of food or reminded the dog of food

      • the dog wanted food

  • learning can involve:

    • acquiring/modifying a behavior (or skill)

    • maintain a behavior/skill

    • inhibit/stop doing something

08/24/23

History of psychology

Plato Vs. Aristotle

  • Plato → nativist → knowledge is inborn

  • Aristotle → empiricist → all knowledge derives from experiences/learning/association

    • Laws of associating ideas:

      • similarity

      • contiguity

        • frequency

      • contrast

Rise of psych. Rene Descartes (17th century)

  • new philosophy

    • Search for first principle

      • “I cannot deny that I can deny things.”

      • “I think, therefore I am “

    • deduced that the mind is separated from body.

      • “Mind-body dualism”

        • subjective (conscious) - objective (physical) reality

evidence for dualism":

  • Some “properties” of objects exist only in the mind

    • as conscious experiences

Descartes’ Mind-Body dualism

  • mind

    • causes voluntary behaviors

    • humans only

    • studied philosophically

      • logic; intuition

  • body

    • complex machine

    • involuntary behaviors (reflexes)

      • stimulus-response

    • studied scientifically → physiology.

after Descartes

  • philosophy studied mind

    • debated the origins of ideas

      • innate or acquired

    • British empiricists ( e.g., John Locke)

      • tabula rasa (all knowledge derived from senses)

      • learning = association

  • Physiology studied body.

    • scientific methods

      • nervous system; reflexes (S-R)

      • Sensation (color vision & hearing tones)

1879 - Wilhelm Wundt founder of psych

  • Founder of psychology

    • 1879 - 1st psych. Lab (Leipzig, Germany)

      • and more

    • psych= study of conscious experience

structuralism

  • structuralism (1890)

    • psych = study of conscious experience

    • searched for elements

    • experimental introspection only

functionalism

  • Functionalism (1890)

  • psych = study of conscious experience

    • influenced by Darwin

      • adaption; individual differences; continuity of species

    • Study the adaptive functions of consciousness

    • individual differences

    • introspection + observation + Test

roots of behaviorism

  • Problems with introspection

    • unreliable; impossible to validate

  • Studies of animal behavior

    • Too much anthropomorphism

John Watson (1913)

  • “Psychology as the behaviorist views it”

example

  • can humans tell the difference between?

    • introspection → No

  • pavlovian Cond:

Watson theory: “methodological behaviorism”

8/29/23

behaviorism

  • natural science that studies environmental influences on observable behavior

  • all behaviorists agreed:

    • reject introspect.

    • avoid going “mental.”

    • studied learning.

    • use animals.

  • disagreed on:

    • role of intervening variables (internal, personal factors)

      • acceptable if operationalized

        • Hunger

behaviorist

  • Clark Hull - physiological intervening variables

  • Edward Tolman - cognitive intervening variables

    • expectation (# number of trials food is on the right)

tolman and Honzik

  • group one: rewarded with food

  • group two: never rewarded

  • group three: not rewarded until day 11

  • he looked at the errors the mice made in the maze to get food

  • latent learning

    • learning without immediate change in behavior

  • rats learned “the layout” of the maze

    • cognitive map

B.F. Skinner (1904-1990)

  • radical behaviorism

  • distinguished between

    • pavlovian vs. operant conditioning:

      • involuntary behaviors vs. voluntary behaviors

  • operant chamber

radical behaviorism

  • the best predictor of future behavior is knowing

    • consequences of past behaviors in similar situations

  • on mental events:

    • they are real (private behaviors)

      • they can be studied scientifically (but not now)

    • unnecessary in predicting ad controlling behavior

terminology

  • radical behaviorism is a philosophical approach

    • behavior analysis → applied behavior analysis (ABA)

skinner myths

  • myth: he believed we are helpless pawns of our environment

  • myth: he argued for punishing behaviors

  • myth: he claimed that all behaviors are learned

Ch 2 research methods

terminology

  • response = reaction to a stimulus

    • overt or cover/private

    • can be involuntary (“elicited”) or voluntary

  • stimulus = sensory experiences that stimulates a response

    • seeing X; Hearing X; Smelling X

terms

  • appetitive and aversive stimuli

    • motivating operations

    • deprivation can increase a appetitive stimuli

    • satiation can make a stimuli aversive

  • contiguity vs. contingency

    • contiguity = closeness in space and time

    • contingency = predictive (or dependent) relationship

      • “if…then..”

8/31/23

assignment due Tuesday

measure of behavior

(types of data)

  • rate/frequency

  • duration

  • speed

  • latency

  • intensity

  • errors

  • topography (qualitative)

recording methods in ABA

  • data = % of intervals a behavior occurs (not rate, duration, etc.)

  • interval recording:

    • continuous intervals

  • time-sample

    • spaced intervals

research designs

  • descriptive:

    • systematically observe behavior

    • only measure things

    • look for correlations b/ behaviors and conditions

    • generate hypotheses

    • limitation → cannot determine cause-effect

  • types of descriptive:

    • naturalistic observations

    • survey research

    • case studies

experimental designs

  • experimental

    • aspects:

      • systematic manipulation (of independent variables)

        • create comparison conditions.

      • experimental control

        • to rule out alternative explanations (“confounds”)

    • 2 types: group vs. single subject

  • control group

    • experimental group vs. control group

      • groups should be identical except for IV:

        • HOW? to rule out confounds

          • random assignment to conditions

          • counterbalance all conditions

    • limitations:

      • subjects

      • look at group averages (lose the individual)

  • single-subject or small n

    • repeated testing of a single or few subjects across time

  • single subject

    • simple comparison (AB design)

    • cannot infer causation

  • reversal design (ABAB)

    • when not to use ABAB

      • when treatment will have long lasting effects

        • there must be a return to baseline

      • ethical issues of removing treatment

  • multiple baseline

    • across people

      • (same target behavior; same setting)

    • across behaviors

      • same person; same setting)

    • across settings

      • (same person; same behavior)

9/5/23

  • changing criterion design

    • good for when you want to gradually change a behavior.

elicited behaviors

elicited stimuli and responses

  • elicited → triggered: involuntary.

    • hearing the bell can elicit numerous kinds of involuntary responses (reflexes)

      → orient to

      → startle unlearned “unconditioned” <- universal.

      → salivate learned “conditioned” via association <- vary between people.

    • simple: reflexes

      • some occur to specific eliciting stimuli.

        • salivatory, eye blink, startle, flexion, etc.

        • orienting response (aka ‘pay attention’)

      • all start as unlearned (“unconditioned”)

        • found in newborns.

      • later the response might be continued to other stimuli.

non-associative learning

  • sensitization: increased elicited responding to the stimuli

    • when stimulus is intense or meaningful (annoying, threatening)

    • increased responding readily generalizes to other stimuli.

  • habituation: reduced elicited responding.

    • (stop “paying attention”)

    • happens when stimulus is weak, moderate or irrelevant.

    • more specific to the eliciting stimulus

      • but some generalization to similar stimuli (you don’t notice the change)

    • how to recover a habituated response to an unchanged stimulus

      • present an unrelated, extraneous stimulus.

        • called dishabituation.

      • time away from stimulus

        • called spontaneous recovery.

    • short-term vs. long term habituation

opponent process theory of emotion

  • emotional reactions have two phases:

    • primary reaction → opposing reaction

      • example

        • shock a dog → increased HR

        • shock ends → decreased HR (below baseline)

    • with repeated exposure:

      • primary reaction weakens (habituates)

      • opposing reaction strengthens (sensitizes)

        • two things to explain.

          • the dynamics in a single exposure

          • changes over time

9/12/23

excitatory vs. inhibitory conditions,

  • excitatory: “positive contingency” (CS+)

    • CS+ associated with onset or increase of US

  • inhibitory: “negative contingency” (CS-)

    • CS- associated with the absence, termination or reduction of US

    • in the presence of this CS-, you inhibit an excitatory response to different CS+

temporal arrangements if NS and US

  • delayed conditioning procedure ( draw graph from book)

  • Trace conditioning procedure ( draw graph from book )

  • simultaneous conditioning procedure ( draw from book)

  • backward conditioning procedure (draw from book)

  • the importance of contiguity and contingency

  • NS/CS must be “informative”

  • NS/CS must signal something about the US

pseudoconditioning

  • When an apparent CR is really due to sensitization

  • need controls:

      1. Include numerous other NSs during test trials

      1. include a control group where NS and US are randomly presented

Ch 4

acquisition

  • acquire a CR to a stimulus

    • pair NS with US

    • maximum strength is called asymptote of conditioning

  • extinction

    • how to weaken a CR

      • disassociate CS with US

        • process = repeatedly present CS without US

        • effect= Weakened CR

    • extinction =/ forgetting

    • “extinction” term is misleading

    • pavlov: extinction is not unlearning (CS →NS)/

      • instead, you learn to inhibit the CR (CS+ → CS-)

9/14/23

Extinction vs. Habituation

  • both involve reduced responding to a repeated stimulus

  • differences

    • Habituation involves UR

      • habituation due to simple repetition of us

    • extinction involes CR

      • b/c CS is dissociated from US

    • different patterns (e.g., spontaneous recovery)

generalization and discrimination

  • stimulus generalization: CRs to stimuli that are physically similar to a CS.

  • discrimination : opposite of generalization, (involves training)

  • experimental neurosis: neurotic symptoms following unpredictable events

Theoretical issue

  • can a different looking NS become a CS without ever being directly paired with a US

  • 2 ways:

    • sensory preconditioning

    • high-order conditioning

  • failures to condition despite paring NS with US:

  • blocking effect

    • NS mist provide new info about US

    • the US must be surprising

  • latent inhibition

    • CS pre-exposure effect”

  • overshadowing

9/19/28

Chapter 5 processes and applicationsthe

Pavlov’s theory

  • the CS substitutes for the us

    • brain treats CS as if it were the US

  • problem:

    • CR can sometimes be different than the UR

      • E.G., Shock

preparatory-response view of classical conditioning

  • the CR prepares the animal for the onset of the US

  • Application: Compensatory response Theory (Siegel)

compensatory response theory

  • classical conditioning + opponent process theory

  • US → a-process → b-process

    • shock → increase HR → decrease in HR

  • 3 options for CR:

    • CS→ a-process → b-process

    • CS → a-process

    • CS → b-process

    • tone → decrease in HR

  • CS elicits opposing-reaction (called conditioned compensatory response)

  • a) conditioned decrease in HR occurs before US

  • b) CR reduces the impact of the US

  • cues (CS) associated with taking drugs trigger conditioned compensatory response

    • Example:

    • function: “prepare” body for the drug

      • counteract/reduce the effects of the drug

        • need more drug

  • imagine taking the drug in “new” place

  • fewer conditioned compensatory responses

  • Overdose

  • limitations

    • sometimes CS for a drug elicits the primary response of the drug

    • examples:

      • cocaine and HR

10-5-23

phobias

  • classical conditioning of fears

    • e.g. little Albert

biological factors that influence phobias

  • temperament

    • individual differences

    • book→ “genetically-determined”

  • preparedness

    • evolved, “biological” predisposition to make/retain certain associations

      • easy to learn to fear some things over other things

  • selective sensitization

    • high stress can make normally minor anxiety events trigger strong anxiety

  • incubation

    • when brief exposures to fearful CSs somehow increases your conditioned fear

treatments for fears/phobias

  • goal: to reduce aversion to CSs

    • (a) extinction-based

      • exposure therapy

        • gradual exposure

      • flooding

        • prolonged exposure to the CSs

        • prevent avoidance response

    • (b) counterconditioning

      • pair feared CSs with appetitive USs

      • systematic desensitization

        • relaxation training + hierarchy of feared CSs

        • but, is this counterconditioning or extinction

additional application

aversion therapy

  • Goal: reduce the appeal of appetitive stimuli

  • counterconditioning

    • examples;

      • alcohol odors/flavors: emetic → nausea

        • alcohol odors/flavors → aversion

        • note: “preparedness” (alcohol: shock dose not work)

      • rapid smoking

      • thumb sucking or nail biting?

medical applications

  • allergies:

    • flower: pollen → allergic reaction

      • flowers → weak allergic reaction

  • placebo effect:

    • pills: medication → pain relief

      NS US UR

      • Pill → pain relief

        CS CR

  • immune system:

    • hospital: chemotherapy → weakened immune system

      • hospital → weakened immune system

    • Sherbert flavor: adrenaline → immune boost

      • Sherbert flavor →immune boost

10/10/23

History: “instrumental” conditioning

Edward Thorndike

  • 19th century

    • animals are believed capable of intellect (e.g., reasoning, insight)

    • full of anecdotal evidence

      • anthropomorphism too

    • few systematic/ experimental studies

  • puzzle box experiments (1890’s)

    • learning is irregular and gradual, not smooth or all at once

      • no insight

      • trial and error instead

      • law of effect: behaviors that produce pleasure get “stamped in” to that situation

      • for t’dike: a reflex (s→r) is created.

Skinner (1930s)

  • new method - operant chambers (skinner boxes)

  • operant conditioning

    • consequences

    • behaviors NOT reflexive (not S→R)

      • “voluntary” → flexible and goal-directed

“three-term contingency” or “functional analysis”

  • antecedent

    • motivating operations

      • restriction

    • discriminative stimuli

      • cues for when to behave.

      • signal consequence (Sd) or no consequence (S^)

  • behavior

    • behavior operates on the environment.

    • “emitted”

  • consequence

    • influences future probability in similar settings

      • reinforce (increase/maintain) or punish (decrease)

      • Sr or Sp

consequences of behavior

operant contingencies

(pictures on phone)

  • you reinforce or punish behaviors, not individuals.

  • Technically speaking, you must know how a behavior changes in response to its consequences.

    • after Tommy broke five dishes this week, his mother took away his TV privileges. the next week Tommy

      • Zero dishes - negative punishment

      • five dished - none

      • 10 dishes - negative reinforcement

  • ignore whether the stimulus sounds appetitive or aversive.

    • usually determined after the fact

    • can differ between people.

  • ignore someone’s intent.

10/12/23

How to teach new behaviors using positive reinforcement

shaping

  • reinforcing successive approximation to a new behavior

    • the best technique with children and animals

other ways to teach new behaviors.

  • luring → guiding a behavior using a Sr.

  • physical guidance

  • modeling

  • verbal instruction

positive reinforcement

factors influencing effectiveness.

  • deprivation level

  • timing:

    • immediate = more effective

      • e.g., dieting

  • absolute magnitude/quality

    • $12.00/hr vs. $8.00/hr

  • relative magnitude/quality*** (contrast effect)

    • who will work harder?

      • previously paid $16.00, now $12.00

      • VS

      • previously paid $8.00, now $12.00

classifying reinforcers

  • primary vs. secondary

    • primary = unconditioned Sr

      • reinforcing for unlearned/biological reasons

    • secondary = conditioned Sr

      • reinforcing because associated with other reinforcers.

        • money; attention; smiles; good grades

  • extrinsic vs. intrinsic (motivation)

    • extrinsic → external reasons

    • intrinsic → “self-reinforcing”

    • can external rewards influence intrinsic motivation?

      • it depends.

        • intrinsic interest reduced when external rewards are.

          • expected

          • tangible

          • given regardless of performance quality.

  • natural vs. contrived

    • natural- consequences that are typical in a given natural or social setting.

    • contrived -artificial consequences.

    • examples:

      • waving →

      • eating veggies→

      • studying →

    • Technique in ABA:

      • change behavior initially using artificial reinforcement.

      • gradually withdraw the artificial Srs to allow natural Srs to take over.

ch7

schedules and theories of positive reinforcement

schedules of reinforcement

  • continuous reinforcement schedule - CRF

  • intermittent/partial schedules of reinforcement

    • most common in social life

    • can be “dense” or “lean.”

  • “Steady-state” response patterns

    • different schedules produce distinctive behavior patterns.

      • some can be seen on a “cumulative recorder.”

intro

  • Notes:

    • most real-life schedules are not “pure.”

      • can be “pure” in a lab.

    • we will look at “un-cued” steady-state patterns.

      • in real life, most behaviors have cues (Sd)

    • more difficult to see “steady-state patterns” in humans.

simple schedules

measures of behavior

  • rate/ frequency

  • duration

  • speed

  • latency

(picture on phone about fixed and variable ratio)

fixed ratio: a set amount of work needed for a reward (unchanging/unvarying)

(picture on phone about fixed and variable interval)

(picture on phone about fixed and variable duration)

(picture on phone about high low rate)

(time schedules)

10/19/23

assignment due Tuesday

complex schedules

  • adjusting schedule

    • requirement changes

    • Notes:

      • only one behavior/operant/task is involved.

      • increasing the requirement (e.g., “stretching the ratio”) should be done gradually.

        • otherwise “ratio strain”

    • changing-criterion research design

    • shaping

  • conjunctive

    • must meet the demands of 2 or more simple schedules.

      • examples:

        • chores

        • making coffee

  • chained

    • series (“chain”) of responses required to obtain a reinforcer.

    • in an operant chamber:

    • Notes:

      • best way to train this series:

        • backward chaining

      • responding get stranger links closer to the end.

        • goal-grading effect

          • why?

            • green and blue lights are secondary reinforcers.

              • via higher-order conditioning

                • green light 1st order

                • blue light 2nd order

  • concurrent schedules (ch10)

    • 2 or more simultaneous, independent schedules/tasks each with its own reinforcer

    • used to study “choice behavior.”

    • what about VI schedules??

      • key difference → the occasional response will pay off

    • Herrnstein (1961)

10/24/23

Matching law

  • real-life examples

    • 1) animals foraging for food

    • 2) human social interaction

deviations from matching

  • when the reinforcers are the same:

    • undermatching

      • less different than expected

    • overmatching

      • more different than expected.

  • ( picture on phone)

  • when there are different

deviations

  • bias

    • occurs when reinforcers differ

    • one schedule consistently gets a higher % of responding that predicted.

    • ( picture on phone)

exam ch5, ch 6 ch7 and 10

10/31/23

theories of positive reinforcement

  • reinforcement = consequence that increases future probability of a behavior

    • note that reinforcers are identified after using them

  • can we identify them before using them?

ch 8 extinction and stimulus control

extinction in operant conditioning

  • process= stop reinforcing a behavior

  • effect = behavior decreases (partially or fully) in that setting

side effects of extinction

  • (1) extinction burst

  • (picture on phone)

  • (2) resurgence - reappearance of an “old” (previously extinguished) behavior

  • (3) emotional behavior - agitation (frustration)

  • (4) aggression

    • example:

      • a rat is likely to bite you (or attack another rat)

  • after all those initial side affects:

  • (5) depression

    • reduced (aka depressed) activity overall

      • happens when the animal cannot get the reinforcer elsewhere

resistance to extinction

  • how long will the rat keep pressing the lever despite not getting food?

11/2/23

  • the degree to which responding (despite nonreinforcement)

  • we can control (or predict) when an animal will show high or low resistance.

  • (picture on phone)

influences on resistance

  • previous schedule of reinforcement:

    • as the example shows, CRF is less resistant than intermittent.

    • in comparing different intermittent schedules:

      • the more intermittent, the more resistant

        • e.g., more resistance on FR-20 than FR-10

      • variable schedules more resistant than comparable fixed schedules

        • e.g., VR-20 more resistant than FR-20

        • or VI- 15 sec more resistant than FI-15 sec

  • history

  • size of reward

  • degree of deprivation

  • presence of a signal

    • s< (pronounced “S-Delta)

      • signal “no reinforcer”

      • speeds up extinction

    • these signals are common in our lives:

      • Out-of-order sign

  • other way to speed up extinction

    • reinforce

    • the non-occurrence of the target behavior

      • differential reinforcement of other behaviors (DRO)

      • always pair extinction of unwanted behaviors with reinforcement of alternatives

spontaneous recovery

  • reappearance of extinguished operant after a break away from setting

stimulus control

  • signals/cues for when to behave.

operant conditioning: three-term contingency

Antecedent Behavior

  • generalization of operant behaviors to similar discriminative stimuli

  • discrimination when over time you will begin treating different stimuli differently

generalization gradients

(picture on phone)

peak shift effect

(picture on phone)

multiple schedules

  • sequential presentation of 2 or more independent schedules, each with its own Sd and Sr.

behavioral contrast

  • changing the amount of reinforcement on one schedule (x) can influence behavior on the other schedule (y)

    • up reinforcement on x, down in behavior on y (negative contrast)

    • down reinforcement on x, up in behavior on y (positive contrast)

  • anticipatory contrast

    • in an S> for upcoming extinction is presented, rate of responding increases.

fading and errorless discrimination

  • basic discrimination training has limitations.

11/9/23

assignment due Tuesday

  • during initial training, an animal will respond to sd and s>

    • nonreinforcement may frustrate animal → disrupt training.

  • solution = errorless discrimination training

    • helps to minimize “errors.”

  • 2 procedures:

    • make S> as dissimilar as possible to start.

      • then gradually change its intensity/noticeability (fading) during training.

    • example: getting a pigeon to discriminate red (Sd) vs. green (S>)

applications of stimulus control

  • basic matching to sample:

    • animal must choose the option (green) that matches the target (green)

    • great for studying perception.

  • delayed matching to sample:

    • when testing memory

  • can also measure STM capacity

transitivity (reasoning) test

(pages 504-506)

  • A>B, and B>C, which is greater A or C

  • discrimination training:

    • A+ vs. B-

    • B+ vs. C- Pairs randomly presented (x 20 each)

    • C+ vs. D-

    • D+ vs. E-

    • critical test?

      • B vs. D

        • tend to choose B

ch9 aversion

escape and avoidance

  • shuttle box escape/avoidance learning.

  • escape easy to understand:

    • tone: shock: jump barrier → removes shock

      • what contingency?

        • negative reinforcement through pain reduction

          • note: from aversive to non-aversive situations

  • avoidance more difficult ( avoidance paradox)

    • tone: jump barrier → no shock

      • note: from non-aversive to non-aversive

        • how can exposure to nothing be reinforcing?

signaled avoidance procedure.

  • signal (CS) associated with shock (US)

mowers two-process theory (1940)

  • process 1: conditioned fear of a CS (the signal)

    • tone: shock → pain/fear

      • tone → fear

  • process 2: negative reinforcement through fear reduction.

    • tone: jump barrier → escape tone (reduced fear)

  • note: avoidance learning is actually escape learning.

problem #1

  • eventually, animals appear “nonchalant.”

    • fear of CS must be present for the 2-process theory to hold true

    • retort: fear of CS is reduced, but not eliminated

      • how can we tell?

        • CS suppressed lever pressing in a different context

problem #2

  • avoidance responses are extremely persistent

    • after many avoidance trials, conditioned fear to CS should extinguish

      • tone: jump → no shock

  • retort:

    • anxiety conservation hypothesis: (not mower)

      • insignificant exposure to the full CS for conditioning to fully extinguish.

application to OCD

  • (picture on phone)

  • what contingency

    • negative reinforcement

  • therapy: exposure and response prevention:

    • gradually exposure of fearful CSs

    • prevent compulsion.

11/14/23

punishment

  • positive or negative (response-cost vs. time-out)

  • classification

    • intrinsic vs extrinsic

    • primary vs secondary

  • influences

    • timing

    • schedule effects

      • continuous schedules lead to lasting change (not intermittent) Often used ineffectively

    • absolute strength

    • relative strength: (contrast effects)

      • initially “weak” makes “strong” less effective

      • initially “strong” makes “Weak” more effective

  • additional suggestions:

    • explain why.

      • punishes teaches what not to do

    • reinforce appropriate behavior.

positive (corporal) punishment

  • problems:

    • general suppression of behavior

    • strong emotional responses

      • interfere with learning.

    • aggression ( “punishment-induced”)

      • in the moment

    • teaches/models aggression.

      • long term

    • avoidance of punishing person.

    • punishing person as a discrimination stimulus

      • → unwanted behavior still occurs elsewhere.

other effects of an aversive stimulus

( pic on the phone)

(pic on the phone)

learned helplessness. (1960s)

  • learning impairment following predictable, uncontrollable aversive events

    • why?

      • expectation and belief of lack of control

        • note: cognitive explanation.

  • similar to depression

  • treatments:

    • place an animal in a situation where it cannot fail (drag the dog over the barrier)

    • anti-depressant medication

  • immunization effect: prior contingent escape experience

    • counteracts effects of inescapable shock

masserman’s experimental neurosis

  • experimental neurosis

    • definition

      • unpredictable stimulus

      • Pavlov’s procedure?

      • appetitive

    • massermans procedure?

      • aversive

  • symptoms

    • most cats: anxiety (phobia-like; hyper-alert)

      • quiet cats → restless, agitated.

      • active cats → withdrawn, passive.

  • similar to PTSD

    • behavior

      • fear and avoidance of trauma-related events

      • agitated or passive.

      • hypervigilance

    • setting:

      • unpredictable aversive events

      • more easily develops when trauma happens in an appetitive context.

ch 12 biological dispositions in learning

elicited responses (ch3)

  • simple: reflexes

  • complex: fixed action patterns

    • unlearned sequence of movements

    • triggered by sign stimuli.

    • historically → “instinct”

history

  • in behaviorism, the role of “instinct” was discussed in terms of biological “preparedness” in learning.

    • the first studies within behaviorism to show preparedness came from studying taste aversion learning.

taste-aversion learning

  • taste/smell: poison/irradiation → sick

    Ns US UR

  • taste/smell → sickness (and avoidance)

    CS CR

  • unusual aspects:

    • one trial learning

    • long delay learning (b/w NS and US/UR)

    • “selective associations”

      • some associations more easily formed than others

        • e.g., alcohol aversions

preparedness operant conditioning

  • rats pressing levers for food (easy) vs. avoiding shock (hard)

  • hamsters: food reinforces food-related behavior

biological influences: instinctive drift

  • Breland and Breland:

instinctive drift

  • when conditioned fixed action patterns eventually displace conditioned behaviors?

  • initial training (operant conditioning)

    • coin: deposit → food → chewing and rooting

      CS US UR (FAP’s)

  • over time (classical conditioning takes over)

    • Coin (CS) → rooting and chewing (CR - FAP’s)

unexpected results

sign tracking

  • approaching a CS that signals an appetitive US

    • shows FAP’s to the CS

  • AutoShaping in pigeons

Adjunctive Behavior

  • Side-effect of FI or FT schedules

    • what do these have in common

  • FI-2 min for food for 3 hr.:

    • in rats → gnawing, running in wheel, polydipsia (1/2 body weight); aggression

activity anorexia

  • rats:

    • restrict access to food once per day (90 min/day)

    • provide running wheel

    • results: as wheel running increased, food intake decreased

      • death

  • similarities to anorexia nervosa:

    • restricted feeding

    • food not aversive

    • high activity levels

    • endorphin highs

      • in rats, blocking endorphins stops wheel running

    • more common in adolescents

  • difference:

    • free access to food eliminates the condition in rats

11/21/23

history of psych

  • history

    • decline of behavioral approach in 1960’s: why?

      • “Cognitive revolution”

    • results: 1960s → rise in cognitive explanations in behaviorism

includes theories of positive reinforcement

premack

reponce deprivation hypothisis

hulls theory

multipule scheduals

contast effects

stimmulus control

negative reinforcment

mowers

punisment