1/58
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Acquisition.
That part of operant conditioning in which an operant response is followed by a reinforcer, thereby increasing the rate with which the response occurs.
Avoidance contingency.
Situation in which the organism can avoid an aversive stimulus by engaging in appropriate activity.
Behavior therapy.
Approach to treating behavior disorders that is based on any one of several learning theories.
Chaining.
Situation in which one response brings the organism into contact with stimuli that (1) reinforce that response and (2) stimulate the next response. Chaining can also involve other people; for example, one person's response can both reinforce another person's response and determine the next course of action.
Classical conditioning.
Type of conditioning studied by Ivan Pavlov and used by J. B. Watson as a model for his version of behaviorism.
Conditioned response (CR).
Response similar to an unconditioned response that is elicited by a previously neutral stimulus (CS).
Conditioned stimulus (CS).
Stimulus that, before classical conditioning principles are applied, is biologically neutral; that is, it does not elicit a natural reaction from an organism.
Contingency contracting.
Agreement between two people that when one acts in an appropriate way, the other one gives him or her something of value.
Contingency management.
Purposive manipulation of reinforcement contingencies so they encourage desirable behaviors.
Contingent reinforcement.
Situation in which a certain response must be made before a reinforcer is obtained; that is, no response, no reinforcer.
Continuous reinforcement schedule (also called a 100% schedule of reinforcement).
Schedule of reinforcement that reinforces a desired response each time it occurs.
Cultural engineering.
Use of contingency management in designing a culture.
Culture.
According to Skinner, a set of reinforcement contingencies.
Differential reinforcement.
Situation in which some responses are reinforced and others are not.
Discriminative operant.
Operant response that is made under one set of circumstances but not under others.
Discriminative stimulus (Sd)
Cue indicating that if a certain response is made it will be followed by reinforcement.
Echoic behavior.
Accurate repeating of what someone else had said.
Escape contingency.
Situation in which an organism must respond in a certain way to escape from an aversive stimulus. All negative reinforcement involves an escape contingency.
Extinction.
Weakening of an operant response by removing the reinforcer that had been following the response during acquisition. When a response returns to its operant level, it has been extinguished.
Fixed interval reinforcement schedule (FI).
Reinforcement schedule that reinforces a response that is made only after a specified interval of time has passed.
Fixed ratio reinforcement schedule (FR).
Reinforcement schedule that reinforces every nth response. For example, every fifth response the organism makes is reinforced (FR5).
Functional analysis.
Skinner's approach to research that attempted to relate measurable environmental events to measurable behavior and bypass cognitive and physiological processes altogether.
Generalized reinforcers.
Class of secondary reinforcers that have been paired with more than one primary reinforcer.
Mand.
Verbal response that demands something and is reinforced when what is demanded is obtained.
Negative reinforcement.
Type of reinforcement that occurs when a response removes a primary or secondary negative reinforcer.
Noncontingent reinforcement.
Situation in which no relationship exists between an organism's behavior and the availability of reinforcement.
Operant behavior.
Behavior that cannot be linked to any known stimulus and therefore appears to be emitted rather than elicited.
Operant conditioning (also called type R conditioning).
Modification response strength by manipulation of the consequences of the response. Responses that are followed by a reinforcer gain in strength; responses not followed by a reinforcer become weaker.
Operant level.
Frequency with which an operant response is made before it is systematically reinforced.
Partial reinforcement effect (PRE).
Fact that a partially or intermittently reinforced response will take longer to extinguish than a response on a continuous or 100% schedule of reinforcement.
Partial reinforcement schedule.
Schedule of reinforcement that sometimes reinforces a desired response and sometimes does not. In other words, the response is maintained on a schedule of reinforcement somewhere between 100% and 0%.
Positive reinforcement.
Type of reinforcement that occurs when a response makes available a primary or secondary positive reinforcer.
Primary negative reinforcer.
Negative reinforcer that threatens an organism's survival— for example, pain or oxygen deprivation.
Primary positive reinforcer.
Positive reinforcer that enhances an organism's survival— for example, food or water.
Primary reinforcer.
Any stimulus that is positively or negatively related to an organism's survival.
Punishment.
Either removing a positive reinforcer or presenting a negative reinforcer.
Radical behaviorism.
The version of behaviorism proposed by J. B. Watson by which only directly observable events, such as stimuli and responses, should constitute the subject matter of psychology. Reference to all internal events can be, and should be, avoided. Skinner accepted this version of behaviorism.
Rate of responding.
Used by Skinner to demonstrate operant conditioning. If a response is followed by a reinforcer, the rate or frequency with which it is made will in-crease; if a response is not followed by a reinforcer, its rate or frequency will stay the same (if it is at its operant level) or will decrease.
Respondent behavior.
Behavior that is elicited by a known stimulus.
Respondent conditioning (also called type S conditioning).
Another term for classical or Pavlovian conditioning.
Secondary negative reinforcer.
Negative reinforcer that derives its reinforcing properties through its association with a primary negative reinforcer.
Secondary positive reinforcer.
Positive reinforcer that derives its reinforcing properties through its association with a primary positive reinforcer.
Secondary reinforcer.
Objects or events that acquire reinforcing properties through their association with primary reinforcers.
Shaping.
Gradual development of a response that an organism does not normally make. Shaping requires differential reinforcement and successive approximations. See also Differential reinforcement and Successive approximations.
Skinner box.
Small experimental chamber that Skinner invented to study operant conditioning.
Stimulus generalization.
The tendency to emit operant responses in situations other than those in which the responses were learned. As the similarity between the original reinforcing situation and other situations increases, so does the probability of responding to them in a similar manner.
Successive approximations.
Situation in which only those responses that are increasingly similar to the one ultimately desired are reinforced.
Superstitious behavior.
Behavior that develops under noncontingent reinforcement in which the organism seems to believe that a relationship exists between its actions and reinforcement, when in fact no such relationship exists.
Tact.
That part of verbal behavior that accurately names objects and events in the environment.
Time out from reinforcement.
A form of punishment by which an organism is denied access to positive reinforcers that are normally available in the situation for a specified interval of time.
Token economies.
Example of Skinnerian behavior therapy that usually occurs within an institutional setting such as a psychiatric hospital or a school. Within a token economy, desirable behavior is reinforced by tokens (or sometimes points or cards) that can subsequently be traded for desirable objects or events such as food, cigarettes, privacy, or a choice of a television program.
Type R conditioning (also called operant conditioning).
Term Skinner used to describe the conditioning of operant or emitted behavior to emphasize the importance of the response (R) to such conditioning.
Type s conditioning (also called respondent conditioning).
Term Skinner used to describe classical conditioning to emphasize the importance of the stimulus (S) to such conditioning.
Unconditioned response (UR).
Natural, automatic response elicited by an unconditioned stimulus (US).
Unconditioned stimulus (US).
Stimulus that elicits an automatic, natural response from an organism. Also called a primary reinforcer because conditioning ultimately depends on the presence of a US.
Variable interval reinforcement schedule (VI).
Reinforcement schedule in which a certain average time interval must pass before a response will be reinforced. For example, the organism is reinforced on the average of every 30 seconds.
Variable ratio reinforcement schedule (VR).
Reinforcement schedule in which a certain average number of responses need to be made before reinforcement is obtained. For example, the organism is reinforced on the average of every fifth response (VR5).
Verbal behavior.
Skinner's term for language.
Walden Two.
Novel written by Skinner to show how his learning principles could be applied to cultural engineering.