signal detection theory
a theory predicting how and when we detect the presence of a faint stimulus amid background stimulation. Assumes there is no single absolute threshold and that detection depends partly on a person's experience, expectations, motivation, and alertness.
sensation
the process by which our sensory receptors and nervous system receive and represent stimulus energies from our environment
sensory receptors
sensory nerve endings that respond to stimuli
perception
the process of organizing and interpreting sensory information, enabling us to recognize meaningful objects and events
bottom-up processing
analysis that begins with the sensory receptors and works up to the brain's integration of sensory information
priming
the activation, often unconsciously, of certain associations, thus predisposing one's perception, memory, or response
Weber's law
the principle that, to be perceived as different, two stimuli must differ by a constant minimum percentage (rather than a constant amount)
perceptual set
a mental predisposition to perceive one thing and not another
intensity
the amount of energy in a light wave or sound wave, which influences what we perceive as brightness or loudness. Intensity is determined by the wave's amplitude (height)
cornea
the eye's clear, protective outer layer, covering the pupil and iris
pupil
the adjustable opening in the center of the eye through which light enters
lens
the transparent structure behind the pupil that changes shape to help focus images on the retina
retina
the light-sensitive inner surface of the eye, containing the receptor rods and cones plus layers of neurons that begin the processing of visual information
accommodation
(1) in sensation and perception, the process by which the eye's lens changes shape to focus near or far objects on the retina. (2) in developmental psychology, adapting our current understandings (schemas) to incorporate new information
rods
retinal receptors that detect black, white, and gray, and are sensitive to movement; necessary for peripheral and twilight vision, when cones don't respond
cones
retinal receptors that are concentrated near the center of the retina and that function in daylight or in well-lit conditions. Cones detect fine detail and give rise to color sensations
optic nerve
the nerve that carries neural impulses from the eye to the brain
blind spot
the point at which the optic nerve leaves the eye, creating a "blind" spot because no receptor cells are located there
fovea
the central focal point in the retina, around which the eye's cones cluster.
opponent-process theory
the theory that opposing retinal processes (red-green, blue-yellow, white-black) enable color vision
feature detectors
nerve cells in the brain's visual cortex that respond to specific features of the stimulus, such as shape, angle, or movement
hue
the dimension of color that is determined by the wavelength of light
figure-ground
the organization of the visual field into objects (the figures) that stand out from their surroundings (the ground)
grouping
the perceptual tendency to organize stimuli into coherent groups.
depth perception
the ability to see objects in three dimensions although the images that strike the retina are two-dimensional; allows us to judge distance
visual cliff
a laboratory device for testing depth perception in infants and young animals
binocular cue
a depth cue, such as retinal disparity, that depends on the use of two eyes
retinal disparity
a binocular cue for perceiving depth. By comparing retinal images from the two eyes, the brain computes distance—the greater the disparity (difference) between the two images, the closer the object
monocular cue
a depth cue, such as interposition or linear perspective, available to either eye alone
phi phenomenon
an illusion of movement created when two or more adjacent lights blink on and off in quick succession
perceptual constancy
perceiving objects as unchanging (having consistent color, brightness, shape, and size) even as illumination and retinal images change
color constancy
perceiving familiar objects as having consistent color, even if changing illumination alters the wavelengths reflected by the object
perceptual adaptation
the ability to adjust to changed sensory input, including an artificially displaced or even inverted visual field
frequency
the number of complete wavelengths that pass a point in a given time
pitch
a tone's experienced highness or lowness; depends on frequency
middle ear
the chamber between the eardrum and cochlea containing three tiny bones (hammer, anvil, and stirrup) that concentrate the vibrations of the eardrum on the cochlea's oval window
cochlea
a coiled, bony, fluid-filled tube in the inner ear; sound waves traveling through the cochlear fluid trigger nerve impulses
inner ear
the innermost part of the ear, containing the cochlea, semicircular canals, and vestibular sacs
sensorineural hearing loss
hearing loss caused by damage to the cochlea's receptor cells or to the auditory nerves
conduction hearing loss
a less common form of hearing loss, caused by damage to the mechanical system that conducts sound waves to the cochlea
cochlear implant
a device for converting sounds into electrical signals and stimulating the auditory nerve through electrodes threaded into the cochlea
place theory
in hearing, the theory that links the pitch we hear with the place where the cochlea's membrane is stimulated
frequency theory
in hearing, the theory that the rate of nerve impulses traveling up the auditory nerve matches the frequency of a tone, thus enabling us to sense its pitch.
gate-control theory
the theory that the spinal cord contains a neurological "gate" that blocks pain signals or allows them to pass on to the brain. The "gate" is opened by the activity of pain signals traveling up small nerve fibers and is closed by activity in larger fibers or by information coming from the brain
vestibular sense
our sense of body movement and position that enables our sense of balance
sensory interaction
the principle that one sense may influence another, as when the smell of food influences its taste
embodied cognition
the influence of bodily sensations, gestures, and other states on cognitive preferences and judgments
learning
the process of acquiring through experience new and relatively enduring information or behaviors
associative learning
learning that certain events occur together. The events may be two stimuli (as in classical conditioning) or a response and its consequence (as in operant conditioning)
stimulus
any event or situation that evokes a response
respondent behavior
behavior that occurs as an automatic response to some stimulus
operant behavior
behavior that operates on the environment, producing consequences
cognitive learning
the acquisition of mental information, whether by observing events, by watching others, or through language
classical conditioning
a type of learning in which we link two or more stimuli; as a result, to illustrate with Pavlov's classic experiment, the first stimulus (a tone) comes to elicit behavior (drooling) in anticipation of the second stimulus (food)
behaviorism
the view that psychology (1) should be an objective science that (2) studies behavior without reference to mental processes. Most psychologists today agree with (1) but not with (2)
neutral stimulus
in classical conditioning, a stimulus that elicits no response before conditioning
unconditioned response
in classical conditioning, an unlearned, naturally occurring response (such as salivation) to an unconditioned stimulus (US) (such as food in the mouth)
unconditioned stimulus
in classical conditioning, a stimulus that unconditionally—naturally and automatically—triggers an unconditioned response (UR)
conditioned response
in classical conditioning, a learned response to a previously neutral (but now conditioned) stimulus (CS)
conditioned stimulus
in classical conditioning, an originally neutral stimulus that, after association with an unconditioned stimulus (US), comes to trigger a conditioned response (CR)
acquisition
in classical conditioning, the initial stage, when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned response. In operant conditioning, the strengthening of a reinforced response
higher-order conditioning
a procedure in which the conditioned stimulus in one conditioning experience is paired with a new neutral stimulus, creating a second (often weaker) conditioned stimulus. For example, an animal that has learned that a tone predicts food might then learn that a light predicts the tone and begin responding to the light alone
extinction
the diminishing of a conditioned response; occurs in classical conditioning when an unconditioned stimulus (US) does not follow a conditioned stimulus (CS); occurs in operant conditioning when a response is no longer reinforced
spontaneous recovery
the reappearance, after a pause, of an extinguished conditioned response
generalization
the tendency, once a response has been conditioned, for stimuli similar to the conditioned stimulus to elicit similar responses. (In operant conditioning, generalization occurs when responses learned in one situation occur in other, similar situations.)
discrimination
(1) in classical conditioning, the learned ability to distinguish between a conditioned stimulus and similar stimuli that do not signal an unconditioned stimulus. (In operant conditioning, the ability to distinguish responses that are reinforced from similar responses that are not reinforced.) (2) in social psychology, unjustifiable negative behavior toward a group and its members
operant conditioning
a type of learning in which a behavior becomes more likely to recur if followed by a reinforcer or less likely to recur if followed by a punisher
law of effect
Thorndike's principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely
operant chamber
in operant conditioning research, a chamber (also known as a Skinner box) containing a bar or key that an animal can manipulate to obtain a food or water reinforcer; attached devices record the animal's rate of bar pressing or key pecking
reinforcement
in operant conditioning, any event that strengthens the behavior it follows
shaping
an operant conditioning procedure in which reinforcers guide behavior toward closer and closer approximations of the desired behavior
discriminative stimulus
in operant conditioning, a stimulus that elicits a response after association with reinforcement (in contrast to related stimuli not associated with reinforcement)
positive reinforcement
increasing behaviors by presenting positive reinforcers. A positive reinforcer is any stimulus that, when presented after a response, strengthens the response
negative reinforcement
increasing behaviors by stopping or reducing aversive stimuli. A negative reinforcer is any stimulus that, when removed after a response, strengthens the response. (Note: Negative reinforcement is not punishment.)
primary reinforcer
an innately reinforcing stimulus, such as one that satisfies a biological need
conditioned reinforcer
a stimulus that gains its reinforcing power through its association with a primary reinforcer; also known as a secondary reinforcer
reinforcement schedule
a pattern that defines how often a desired response will be reinforced
continuous reinforcement schedule
reinforcing the desired response every time it occurs
partial reinforcement schedule
reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous reinforcement
fixed-ratio schedule
in operant conditioning, a reinforcement schedule that reinforces a response only after a specified number of responses
variable-ratio schedule
in operant conditioning, a reinforcement schedule that reinforces a response after an unpredictable number of responses
fixed-interval schedule
in operant conditioning, a reinforcement schedule that reinforces a response only after a specified time has elapsed
variable-interval schedule
in operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals
punishment
an event that tends to decrease the behavior that it follows
biofeedback
a system for electronically recording, amplifying, and feeding back information regarding a subtle physiological state, such as blood pressure or muscle tension
preparedness
a biological predisposition to learn associations, such as between taste and nausea, that have survival value
instinctive drift
the tendency of learned behavior to gradually revert to biologically predisposed patterns
cognitive map
a mental representation of the layout of one's environment. For example, after exploring a maze, rats act as if they have learned a cognitive map of it
latent learning
learning that occurs but is not apparent until there is an incentive to demonstrate it
insight
a sudden realization of a problem's solution; contrasts with strategy-based solutions
intrinsic motivation
a desire to perform a behavior effectively for its own sake
extrinsic motivation
a desire to perform a behavior to receive promised rewards or avoid threatened punishment
problem-focused coping
attempting to alleviate stress directly—by changing the stressor or the way we interact with that stressor
emotion-focused coping
attempting to alleviate stress by avoiding or ignoring a stressor and attending to emotional needs related to our stress reaction
personal control
our sense of controlling our environment rather than feeling helpless
learned helplessness
the hopelessness and passive resignation an animal or person learns when unable to avoid repeated aversive events
external locus of control
the perception that chance or outside forces beyond our personal control determine our fate
internal locus of control
the perception that we control our own fate
self-control
the ability to control impulses and delay short-term gratification for greater long-term rewards
observational learning
learning by observing others