64. ecological systems theory: a theory of the social environment's influence
on human development, using five nested systems (microsystem;
mesosystem;
exosystem; macrosystem; chronosystem) ranging from direct to indirect influences.
65. stranger anxiety: the fear of strangers that infants commonly display,
beginning by about 8 months of age.
66. attachment: an emotional tie with others; shown in young children by
their seeking closeness to caregivers and showing distress on separation.
67. imprinting: the process by which certain animals form strong
attachments during early life.
68. strange situation: a procedure for studying child-caregiver attachment; a
child is placed in an unfamiliar environment while their caregiver leaves and then
returns, and the child's reactions are observed.
69. secure attachment: demonstrated by infants who comfortably explore
environments in the presence of their caregiver, show only temporary distress
when the caregiver leaves, and find comfort in the caregiver's return.
70. insecure attachment: demonstrated by infants who display a clinging,
anxious attachment; an avoidant attachment that resists closeness; or a
disorganized attachment with no consistent behavior when separated from
reunited with caregivers.
71. temperament: a person's characteristic emotional reactivity and intensity.
72. basic trust: according to Erik Erikson, a sense that the world is predictable
and trustworthy; said to be formed during infancy by appropriate experiences
with responsive caregivers.
73. self-concept: all our thoughts and feelings about ourselves, in answer to
the question, "Who am I?"
74. identity: our sense of self; according to Erikson, the adolescent's task is
to solidify a sense of self by testing and integrating various roles.
75. social identity: the "we" aspect of our self-concept; the part of our answer
to "Who am I?" that comes from our group memberships.
76. intimacy: in Erikson's theory, the ability to form close, loving relationships;
a primary developmental task in young adulthood.
77. emerging adulthood: a period from about age 18 to the mid-twenties, when
many persons in prosperous Western cultures are no longer adolescents but
have not yet achieved full independence as adults.
78. social clock: the culturally preferred timing of social events such as
marriage, parenthood, and retirement.
79. learning: the process of acquiring through experience new and relatively
enduring information or behaviors.
80. associative learning: learning that certain events occur together. The events
may be two stimuli (as in classical conditioning) or a response and its
consequence (as in operant conditioning).
81. stimulus: any event or situation that evokes a response.
82. respondent behavior: behavior that occurs as an automatic response to
some stimulus.
83. operant behavior: behavior that operates on the environment, producing
a consequence.
84. cognitive learning: the acquisition of mental information, whether by
observing events, by watching others, or through language.
85. classical conditioning: a type of learning in which we link two or more
stimuli; as a result, to illustrate with Pavlov's classic experiment, the first stimulus
(a tone) comes to elicit behavior (drooling) in anticipation of the second stimulus
(food).
86. behaviorism: the view that psychology (1) should be an objective science
that (2) studies behavior without reference to mental processes. Most
research psychologists today agree with (1) but not with (2).
87. neutral stimulus (NS): in classical conditioning, a stimulus that elicits no
response before conditioning.
88. unconditioned response (UCR): in classical conditioning, an unlearned,
naturally occurring response (such as salivation) to an unconditioned stimulus
(UCS) (such as food in the mouth).
89. unconditioned stimulus (UCS): in classical conditioning, a stimulus that unconditionally - naturally and automatically - triggers an unconditioned response
(UCR).
90. conditioned response (CR): in classical conditioning, a learned response to
a previously neutral (but now conditioned) stimulus (CS).
91. conditioned stimulus (CS): in classical conditioning, an originally neutral
stim- ulus that, after association with an unconditioned stimulus (UCS), comes to
trigger a conditioned response (CR).
92. acquisition: in classical conditioning, the initial stage - when one links a
neutral stimulus and an unconditioned stimulus so that the neutral stimulus
begins triggering the conditioned response. (In operant conditioning, the
strengthening of a reinforced response.)
93. higher-order conditioning: a procedure in which the conditioned stimulus in
one conditioning experience is paired with a new neutral stimulus, creating a
second (often weaker) conditioned stimulus. For example, an animal that has
learned that a tone predicts food might then learn that a light predicts the tone and
begin responding to the light alone. (Also called second-order conditioning.)
94. extinction: in classical conditioning, the diminishing of a conditioned response
when an unconditioned stimulus does not follow a conditioned stimulus. (In
operant conditioning, when a response is no longer reinforced.)
95. spontaneous recovery: the reappearance, after a pause, of a weakened
conditioned response.
96. generalization: (also called stimulus generalization) in classical
conditioning, the tendency, once a response has been conditioned, for stimuli
similar to the conditioned stimulus to elicit similar responses. (In operant
conditioning, when responses learned in one situation occur in other, similar situations.)
97. discrimination (in classical conditioning): in classical conditioning, the
learned ability to distinguish between a conditioned stimulus and other stimuli that
have not been associated with a conditioned stimulus. (In operant conditioning,
the ability to distinguish responses that are reinforced from similar responses that
are not reinforced.)
98. preparedness: a biological predisposition to learn associations, such as
be- tween taste and nausea, that have survival value.
99. operant conditioning: a type of learning in which a behavior becomes more
likely to recur if followed by a reinforcer or less likely to recur if followed by a
punisher.
100. law of effect: Thorndike's principle that behaviors followed by favorable
(or reinforcing) consequences become more likely, and that behaviors followed
by unfavorable (or punishing) consequences become less likely.
101. operant chamber: in operant conditioning research, a chamber (also known
as a Skinner box) containing a bar or key that an animal can manipulate to obtain
a food or water reinforcer; attached devices record the animal's rate of bar
pressing or key pecking.
102. reinforcement: in operant conditioning, any event that strengthens the
behavior it follows.
103. shaping: an operant conditioning procedure in which reinforcers guide
behavior toward closer and closer approximations of the desired behavior.
104. discriminative stimulus: in operant conditioning, a stimulus that elicits
a response after association with reinforcement (in contrast to related stimuli
not associated with reinforcement).
105. positive reinforcement: increasing behaviors by presenting a pleasurable
stimulus. A positive reinforcer is any stimulus that, when presented after a
response, strengthens the response.
106. negative reinforcement: increasing behaviors by stopping or reducing an
aversive stimulus. A negative reinforcer is any stimulus that, when removed
after
a response, strengthens the response. (Note: Negative reinforcement is not
punishment.)
107. primary reinforcer: an innately reinforcing stimulus, such as one that
satisfies a biological need.
108. conditioned reinforcer: a stimulus that gains its reinforcing power through
its association with a primary reinforcer. (Also known as a secondary reinforcer.)
109. reinforcement schedule: a pattern that defines how often a desired
response will be reinforced.
110. continuous reinforcement schedule: reinforcing the desired response
every time it occurs.
111. partial (intermittent) reinforcement schedule: reinforcing a response
only part of the time; results in slower acquisition of a response but much greater
resistance to extinction than does continuous reinforcement.
112. fixed-ratio schedule: in operant conditioning, a reinforcement schedule
that reinforces a response only after a specified number of responses.
113. variable-ratio schedule: in operant conditioning, a reinforcement
schedule that reinforces a response after an unpredictable number of
responses.
114. fixed-interval schedule: in operant conditioning, a reinforcement
schedule that reinforces a response only after a specified time has elapsed.
115. variable-interval schedule: in operant conditioning, a reinforcement
schedule that reinforces a response at unpredictable time intervals.
116. punishment: an event that tends to decrease the behavior that it follows.
117. instinctive drift: the tendency of learned behavior to gradually revert to
bio- logically predisposed patterns.
118. cognitive map: a mental representation of the layout of one's environment.
For example, after exploring a maze, rats act as if they have learned a cognitive
map of it.
119. latent learning: learning that occurs but is not apparent until there is
an incentive to demonstrate it.
120. insight learning: solving problems through sudden insight; contrasts
with strategy-based solutions.
121. observational learning: learning by observing others. (Also called
social learning.)
122. modeling: the process of observing and imitating a specific behavior.
123. mirror neurons: neurons that some scientists believe fire when we perform
certain actions or observe another doing so. The brain's mirroring of another's
action may enable imitation and empathy.
124. prosocial behavior: positive, constructive, helpful behavior. The opposite
of antisocial behavior.
125. antisocial behavior: negative, destructive, harmful behavior. The opposite of prosocial behavior.