1/197
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Physical definition of sound
sound is pressure changes
Perceptual definition of sound
the experience we have when we hear
Frequency
how many times a sound wave repeats (vibrates) each second
Frequency is measured in…
Hertz (Hz)
Sound amplitude
height of peaks in a sound wave (difference between high and low pressures in air)
Sound amplitude is measured in…
decibels
3 perceptual aspects of sound
pitch, loudness, and timbre
Pitch
perceived from frequency (high or low)
Human hearing ranges from…
20-20,000 Hz
Loudness
perceived volume from sound amplitude
Timbre
other perceptual aspects of sound beyond loudness, pitch, and duration (ex. different musical instruments)
2 thresholds for loudness
absolute threshold (minimum) and magnitude estimation
Thresholds depend on…
loudness and frequency
Audibility curve
curve that shows the sound pressure level (SPL), aka the minimum loudness needed to hear different frequencies
Two places frequency is represented in auditory pathway…
The cochlea and auditory cortex (A1)
Frequency in cochlea
different hair cells activate in response different sound frequencies (place theory)
Hair cells near base of cochlea
high frequencies
Hair cells near apex
low frequencies
Frequency in A1
tonotopic map (different parts of A1 respond to different frequencies in organized pattern)
Cochlea
tiny coiled structure in the inner ear that transmits vibration of sound
3 fluid filled canals in cochlea
scala vestibuli, scala media, scala tympani
What do the canals of the cochlea do?
carry waves causing the basilar membrane to move
How does the basilar membrane respond to different sound frequencies?
sound waves travel and reach their strongest point at the spot tuned to that sound’s frequency, making the membrane vibrate
The basilar membrane translates…
frequency of sounds into neural activity for the brain
Organ of corti
structure holding hair cells
Hair cells
detect vibrations and release neurotransmitters that promote auditory signals to the brain for perception
Inner ear
key to both hearing and maintaining equilibrium
Differences in basilar membrane fibers
those at base of cochlea are short and stiff for high frequencies
those at the end are longer and looser for low frequencies
Parts of basilar membrane fibers vibrate…
based on frequency of the sound coming through
Short vs. long basiliar membrane fibers
short fibers → high frequency pressure waves
long fibers → low frequency waves
Transduction of sound begins when…
part of the basilar membrane moves and fibers tickle the organ of corti
Brain detects frequency of sound based on…
location of hair cells being triggered
Cerebral cortex function
interprets electrical signals and plugs them into stored memories to recognize sounds and sources
Steps in auditory pathway
hair cells → auditory nerve → cochlear nucleus → superior olivary complex → inferior colliculus → medial geniculate nucleus (MGN) in thalamus → A1
Auditory location based on…
left vs. right (azimuth coordinates), up vs. down (elevation coordinates), and distance coordinates
Localization of sounds are…
calculated through ITD and ILD
Biaural cues
location cues based on comparison of sound info received by the right and left ears
Interaural timing difference (ITD)
differences in timing of sounds reaching the right ear and left ear
No ITD =
source is equal distance from both ears
ITD =
source is to one side
Interaural level difference (ILD)
difference in sound pressure level reaching the right ear and the left ear (intensity of sound)
ITD is for…
high frequency sounds, decrease intensity of level in further ear
ILD and ITD are not effective for…
elevation judgements
ITD Detectors
neurons that fire when signals reach them from both ears
Brain areas involved in locating sound
A1 and posterior belt area
Two auditory processing pathways
dorsal auditory processing stream and ventral auditory processing stream
Dorsal auditory processing stream
where pathway—toward the parietal lobe
Ventral auditory processing stream
what pathway—temporal lobe
Both processing streams lead to…
the frontal lobe
Evidence methods for auditory processing pathways
Lesion method, neuroimaging, neural recordings
Direct sound
sounds that reach the ear directly
Indirect sounds
sound that reaches the ear after reflecting off of another surface
Problem with direct vs. indirect
both direct and indirect sounds may hit the ear at different timings—-do we hear them as separate sounds?
Precedence effect
when two identical or similar sounds reach the ear within a very short time gap, we perceive the first sound that reaches the ear
When more than one sound is detected…
precedence effect makes us perceive only the first sound
Architectural accoustics
study of how sounds reflect in rooms
Auditory source analysis (ASA)
process where the stimuli made by each source are separated
Two types of grouping
simultaneous and sequential
Simultaneous grouping
situation where sounds are perceptually grouped together since they happen simultaneously in time
Sequential grouping
grouping when sounds come one after another in time
3 cues for simultaneous grouping in ASA
location, onset synchrony, timbre and pitch
Onset synchrony
if the onset is at different times, likely from different sources; if onset at same time, likely grouped together
Timbre and pitch
same timbre and pitches are produced from same source
5 cues for sequential grouping in ASA
Gestalt principle of proximity, similarity of pitch, auditory continuity, experience, auditory stream segregation
Auditory continuity (principle of good continuity)
sounds that stay constant or change smoothly are usually from the same source
Melody schema
representation of a familiar melody in memory (part of experience in ASA)
Auditory Stream Segregation
perception of a string of sounds as belonging together
Acoustic stimulus/signal
pattern of frequencies and intensities of the sound stimulus
Sound produced is based on…
the shape of the vocal tract and articulators
Articulators
tongue, lips, teeth, etc.
Speech sounds are described by (3)
manner of articulation, place of articulation, voice onset time
Manner of articulation
interaction of articulators in speech production
Place of articulation
location of articulators during a speech sound
Voice onset time (VOT)
time delay between when a sound begins and when the vocal chords start vibrating
Phoneme
shorten segment of speech that changes the meaning of a word if changed
Phonemes are the…
basic units of speech (ex. /b/, /a/, /t/)
Acoustic signals are…
variable → sounds can change properties frequently but represent the same thing
There is no simple relationship between a…
phoneme and acoustic signal
2 sources of variability in acoustic signal
context and pronunciation
Forms of context variability
coarticulation and perceptual constancy
Forms of pronunciation variability
between speaker differences and within speaker differences
Coarticulation
overlapping articulation that happens when different phonemes follow each other in speech (ex. /b/ in boot different from /b/ in beat)
Perceptual constancy
we perceive the sound of a phoneme the same despite acoustic signals being different from coarticulation
From pronunciation
changes acoustic signal but not meaning
Between speaker differences
from person to person saying words differently (ex. regional accents)
Within speaker differences
within individual person, talking differently in different contexts (ex. laryngitis losing your voice, baby talk)
Application of variability in pronunciation
differences in IDS and ADS
Infant directed speech (IDS)
“baby talk” with special characteristics that attract an infants’ attention making it easier for the infant to recognize individual words (motherese or parentese)
IDS differs from adult directed speech by being…
slower, higher in and larger range of pitches, more separated and repeated words, positive affect
4 sources for speech perception
acoustic signal, visual info from face/lip movements of others, knowledge of language, experience
Phonemic restoration effect
occurs when listeners perceive a phoneme in a word even though the phoneme is obscured by another sound (ex. white noise, cough)
Forms of knowledge of language
phonemic restoration effect and knowing meaning of words in sentences
Experience in speech perception
speech perception improved with more experience even with distortion or an accent
6 brain areas for language
Broca’s area
Wernicke’s area
Facial area of primary motor cortex
Angular gyrus
Arculate fasciculus
A1 and V1
Broca’s area
speech production
Wernicke’s area
speech comprehension
Angular gyrus
reading and writing
Arculate fasciculus
white matter tract connecting Broca’s and Wernicke’s areas
Lateralization of language function
language localized to the left hemisphere in majority of individuals that are right handed
Left hemisphere is the…
language dominant hemisphere