LECTURE 3 BOOK Introduction Standard dictionary definition of language: “the particular form of speech of a group of people.” Cognitive neuroscience definition: “a symbolic system used to communicate concrete or abstract meanings, irrespective of the sensory modality employed or the particular means of expression.”Includes spoken, written, and sign language. Speech and language are intensively studied due to their importance in human societies. Languages vary widely (6000-7000 in use), but include:Vocabulary Grammar Syntax rules Written forms exist for ~200 languages. Animal communication systems are studied to understand language origins. Speech Producing Speech Human vocal tract: larynx to lips (Figure 12.1A). Air from lungs passes through the glottis (opening between vocal cords). Airstream acceleration causes decreased pressure, vibrating the vocal cords (Bernoulli’s principle). Vibration frequency determined by vocal cord tension (100-400 Hz). Fundamental frequency varies based on the gender and size of the speaker. Vocal tract shapes and filters sound, like a guitar body. Speech formants: peaks of power produced by source-filter mechanism (Figure 12.1B). Vocal tract shape is changed by muscles of the pharynx, tongue, and lips, producing different speech sounds and formant frequencies. Relative formant frequencies create voiced speech sounds. Source-filter model of speech: Lungs: air reservoir Diaphragm and chest muscles: motive force Vocal cords: periodic vibration for voiced sounds Pharynx, oral/nasal cavities: filter Comprehending Speech Phones: basic speech sound stimuli. Phonemes: perception of phones. Syllables: made up of one or more phones. Words: syllables make up words. Sentences: words make up sentences. ~200 phones exist, with 30-100 used per language. Difficulty learning new languages arises from unfamiliar phones. Accents persist if a second language is learned after age 8 due to entrenched phoneme production/perception. Phones are divided into vowels and consonants. English has ~40 phones, nearly equally divided between vowels and consonants. Vowel sounds:Voiced elements, generated by vocal cord oscillations. Tonal qualities, eliciting pitch perception. Majority of acoustic power in speech. Consonant sounds:Begin and/or end syllables. Briefer, with rapid energy changes. Categorized by place/manner of articulation. Click languages use double consonants made by tongue movements. Consonants carry the main information in speech. Interpreting Speech Sounds Speech doesn't have discrete breaks between syllables or words. Neural processing proceeds holistically. Speech percepts are actively created, not just neural translations of physical stimuli. Eye movements during reading don't follow syllabic or word boundaries. Syllables and words are not natural units of speech processing. Alvin Liberman proposed that we perceive underlying “vocal gestures,” corresponding to vocal tract movements. Coarticulation: vocal-tract changes overlap in time and influence each other. Acoustic characteristics of phones overlap between speakers. Speech perception relates more to vocal intention and meaning than physical sound. Sentences, Grammar, and Syntax Sentences: word sequences expressing a complete thought. Grammar: rules by which words are formed and combined. Syntax: rules describing combinations of grammatically correct words/phrases. Grammar and syntax change over time and vary among languages. English uses subject-verb-object order, but other arrangements exist. The Importance of Context Phone, syllable, word, phrase, and sentence meanings are ambiguous. Homonyms: words with same spelling/sound but multiple meanings (e.g., bank). Homophones: words with same sound but different meanings/spellings (e.g., kernel/colonel). Understanding depends on context and experience. William Bagley's work showed correct syllable identification depends on immediate surroundings. Words are easier to recognize in sentences. Recognition increases with usage frequency. McGurk effect: what we see influences what we hear.Speech sounds are influenced by seen lip/tongue movements. Integration occurs in the superior temporal sulcus region. Speech perception is based on the empirical significance of speech sounds within a broader context. Acquiring Speech and Language Learning A Vocabulary Learning languages is a remarkable feat. Requires knowing word meanings (vocabulary acquisition). The Oxford English Dictionary includes ~500,000 words. Vocabulary is in constant flux, with words being lost and added. A highly verbal person knows ~50,000 words, but ~10,000 are used in ordinary discourse. Learning involves grammar/syntax, complicated by context. Learned through trial and error in infancy/childhood. The Shaping of Phonemes and Phones Infant's perception/production of speech is shaped by heard sounds from early postnatal life. Languages use different subsets of ~200 phones. Infants can initially perceive/discriminate among all speech sounds. This ability diminishes, causing difficulties for older children/adults in perceiving/uttering unfamiliar phones. Native Japanese speakers can't reliably distinguish /r/ and /l/ in English. 4-month-old Japanese infants can make this discrimination. Infants show preferences for native language phonemes by 6 months. By the end of the first year, they no longer respond to non-native phonetic elements. “Baby talk” (motherese) emphasizes phonetic distinctions, helping infants learn. Losing acoustic discrimination is specific to speech sounds; adults can discriminate non-speech sounds with similar characteristics. A Critical Period For Language Acquisition Ability to learn another language fluently persists for some years. Learning requires experience relatively early in life. Neural circuitry is especially susceptible to modification during early development. Malleability diminishes with maturation. The critical (sensitive) period is the window for extensive neural modification. Jacqueline Johnson and Elissa Newport's studies:Second language acquisition in Asian Americans. Learning before age 7 results in native-speaker performance. Effects on language skills are more marked when deafness occurs early in life. Brain activation differences in children/adults indicate pertinent neural regions. Normal acquisition is subject to a decade-long critical period. Some ability to learn persists into adulthood, but at a reduced level. Early experience is important for cognitive abilities. Mechanisms of Language Learning Extensive exposure and practice are the most obvious aspects. Proficiency requires repeated activation of relevant neural circuits. Exposure/practice strengthen language-relevant circuits. Absence of exposure weakens connections representing non-native sounds. Used circuitry is retained, unused circuits weaken. Changes arise from neural activity influencing synaptic connections. Paul Bloom suggests children can Knowt Play Call Kai