1/8
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
multimodal view of language
language is a form of social interaction and joint action
speech, prosody, vocalisations, mouth movements, eye gaze, gestures
gesture first hypothesis
iconicity as the bridge to language
shifts in larynx and breathing control allowed control of sounds
language evolution Levinson and Holler 2014
ritualised gestures- first part of action
pointing- draw attention
iconic gestures (homo erectus)
speech and gestures (heidelbergensis)
child gestures Rowe and Goldin Meadow 2009
gesture vocab at 18mo predicts spoken vocab at 42mo
gesture and speech combinations at 18mo predict syntactic ability at 42mo
quality of caregiver input Cartmill et al 2013
high quality input when adult can guess beeped out word from parents gestures and environment
high quality input at 18mo predictor of childs vocab 3 years later
concurrent vs displaced learning Motamedi et al 2024
cues for absent objects are more cognitively demanding but essential for abstract language dev
parents shift from pointing at present objects to iconic as child gets older
gestures
integrated with speech
convey more spatial info
follow language specific time restraints
time locked to speech
support prediction of speech
processed automatically along with speech
beat gestures are synchronised with stressed syllables
why do people gesture?
if learned- blind ppl wouldnt
if to help listener- shouldnt when talking to blind ppl
if to help thinking and speaking- should always do it
why do people gesture Iverson and Goldin Meadow 1999 2001
sighted ppl gesture more when they are blindfolded as are working harder to visualise concepts in mind
=helps package visuospatial info to be translated into speech