1/15
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
What is Language?
Symbolic, Structured, Generative
Symbolic Nature of Language
Language relies on symbols to represent concepts, objects, actions, and ideas
Structural Nature of Language
We organise sentences according to syntax and when we don’t do it correctly it can become difficult to understand or just look off in someway
Syntax
The set of rules concerning word order within sentence and language
Semantic
The study of meaning in language
Behaviourist View
You learn words and grammar because you hear them over and over again. Your brain being trained like forming a habit
Chomsky’s Universal Grammar
Children acquire language in a way that is astonishingly fast. If language was learned purely through conditioning, we would expect progress to be slow with more errors but this is not the case. In addition, children only hear a small, restricted sample of language which is too little to learn full grammar quickly. Led to the theory of Universal Grammar.
Language Acquisition Device
Brains built in language toolkit. Innate mental mechanism that gives humans the ability to acquire language. Contains the universal principles of grammar that help children learn any human language.
Support of UG
Uniformity/convergence: language is universal. Human languages share certain properties. Across cultures, children acquire language in the same predictable stages and in the same order
Poverty or the stimulus: simple being exposed to the world isn’t enough to explain how quickly and effortlessly children learn language
Contributions of UG
Shifted focus to internal mental processes.
Connected nature and nurture
Influence psycholinguists/psychology: provided a framework for studying how the mind represents and processes language
Criticism of UG
Chomsky claims languages are identical which downplays diversity. Typologists show languages differ fundamentally in sound, grammar, lexicon, and meaning.
Underestimation of input: children receive vast amounts of language - conservative estimate: 42 million words between ages 1 - 5. In addition, research shows that it is not just a matter of exposure and having a UG. Mere input does not guarantee acquisition.
Large Language Models
Built using different ‘architectures’ which means different ways of organising how the model learns and processes language
Autoregressive models
Generate text one token at a time (token is usually a word or part of a word). It looks at all the previous tokens, guess what the next token should be using probability, and then add that token to the text.
AI insight into human language
Show that language-like patterns can emerge from statistical learning over large datasets, without innate grammar. This suggests that language may not be entirely hardwired or exclusively innate - experience and input play a larger role that previously assumed.
What AI does well
Generate often coherent sentences
Follow stylistic patterns
Produce text very quickly
Imitate tone or genre
What AI doesn’t do well
Glorified statistical parrots
They do not think, they have no creativity.