1/20
Flashcards covering key concepts from the lecture notes on Artificial Intelligence, Applied AI, Generative AI, Narrow AI, Expert Systems, Data Ethics, Text as Data, Distant Reading, and NLP Core Concepts such as Tokenization, Frequency Analysis, KWIC, N-Grams, Collocations, Keyness, and Topic Modeling, organized by their descriptive, comparative, and predictive functions.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What is Artificial Intelligence (AI)?
The field of computer science focused on building systems that perform tasks requiring human-like intelligence (learning, reasoning, problem-solving, and language use).
What is Applied AI?
The practical use of AI systems in real-world tasks (e.g., voice assistants, recommendation engines, text analysis tools).
What is Generative AI?
A type of AI that creates new content (text, images, audio, video) based on patterns it has learned from large datasets.
What is Narrow AI (ANI)?
AI designed to perform one specific task very well (e.g., spam filters, translation, image recognition).
What are Expert Systems in AI?
Early AI models built around rules and “if-then” logic for decision-making in specialized domains.
What is Data Ethics?
The responsibility to handle data with fairness, accountability, transparency, and respect for privacy.
What does 'Text as Data' mean?
Treating language as measurable information that can be counted, compared, and analyzed mathematically, where words become “tokens” and patterns become data.
What is Distant Reading?
Analyzing large collections of texts through computational or statistical methods to see patterns, rather than reading individual texts closely.
How does AI reflect the idea of 'digital information'?
By methods like tokenization and frequency analysis representing language as numbers, demonstrating that everything can be expressed numerically.
What ethical risks are associated with interpreting themes from topic models?
Misinterpreting or overstating what the statistics show, especially concerning who decides how those themes are interpreted.
How does the move from descriptive to predictive analysis shape interaction with language data?
It might give an illusion of understanding or open new ways to 'vibe' with text, depending on interpretation.
What is Tokenization in NLP?
Breaking text into units called tokens.
What is Frequency Analysis in NLP?
Counting how often words appear in a text.
What is KWIC (Key Word in Context)?
Showing words with their surrounding context.
What are N-Grams in NLP?
Common sequences of words.
What are Collocations in NLP?
Words that appear together more often than expected.
What is Keyness in text analysis?
Determining what makes one text distinctive compared to another.
What is Topic Modeling?
Discovering clusters of words that suggest themes across documents.
What is the purpose of Descriptive methods in text analysis (e.g., Tokenization, Frequency, KWIC, N-Grams, Collocations)?
They show what’s common and how language is structured, answering questions like 'What words/phrases appear most often?'
What is the purpose of Comparative methods in text analysis (e.g., Keyness)?
They show what’s distinctive between corpora, answering questions like 'What makes one set of texts different from another?'
What is the purpose of Predictive/Inferential methods in text analysis (e.g., Topic Modeling)?
They surface hidden themes and structures, answering questions like 'What themes organize this collection?'