1/12
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What does GPT stand for?
Generative Pre-trained Transformer.
What is a Transformer architecture?
A type of deep-learning model that processes entire input sequences in parallel, weighing the importance of different words.
How do Transformers differ from older models?
Transformers analyze entire sequences at once using attention mechanisms.
How does parallel processing benefit Transformers?
speeds up training
allows handling of large datasets
enables scalability to massive models
What is an LLM (Large Language Model)?
AI systems capable of understanding and generating human-like text by processing vast amounts of text data.
What makes LLMs "large"?
the number of model parameters and the massive amount of training dataÂ
What is an n-gram model?
A statistical model that predicts the next word based on a fixed number (N) of previous words
What are tokens in LLMs?
units of text that LLMs process and generate
short word = 1 token
longer word = 2-3 tokens
What are word embeddings?
Numerical vector representations of words that capture their meaning, context, and relationships.
How do word embeddings help LLMs?
They improve language understanding
What is Generative AI (GenAI)?
A broad category of AI that creates new content, including text, images, audio, video, and 3D models
What is Prompt Engineering?
The practice of designing and optimizing prompts to get the best possible responses from LLMs
What is (RAG)?
Retrieval Augmented Generation.
A process where LLMs retrieve relevant information from external sources to improve response accuracy and contextual relevance.