Generative Artificial Intelligence (GenAI)

0.0(0)
studied byStudied by 1 person
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/12

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

13 Terms

1
New cards

What does GPT stand for?

Generative Pre-trained Transformer.

2
New cards

What is a Transformer architecture?

A type of deep-learning model that processes entire input sequences in parallel, weighing the importance of different words.

3
New cards

How do Transformers differ from older models?

Transformers analyze entire sequences at once using attention mechanisms.

4
New cards

How does parallel processing benefit Transformers?

  1. speeds up training

  2. allows handling of large datasets

  3. enables scalability to massive models

5
New cards

What is an LLM (Large Language Model)?

AI systems capable of understanding and generating human-like text by processing vast amounts of text data.

6
New cards

What makes LLMs "large"?

the number of model parameters and the massive amount of training data 

7
New cards

What is an n-gram model?

A statistical model that predicts the next word based on a fixed number (N) of previous words

8
New cards

What are tokens in LLMs?

units of text that LLMs process and generate

short word = 1 token

longer word = 2-3 tokens

9
New cards

What are word embeddings?

Numerical vector representations of words that capture their meaning, context, and relationships.

10
New cards

How do word embeddings help LLMs?

They improve language understanding

11
New cards

What is Generative AI (GenAI)?

A broad category of AI that creates new content, including text, images, audio, video, and 3D models

12
New cards

What is Prompt Engineering?

The practice of designing and optimizing prompts to get the best possible responses from LLMs

13
New cards

What is (RAG)?

Retrieval Augmented Generation.

A process where LLMs retrieve relevant information from external sources to improve response accuracy and contextual relevance.