Intro to AI Unit 4

0.0(0)
studied byStudied by 0 people
0.0(0)
call with kaiCall with Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/19

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 4:02 PM on 12/12/25
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

20 Terms

1
New cards

Q: What do most machine learning systems use?

Neural networks.

2
New cards

What makes RNNs special?

They understand order in data and can remember previous information.

3
New cards

What are RNNs useful for?

  • Sequential data like language, speech, and time-based data where order matters.

4
New cards

What makes them useful for this purpose?

  • They keep track of old information and use it to give more context to new input.

5
New cards

How are RNNs different from feed-forward networks?

Feed-forward networks only move forward; RNNs loop back to reuse previous information.

6
New cards

What’s one major problem with RNNs?

They can only process one word or token at a time and forget earlier information in long sequences.

7
New cards

What is a transformer?

A type of neural network that processes data with order, like RNNs, but more efficiently.

8
New cards

Why were transformers created?

  • RNNs used too much memory and processed information slowly, one item at a time.

9
New cards

What idea is at the core of transformers?

Attention—the model learns which parts of a sequence matter most to each other.

10
New cards

What does “attention” mean?

Measuring the relative importance of parts of the input to one another (e.g., which words in a sentence are most related).

11
New cards

What paper introduced transformers?

“Attention is All You Need.”

12
New cards

What is the main advantage of transformers over RNNs?

  • They can process entire sequences in parallel (parallelization).

  • They use less memory.

  • They don’t forget earlier information.

13
New cards

What happens when transformers remove the recurrence from RNNs?

The model can analyze the whole text at once, using only attention.

14
New cards

Why are transformers faster?

They perform many calculations simultaneously instead of one step at a time.

15
New cards

What types of data do transformers handle well?

Text, speech, and images—any sequence where relationships between elements matter.

16
New cards

What are most machine learning systems built with?

Neural networks.

17
New cards

What is a Recurrent Neural Network (RNN)?

: A neural network that understands order or sequence in data.

18
New cards

How are RNNs and transformers related?

Transformers build on the same core idea as RNNs but are more efficient.

19
New cards

Why are transformers more efficient than RNNs?

RNNs take up more space and time since they process one step at a time; transformers handle sequences faster.

20
New cards