1/19
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Q: What do most machine learning systems use?
Neural networks.
What makes RNNs special?
They understand order in data and can remember previous information.
What are RNNs useful for?
Sequential data like language, speech, and time-based data where order matters.
What makes them useful for this purpose?
They keep track of old information and use it to give more context to new input.
How are RNNs different from feed-forward networks?
Feed-forward networks only move forward; RNNs loop back to reuse previous information.
What’s one major problem with RNNs?
They can only process one word or token at a time and forget earlier information in long sequences.
What is a transformer?
A type of neural network that processes data with order, like RNNs, but more efficiently.
Why were transformers created?
RNNs used too much memory and processed information slowly, one item at a time.
What idea is at the core of transformers?
Attention—the model learns which parts of a sequence matter most to each other.
What does “attention” mean?
Measuring the relative importance of parts of the input to one another (e.g., which words in a sentence are most related).
What paper introduced transformers?
“Attention is All You Need.”
What is the main advantage of transformers over RNNs?
They can process entire sequences in parallel (parallelization).
They use less memory.
They don’t forget earlier information.
What happens when transformers remove the recurrence from RNNs?
The model can analyze the whole text at once, using only attention.
Why are transformers faster?
They perform many calculations simultaneously instead of one step at a time.
What types of data do transformers handle well?
Text, speech, and images—any sequence where relationships between elements matter.
What are most machine learning systems built with?
Neural networks.
What is a Recurrent Neural Network (RNN)?
: A neural network that understands order or sequence in data.
How are RNNs and transformers related?
Transformers build on the same core idea as RNNs but are more efficient.
Why are transformers more efficient than RNNs?
RNNs take up more space and time since they process one step at a time; transformers handle sequences faster.