1/31
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Backpropagation through Time
The process for BPTT is as follows:
1. Input a series of input and output pairs
2. Calculate the errors across each pair
3. Update the network’s weights
4. Repeat
Bag-of-Words
A model using an unordered list of words for measuring occurrence frequency. This is different than ordered list, as it cannot measure grammar or syntax. This model of data is generally used for categorization of documents, where the frequency of specific words can determine the contents of the document.
Biases
When a model outputs results that are unfair to certain groups.
Linguistic Bias
A bias based on stereotypes and prejudice.
Confirmation Bias
The tendency to search for, interpret, and remember information in a way that confirms one’s preexisting beliefs or hypotheses.
Labelling
Humans add labels to raw data to train an algorithm.
Sampling Bias
A bias in which a sample is collected in a way that some members of the sample might have a higher or lower sampling probability than others
Selection Bias
A bias that happens when examples of a dataset are not representative of how they’re distributed in real life.
Dataset
A collection of data
Deep learning
A subset of machine learning focused on neural networks with multiple layers, enabling them to learn complex patterns from large datasets.
Graphical Processing Unit
A part of a computer or circuit designed for digital image processing and to accelerate graphical processing.
Hyperparameter Tuning
The problem of optimizing the hyperparameters of a learning algorithm.
Hyperparameter
A parameter of the algorithm that is not part of the layers of the algorithm, but is rather determined outside of the algorithm itself, and is used to affect the learning process. Examples include learning rate, batch size, and number of epochs.
Large Language Model
An AI model that handles natural language processing. They have many parameters and use self-supervised learning.
Latency
The delay before a transfer of data begins following an instruction for its transfer.
Long Short-Term Memory
A type of recurrent neural network with short-term memory to improve algorithms.
Loss function
A method for evaluating the effectiveness of an algorithm.
Memory Cell State
Memory cell state is the internal state of the cell at any time. Memory cell state is responsible for storing information that the network has learned over time.
Natural Language Understanding
A subcategory of natural language processing that deals specifically with AI reading comprehension.
Discourse Integration
The process of analyzing text by considering by considering the broader context surrounding a specific word, phrase, or sentence.
Lexical Analysis
The process of breaking down a string into tokens.
Pragmatic Analysis
The interpretation of language based on context, speaker intent, and real-world knowledge.
Semantic Analysis
The process which allows computers to interpret the correct context of words or phrases with multiple meanings.
Syntactical Analysis
The analysis of syntax and grammar.
Pre-Processing
The process of changing the raw data into a clean data set.
Recurrent Neural Network
A type of neural network designed for processing sequential data where order is important.
Self-Attention Mechanism
A mechanism for determining the importance of parts of an input sequence.
Synthetic Data
Artificially generated data
Tensor Processing Unit
Chips specialized for machine learning.
Transformer Neural Network
A type of RNN that weights more important tokens more heavily.
Vanishing Gradient Problem
When the gradients used to update a network’s weights become extremely small during backpropagation.
Weights
Parameters in a neural network that adjust how input signals are processed, influencing the model’s learning process.