1/9
Flashcards covering essential concepts and terms from the lecture on Deep Learning, including types of networks, mathematical operations, and fundamental challenges.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Deep Learning
A subset of machine learning involving complex networks with many layers that create flexible models from massive datasets.
Convolutional Neural Networks (CNN)
A type of deep learning model used primarily for image recognition that applies convolution operations to aggregate pixel predictors.
Autoencoder
A type of neural network that learns to predict its own input, typically composed of one hidden layer in its simplest form.
Recurrent Neural Networks (RNN)
A class of neural networks designed to recognize patterns in sequences of data, using loops to retain information over time.
Long Short Term Memory (LSTM)
An advanced type of RNN that includes memory cells and gates to combat the vanishing gradient problem and retain information over longer periods.
Convolution
A mathematical operation on two functions that produces a third function, often used in neural networks to extract features from input data.
Vanishing Gradient Problem
A phenomenon where gradients become too small for the model to learn effectively, common in deep neural networks.
Unsupervised Learning
A type of machine learning where models learn from unlabeled data, identifying patterns without specific output targets.
Sobel Filter
A popular edge detection filter used in image processing that emphasizes gradient changes.
Backpropagation
A method for training neural networks that involves sending the output error backward through the network to update weights.