Looks like no one added any tags here yet for you.
Pretrained Models
Models that have already been trained on a large dataset, used for a general task such as image classification.
Vision Transformers
A type of deep learning model that uses self-attention to process images, capturing long-range dependencies and global context.
Fine-tuning
A transfer learning technique where parameters of a pretrained model are updated by training for additional epochs on a different task.
Fit One Cycle
A training schedule that involves gradually increasing and then decreasing the learning rate during training.
Half-precision (to_fp16())
A technique that uses 16-bit floating-point numbers instead of 32-bit to speed up training and reduce memory usage.
Fastkaggle
A Python library that simplifies working with Kaggle competitions, offering features like automatic data downloads.
Convolutional Neural Network (CNN)
A specialized artificial neural network designed for processing data with a grid-like structure, such as images.
Convolutional Layers
Layers in a CNN that perform convolutions to extract features from the input data using learnable filters.
Pooling Layers
Layers that downsample feature maps, reducing spatial dimensions to prevent overfitting and make the network more robust.
Learning Rate Finder
A technique used to determine a suitable learning rate for training a neural network.
Stochastic Gradient Descent (SGD)
An iterative optimization algorithm that minimizes a cost function by updating model parameters using randomly selected batches.
Batch vs. Mini-batch
'Batch' uses the entire dataset for updates, while 'mini-batch' uses small subsets, reducing computation and improving training.
Parameter Sharing
Using the same set of weights for different parts of the input image to reduce the number of parameters in a CNN.
Overfitting
When a model learns the training data too well, causing it to perform poorly on unseen data.
Transfer Learning
Utilizing a pretrained model adapted for a different task, often improving training efficiency.
Validation Set
A portion of the dataset used to evaluate model performance during training and fine-tune hyperparameters.
Test Set
A completely separate part of the dataset used for final evaluation of the model's performance, unseen during training.
Epoch
One complete pass through the entire training dataset during the training process.
Loss Function
A function that quantifies the difference between predicted and actual values, guiding model optimization.
Metric
A human-interpretable measure used to evaluate a model's performance, different from the loss function.