Untitled Flashcards Set

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/19

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

20 Terms

1
New cards

Pretrained Models

Models that have already been trained on a large dataset, used for a general task such as image classification.

2
New cards

Vision Transformers

A type of deep learning model that uses self-attention to process images, capturing long-range dependencies and global context.

3
New cards

Fine-tuning

A transfer learning technique where parameters of a pretrained model are updated by training for additional epochs on a different task.

4
New cards

Fit One Cycle

A training schedule that involves gradually increasing and then decreasing the learning rate during training.

5
New cards

Half-precision (to_fp16())

A technique that uses 16-bit floating-point numbers instead of 32-bit to speed up training and reduce memory usage.

6
New cards

Fastkaggle

A Python library that simplifies working with Kaggle competitions, offering features like automatic data downloads.

7
New cards

Convolutional Neural Network (CNN)

A specialized artificial neural network designed for processing data with a grid-like structure, such as images.

8
New cards

Convolutional Layers

Layers in a CNN that perform convolutions to extract features from the input data using learnable filters.

9
New cards

Pooling Layers

Layers that downsample feature maps, reducing spatial dimensions to prevent overfitting and make the network more robust.

10
New cards

Learning Rate Finder

A technique used to determine a suitable learning rate for training a neural network.

11
New cards

Stochastic Gradient Descent (SGD)

An iterative optimization algorithm that minimizes a cost function by updating model parameters using randomly selected batches.

12
New cards

Batch vs. Mini-batch

'Batch' uses the entire dataset for updates, while 'mini-batch' uses small subsets, reducing computation and improving training.

13
New cards

Parameter Sharing

Using the same set of weights for different parts of the input image to reduce the number of parameters in a CNN.

14
New cards

Overfitting

When a model learns the training data too well, causing it to perform poorly on unseen data.

15
New cards

Transfer Learning

Utilizing a pretrained model adapted for a different task, often improving training efficiency.

16
New cards

Validation Set

A portion of the dataset used to evaluate model performance during training and fine-tune hyperparameters.

17
New cards

Test Set

A completely separate part of the dataset used for final evaluation of the model's performance, unseen during training.

18
New cards

Epoch

One complete pass through the entire training dataset during the training process.

19
New cards

Loss Function

A function that quantifies the difference between predicted and actual values, guiding model optimization.

20
New cards

Metric

A human-interpretable measure used to evaluate a model's performance, different from the loss function.