Module 5 - Building Artificial Neural Networks

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/32

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

33 Terms

1
New cards

Activation Function

A function that computes the output of an artificial neuron to solve non-linear tasks

2
New cards

ANN (Artificial Neural Network)

A machine approximation of biological neural networks. Used in deep learning. A popular choice for solving complex problems in the areas of natural language processing (NLP) and computer vision, as well as other domains where large volumes of training data are available

3
New cards

Backpropagation

A method of training a neural network that starts by computing the error gradient of neurons in the last hidden layer, then the next-to-last hidden layer, and so on, until reaching the input layer. The connection weights between neurons are then updated.

4
New cards

Bag-of-words

An approach to representing textual content as as list of individual words, irrespective of other language components like grammar and punctuation.

<p>An approach to representing textual content as as list of individual words, irrespective of other language components like grammar and punctuation.</p>
5
New cards

BPTT (Backpropagation Through Time)

A method of training a recurrent neural network (RNN) in which the time sequence of RNN layers is first unrolled, and then backpropagation is performed. The error gradients between the predicted values and the actual values are identified starting at the last time step (t), then these gradients are propagated backwards for the next layer’s error calculation, until reaching the first time step.

6
New cards

CNN (Convoluted Neural Network)

A type of artificial neural network (ANN) most commonly used to process pixel data. This approach owes its name to convolution, a mathematical operation that enables it to perceive images by assembling small, simple patterns into larger, more complex patterns. This approach was inspired by the way neural vision is processed in the visual cortex of animals,

<p>A type of artificial neural network (ANN) most commonly used to process pixel data. This approach owes its name to convolution, a mathematical operation that enables it to perceive  images by assembling small, simple patterns into larger, more complex patterns. This approach was inspired by the way neural vision is processed in the visual cortex of animals,</p>
7
New cards

Convolutional layer

A type of layer in a convolutional neural network (CNN) in which the neurons scan a portion of the input image for data that is within the neurons’ filter. Each neuron reacts to a portion of the image, and each layer builds on the neurons near it.

Also called a convolution.

8
New cards

Discriminator

One half of a generative adversarial network (GAN) that prEedicts a label given a set of features. It tries to determine whether the image created by the generator is real or fake.

9
New cards

Embedding

In a recurrent neural network (RNN), the process of condensing a language vocabulary into vectors of relatively small dimensions. These vectors are very efficient and easy to process

10
New cards

Feature map

A representation of an image that focuses whatever features a convolution filter searches for in a convolutional neural network (CNN)

11
New cards

Filter

The portion of the receptive field that a convolutional layer neuron uses to scan the image at prior layers

<p>The portion of the receptive field that a convolutional layer neuron uses to scan the image at prior layers</p>
12
New cards

FNN (Feedforward Neural Network)

A type of artificial neural network (ANN) in which information flows to and from artificial neurons in a single direction

13
New cards

GAN (Generative Adversarial Network)

A neural network architecture that pits two different neural networks against each other, typically to generate images

<p>A neural network architecture that pits two different neural networks against each other, typically to generate images</p>
14
New cards

Generator

One half of a generative adversarial network (GAN) that predicts features given a label. It creates an image and tries to “fool” the discriminator into believing that it is real.

15
New cards

Hidden Layer

A layer of neurons in a neural network that is not directly exposed to the input and requires additional analysis. It’s purpose is to add complexity and sophistication to the neural network

16
New cards

Input layer

A layer of neurons in a neural network that deals with information that is directly exposed to the input

17
New cards

LSTM cell (long short-term memory)

A type of memory cell in a recurrent neural network (RNN) that preserves input that is significant to the training process, while “forgetting“ input that ais not.

<p>A type of memory cell in a recurrent neural network (RNN) that preserves input that is significant to the training process, while “forgetting“ input that ais not.</p>
18
New cards

LSTM Cell process

knowt flashcard image
19
New cards

Memory cell

Aa component of a recurrent neural network (RNN) that maintains a certain state in time.

<p>Aa component of a recurrent neural network (RNN)  that maintains a certain state in time.</p>
20
New cards

MLP (Multi Layer Perceptron)

A neural network algorithm that has multiple distinct layers of threshold logic units (TLUs)

<p>A neural network algorithm that has multiple distinct layers of threshold logic units (TLUs)</p>
21
New cards

Output layer

A layer of neurons in a neural network that formats and outputs data that is relevant to the problem

22
New cards

Padding

The practice of adding pixels around an input image to preserve its dimensions, while enabling a convolutional layer to be the same size as the actual input. Padding values are zeros in most cases.

<p>The practice of adding pixels around an input image to preserve its dimensions, while enabling a convolutional layer to be the same size as the actual input. Padding values are zeros in most cases.</p>
23
New cards

Perceptron

An algorithm used in ANNs to solve binary classification problems

<p>An algorithm used in ANNs to solve binary classification problems</p>
24
New cards

Pooling layer

A type of layer in a convolutional neural network (CNN) that applies an aggregation function to input features in order to make a more efficient selection. It will only pass on the highest value to the next layer.

<p>A type of layer in a convolutional neural network (CNN) that applies an aggregation function to input features in order to make a more efficient selection. It will only pass on the highest value to the next layer.</p>
25
New cards

ReLU funcion (Rectified Linear Unit Function)

An activation function that calculates a linear function of the inputs. If the result is positive, it outputs that result. If it is negative, it outputs 0

26
New cards

RNN (Recurrent Neural Network)

A type of artificial neural network in which information can flow to and from artificial neurons in a loop, rather than just a single direction. It incorporates time as an important components

<p>A type of artificial neural network in which information can flow to and from artificial neurons in a loop, rather than just a single direction. It incorporates time as an important components</p>
27
New cards

Stride

The distance between filters in a convolution as they scan an image

<p>The distance between filters in a convolution as they scan an image</p>
28
New cards

tanh function (hyperbolic tangent function)

An activation function whose output values are constraint between -1 and 1

29
New cards

Sigmoid Function

The original activation function used with MLPs.

Outputs an S-shaped curve to account for non-linear data

Output range is in between 0 and 1

It is a good choice for output layer of a classifier, but if applied to hidden layers, the backpropagation of the gradient error may become too slow

30
New cards

Guidelines for building MLPs

knowt flashcard image
31
New cards

Guidelines for building CNNs

knowt flashcard image
32
New cards

Guidelines for building RNNs

knowt flashcard image
33
New cards

TLU (Threshold logic unit)

An output neuron that calculates the weighted sun of input neurons and then implements a step function

<p>An output neuron that calculates the weighted sun of input neurons and then implements a step function</p>