IBCS AI Case Study

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/31

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

32 Terms

1
New cards

Backpropagation through Time

The process for BPTT is as follows:
1. Input a series of input and output pairs

2. Calculate the errors across each pair

3. Update the network’s weights

4. Repeat

2
New cards

Bag-of-Words

A model using an unordered list of words for measuring occurrence frequency. This is different than ordered list, as it cannot measure grammar or syntax. This model of data is generally used for categorization of documents, where the frequency of specific words can determine the contents of the document.

3
New cards

Biases

When a model outputs results that are unfair to certain groups.

4
New cards

Linguistic Bias

A bias based on stereotypes and prejudice.

5
New cards

Confirmation Bias

The tendency to search for, interpret, and remember information in a way that confirms one’s preexisting beliefs or hypotheses.

6
New cards

Labelling

Humans add labels to raw data to train an algorithm.

7
New cards

Sampling Bias

A bias in which a sample is collected in a way that some members of the sample might have a higher or lower sampling probability than others

8
New cards

Selection Bias

A bias that happens when examples of a dataset are not representative of how they’re distributed in real life.

9
New cards

Dataset

A collection of data

10
New cards

Deep learning

A subset of machine learning focused on neural networks with multiple layers, enabling them to learn complex patterns from large datasets.

11
New cards

Graphical Processing Unit

A part of a computer or circuit designed for digital image processing and to accelerate graphical processing.

12
New cards

Hyperparameter Tuning

The problem of optimizing the hyperparameters of a learning algorithm.

13
New cards

Hyperparameter

A parameter of the algorithm that is not part of the layers of the algorithm, but is rather determined outside of the algorithm itself, and is used to affect the learning process. Examples include learning rate, batch size, and number of epochs.

14
New cards

Large Language Model

An AI model that handles natural language processing. They have many parameters and use self-supervised learning.

15
New cards

Latency

The delay before a transfer of data begins following an instruction for its transfer.

16
New cards

Long Short-Term Memory

A type of recurrent neural network with short-term memory to improve algorithms.

17
New cards

Loss function

A method for evaluating the effectiveness of an algorithm.

18
New cards

Memory Cell State

Memory cell state is the internal state of the cell at any time. Memory cell state is responsible for storing information that the network has learned over time.

19
New cards

Natural Language Understanding

A subcategory of natural language processing that deals specifically with AI reading comprehension.

20
New cards

Discourse Integration

The process of analyzing text by considering by considering the broader context surrounding a specific word, phrase, or sentence.

21
New cards

Lexical Analysis

The process of breaking down a string into tokens.

22
New cards

Pragmatic Analysis

The interpretation of language based on context, speaker intent, and real-world knowledge.

23
New cards

Semantic Analysis

The process which allows computers to interpret the correct context of words or phrases with multiple meanings.

24
New cards

Syntactical Analysis

The analysis of syntax and grammar.

25
New cards

Pre-Processing

The process of changing the raw data into a clean data set.

26
New cards

Recurrent Neural Network

A type of neural network designed for processing sequential data where order is important.

27
New cards

Self-Attention Mechanism

A mechanism for determining the importance of parts of an input sequence.

28
New cards

Synthetic Data

Artificially generated data

29
New cards

Tensor Processing Unit

Chips specialized for machine learning.

30
New cards

Transformer Neural Network

A type of RNN that weights more important tokens more heavily.

31
New cards

Vanishing Gradient Problem

When the gradients used to update a network’s weights become extremely small during backpropagation.

32
New cards

Weights

Parameters in a neural network that adjust how input signals are processed, influencing the model’s learning process.