IB HL CS P3 2025

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/32

flashcard set

Earn XP

Description and Tags

2025 case study key terminology

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

33 Terms

1
New cards

Backpropagation Through Time (BPTT)

An algorithm for training Recurrent Neural Networks (RNNs) by unfolding them through time and applying backpropagation at each time step to update weights.

2
New cards

Bag-of-Words

A text representation model that treats a document as a set of individual words, ignoring grammar and order, but keeping word frequency.

3
New cards

Confirmation Bias

A type of AI bias where a model tends to favor data or interpretations that confirm existing beliefs or assumptions.

4
New cards

Historical Bias

A type of AI bias inherited from past data, where the model reflects inequalities or patterns that existed historically.

5
New cards

Labelling Bias

A type of AI bias caused by inconsistently or subjectively labeled data, leading to misleading model learning.

6
New cards

Linguistic Bias

A type of AI bias where language differences like dialects or phrasing affect how the model interprets inputs.

7
New cards

Sampling Bias

A type of AI bias that arises when the data used is not representative of the population it’s meant to model.

8
New cards

Selection Bias

A type of AI bias caused by how data is chosen, potentially excluding important information or skewing the results.

9
New cards

Dataset

A structured collection of data used for training or evaluating machine learning models, often consisting of input-output pairs.

10
New cards

Deep Learning

A subset of machine learning that uses neural networks with multiple layers to learn complex features and patterns.

11
New cards

Graphical Processing Unit (GPU)

A specialized processor optimized for handling parallel operations, crucial for training deep learning models quickly.

12
New cards

Hyperparameter Tuning

The process of adjusting external parameters of a model (like learning rate or batch size) to improve its performance.

13
New cards

Large Language Model (LLM)

A type of deep learning model trained on massive text datasets to perform language tasks such as generation and comprehension.

14
New cards

Latency

The time delay between a user’s input and the chatbot’s response; lower latency improves user experience.

15
New cards

Long Short-Term Memory (LSTM)

A special type of RNN that uses memory cells and gates to retain long-term dependencies in sequential data.

16
New cards

Loss Function

A mathematical function that measures the error between predicted outputs and actual values during training, guiding weight updates.

17
New cards

Memory Cell State

A component of LSTM networks that holds and updates the internal memory of the network over time.

18
New cards

Natural Language Processing (NLP)

A field of artificial intelligence focused on enabling machines to understand, interpret, and generate human language; includes lexical analysis, syntactical analysis, semantic analysis, discourse integration, and pragmatic analysis.

19
New cards

Discourse Integration

An NLP subfield that deals with how the meaning of one sentence depends on previous sentences in a conversation or text.

20
New cards

Lexical Analysis

An NLP subfield focused on breaking down text into individual words or tokens and analyzing them.

21
New cards

Pragmatic Analysis

An NLP subfield that interprets text based on context and intended meaning, such as sarcasm or indirect requests.

22
New cards

Semantic Analysis

An NLP subfield that focuses on determining the meaning of words and sentences, including resolving ambiguities.

23
New cards

Syntactical Analysis (Parsing)

An NLP subfield that examines the grammatical structure of sentences to understand sentence syntax.

24
New cards

Natural Language Understanding (NLU)

A subfield of NLP that focuses on the interpretation and extraction of meaning from human language input.

25
New cards

Pre-processing

The steps taken to clean and prepare raw data for model input, such as tokenization, removing stop words, or lowercasing text.

26
New cards

Recurrent Neural Network (RNN)

A neural network model where connections form loops, allowing it to process sequences and remember prior inputs.

27
New cards

Self-Attention Mechanism

A method used in transformer models to weigh the importance of different words in a sequence, regardless of position.

28
New cards

Synthetic Data

Artificially generated data used to augment training datasets or address issues like bias and data scarcity.

29
New cards

Tensor Processing Unit (TPU)

A hardware chip designed by Google to accelerate deep learning workloads, especially for tensor operations.

30
New cards

Transformer Neural Network (Transformer NN)

A deep learning architecture that uses self-attention to process sequences more efficiently than RNNs or LSTMs.

31
New cards

Vanishing Gradient

A problem during deep network training where gradients become too small to update early layers effectively, hindering learning.

32
New cards

Weights

Parameters in a neural network that determine the strength of connections between neurons and are adjusted during training to minimize loss.

33
New cards

Hyperparameters

Settings or configurations set before training a machine learning model (like learning rate or number of layers) that control how the model learns