1/14
Flashcards covering key concepts in Decision Theory, specifically focusing on expected value and decision trees.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Expected Value (EV)
A criterion used to determine the expected payoff of alternatives and choose the one with the best expected outcome.
Decision Tree
A schematic representation used to analyze sequential decisions with decision nodes and chance nodes.
Certainty
An environment where relevant parameters have known values.
Uncertainty
An environment where it is impossible to assess the likelihood of various possible future events.
Risk
An environment where certain future events have probabilistic outcomes.
Probability
The measure of the degree of sureness or uncertainty of the occurrence of an event.
Expected Monetary Value (EMV)
The expected payoff from different alternatives, taking into account their probabilities.
Payoff
The return received from a particular decision or action.
Chance Node
Represents uncertain outcomes in a decision tree, depicted by a circular node.
Decision Node
Represents a decision point in a decision tree, depicted by a square node.
Analyzing from Right to Left
The method used in decision trees to assess the optimal alternatives and make decisions.
Nature
The state of the environment affecting the outcome of a decision in risk management.
Decision Making Under Risk
The process of making decisions when probabilities of outcomes are known but exact outcomes are uncertain.
EMV Formula
Calculated by multiplying payoffs by their probabilities and summing the results.
Schematic Representation
A visual depiction of available alternatives along with their possible consequences, as seen in decision trees.