Markov Models and Transition Models

Decision Analysis with Markov Models

Introduction

  • This week focuses on Markov models, which are more effective for analyzing real-world decisions compared to decision trees.

  • Using Excel is required for calculations, but interpreting the results is the main challenge.

Decision Trees Recap

  • Decision trees break down potential outcomes along different probability lines.

  • Probabilities are assigned to estimate costs for alternatives within a program.

  • The sum of expected values provides the expected total cost for a program alternative.

  • A key requirement is that probabilities at each chance node must sum up to 1 (or 100%).

Introducing Markov Models

  • Markov models address the limitations of decision trees by accounting for movement between disease states or conditions.

  • They can illustrate using trees, sometimes combining them with decision trees.

Defining Health States
  • The initial step involves defining the health states relevant to the program being evaluated.

  • Example health states: well, ill, or dead; relief from a condition or no relief; headache or no headache.

Transitions Between States
  • Individuals move between states over time, sometimes returning to previous states.

  • Markov models, also known as transition models, evaluate the proportion or percentage of a population moving between states.

  • They are frequently used in survival analysis.

Probabilities and Cycles
  • Similar to decision trees, Markov models establish probabilities, representing the transition of moving forward or improving in a disease state.

  • Transitions occur over defined time periods called cycles, often set to a year but can vary depending on the condition.

Costs and Benefits
  • Each health state can have associated costs and benefits, with benefits often termed as rewards.

  • Rewards are equivalent to payoffs in decision tree modeling.

Example: HIV Treatment
  • Markov model is applied to HIV treatment scenarios (combination therapy vs. monotherapy).

  • Health states include:

    • State A: Good health.

    • State B: Illness without full-blown AIDS.

    • State C: Full-blown AIDS.

    • State D: Death.

  • A one-year cycle is used for this illustration.

Recursive Model
  • Markov model is recursive, allowing individuals to stay in one state across multiple cycles or return to a previous state.

  • Starting from state A, individuals can either remain in state A or transition to state B.

  • From state B, they may return to state A, stay in state B, progress to full-blown AIDS, or transition to death.

  • Individuals in the final state (death) remain there.

  • Important to note, staying within a state is possible for multiple cycles.

One-Way Transition Assumption
  • The model assumes a one-way transition for simplicity, where patients progressing from state A to state B cannot revert to state A.

Transition Probabilities
  • Transition probabilities must sum up to 1 (100%) horizontally.

  • This ensures that all individuals in the initial population are accounted for.

  • In the example:

    • Monotherapy:

      • State A: 0.721 + 0.202 + 0.067 + 0.01 = 1

      • State B: 0.581 + 0.407 + 0.012 = 1

      • State C: 0.75 + 0.25 = 1

      • State D: 1 = 1

Comparison of Monotherapy and Combination Therapy
  • Combination therapy appears more effective due to lower transition probabilities from state A to state B.

    • State A to B: 10% (combination) vs. 20% (monotherapy).

  • Combination therapy results in fewer progressions from state B to C.

  • Combination therapy keeps more people in the AIDS state (12% more) and reduces the transition to the final death state.

Importance of Probabilities Summing to One
  • Always verify that the probabilities add up to one horizontally to account for everyone in the initial population.

  • Once in the final state, individuals remain there: no chance of recovery and transition to other states.

Simplifications in the Model
  • The model simplifies the situation by not modeling backward transitions (e.g., from state B to state A).

  • A zero in the transition model implies no movement backward.

Obtaining Probabilities and Rewards

  • Probabilities are derived from research and literature review.

  • It's important to understand the disease states involved in the area being evaluated.

  • Decision trees can be used initially to build the model.

  • Recursive decisions involving staying in a state can complicate decision trees, making Markov models more suitable.