main(1) (1)

Lecture Notes

  • Title: Stochastic processes and simulation in the natural sciences

  • Instructor: Giacomo Zanella

  • Date: March 4, 2025

Contents

  1. Discrete-time Markov chains

    • 1.1 Introduction to Stochastic Processes

    • 1.2 Markov Chains

      • 1.2.1 Chapman-Kolmogorov equations

    • 1.3 Classification of states and recurrence

      • 1.3.1 Communication classes

      • 1.3.2 Transience and recurrence

      • 1.3.3 Random walks in d dimension

    • 1.4 Convergence to stationarity

      • 1.4.1 Stationary distributions

      • 1.4.2 Convergence in distribution

      • 1.4.3 Convergence of asymptotic frequencies

      • 1.4.4 Ergodic averages and law of large numbers for MCs

    • 1.5 Counter-examples where the convergence theorem fails

      • 1.5.1 Infinite state spaces and non-existence of stationary distributions

      • 1.5.2 Lack of irreducibility and hitting times

      • 1.5.3 Time-inhomogeneity and self-reinforcing processes

    • 1.6 Reversibility and detailed balance

  2. Illustrative examples and applications of discrete-time Markov chains

    • 2.1 Random walks on graphs and PageRank

    • 2.2 Sequential testing

    • 2.3 Hidden Markov models

  3. Continuous-time Markov chains

    • 3.1 Background and tools

      • 3.1.1 Exponential distribution

      • 3.1.2 Hazard rates

      • 3.1.3 First events properties

    • 3.2 Markov property and transition probabilities

      • 3.2.1 Generator/Jump rates

    • 3.3 Examples of CTMCs

      • 3.3.1 Binary chain

      • 3.3.2 Poisson process

      • 3.3.3 Birth process with linear birth

      • 3.3.4 Process with birth, death and immigration

    • 3.4 Kolmogorov differential equations

      • 3.4.1 Kolmogorov backward and forward equations

    • 3.5 Jump chain and holding time construction

    • 3.6 Limiting behavior of CTMCs

    • 3.7 Birth and death processes

    • 3.8 The Poisson Process

      • 3.8.1 One-dimensional, homogeneous case

      • 3.8.2 One-dimensional, non-homogeneous case

      • 3.8.3 General case

Detailed Notes

Discrete-time Markov Chains

1.1 Introduction to Stochastic Processes

  • Stochastic processes represent models for random quantities that change or evolve over time.

  • Formally defined as a collection of random variables {Xt, t ∈ T} where Xt maps outcomes to states.

1.2 Markov Chains

  • Definition: A discrete time stochastic process is a Markov chain if the future state depends only on the present state, not on the prior states.

  • Transition Matrix: It characterizes the behavior of the Markov chain with probabilities Pij = Pr(X(t+1) = j | X(t) = i).

  • Examples include the Gambler's ruin and random walks.

1.3 Classification of States and Recurrence

1.3.1 Communication Classes

  • Two states communicate if each is accessible from the other.

  • States partition into classes based on communication.

1.3.2 Transience and Recurrence

  • A state is recurrent if, starting from that state, it will eventually return with probability 1; otherwise, it's transient.

1.3.3 Random Walks in d Dimension

  • Discusses different recurrence properties in random walks across multiple dimensions.

1.4 Convergence to Stationarity

  • Stationary Distribution: A distribution π for a Markov chain that remains invariant over time.

  • Convergence theorems establish conditions under which a Markov chain approaches this stationary distribution over time.

1.5 Counter-Examples to Convergence Theorem

  • Discusses scenarios such as infinite state spaces where stationary distributions may not exist.

1.6 Reversibility

  • A Markov chain is reversible if the probabilities of transitioning between states maintain balance over time.

Illustrative Examples and Applications

2.1 Random Walks on Graphs and PageRank

  • PageRank relies on the structure of directed graphs to rank web pages based on the probability of random walks.

2.2 Sequential Testing

  • Involves frameworks for comparing the effectiveness of drugs or treatments using randomized trials.

2.3 Hidden Markov Models

  • These models involve latent processes and are widely applicable in signal processing.

Continuous-Time Markov Chains

3.1 Background and Tools

3.1.1 Exponential Distribution

  • Continuous random variable with a specific distribution characterized by a constant hazard rate.

3.2 Markov Property

  • Continuous-time stochastic processes must satisfy a Markov property.

3.3 Examples of CTMCs

  • Discusses practical applications of CTMCs such as birth-death processes and Poisson processes.

3.4 Kolmogorov Equations

  • Define and derive the differential equations governing the transition probabilities in CTMCs.

3.5 Jump Chain and Holding Time Construction

  • Introduces techniques to simulate trajectories of CTMCs using sequences of jump chains and holding times.

3.6 Limiting Behavior

  • Discusses how to derive the long-run behavior of CTMCs and identifies limiting distributions.

3.7 Birth and Death Processes

  • Explains modeling techniques for populations and queues using birth-death chains.

3.8 Poisson Process

3.8.1 NHPP

  • Discusses the non-homogeneous Poisson process and its definition in the context of counting processes.

robot