Digital Communication – Random Processes to M-ary PSK

Random Processes & Noise

  • Random Variable (RV) vs Random Process (RP)

    • Random Variable (RV): A function that maps each outcome in the sample space of a random experiment to a real number. For example, the outcome of a single coin flip mapped to 0 or 1.

    • Random Process (RP) X(t)X(t): A family of random variables indexed by time tt (or sometimes space); for each outcome of the underlying experiment, a complete function of time (a waveform) is generated. Unlike an RV, which gives a single value, an RP describes the evolution of a random phenomenon over time. Each outcome yields a unique sample function or realization.

  • Ensemble & Sample Function

    • Ensemble: The collection of all possible sample functions (waveforms) that a random process can produce. It represents all possible 'histories' of the random phenomenon.

    • Sample function: A single specific waveform or realization obtained from the random process for one particular outcome of the underlying experiment. For instance, if you measure noise on a wire over time, one such measurement trace is a sample function.

  • Stationarity

    • Strict-sense Stationarity (SSS): A random process is SSS if all its joint cumulative distribution functions (CDFs) remain invariant to any shift in time. This means the statistical properties (like mean, variance, higher-order moments) are constant over time.

    • Wide-sense Stationarity (WSS): A less restrictive form of stationarity. A random process is WSS if and only if:

      1. Its mean mx=E[X(t)]m_x = E[X(t)] is constant and independent of time.

      2. Its autocorrelation function Rx(au)=E[X(t)X(t+au)]R_x( au) = E[X(t)X(t+ au)] depends only on the time difference τ\tau (lag) and not on the absolute time tt. WSS is commonly assumed in practical communication systems because it simplifies analysis while still capturing essential statistical behaviors.

    • Cyclostationary Process: A type of non-stationary process whose statistical properties (mean, autocorrelation) vary periodically with time. Often observed in modulated signals.

  • Ergodicity (in mean / autocorrelation)

    • A random process is ergodic (e.g., in mean or autocorrelation) if its time averages (calculated over a single, infinitely long sample function) are equal to its ensemble averages (calculated across all sample functions at a fixed time). This property is crucial in practice as it allows us to infer statistical properties of a random process by performing measurements on just a single sufficiently long sample function, rather than requiring an infinitely large ensemble of measurements.

  • Moments for WSS RP

    • Mean mx=E[X(t)]m_x = E[X(t)]: The average value of the random process, which is constant for a WSS process.

    • Autocorrelation Rx(au)=E[X(t)X(t+au)]R_x( au) = E[X(t)X(t+ au)]: Measures the statistical similarity between the process at time tt and at time t+aut+ au. For a WSS process, this only depends on the time lag τ\tau. It provides information about the power spectral density of the process.

    • Auto-covariance C<em>x(au)=R</em>x(au)m<em>x2C<em>x( au) = R</em>x( au) - m<em>x^2: Similar to autocorrelation but measures the correlation around the mean. For a zero-mean process, C</em>x(au)=Rx(au)C</em>x( au) = R_x( au).

  • Einstein–Wiener–Khintchine Theorem
    S<em>x(f)=</em>R<em>x(τ)ej2πfτdτS<em>x(f) = \int</em>{-\infty}^{\infty} R<em>x(\tau)e^{-j2\pi f\tau}d\tau R</em>x(τ)=<em>S</em>x(f)ej2πfτdfR</em>x(\tau) = \int<em>{-\infty}^{\infty} S</em>x(f)e^{j2\pi f\tau}df
    This theorem states that the Power Spectral Density (PSD) S<em>x(f)S<em>x(f) and the autocorrelation function R</em>x(τ)R</em>x(\tau) of a WSS random process form a Fourier transform pair. This fundamental relationship allows us to analyze the frequency content of a random process from its time-domain correlation properties, and vice-versa.

  • White Gaussian Noise (WGN)

    • Power Spectral Density (PSD): S<em>n(f)=N</em>02S<em>n(f) = \frac{N</em>0}{2} (flat across all frequencies). This means WGN has equal power per unit bandwidth at every frequency, analogous to white light containing all colors. N0N_0 is the noise power/Hz.

    • Autocorrelation Function: R<em>n(τ)=N</em>02δ(τ)R<em>n(\tau) = \frac{N</em>0}{2}\delta(\tau). The autocorrelation function is an impulse at τ=0\tau = 0, implying that noise samples at different time instants are uncorrelated (and statistically independent if the noise is Gaussian). The 'Gaussian' part means its amplitude distribution is Gaussian.

    • WGN is a crucial model in communication systems because it simplifies analysis and accurately approximates thermal noise encountered in electronic components.

  • Linear filtering of WSS RP
    When a WSS random process X(t)X(t) with mean m<em>xm<em>x and PSD S</em>x(f)S</em>x(f) passes through a Linear Time-Invariant (LTI) filter with impulse response h(t)h(t) and frequency response H(f)H(f), the output process Y(t)Y(t) will also be WSS:

    • Output mean m<em>y=m</em>xH(0)m<em>y = m</em>x H(0). The DC component of the input mean is scaled by the filter's DC gain.

    • Output PSD S<em>y(f)=H(f)2S</em>x(f)S<em>y(f) = |H(f)|^2 S</em>x(f). The filter shapes the input PSD, with the output power spectrum determined by the magnitude-squared of the filter's frequency response multiplied by the input PSD.

    • Output power E[Y2]=<em>H(f)2S</em>x(f)dfE[Y^2] = \int<em>{-\infty}^{\infty} |H(f)|^2 S</em>x(f)df

      • For zero-mean processes, power is equal to variance: E[Y2]=σ<em>y2=R</em>y(0)E[Y^2] = \sigma<em>y^2 = R</em>y(0). This integral calculates the total power of the output process, representing the area under the output PSD.

Integrate-and-Dump (I&D) Receiver
  • An Integrate-and-Dump (I&D) receiver is an optimal filter for detecting rectangular baseband pulses in the presence of AWGN. It works by integrating the received signal over the duration of one bit period, TbT_b, and then