Study Notes: Analysis of Algorithms and Asymptotic Analysis: Recurrences and Running Time
Recurrences and Running Time
An equation or inequality that describes a function in terms of its value on smaller inputs. Example: T(n) = T(n-1) + n
Recurrences arise when an algorithm contains recursive calls to itself
The goal is to solve the recurrence: obtain an explicit formula, and bound the recurrence by an expression that involves n
Typical complexity targets: ext{Θ}(n^2), ext{Θ}( frac{n}{ ext{?}}), etc. (focus on asymptotic bounds)
Common Recurrences and Their Theta Bounds
T(n) = T(n-1) + n \ \Theta(n^2)
Interpretation: Recursive algorithm that loops through the input to eliminate one item at a time
T(n) = T(n/2) + c \ \Theta( obreak\lg n)
Recursive algorithm that halves the input in one step
T(n) = T(n/2) + n \ \Theta(n\nobreak\lg n)
Recursive algorithm that halves the input but must examine every item in the input
T(n) = 2T(n/2) + 1 \ \Theta(n)
Recursive algorithm that splits the input into 2 halves and does a constant amount of other work
Binary Search (Recurrent Algorithm)
Given an ordered array A[1..n], finds if x is in A[lo..hi]
Algorithm: BINARY-SEARCH(A, lo, hi, x)
if (lo > hi) return FALSE
mid ← ⌊(lo+hi)/2⌋
if x = A[mid] return TRUE
if (x < A[mid]) then BINARY-SEARCH(A, lo, mid-1, x)
if (x > A[mid]) then BINARY-SEARCH(A, mid+1, hi, x)
Example A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
lo = 1, hi = 8, x = 7
mid = 4, A[mid] = 4; since x > A[mid], search right half: lo = 5, hi = 8
mid = 6, A[mid] = 9; since x < A[mid], search left half: lo = 5, hi = 5
mid = 5, A[mid] = 5; since x > A[mid], search right half: lo = 6, hi = 5 → NOT FOUND? (illustrative path; actual found at later step if present)
Another example with x = 6 (not in the set):
Similar halving steps show NOT FOUND after narrowing
Analysis of Binary Search
Recurrence: T(n) = c + T(n/2)
Solution: T(n) = ext{Θ}(
obreak\lg n)Intuition: At each step, the problem size halves, so the depth of the recursion is ext{Θ}(
obreak\lg n), and the work per level is constant
Methods for Solving Recurrences
Iteration method
Substitution method
Recursion-tree method
Master method
The Iteration Method
Convert the recurrence into a summation and bound it using known series
Iterate until the initial condition is reached
Back-substitute to express in terms of n and the initial condition
Example 1: T(n) = c + T(n/2)
Expand:
T(n) = c + T(n/2)
= c + c + T(n/4)
= c + c + c + T(n/8)
Assume n = 2^k
After k steps: T(n) = k c + T(1) = c \log_2 n + T(1) = \Theta(\log n)
Example 2: T(n) = n + 2T(n/2)
Expand:
T(n) = n + 2T(n/2)
T(n/2) = n/2 + 2T(n/4)
Substitute: T(n) = n + 2(n/2 + 2T(n/4)) = n + n + 4T(n/4)
Continue: T(n) = n + n + n + 8T(n/8)
Pattern after i steps: T(n) = i n + 2^i T(n/2^i)
Let n = 2^k, after k steps: T(n) = k n + 2^k T(1) = n \log_2 n + n T(1) = \Theta(n \log n)
The Substitution Method
Steps:
Guess a solution: T(n) = O(g(n))
Induction goal: T(n) \le d\, g(n) for some constants d > 0 and n \ge n_0
Induction hypothesis: for all smaller inputs, T(k) \le d\, g(k)
Prove the induction goal by using the hypothesis to bound the recurrence
Example: Binary Search (Recurrence T(n) = c + T(n/2))
Guess: T(n) = O(\lg n)
Induction goal: T(n) \le d \lg n, for some d, n_0
Induction hypothesis: T(n/2) \le d \lg(n/2)
Proof:
T(n) = T(n/2) + c \le d \lg(n/2) + c = d \lg n - d + c
To ensure T(n) \le d \lg n, require -d + c \le 0 \Rightarrow d \ge c
Base case must be checked
Example 2: T(n) = T(n-1) + n
Guess: T(n) = O(n^2)
Induction goal: T(n) \le c n^2
Induction hypothesis: T(n-1) \le c (n-1)^2
Proof: T(n) = T(n-1) + n \le c (n-1)^2 + n = c n^2 - 2c n + c + n
Require -2c n + c + n \le 0 for all large n. One way: choose c \ge 1, since this makes the inequality hold for all sufficiently large n (and in particular for n ≥ 1).
Example 3: T(n) = 2T(n/2) + n
Guess: T(n) = O(n \log n)
Induction goal: T(n) \le c n \log n
Induction hypothesis: T(n/2) \le c (n/2) \log(n/2)
Proof: T(n) = 2T(n/2) + n \le 2 c (n/2) \log(n/2) + n = c n \log n - c n + n \le c n \log n if -c n + n \le 0 \Rightarrow c \ge 1
Changing variables (Renaming) – A trick to simplify
Let m = \log_2 n, so n = 2^m
Then T(2^m) = 2 T(2^{m-1}) + m
Rename: S(m) = T(2^m)
So S(m) = 2 S(m/2) + m
From the previous methods, S(m) = O(m \log m)
Therefore T(n) = S(\log_2 n) = O(\log n \log \log n)
Idea: transform the recurrence to one you have already solved
The Recursion-Tree Method
Convert the recurrence into a recursion tree: each node represents the cost at a level
The total cost is the sum of costs across levels
Used to guide the guess of the solution
Example 1: W(n) = 2W(n/2) + n^2
Subproblem size at level i: n/2^i
Number of nodes at level i: 2^i
Cost per node at level i: \left(\frac{n}{2^i}\right)^2 = \frac{n^2}{4^i}
Total cost at level i: 2^i \cdot \frac{n^2}{4^i} = \frac{n^2}{2^i}
Depth: \log_2 n levels
Total work: W(n) = \sum{i=0}^{\log n} \frac{n^2}{2^i} = n^2 \sum{i=0}^{\log n} \left(\frac{1}{2}\right)^i = O(n^2)
Example 2: T(n) = 3T(n/4) + c n^2
Subproblem size at level i: n/4^i
Cost per node at level i: c\left(\frac{n}{4^i}\right)^2 = c \frac{n^2}{16^i}
Number of nodes at level i: 3^i
Total cost at level i: c n^2 \left(\frac{3}{16}\right)^i
Depth: \log_4 n
Total cost: geometric series → T(n) = O(n^2)
Substitution Method (Substitution-Style Details)
Mastery of the technique: guess a bound and prove by induction using the hypothesis
Key idea: pick a target bound, prove for all sufficiently large n, with appropriate constants
Master Method (Cookbook for T(n) = a T(n/b) + f(n))
Form: T(n) = a\,T\left(\frac{n}{b}\right) + f(n) where a \ge 1, b > 1 and f(n) = \Theta(n^d) for some d \ge 0
Compare growth of f(n) with n^{log_b a}
Results:
If a < b^d then T(n) = \Theta(n^d)
If a = b^d then T(n) = \Theta(n^d \log n)
If a > b^d then T(n) = \Theta(n^{\log_b a})
This is the standard Master Theorem used to quickly classify many recurrences