Algorithm Analysis Study Notes
Algorithms (Algorithm Analysis)
Lecturer: Pramod Ganapathi
Affiliation: Department of Computer Science, State University of New York at Stony Brook
Date: September 2, 2025
Contents
GO Asymptotic Analysis
GO Computing Complexity
GO Comparing Complexities
GO Comparing Complexities (Advanced)
Questions in Algorithm Analysis
Problem:
Which is better among the two given computer programs A and B that solve the same problem?
Solution:
Compute the goodnesses $GA$ of A and $GB$ of B
Compare and choose the better among $GA$ and $GB$
What can be goodness of a computer program?
Correctness
Time
Space
Comparing Correctness and Runtime Complexity
Problem:
Which is faster among the two given correct computer programs A and B that solve the same problem?
Solution:
Compute the times $TA$ of A and $TB$ of B
Compare and choose the smaller among $TA$ and $TB$
Measure Actual Running Times
Solution:
Measure actual running times $TA$ (in seconds) of A and $TB$ (in seconds) of B
Compare and choose the smaller among $TA$ and $TB$
However, this is a bad idea!
Reasons:
Actual program runtime depends on several factors:
Algorithm
Data structure
Input size
Input data distribution
Computing machine
Operating system
Compiler
Programming language
Library
Coding style
Therefore, comparing actual program runtimes is problematic!
Better Alternative for Comparison
Problem:
Which is faster among the two given correct computer programs A and B that solve the same problem?
Solution:
Compute the number of computations $TA$ of A and $TB$ of B as functions
Compare and choose the smaller among $TA$ and $TB$ functions
Why is this a better idea?
A mathematical model for comparing computational complexity, removing external influences.
Asymptotic Analysis
Definition:
Asymptotic analysis is a mathematical framework for analyzing the complexity of algorithms.
Key aspects of asymptotic analysis:
Models complexities of algorithms as functions
Compares these functions using operators similar to comparison of real numbers: <, >, =, \, ext{ and } \, ext{max}.
Examples of Complexity Notation
Time Complexity:
Space Complexity:
Query Complexity:
Comparing Functions
How can you compare functions using asymptotic operators ?
Definitions of the notations:
Examples of Asymptotic Analysis
Running Times of Algorithms:
Search in a sorted array:
Search in an unsorted array:
Integer addition:
Generate primes:
Sorting (e.g., Quick Sort):
Fast Fourier Transform:
Integer multiplication:
Matrix multiplication:
Linear programming:
Primality test:
Satisfiability problem:
Traveling salesperson problem:
Sudoku, Chess, Checkers, Go: exponential class
Simulate problem, Halting problem: infinite
Program correctness, Program equivalence: infinite
Integral roots of a polynomial: infinite
Basic Operations and Computational Time
Definition:
Computational time = number of basic operations
Basic Operation Definition:
The most fundamental operation of the algorithm, which takes constant time. This includes:
Arithmetic operation: (+, -, ×, ÷)
Comparison operation: (<, ≤, =, ≠, >, ≥)
Memory operation: (e.g., $a rianglelefteq b$ or $C[i]$)
Function invocation and return
Example of Computational Time
Calculation of Sum of Array Elements:
plaintext Sum(A[1 . . . n]) sum ← 0 for i ← 1 to n do sum ← sum + A[i] return sum
Computational time involves:
Approximately $n$ comparisons
Approximately $n$ additions
Approximately $n$ memory index accesses
Approximately $n$ assignments
Complexity Analysis: Worst-case, Best-case, and Average-case
Worst-case Complexity
Definition: Complexity for the worst-case input of size $n$ which takes the longest to process among all inputs of that size.
Best-case Complexity
Definition: Complexity for the best-case input of size $n$ which takes the shortest time among all inputs of that size.
Average-case Complexity
Definition: Complexity for a typical or random input of size $n$.
Amortized Complexity
Definition: Average complexity over a sequence of operations.
Sequential Search Algorithm
Problem:
What are the worst-case, best-case, and average-case analyses for the sequential search algorithm? plaintext Sequential-Search(A[1 . . . n], key) Input: Input array $A$ and search key $key$ Output: The index of the first element in $A$ that matches $key$ or -1 if no match is found for i ← 1 to n do if A[i] = key then return i return -1
Solution:
Worst-case: (All elements must be checked)
Best-case: (Key is found at the first position)
Average-case:
T_{avg}(n) = egin{cases} \ \frac{n + 1}{2} & ext{if search successful} \ \ n & ext{if search unsuccessful} \ ext{Where } p ext{ is the probability of success.} \end{cases}
The probability of the first match occurring is uniform across all positions.
The average-case complexity formula is derived from addition of probabilities over checks.
What happens when you set $p = 1$ (search successful) or $p = 0$ (search unsuccessful)?
Asymptotically Positive Functions
Definition
Let $T(n)$ be asymptotically positive, meaning:
such that:
ext{Exist } n0 : orall n ext{ such that } n ext{ is greater than or equal to } n0, \ T(n) > 0
Examples of Asymptotically Positive Functions
Number of computations
Extra space
Number of cache misses
Number of queries
Note: From this point onward, it is assumed that $T(n)$ and similar functions are asymptotically positive.
Asymptotic Notations
Definitions
Let us define the binary relations over functions. For two functions $f(n)$ and $g(n)$:
Computing Limits with L'Hôpital's Rule
Rule:
Suppose functions $f$ and $g$ are differentiable on an open interval I, except possibly at a point $c$ contained in $I$. If:
for all $x$ in I where $x
eq c$$ ext{lim}{x o c} f'(x)/g'(x)$ exists Then:
Non-decreasing Order of Functions
Problem:
Prove that the following non-decreasing order applies:
Practice Problems
Problem:
Evaluate the complexity of the Loooooooooop kernel based on given parameters.
Kernel: Loooooooooop(n)
plaintext for i ← I_{start}; i ≤ I_{end}; I_{incr} do for j ← J_{start}; j ≤ J_{end}; J_{incr} do do nothing
Various values for $I{start}$, $I{end}$, $I{incr}$, $J{start}$, $J{end}$, and $J{incr}$ are given as:
$I{e} rianglelefteq I{incr}$
$J{s} rianglelefteq J{e}$
$J_{i} rianglelefteq ext{2}$
Example Evaluations Include: - 1:
The execution time of the inner and outer loops combining to arrive at total complexity.
For instance, time complexity could be represented and derived through substitutions of summation and analysis of series.
Important Additional Concepts
Polynomial Class
To show that if $f(n)$ is a polynomial of degree $k$, express that with a_k > 0
Results in comparison showing how asymptotic notations relate to function growth.
Comparative Problems
Showing relationships and comparisons between various growth rates and calculations of logarithmic and polynomial boundaries.
Example Evaluations
Involves showing conditions and limits as outlined through algebra and series analysis.
Additional Resources
Important Links for Review:
Brush up on mathematical skills
Derivatives and differentiation rules
List of integrals
Maclaurin series
List of mathematical series
Conclusion
These notes provide not only a structure to understand algorithmic complexity and efficiency but also offer practical exercises to cement this knowledge through hands-on analysis. With foundational principles and thorough proofs, the framework set becomes essential for analyzing computer performance and algorithm selection.