Algorithm Complexity

1. What is Algorithm Complexity?

Algorithm complexity measures the efficiency of an algorithm in terms of:

  • Time Complexity – Number of steps executed by an algorithm.

  • Memory Complexity – Amount of memory used during execution.

Factors Affecting Algorithm Running Time:

  1. Input size and type.

  2. Efficiency of the compiler.

  3. Speed of the executing machine.

  4. Complexity of the algorithm itself.


2. Measuring Efficiency of Algorithms

Efficiency is categorized based on how execution time increases with input size nn.

Basic Complexity Categories:

Complexity Type

Growth Rate

Example

Constant (O(1))

Fixed time, does not depend on nn

Arithmetic operations, assignments, comparisons.

Linear (O(n))

Execution time grows proportionally with nn

Printing an array of size nn.

Logarithmic (O(log n))

Execution time grows slowly as nn increases

Binary search.

Linear Logarithmic (O(n log n))

Increases more than linear but less than quadratic

Merge Sort, Quick Sort.

Quadratic (O(n²))

Execution time grows rapidly with nn

Nested loops (e.g., Bubble Sort).

Cubic (O(n³))

More expensive than quadratic, used for small problems

Matrix multiplication.

Exponential (O(2ⁿ))

Grows exponentially, infeasible for large nn

Towers of Hanoi.

Factorial (O(n!))

Grows the fastest, worst-case complexity

Permutations.


3. Understanding Big-O Notation

Big-O notation describes the upper bound (worst-case scenario) of an algorithm’s complexity.

Rules for Determining Big-O:

  1. Keep the dominant term (largest term in the function).

  2. Ignore coefficients (constants don’t affect complexity ranking).

  3. Rank of common complexities (from best to worst performance):

    • O(1)O(1) → O(log⁡n)O(\log n) → O(n)O(n) → O(nlog⁡n)O(n \log n) → O(n2)O(n^2) → O(n3)O(n^3) → O(2n)O(2^n) → O(n!)O(n!)