Comprehensive Algorithms & Complexity: Detailed Bullet-Point Study Notes

Core Concepts

  • Problem: A well-defined task with input + expected output.

    • Problem Instance: A problem with specific input values.

    • Algorithm: A step-by-step procedure to solve all instances of a problem.

    • Efficiency: Measured by time complexity (steps) and space complexity (memory).

  • Analysis Types

    • T(n): usual (average) complexity for a problem size n

    • W(n): worst case complexity

    • A(n): average case complexity

    • B(n): best case complexity

  • Overhead vs Control

    • Overhead: operations independent of input size

    • Control: operations dependent on input size

  • Asymptotic Notations

    • O: upper bound on growth rate

    • Ω: lower bound on growth rate

    • Θ: tight bound (both upper and lower bound)

  • Complexity Classes (growth order)

    • O(\,\log n\,) < O(n) < O(n \log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)

  • Space Complexity

    • Memory required: instructions, constants, variables, calls

  • Auxiliary Space

    • Temporary extra storage (excludes input)

  • In-place Algorithm

    • Uses only O(1) extra memory (e.g., array reversal)

  • Key Techniques

    • Recursion: solve problem via smaller subproblems

    • Divide & Conquer (D&C):

    • Divide problem into subproblems

    • Conquer recursively solve them

    • Combine results into final answer

    • Recurrence Solving Methods:

    • Substitution

    • Recursion Tree

    • Master Method

  • Equations & Recurrence Relations

    • Recurrence problems provide time complexity expressions that arise from subproblem decomposition.

    • Linear search: Worst-case time complexity W(n)=n\,,\; B(n)=1 → O(n)

    • Binary search: T(n)=T(n/2)+\Theta(1) → \Theta(\log n)

    • Fibonacci (recursive): T(n) > 2^{n/2}

    • Fibonacci (iterative): O(n)

    • Merge Sort: T(n)=2T(n/2)+O(n) → O(n \log n)

    • Quick Sort (worst): T(n)=T(n-1)+n → O(n^2)

    • Quick Sort (avg.): T(n)=2T(n/2)+n → O(n \log n)

    • Power Problem: T(n)=T(n/2)+O(1) → O(\log n)

Recurrence Solving: The Master Theorem & Examples

  • Master Theorem (for T(n)=aT(n/b)+f(n)):

    • Case 1: If f(n)=O(n^{\logb a-\epsilon}) for some (\epsilon>0), then T(n)=\Theta(n^{\logb a})

    • Explanation: The work outside the recursive calls is smaller than the work inside subproblems.

    • Case 2: If f(n)=\Theta(n^{\logb a}), then T(n)=\Theta(n^{\logb a}\log n)

    • Case 3: If f(n)=\Omega(n^{\log_b a + \epsilon}) for some (\epsilon>0) and regularity condition holds, then T(n)=\Theta(f(n))

  • Applications:

    • Example: If a=2, b=2, f(n)=O(n), then n^{\log_b a}=n and Case 2 applies: T(n)=\Theta(n\log n)

    • Example: If a=4, b=2, f(n)=O(n), then n^{\log_b a}=n^{2} and Case 1 applies: T(n)=\Theta(n^{2})

Data Structures & Properties

  • Structure / Approach vs Space Complexity vs Notes

  • Array ops

    • Depends on size n

    • Basic operations scale with input size

  • Iterative factorial

    • Time: O(1)

    • Space: O(1)

    • Rationale: direct iteration uses constant time per operation and constant extra storage

  • Recursive factorial

    • Time: O(n)

    • Space: O(n)

    • Rationale: uses call stack proportional to depth n

  • In-place

    • Time: variable; Space: O(1)

    • Example: reversal in place

  • Merge Sort

    • Time: O(n)? Note: Actually Merge Sort uses O(n \log n) time; Space: O(n) (auxiliary array)

  • Quick Sort

    • Time: Average O(n \log n), Worst O(n^2)

    • Space: Recursive stack only (average O(\log n), worst O(n))

Algorithms & Complexities (summary table format, bullet-friendly)

  • Searching

    • Linear Search: Time O(n), Space O(1)

    • Binary Search: Time O(\log n), Space O(1)

  • Sorting

    • Bubble Sort: Time O(n^2), Space O(1)

    • Merge Sort: Time O(n \log n), Space O(n)

    • Quick Sort: Time (Avg) O(n \log n), Time (Worst) O(n^2); Space (Avg) O(\log n), Space (Worst) O(n)

  • Math

    • Power: x^n

    • Naïve: O(n)

    • Divide & Conquer: O(\log n) (via recursion)

  • Fibonacci

    • Recursive: Exponential time, high stack usage

    • Iterative: O(n)

  • Matrix Multiplication

    • Standard multiplication: T(n)=O(n^3) time, S(n)=O(n^2) space

    • Strassen’s Algorithm: T(n)=O(n^{2.8}) time, S(n)=O(n^2)

Problem-Solving Patterns

  • Replace recursion with iteration when recomputation is wasteful (e.g., Fibonacci memoization or iterative DP).

  • Use Divide & Conquer when subproblems are independent and easily recombined.

  • Avoid Divide & Conquer if subproblems are nearly the same size as the original (can lead to poor time/space trade-offs, e.g., O(n^2) or worse).

  • Prefer in-place solutions for space efficiency when possible.

  • Rely on worst-case analysis in critical applications (e.g., safety-critical systems) to ensure guarantees.

Notes on Notation and Conventions

  • When expressing formulas, use LaTeX syntax and enclose in double dollars: …

  • Common symbols:

    • O(\cdot) for upper bounds

    • \Omega(\cdot) for lower bounds

    • \Theta(\cdot) for tight bounds

    • \log_b a for logarithms with base b

Quick reference (at a glance)

  • Growth order: O(\log n) < O(n) < O(n \log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)

  • Master Theorem as a decision tree (Case 1, Case 2, Case 3) with corresponding results

  • Major algorithm families and typical complexities:

    • Linear/Binary search: O(n)\,;\; O(\log n) time

    • Sorting: O(n^2), O(n \log n), O(n^3) (and improvements like Strassen: O(n^{2.8}))

    • Matrix multiplication: O(n^3) (standard) vs. O(n^{2.8}) (Strassen)

    • Fibonacci: exponential without optimization; linear with iteration/DP

If you’d like, I can generate a one-page at-a-glance table listing only algorithms, their typical time complexities, and space, as a quick exam reference.