SZ

ITE131: Data Structures and Algorithms - Asymptotic Notation

Overview

  • Asymptotic Notation provides a way to analyze the efficiency and performance of algorithms in terms of their running time and input size.

Key Concepts

  • Code vs Data Structures

    • Quote by Linus Torvalds: The difference between good and bad programmers is their focus on data structures rather than code.

  • Algorithm Comparison

    • Implementing both algorithms is often expensive and error-prone.

    • Prefer mathematical analysis of algorithms to determine which performs better.

  • Asymptotic Analysis Basics

    • Two key factors to consider are:

    • Running Time (T): Speed of the algorithm.

    • Input Size (n): Quantity of data processed.

Types of Growth Analysis

  • Maximum Value (Worst Case)

    • Example: Finding the maximum in an array of n integers:

  int find_max(int *array, int n) {
      int max = array[0];
      for (int i = 1; i < n; ++i) {
          if (array[i] > max) {
              max = array[i];
          }
      }
      return max;
  }
  • The time complexity here is $O(n)$ for the worst case.

    • Linear Search vs Binary Search

  • Performance comparison shows binary search is significantly faster than linear search for larger arrays.

Asymptotic Notation Types

  1. Big O Notation (O)

    • Describes an upper bound on time complexity.

    • Example: If f(n) grows at most as fast as g(n), we write f(n) = O(g(n)).

  2. Theta Notation (Θ)

    • Describes asymptotically tight bounds: both upper and lower.

  3. Omega Notation (Ω)

    • Describes a lower bound on time complexity.

Orders of Growth

  • Common Orders:

    • $O(1)$: Constant

    • $O(log n)$: Logarithmic

    • $O(n)$: Linear

    • $O(n \, log \, n)$: Linearithmic

    • $O(n^2)$: Quadratic

    • $O(n^3)$: Cubic

    • $O(2^n)$: Exponential

Examples of Sorting Algorithms

  • Selection Sort:

    • Time Complexity: Average and Worst-case $O(n^2)$

    • Selects the smallest value from unsorted elements.

  • Bubble Sort:

    • Time Complexity: Worst-case $O(n^2)$

    • Compare and swap adjacent elements.

  • Insertion Sort:

    • Mimics hand sorting playing cards. Average and worst-case complexity is $O(n^2)$.

  • Quicksort: find pivot

    • A divide-and-conquer algorithm.

    • Performance: Average-case $O(n \, log \, n)$, making it faster than Insertion or Bubble Sort for large datasets.

Performance Analysis of Algorithms

  • Input Size Effects:

    • Discuss the efficiency of algorithms for varying sizes of input (n).

    • Example comparisons between Insertion Sort and Quicksort illustrate that, while Insertion Sort is faster for small n, Quicksort becomes superior as n increases.

  • Runtime Comparisons:

    • For $n ≤ 21$, Bubble sort might be faster due to its simplicity despite its higher time complexity.

Summary of Asymptotic Analysis

  • Asymptotic analysis helps estimate algorithm efficiency without needing empirical execution.

  • Key focus is on how performance changes as n increases, primarily for large datasets.

  • Often disregards small inputs because they do not significantly impact the overall Big O classification.