Asymptotic Notation provides a way to analyze the efficiency and performance of algorithms in terms of their running time and input size.
Code vs Data Structures
Quote by Linus Torvalds: The difference between good and bad programmers is their focus on data structures rather than code.
Algorithm Comparison
Implementing both algorithms is often expensive and error-prone.
Prefer mathematical analysis of algorithms to determine which performs better.
Asymptotic Analysis Basics
Two key factors to consider are:
Running Time (T): Speed of the algorithm.
Input Size (n): Quantity of data processed.
Maximum Value (Worst Case)
Example: Finding the maximum in an array of n integers:
int find_max(int *array, int n) {
int max = array[0];
for (int i = 1; i < n; ++i) {
if (array[i] > max) {
max = array[i];
}
}
return max;
}
The time complexity here is $O(n)$ for the worst case.
Linear Search vs Binary Search
Performance comparison shows binary search is significantly faster than linear search for larger arrays.
Big O Notation (O)
Describes an upper bound on time complexity.
Example: If f(n) grows at most as fast as g(n), we write f(n) = O(g(n)).
Theta Notation (Θ)
Describes asymptotically tight bounds: both upper and lower.
Omega Notation (Ω)
Describes a lower bound on time complexity.
Common Orders:
$O(1)$: Constant
$O(log n)$: Logarithmic
$O(n)$: Linear
$O(n \, log \, n)$: Linearithmic
$O(n^2)$: Quadratic
$O(n^3)$: Cubic
$O(2^n)$: Exponential
Selection Sort:
Time Complexity: Average and Worst-case $O(n^2)$
Selects the smallest value from unsorted elements.
Bubble Sort:
Time Complexity: Worst-case $O(n^2)$
Compare and swap adjacent elements.
Insertion Sort:
Mimics hand sorting playing cards. Average and worst-case complexity is $O(n^2)$.
Quicksort: find pivot
A divide-and-conquer algorithm.
Performance: Average-case $O(n \, log \, n)$, making it faster than Insertion or Bubble Sort for large datasets.
Input Size Effects:
Discuss the efficiency of algorithms for varying sizes of input (n).
Example comparisons between Insertion Sort and Quicksort illustrate that, while Insertion Sort is faster for small n, Quicksort becomes superior as n increases.
Runtime Comparisons:
For $n ≤ 21$, Bubble sort might be faster due to its simplicity despite its higher time complexity.
Asymptotic analysis helps estimate algorithm efficiency without needing empirical execution.
Key focus is on how performance changes as n increases, primarily for large datasets.
Often disregards small inputs because they do not significantly impact the overall Big O classification.