1/16
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Big O Notation (O)
O is Big O, which describes the upper bound (the maximum running time).
Informally: t(n) ∈Θ(g(n)) means g(n) is a function that, as n
increases, is both a upper and lower bound of t(n)
•Formally: t(n) ∈Θ(g(n)), if g(n) is a function and c1 ×g(n) is an
upper bound on t(n) and c2 ×g(n) is an lower bound on t(n), for
some c1 > 0 and c2 > 0 and for “large” n
Big Omega Notation (Ω)
Ω is Big Omega, which describes the lower bound (the minimum running time).
A function t (n) is said to be in (g(n)), denoted t (n) ∈ (g(n)), if t (n) is bounded below by some positive constant multiple of g(n) for all large n, i.e., if there exist some positive constant c and some nonnegative integer n 0 such that t (n) ≥ cg(n)
Big Theta
Θ is Big Theta, which describes the exact bound (the tightest possible range for the running time).
A function t (n) is said to be in (g(n)) , denoted t (n) ∈ (g(n)), if t (n) is bounded both above and below by some positive constant multiples of g(n) for all large n, i.e., if there exist some positive constants c 1 and c 2 and some nonnegative integer n 0 such that c 2 g(n) ≤ t (n) ≤ c 1 g(n)
Asymptotic Complexity, Lower Bound (Ω(n))
Informally: t(n) ∈Ω(g(n)) means g(n) is a function that, as ‘n’ increases, is a lower bound of t(n)
Formally: t(n) ∈Ω(g(n)), if g(n) is a function and c ×g(n) is a lower bound on t(n) for some c > 0 and for “large” n.
Asymptotic Complexity, Upper Bound
Let c ×g(n) be a function that is an upper bound on t(n) for some c > 0 and for “large” n.
Asymptotic Complexity, Exact Bounds (Θ(n))
Informally: t(n) ∈Θ(g(n)) means g(n) is a function that, as n increases, is both a upper and lower bound of t(n)
Formally: t(n) ∈Θ(g(n)), if g(n) is a function and c1 ×g(n) is an upper bound on t(n) and c2 ×g(n) is an lower bound on t(n), for some c1 > 0 and c2 > 0 and for “large” n
Equivalence Classes
Constant - O(1) : Access array element
•Logarithmic - O(log n) : Binary search
•Linear - O(n) : Link list search
•Linearithmic (Supralinear) - O(n log n) : Merge Sorting
•Quadratic - O(n2) : Selection Sorting
•Exponential - O(2n) : Generating all subsets
•Factorial - O(n!) : Generating all permutations
Selection Sort
Scan the entire list and find the smallest element
Exchange smallest element with first element
Then scan the list with the second element
Find the smallest element among the last n-1 elements
Exchange this new smallest element with second element, and tc
The list is sorted after n-1 passes
Time complexity: O(n²)
Not a stable sorting algorithm
Bubble Sort
Compare adjacent elements of the list
Exchange those elements if they are out of order
The list is sorted after n-1 passes
Time complexity: O(n²)
Is a stable sorting algorithm
Insertion Sort
We start with the second element of the array as the first element is assumed to be sorted.
Compare the second element with the first element if the second element is smaller then swap them.
Move to the third element, compare it with the first two elements, and put it in its correct position
Repeat until the entire array is sorted.
Time complexity: O(n)
Is a stable sorting algorithm
Topological Sort
Topological sorting for Directed Acyclic Graph (DAG) is a linear ordering of vertices such that for every directed edge u-v, vertex u comes before v in the ordering.
Note: Topological Sorting for a graph is not possible if the graph is not a DAG.
Time complexity: O(V + E), where:
V: represents the number of vertices (nodes) in the directed acyclic graph (DAG).
E: represents the number of edges in the DAG.
Not a stable sorting algorithm
Binary Search
Divide the search space into two halves by finding the middle index "mid".
Compare the middle element of the search space with the key.
If the key is found at middle element, the process is terminated.
If the key is not found at middle element, choose which half will be used as the next search space.
-> If the key is smaller than the middle element, then the left side is used for next search.
-> If the key is larger than the middle element, then the right side is used for next search.
This process is continued until the key is found or the total search space is exhausted.
Time complextiy of O(log n)
Merge Sort
Divide: Divide the list or array recursively into two halves until it can no more be divided.
Conquer: Each subarray is sorted individually using the merge sort algorithm.
Merge: The sorted subarrays are merged back together in sorted order. The process continues until all elements from both subarrays have been merged.
Is stable sorting method
O (n log n) time complexity
Quick Sort
Choose a Pivot: Select an element from the array as the pivot. The choice of pivot can vary (e.g., first element, last element, random element, or median).
Partition the Array: Re arrange the array around the pivot. After partitioning, all elements smaller than the pivot will be on its left, and all elements greater than the pivot will be on its right. The pivot is then in its correct position, and we obtain the index of the pivot.
Recursively Call: Recursively apply the same process to the two partitioned sub-arrays (left and right of the pivot).
Base Case: The recursion stops when there is only one element left in the sub-array, as a single element is already sorted.
O (n log2 n) time complexity
not a stable sorting method
Stable Sorting
A sorting method is stable if it preserves the relative order of duplicate keys in the file
Sequential Search
The algorithim campares elements in a given list with a given search key until:
A match is encountered (successful search)
Exhausted without finding a match (unsuccessful search)
Log definition
a^b = c IS EQUIVALENT TO b = log a(C)