1/23
Vocabulary flashcards covering Big O notation, Landau symbols, and algorithmic time complexity concepts from the lecture notes.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Big O notation
A notation describing the upper bound on the growth rate of a function as input size grows; used in complexity theory, computer science, and mathematics; ignores constants and lower-order terms.
Landau symbol
Another name for Big O notation; named after Edmund Landau; used to describe asymptotic growth.
O(g(n)) formal definition (x → ∞)
There exist constants N and C such that |f(x)| ≤ C|g(x)| for all x > N.
O(g(x)) for x → a
There exist constants d > 0 and C > 0 such that |f(x)| ≤ C|g(x)| for all x with |x − a| < d.
O(1)
Constant time; the growth is bounded by a fixed constant, independent of input size.
O(log n)
Logarithmic growth; time grows proportionally to the logarithm of n.
O((log n)^c) (polylogarithmic)
Polylogarithmic growth; the log term is raised to a constant power c.
O(n)
Linear time; growth proportional to n.
O(n^2)
Quadratic time; growth proportional to n squared.
O(n^c)
Polynomial time; growth proportional to n raised to the power c (c > 0).
O(c^n)
Exponential time; growth proportional to a constant base c raised to the power n.
Difference between O(n^c) and O(c^n)
O(n^c) is polynomial growth; O(c^n) is exponential growth and dominates for large n.
Superpolynomial
Growth faster than any polynomial n^k for fixed k, but not necessarily exponential.
Subexponential
Growth slower than exponential c^n but faster than any polynomial.
Dominant term
In a sum of functions, the fastest-growing term dictates the overall order as n grows (assuming a constant number of terms).
Log bases equivalence
Logarithms with different bases differ by a constant factor, so O(log n) = O(log(n^c)).
Worst-case time
The maximum number of operations that might be performed for a given problem size.
Problem size / input size
The size of the problem, typically denoted n, used as the argument of complexity expressions.
Basic operation
A minimal unit of work such as one arithmetic operation, one assignment, one test, one read, or one write.
Theta notation
A tight bound: f(n) = Θ(g(n)) means f(n) is both O(g(n)) and Ω(g(n)).
Omega notation
A lower bound: f(n) = Ω(g(n)) means f grows at least as fast as g(n) up to a constant factor for large n.
Little o notation
f(n) = o(g(n)) means f grows much slower than g; formally, lim_{n→∞} f(n)/g(n) = 0 (when g(n) > 0).
Little omega notation
f(n) = ω(g(n)) means f grows faster than g by more than a constant factor; equivalently, g(n) = o(f(n)).
Time complexity
The relationship between input size and resource usage (time/space) of an algorithm, typically described with O-notation.