1/26
Flashcards covering key concepts in algorithm efficiency and computational complexity.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Algorithm Efficiency
A measure of how efficiently an algorithm utilizes resources such as time and space.
O(n) Complexity
A loop that performs a fixed number of operations n times.
Worst-Case Efficiency
The maximum number of operations required for inputs of a given size.
Average-Case Efficiency
The average number of operations required for inputs of a given size.
Best-Case Efficiency
The fewest number of operations required for inputs of a given size.
Computational Complexity
A measure of the effort needed to apply an algorithm, often in terms of time and space.
Time Complexity
A measure of the time required to execute an algorithm.
Space Complexity
A measure of the amount of memory required by an algorithm.
Asymptotic Complexity
A simplified way to represent algorithm efficiency, focusing on how performance scales with input size, denoted as Big O.
Big O Notation
Used to give an asymptotic upper bound for a function, representing the upper bound of an algorithm's complexity.
Linear Relationship (Complexity)
If t=cn, then an increase in the size of data increases the execution time by the same factor
Logarithmic Relationship (Complexity)
If t=log2n then doubling the size ‘n’ increases ‘t’ by one time unit.
Running Time
The number of steps an algorithm requires to solve a specific problem.
Upper Bound
The maximum number of steps an algorithm requires.
Lower Bound
A certain number of steps that every algorithm has to execute at least in order to solve the problem.
Quadratic Equation Complexity
O(n^2) - Number of operations proportional to the square of the size of the input data
Cubic Equation Complexity
O(n^3) - Number of operations proportional to the cube of the size of the input data
Exponential Equation Complexity
O(2^n) - Exponential number of operations, grows very fast.
Constant Complexity
O(1) - Number of operations does not depend on the size of the problem.
Calculating Average Case Complexity
Calculating approximations in the form of big-O, big-Ω and big-Θ can simplify the task.
Big-Ω
Represents an algorithm's lower bound.
Big-Θ
Means tight bound.
Logarithmic Loops
f(n) = log2 n ; where 2 is the multiplier/divisor
Linear Loops
f(n) = n
Nested Loops
Iterations = outerloop iterations x inner loop iterations
Linear Logarithmic
f(n) = n log2 n
Quadratic
f(n) = n2