The Efficiency of Algorithms

Introduction to Algorithm Efficiency
  • Efficient code remains vital due to time and space constraints, despite faster computers.

  • Efficiency is measured by time complexity and space complexity.

  • The study of these constraints is called analysis of algorithms.

Measuring Algorithm Efficiency
  • Focus on counting basic operations, which are the most significant contributors to an algorithm's total execution time.

  • Example (Sum 1 + 2 + ext{…} + n):

    • Algorithm A (loop): Requires n additions.

    • Algorithm B (nested loop): Requires (n^2 + n)/2 additions.

    • Algorithm C (formula): Requires 1 multiplication and 1 division.

  • Algorithm execution time can depend on both data set size and the nature of the data, leading to best, worst, and average case analyses.

Growth-Rate Functions
  • Describe how the number of basic operations scales with increasing input size (n) .

  • Common growth rates include log(log n), log n, n, n log n, n^2, n^3, 2^n, and n!. These illustrate varying rates of increase in operations as n grows.

Big Oh Notation
  • A function f(n) is O(g(n)) (order at most g(n)) if a positive real number c and a positive integer N exist such that f(n) ext{ extless}= c ext{ exttimes} g(n) for all n ext{ extgreater}= N.

  • This means c ext{ exttimes} g(n) acts as an upper bound on f(n) for sufficiently large values of n.

  • Identities for Big Oh Notation:

    • O(k g(n)) = O(g(n))

    • O(g1(n)) + O(g2(n)) = O(g1(n) + g2(n))

    • O(g1(n)) ext{ exttimes} O(g2(n)) = O(g1(n) ext{ exttimes} g2(n))

    • O(g1(n) + ext{…} + gm(n)) = O(max(g1(n), ext{…}, gm(n)))

Impact of Efficiency
  • Doubling the problem size (n) significantly increases execution time for less efficient algorithms:

    • O(n) doubles.

    • O(n^2) quadruples.

    • O(n^3) multiplies by 8.

    • O(2^n) squares its time requirement.

  • For large inputs (e.g., 10^6 items), even polynomial algorithms (n^2, n^3) can require impractically long times (e.g., days to thousands of years), while logarithmic (log n) and linear (n) algorithms remain efficient.