TM

Analysis of Algorithms

Chapter Scope

  • Efficiency Goals: The chapter will discuss efficiency goals in the context of algorithm analysis.
  • Algorithm Analysis: An introduction to the concept of algorithm analysis.
  • Big-Oh Notation: Explanation and use of Big-Oh notation for expressing algorithm complexity.
  • Asymptotic Complexity: Introduce the concept of asymptotic complexity.
  • Comparing Growth Functions: The chapter will compare various growth functions.

Algorithm Efficiency

  • The efficiency of an algorithm is generally expressed in terms of CPU time.
  • The analysis of algorithms categorizes algorithms by efficiency.
  • Dishwashing Example: Washing a dish takes 30 seconds, drying takes 30 seconds.
    • Washing and drying n dishes takes n minutes. This implies a linear relationship between the number of dishes and the time required.
  • A less efficient approach involves redrying all previously washed dishes after washing another one.

Problem Size

  • For algorithm analysis, define the problem size.
  • Dishwashing: Size n = number of dishes.
  • Search Algorithm: Size = size of the search pool.
  • Sorting Algorithm: Size = number of elements to be sorted.

Growth Functions

  • Optimize for time complexity (CPU time) or space complexity (memory space).
  • CPU time is generally the focus.
  • A growth function shows the relationship between the problem size n and the optimized value (time).

Asymptotic Complexity

  • The growth function of the second dishwashing algorithm is t(n) = 15n^2 + 45n.
  • Knowing the exact growth function is often unnecessary.
  • Asymptotic complexity is the general nature of the algorithm as n increases.

Asymptotic Complexity

  • Asymptotic complexity is based on the dominant term of the growth function, which increases most quickly as n increases.
  • For the second dishwashing algorithm, the dominant term is n^2.

Big-Oh Notation

  • Coefficients and lower order terms become less relevant as n increases.
  • An algorithm is order n^2, written as O(n^2).
  • This is Big-Oh notation.
  • Big-Oh categories exist like O(1), O(log n), O(n), O(n log n), O(n^2), O(2^n), O(n!).
  • Algorithms in the same category have similar efficiency, but may not have equal growth functions or behave identically for all n.

Comparing Growth Functions

  • Faster processors do not diminish the importance of efficient algorithms.
  • A faster CPU helps, but does not change the dominant term.

Comparing Growth Functions

  • As n increases, growth functions diverge dramatically.

Comparing Growth Functions

  • For large values of n, the difference is even more pronounced.

Analyzing Loop Execution

  • Determine the order of the loop body, then multiply by the number of executions.

  • Example:

    for (int count = 0; count < n; count++)
        // some sequence of O(1) steps
    
    • n loop executions * O(1) operations = O(n) efficiency.

Analyzing Loop Execution

  • Consider the following loop:

    count = 1;
    while (count < n) {
        count *= 2;
        // some sequence of O(1) steps
    }
    
    • The loop is executed log_2{n} times, resulting in O(log n) complexity.

Analyzing Nested Loops

  • For nested loops, multiply the complexity of the outer loop by the complexity of the inner loop.

  • Example:

    for (int count = 0; count < n; count++)
        for (int count2 = 0; count2 < n; count2++) {
            // some sequence of O(1) steps
    }
    
    • Both inner and outer loops have complexity O(n).
    • The overall efficiency is O(n^2).

Analyzing Method Calls

  • The body of a loop may contain a method call.
  • To determine the order of the loop body, the order of the method must be taken into account.
  • The overhead of the method call itself is generally ignored.