Last saved 99 days ago
M

Lecture2-Analysis of Algorithms

robot
knowt logo

Lecture2-Analysis of Algorithms

1. Comparing Algorithms

Algorithms are compared in two main ways:

  1. Experimental Analysis:

    • Implement the algorithm and measure its performance on various input sizes.

    • Use tools like System.currentTimeMillis() for timing.

    • Challenges:

      • Environment-dependent (hardware/software).

      • Hard to scale testing for all inputs.

  2. Theoretical Analysis:

    • Analyze pseudo-code.

    • Running time is characterized as a function of input size nnn.

    • Independent of specific environments.


2. Primitive Operations

Primitive operations include:

  • Assigning a value.

  • Accessing array elements.

  • Arithmetic operations.

  • Comparing two values.

Example: Finding the Maximum in an Array

Java Code:
java
public static int arrayMax(int[] A) { int currentMax = A[0];
// Initialize maximum with the first element for (int i = 1; i < A.length; i++) {
// Loop through the array if (A[i] > currentMax) {
// Check if the current element is larger currentMax = A[i]; // Update the maximum } } return currentMax;
// Return the maximum value }
  • Operation Count: 4n−14n - 14n−1 primitive operations.

  • Time Complexity: O(n)O(n)O(n).


3. Growth Rates

Growth rates describe how the running time of an algorithm increases with input size nnn.

Common Growth Rates:

TypeExample T(n)T(n)T(n)Growth for n=10,100,1000n = 10, 100, 1000n=10,100,1000

Constant

111

1, 1, 1

Logarithmic

log⁡n\log nlogn

3, 6, 9

Linear

nnn

10, 100, 1000

Quadratic

n2n^2n2

100, 10,000, 1,000,000

Exponential

2n2^n2n

1024, massive, astronomical


4. Asymptotic Notations

These notations describe bounds on an algorithm's running time:

  1. Big-O (OOO): Upper bound for worst-case growth.

  2. Omega (Ω\OmegaΩ): Lower bound for best-case growth.

  3. Theta (Θ\ThetaΘ): Tight bound for exact growth.

Big-O Rules:
  • Drop constants and lower-order terms.

    • 3n+53n + 53n+5 becomes O(n)O(n)O(n).

    • 2n+10002n + 10002n+1000 becomes O(n)O(n)O(n).


5. Algorithm Analysis Examples

Selection Sort

Selection sort finds the smallest element and places it in the correct position iteratively.

Java Code:

java
public static void selectionSort(int[] arr) { int n = arr.length; for (int i = 0; i < n - 1; i++) { int indexOfCurrentMin = i; // Assume current index has the smallest value for (int j = i + 1; j < n; j++) { if (arr[j] < arr[indexOfCurrentMin]) { // Check for a smaller value indexOfCurrentMin = j; // Update the index of the smallest value } } // Swap the smallest element with the first unsorted element int currentMin = arr[indexOfCurrentMin]; arr[indexOfCurrentMin] = arr[i]; arr[i] = currentMin; } }

  • Time Complexity: O(n2)O(n^2)O(n2) due to nested loops.


Computing Prefix Averages
  1. Naive Approach:

    • Nested loops calculate the sum of prefixes.

Java Code:

java
public static double[] prefixAverages1(int[] X) { int n = X.length; double[] A = new double[n]; for (int i = 0; i < n; i++) { // Outer loop for each prefix int sum = 0; for (int j = 0; j <= i; j++) { // Inner loop to calculate the sum sum += X[j]; } A[i] = (double) sum / (i + 1); // Compute the average } return A; }
  • Time Complexity: O(n2)O(n^2)O(n2).

  1. Optimized Approach:

    • Use a running sum to avoid redundant calculations.

Java Code:

java 

public static double[] prefixAverages2(int[] X) { int n = X.length; double[] A = new double[n]; int sum = 0; for (int i = 0; i < n; i++) { // Single loop for prefix computation sum += X[i]; // Maintain a running sum A[i] = (double) sum / (i + 1); // Compute the average } return A; }
  • Time Complexity: O(n)O(n)O(n).


6. Algorithm Complexity vs. Problem Complexity

  1. Algorithm Complexity: Measures the running time of a specific algorithm.

    • Example: Sorting with selection sort has O(n2)O(n^2)O(n2) complexity.

  2. Problem Complexity: Determines the best possible complexity for solving a problem.

    • Example: Sorting cannot be solved faster than O(nlog⁡n)O(n \log n)O(nlogn) using comparisons.


Key Tips for Studying Algorithm Analysis

  • Focus on understanding growth rates and Big-O notation.

  • Practice counting primitive operations in code.

  • Compare algorithms with theoretical and experimental methods.

  • Rewrite algorithms to optimize performance where possible.

To determine the time complexity of Java code, follow these steps:

  1. Identify Input Size: Understand what the input size (often denoted as 'n') is within the context of the problem.

  2. Analyze Loop Structures: Examine any loops in the code:

    • A single for or while loop typically contributes O(n) if it iterates n times.

    • Nested loops contribute multiplicatively; for example, two nested loops each iterating over n would contribute O(n^2).

  3. Count Primitive Operations: Keep track of key operations such as:

    • Assignments (e.g., int x = 5;)

    • Array access (e.g., array[i])

    • Arithmetic operations (e.g., x + y)

    • Comparisons (e.g., if (x < y)) which are generally performed constantly.

  4. Examine Conditionals: Check how many times certain blocks of code are executed, as conditional statements can affect the overall complexity based on input size.

  5. Sum up All Contributions: Compile the contributions from loops, conditionals, and primitive operations to get a total count of operations.

  6. Express in Big-O Notation: Finally, represent the overall time complexity using Big-O notation which provides an upper limit on performance. For instance, if the combination of outer and inner loops results in operations growing quadratically, denote this as O(n^2).

Example Java Code Analysis

public static int arraySum(int[] arr) {
int sum = 0; // O(1) - constant time operation
for (int i = 0; i < arr.length; i++) { // O(n)
sum += arr[i]; // O(1)
}
return sum; // O(1)
}
  • In this example, the loop runs 'n' times, leading to a time complexity of O(n). The overall contribution is the sum of O(1) operations for initializing and returning a value plus O(n) from the loop, resulting in a total time complexity of O(n).