Algorithmic Design Thinking Notes

  • Algorithm: A set of clear instructions to solve a problem in a finite amount of time.

    • Must be clear and precise.

    • Should work for a specific set of inputs.

    • Can be shown in different ways.

    • Can have many different algorithms for the same solution

Why Analyze an Algorithm?
  • To see if it's good for different uses and compare it to others.

  • Helps us understand and make it better (shorter, simpler).

Computational Complexity
  • How we classify algorithms by how well they work and how hard problems are to solve.

  • Usually focuses on the worst-case performance.

Analysis of Algorithms: Steps
  • Write out the algorithm.

  • See how long each step takes.

  • Find out how often each step is done.

  • Make a simple model for the input.

  • Analyze how often steps are done with the model.

  • Add up the total time by multiplying step time by frequency.

Basic Analysis Operations
  • Time Complexity: How long it takes to run as input grows.

    • Use notations like O (Big O), \\Omega (Omega), and Θ (Theta).

  • Space Complexity: How much memory it needs as input grows.

    • Use Big O notation.

  • Correctness Analysis: Making sure it always gives the right answer.

    • Use logic and math to prove it.

  • Worst-case: The most time or space it could need.

  • Best-case: The least time or space it could need.

  • Average-case: What to expect on average.

  • Asymptotic Analysis: How it acts when the input is really big.

    • Helps compare algorithms.

  • Empirical Analysis: Testing it out with real data.

Order of Growth
  • How the time or space an algorithm needs grows as the input gets very large.

  • Big O notation tells us the upper limit.

Asymptotic Notations
  • Big O Notation (O): The upper limit of running time (worst case).

    • Common growth orders:

    • O(1): Constant time.

    • O(log n): Logarithmic time.

    • O(n): Linear time.

    • O(n log n): Log-linear time.

    • O(n^2): Quadratic time.

    • O(2^n): Exponential time.

    • O(n!): Factorial time.

  • Omega Notation (\\Omega): The lower limit of running time (best case).

  • Theta Notation (Θ): The upper and lower limits (average case).

Space and Time Complexity
  • Space: How much space an algorithm needs.

  • Time: How long an algorithm takes to finish.

  • We want algorithms that take less time and memory.

Complexity of Algorithm
  • How many steps an algorithm needs to solve a problem.

  • We look at the order of operations, not exact counts.

  • O(f) (Big O) shows the complexity.

Typical Complexities of an Algorithm
  • Constant: O(1). Stays the same no matter the input size.

  • Logarithmic: O(log(N)). Grows slowly.

  • Linear: O(N). Grows directly with the input size.

  • Quadratic: O(n^2). Grows faster.

  • Cubic: O(n^3). Grows even faster.

  • Exponential: O(2^n), O(N!), O(n^k). Grows extremely fast.

Time and Space Complexity Examples
  • Linear Search: Time - O(n), Space - O(1).

  • Binary Search: Time - O(log n).

  • Recursive Factorial: Space - O(n) due to function calls.

Brute Force
  • Trying every possible solution until you find the right one.

  • Good for small problems.

  • Bad for big problems because it takes too long.

Pros:
  • Always finds the solution.

  • Simple to use.

Cons:
  • Takes a lot of time.

  • Not creative.

Uses:
  • String matching.

  • Solving the traveling salesman problem.

Divide and Conquer
  • Breaking a big problem into smaller parts, solving them, and putting the solutions back together.

Steps:
  1. Divide problem into smaller problems.

  2. Solve each small problem.

  3. Combine the solutions.

Uses:
  • Binary Search.

  • Quicksort.

  • Merge Sort.

Advantages:
  • Solves hard problems.

  • Fast.

  • Uses memory well.

Disadvantages:
  • Needs good memory management.

  • Can crash if it uses too much memory.

Merge Sort
  • A sorting method that divides, sorts, and merges.

Steps:
  1. Divide the array in half until you can’t divide anymore.

  2. Sort each half.

  3. Merge the sorted halves.

Efficiency:
  • Time: O(nlog n).

  • Space: O(n+log n).

Greedy Algorithm
  • Picking the best choice at each step.

Steps:
  1. Find the best small problem.

  2. Decide what the solution looks like.

  3. Keep picking the best option.

Limitations:
  • Doesn’t always find the best solution.

  • Can’t solve problems with negative values (like in some graphs).

Uses:
  • Finding the shortest paths.

  • Compressing files.

  • Solving scheduling problems.

Backtracking Algorithm
  • Trying solutions until you find one that works.

  • Uses recursion.

When to use:
  • When you don’t have enough information upfront.

  • When each choice leads to more choices.

How it works:
  • Tries a path, and if it doesn’