A recursive procedure, denoted as p(input x of size n), consists of two main components: the Base Case and the Recursive Case. The Base Case states that if n is less than some constant k, the procedure solves the problem directly without further recursion. In the Recursive Case, subproblems of x are created, each having a size of n/b, and procedure p is called recursively on each of these subproblems, ultimately combining the results from these subproblems.
Merge Sort operates by merging arrays. For example, given input arrays {1,2,3,4,5,6,7,9} and {2,3,7,9}, the merging process involves multiple steps of combining elements to form a sorted array.
The time complexity for Merge Sort is O(n log n). The algorithm works as follows: if the length of the array A is less than or equal to 1, it returns A; otherwise, it returns the result of merging the sorted left half of A with the sorted right half of A.
The work done at each depth i of recursion encompasses both splitting and merging. The total work and the total depth of the operation need to be calculated.
An example of quadratic time complexity is O(n²), illustrated by the closest pair of points in a plane problem. Given pairs of points (x1, y1), …, (xn, yn), an iterative approach using double for loops results in a complexity of O(n²), where the minimum distance is calculated through nested loops. Alternative methods aim to find better complexities than O(n²).
In the Largest Interval Sum Problem, one must find indices i and j in an array A[0],…,A[n – 1] such that the sum A[i] +... + A[j] is maximized. The naïve algorithm initializes maximum_sum to -∞ and uses nested loops to iterate through all pairs in the array to calculate the maximum sum.
Kadane's Algorithm optimizes the calculation of interval sums by maintaining a running sum and checking against the maximum threshold. For example, when applied to the array [2, -3, 4, 2, 5, 7, -10, -8, 12], it efficiently finds the maximum subarray sum.
Recurrence relations are leveraged to determine computational complexities, such as for Merge Sort, where T(1) = 1 and T(n) = 2T(n/2) + n for n > 1.
An induction hypothesis assumes T(n) ≤ cn log n and proves both the base case and the general case via substitution, specifically using base cases T(2) and T(3).
The Master Theorem provides a framework for solving recurrences of the form T(n) = aT(n/b) + f(n), where "a" represents the number of subproblems, "b" is the factor by which the problem size is reduced, and f(n) describes the additional work outside of recursive calls. The steps to apply the Master Theorem include identifying a, b, and f(n), calculating n^(log_b(a)), and comparing f(n) to this value. Depending on the comparison, three cases define the asymptotic bounds of T(n). For Merge Sort, the recurrence leads to T(n) = Θ(n log n) as it falls under Case 2, where f(n) is asymptotically equal to n^(log_b(a)).
Common algorithms include Binary Search, with a complexity of O(log n) derived from T(n) = T(n/2) + O(1), and Merge Sort, with a complexity of O(n log n) evaluated from T(n) = 2T(n/2) + O(n).
Certain recurrence relations are not manageable by the Master Theorem, such as T(n) = 2T(n/2) + n^b where "a" is not a constant.
To handle intractability in algorithm problems, techniques like approximation, which finds nearly optimal solutions, and randomization, which employs probabilistic algorithms, can be utilized.
To solve the recurrence relation T(n) = 3T(n/2) + n² using the Master Theorem, we first identify the parameters:
a = 3 (the number of subproblems),
b = 2 (the factor by which the problem size is reduced),
f(n) = n² (the additional work outside recursive calls).
Next, we calculate (n^{\log_b(a)}):
(\log_b(a) = \log_2(3)) (approximately 1.585),
Therefore, (n^{\log_b(a)} = n^{\log_2(3)} = n^{1.585}).
Now, we compare f(n) with n^{\log_b(a)}:
Since f(n) = n² is polynomially larger than n^{1.585} (because 2 > 1.585), we fall into case 3 of the Master Theorem, which states:
If f(n) is polynomially larger than n^{\log_b(a)} and satisfies the regularity condition, then:
T(n) = Θ(f(n)) = Θ(n²).
Thus, the solution of the recurrence T(n) = 3T(n/2) + n² is T(n) = Θ(n²).