Merge Sort and Quick Sort Overview
Merge Sort Process
- Merge Step:
- Combine two sorted halves by comparing their elements.
- Use two pointers to track positions in each half.
- Place the smaller element into the correct position in the merged array and move the relevant pointer.
- Continue until all elements are merged, and if one half is exhausted, copy remaining elements from the other half.
In-Place Sorting
- In-Place Definition:
- An algorithm is in-place if it requires no extra memory allocation other than a fixed number of variables.
- Merge Sort:
- Not an in-place sorting algorithm.
- Requires additional temporary arrays to store intermediate results during merging.
Time Complexity Analysis (Big O Notation)
Divide Stage:
- Each division operation costs $O(1)$ (finding the midpoint).
- The number of divisions for an array of size n is $n - 1$ resulting in a total cost of $O(n)$.
Merge Stage:
- Merging two halves costs $O(n)$ since every element needs to be compared at worst.
- The number of merge operations is $O( ext{height of recursion} ext{ log } n)$ where the height is $log n$.
- Total merging cost is $O(n ext{ log } n)$.
Summary of Complexity
- Overall Complexity:
- Since merging dominates the splitting stage, the overall complexity for merge sort is $O(n ext{ log } n)$.
Example Run of Merge Sort
- Start with array sequence and repetitively find midpoints.
- Split until there are single elements and begin merging.
- Use temporary arrays to store intermediate merges, ensuring no data is lost.
Quick Sort Overview
- Quick Sort Process:
- Select a pivot (can be the first, last, or any element).
- Partition the array into elements less than and greater than the pivot.
- Recursively apply quick sort to the partitions.
Time Complexity for Quick Sort
Best Case:
- When the pivot evenly divides the array results in $O(n ext{ log } n)$.
Worst Case:
- When the pivot results in unbalanced partitions, typically results in $O(n^2)$.
Average Case Complexity:
- Balanced scenarios on average yield $O(n ext{ log } n)$.
Recall: Selecting optimal pivots leads to better performance, whereas repetitive poor pivot choices can degrade performance significantly.
Recap on Partitioning Strategy in Quick Sort
- Use two pointers to track elements from both ends.
- Compare and swap values to ensure elements are partitioned correctly around the pivot.
- When pointers cross, place the pivot in its final sorted position.
Key Takeaways
- Both Merge Sort and Quick Sort utilize divide and conquer strategies but with differing methods and complexities.
- Merge Sort stabilizes the sorting process with consistent $O(n ext{ log } n)$ complexity at the cost of additional memory, whereas Quick Sort can be faster in practice but potentially suffers from worst-case scenarios unless good pivots are chosen.