1/10
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Define the concept of recursion in computer science and explain how it is used in
sorting algorithms. Give details about
1. The basic idea behind recursive sorting algorithms
Recursion based sorting algorithms work in a way where they will break down the
collection to smaller sub collections before finally processing or sorting it. In the context of
merge sort, merge sort will break down a large collection into smaller sub collections until
it is sorted (this is usually when the collection only has 1 element, which automatically
means it is sorted, known as the base case). It will then combine the smaller sub
collections until the length of the original collection is reached and all elements inside are
sorted.
Recursion is a concept in computer science that involves breaking down a large problem
into smaller simple problems until the problems are simple enough to solve using the
same algorithm, where this is considered the base case.
Recursive Sorting Algorithms uses the concept of recursion in a Divide and Conquer
concept, where the collection that needs to be sorted will be divided into smaller
sub-collections, before the algorithm sorts (conquers) it. In the context of merge sort, the
collection will be divided until only one element is left in the sub list, meaning it is
inherently sorted (conquered) this is also the base case of merge sort. After this happens,
merge sort will merge all the sub lists into a temporary list while ensuring the elements
added are still sorted. After this merge process, the list is now sorted
How they divide the original problem?
Recursive sorting algorithms divide the original problem by breaking down the bigger list
into smaller sublists, usually until the sublists are sorted.
For Merge Sort, Merge sort will keep dividing the input list until the input list has a length
of 1, which means the list is already sorted (as there is only 1 element, it automatically
means this is sorted). This is the base case for Merge Sort.
For Quick Sort, Quick sort will divide the input list by using a “pivot”, where it selects an
element in the list and divides the list into the left list and right list. Quick Sort will process
this list by ensuring elements inside the left list is smaller than the pivot and elements in
the right list is greater than the pivot
How they combine the results.
Merge Sort combines the result by merging the sublists into one bigger list, while sorting
the elements in the process.
Quicksort combines the results by sorting each sub-list and appending it all to the bigger
list. This will cause the entire list to be sorted as each sublist will be sorted.
Give an example of a recursive sorting algorithm and explain how it works.
An example of a recursive sorting algorithm would be Merge Sort.
1. Merge sort works in a way where it breaks down the input list into smaller sublists,
all the way until the list only consists of one element, which will result in the list
being sorted. This is the base case of Merge Sort.
2. When merge sort reaches its base case, it will then start merging the sublists
together, ensuring the order of elements are sorted. It does this merging process
until all the sublists have been merged together, returning us the original list that is
sorted
Briefly describe QuickSort algorithm giving details about
1. How does it sort a given array
Quicksort sorts a given array by first dividing the array into sub-arrays. This is done by
picking a pivot, which ideally is the median of the array. It will then split the subarrays in a
way where all elements in the left subarray will be lesser than the pivot, while all elements
in the right subarray will be greater than the pivot. By doing so, the pivot is now in the
final position in the actual array. The next step would be to go to each of the subarrays,
and further break it down to even smaller subarrays, doing this until a subarray of length
1 and everything is sorted. When this is done, the array is fully sorted and the quicksort is
done.
What is the best and worst case the time complexity of QuickSort and explain
why? No explanation no marks.
Best Case Complexity: O(N log N)
- N is number of elements in the list
Explanation: The best case complexity of quicksort happens when the pivot chosen is the
median element. When this scenario happens, the depth of recursion will be logarithmic
since the list will always be partitioned in half. Given that the complexity of partitioning a
list is O(N), and the complexity of sorting partitioned list is O(log N), we get a final time
complexity of O(N log N)
Worst Case Complexity: O(N^2)
- N is number of elements in the list
Explanation: The worst case complexity of quicksort happens when the pivot chosen is
either the smallest or largest element in the list. This is because the depth of recursion
will be linear to the input size. This happens because when we are partitioning the list,
one half of the list will always have an element of 0 (left list will be empty if pivot is
smallest, right will be empty if pivot is largest). Since the left or right list will always be
empty, we are simply reducing the list by one element, meaning we will have a recursion
depth of O(N). Given that partitioning a list is O(N), we will then get a complexity of
O(N*N) or O(N^2)
Does the choice of the pivot matter in a QuickSort algorithm why/why not?
The choice of a pivot matters in QuickSort as a bad selection of pivots can result in the
QuickSort algorithm having a time complexity of O(N^2) which is its worst case. On the
other hand, a good selection of pivots can result in the QuickSort algorithm having a time
complexity of O(N log N) which is its best case. This happens because a “good” pivot will
ideally partition the list in a “balanced” way, where elements in the left and right partition
are equal. However, a “bad” pivot will partition it in an unbalanced way, where the worst
case is one of the partition is empty while the other partition has N-1 elements.
If the maximum or minimum element is always chosen as the pivot in the Quicksort
algorithm, the partitioning becomes highly unbalanced, with one partition having n - 1, n -
2, and so on
elements, while the other partition has 0 elements. This leads to O(N) recursive levels
because
each recursive call reduces the size of the problem by only one element. Since
partitioning at each
level has O(N) time complexity, the total time complexity in this worst-case scenario for
the quick sort algorithm is is O(N^2).
Briefly describe MergeSort algorithm giving details about
1. How it sorts a given array?
Mergesort will first partition the array by splitting it down the middle, breaking it
down until all the subarrays are of length 1
2. Since arrays of length 1 are inherently sorted, merge sort does not need to do any
sorting in the array
3. Mergesort will then merge all the subarrays while ensuring the array stays sorted.
This is done by comparing the values in each subarray being merged and
appending it in a sorted order
4. After all subarrays have been merged, the array is now sorted and mergesort is
done.
What is the best and worst case the time complexity of MergeSort and explain
why? No explanation no marks.
Best case and Worst case of O(N log N)
- N is number of elements in the list
Explanation: Both the best and worst case of MergeSort is O(N log N) as there are no
early returns for MergeSort. This complexity itself happens as we are constantly splitting
the list in half, which recursively has a recursive call depth of log N. However, we also
need to take into account the cost of merging all subarrays together, which has a
complexity of O(N log N). So, our complexity for MergeSort is O(N log N).
In which scenario would MergeSort be a good choice of sorting algorithm and
explain why
MergeSort would be a good choice of sorting algorithms when dealing with large sets of
data, as its complexity is always O(N log N) complexity with N being the length of the
input list. Furthermore, it is also a stable sorting algorithm which may be important in
some cases as it preserves the relative order of elements, as opposed to a similarly fast
sorting algorithm such as QuickSort.
Describe the stable and unstable versions of QuickSort and MergeSort. What are the
differences between them? Compare the stability of QuickSort and MergeSort and
discuss which one would be a better choice in a scenario where stability is important
In a stable sorting algorithm, the relative order of elements which are equal is maintained.
For example, if the input list is [(a,0),(c,2),(d,3),(b,0)]
and is sorted according to the
second element of the tuple, the output list should be [(a,0), (b,0), (c,2),(d,3)]
.
Merge Sort is inherently a stable sorting algorithm, as it always maintains the initial
relative order of items if they have the same value. In the merge step, as we make
comparisons to create a sorted array, if two items have equal value, the left hand array’s
item will be picked before the right hand array’s item.
On the other hand, QuickSort is inherently unstable because items are moved around the
pivot without considering their original position, which can disrupt the relative order of
equal elements.
That is, if a pivot element 'p' has the same value as an element 'a' that originally appears
before it, 'p' may end up before 'a' after sorting, disrupting the order. However, quick sort
can be made stable by altering the algorithm such that it always sorts elements of the
same value based on the relative order of items during the process of partitioning the
array.
Within the stable version of QuickSort/MergeSort, elements of equal value are sorted in
the same order as they were in the original array. That is to say, if 'a' comes before 'b' and
'a' and 'b' have the same value, 'a' should be before 'b' in the sorted list.
In the unstable version of QuickSort/MergeSort, elements of equal value are not sorted in
the same relative order as they were in the original array. For example, if the input list is
[(a,2),(b,1), (d,0),(c,2)]
and is sorted according to the first element second element of
the tuple, the output list could be [(d,0), (b,1), (a,2), (c,2)]
under the unstable version of
an algorithm.
Overall, in scenarios where stability is important, MergeSort is the better choice as it
guarantees stability inherently, while QuickSort would require specific modifications to
ensure it behaves properly