1/150
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What is the goal of sorting?
Arrange records or elements into a defined order (typically ascending or descending) based on keys or comparison criteria.
Why study multiple sorting algorithms?
They illustrate performance trade-offs, memory usage patterns, stability, and design paradigms like divide-and-conquer and in-place operations.
What is a comparison sort?
A sort that orders elements solely by comparing keys (e.g., insertion, selection, bubble, merge, quick, heap).
What is a non-comparison sort?
A sort that uses key structure rather than comparisons (e.g., counting, bucket, radix) to achieve linear-time behavior under constraints.
What is algorithmic stability in sorting?
If two items have equal keys, a stable sort preserves their original relative order.
T/F: Stability preserves the original relative order of equal-key records.
TRUE
What does “in-place” mean for sorting?
The algorithm uses O(1) or small extra space beyond the input array (e.g., quick sort, heap sort, insertion sort).
T/F: An in-place sort uses only a constant amount of extra memory.
TRUE
Which complexities matter when judging sorts?
Time complexity (best/average/worst), space complexity, stability, constant factors, and cache behavior.
What is the decision-tree lower bound for comparison sorts?
Ω(n log n) comparisons in the worst case.
T/F: No comparison sort can beat Ω(n log n) in the worst case.
TRUE
When are O(n²) sorts acceptable?
For very small n, nearly sorted data, or when constants and simplicity dominate.
What is insertion sort?
A simple comparison sort that builds a sorted prefix by inserting each element into its correct position.
How does insertion sort operate?
It scans leftward in the sorted region and inserts the current element at the first smaller-or-equal position.
What is the time complexity of insertion sort?
Worst/average O(n²); best O(n) on nearly sorted input.
Is insertion sort stable and in-place?
Yes, it is stable and in-place with O(1) extra space.
T/F: Insertion sort runs in O(n) time on an already nearly sorted array.
TRUE
What is bubble sort?
A simple comparison sort that repeatedly swaps adjacent out-of-order pairs, bubbling extremes toward the end.
What is bubble sort’s time complexity?
Worst/average O(n²); best O(n) with early-exit optimization when no swaps occur on a pass.
Is bubble sort stable and in-place?
Yes, bubble sort is stable and in-place.
T/F: Bubble sort’s average time complexity is O(n log n).
FALSE
What is selection sort?
A comparison sort that repeatedly selects the minimum from the unsorted region and places it at the boundary.
What is selection sort’s time complexity?
Always Θ(n²) comparisons regardless of input order.
Is selection sort stable and in-place?
In-place yes; not stable in its basic form (equal elements may be reordered).
T/F: Selection sort makes Θ(n²) comparisons even on sorted input.
TRUE
What is merge sort?
A stable divide-and-conquer comparison sort that recursively splits, sorts, and merges subarrays.
What is merge sort’s time complexity?
Θ(n log n) in best/average/worst cases due to balanced divide-and-conquer.
What extra space does merge sort require?
Θ(n) additional space for temporary arrays during merging.
Is merge sort stable and in-place?
Stable yes; not in-place in the standard array implementation.
T/F: Merge sort guarantees Θ(n log n) time in the worst case.
TRUE
What is quick sort?
An in-place divide-and-conquer comparison sort that partitions around a pivot, then sorts partitions.
What determines quick sort performance?
Pivot quality; balanced partitions yield O(n log n), highly unbalanced partitions yield O(n²).
What is quick sort’s average and worst-case time?
Average O(n log n); worst-case O(n²) (e.g., already sorted with poor pivot).
Is quick sort stable and in-place?
Typically in-place; not stable in standard in-place implementations.
T/F: Randomized pivot selection helps avoid quick sort’s worst case in practice.
TRUE
What is heap sort?
A comparison sort that builds a heap and repeatedly extracts the max (or min) to produce sorted order.
What is heap sort’s time and space complexity?
O(n log n) time; O(1) extra space (in-place with array-based heap).
Is heap sort stable?
No, heap sort is not stable in standard implementations.
T/F: Heap sort is in-place and runs in O(n log n) worst-case time.
TRUE
What is bucket sort?
A non-comparison sort that distributes keys into a fixed number of buckets, then sorts buckets and concatenates.
When does bucket sort run in linear time?
When keys are uniformly distributed and bucket sorting is O(1) amortized per element, total time O(n + k).
Is bucket sort stable?
If stable sorting is used within buckets and concatenation preserves order, bucket sort can be stable.
T/F: Bucket sort’s performance depends on key distribution across buckets.
TRUE
What is radix sort?
A non-comparison sort that processes keys digit-by-digit using a stable subroutine (often counting sort).
What is radix sort’s time complexity?
O(d·(n + k)) for d digits and radix k; can be O(n) when d and k are bounded.
Is radix sort stable and what space does it use?
Stable if the digit pass is stable; extra space depends on the counting/bucket arrays used.
T/F: Radix sort assumes keys can be decomposed into digits from a finite alphabet.
TRUE
What is external sorting?
Sorting designed for data sets larger than memory, minimizing I/O by creating sorted runs and merging them.
How does external merge sort work?
Create sorted runs that fit in memory, then perform multiway merges to produce the final sorted output.
What dominates external sorting cost?
I/O operations; algorithms aim to minimize disk reads/writes and passes.
T/F: External sorting emphasizes multiway merging and I/O efficiency over CPU comparisons.
TRUE
When would you choose insertion over merge?
On very small or nearly sorted arrays, where insertion’s low overhead and O(n) best case excel.
When would you choose merge over quick?
When worst-case guarantees and stability are required, or when sorting linked lists.
When would you choose quick over merge?
For in-memory arrays when average performance and low extra space are priorities.
When would you choose heap over quick?
When worst-case O(n log n) and in-place are required, and stability is not needed.
T/F: Merge sort is generally preferred for linked lists; quick sort is often preferred for arrays.
TRUE
Which sorts are stable by default?
Merge sort, insertion sort, bubble sort (and radix/bucket when stable subroutines are used).
Which sorts are in-place by default?
Quick sort, heap sort, insertion sort, selection sort, bubble sort (standard array forms).
Which sorts guarantee O(n log n) worst-case time?
Merge sort and heap sort (and some engineered quicksort variants with introspection).
T/F: Radix sort can achieve linear time when digit count and radix are bounded.
TRUE
Which sorts are best for nearly sorted data?
Insertion sort (O(n) best case) and bubble sort with early-exit optimization.
Which sorts are best for very large data on disk?
External merge sort with multiway merges and large block I/O.
T/F: Stability is essential when secondary keys must preserve the order of equal primary keys.
TRUE
Why might heap sort be slower in practice than quick sort?
Heap’s access pattern and constant factors can degrade cache locality compared to quick sort’s partitioning.
Why is merge sort cache-friendly?
It processes arrays sequentially during merge, improving spatial locality compared to random heap accesses.
T/F: Quick sort’s partitioning typically has good cache locality on contiguous arrays.
TRUE
What is the best general-purpose in-memory sort?
Optimized quick sort (often with introsort fallback) due to average speed and small constant factors.
What hybrid techniques improve quick sort?
Randomized or median-of-three pivots, cutoff to insertion sort for small partitions, and introspective fallback to heap sort.
T/F: Many library sorts use hybrids like introsort for robust performance.
TRUE
How do you preserve stability when using bucket or radix sort?
Use stable counting within passes and concatenate buckets without reordering.
What affects radix sort’s practical performance?
Digit width, radix choice, memory bandwidth, and stable counting implementation.
T/F: Counting sort is a stable linear-time subroutine often used inside radix sort passes.
TRUE
When is selection sort favored?
When writes are expensive and minimizing swaps matters, since selection sort performs O(n) swaps.
Which algorithm minimizes auxiliary memory among O(n log n) sorts?
Heap sort (O(1) extra space).
T/F: Merge sort’s Θ(n) extra space is a trade-off for stability and guaranteed O(n log n) time.
TRUE
Which sort is preferred for small subproblems in divide-and-conquer?
Insertion sort is commonly used below a cutoff threshold due to low overhead.
What risks lead to quick sort’s O(n²) behavior?
Poor pivot choices leading to highly unbalanced partitions (e.g., sorted input with first-element pivot).
T/F: Randomization mitigates adversarial inputs for quick sort.
TRUE
Which sort is easiest to implement correctly for guaranteed performance?
Merge sort (stable, predictable Θ(n log n) time).
Which sort offers the fewest comparisons on average among comparison sorts?
Optimized quick sort typically minimizes comparisons and moves on average.
T/F: Any comparison sort that always beats O(n log n) worst-case would violate the decision-tree lower bound.
TRUE
How do you choose between merge and radix for large numeric keys?
If keys fit fixed-width digits and memory supports linear passes, radix can be faster; merge is more general and stable for arbitrary comparators.
When is bucket sort a good fit?
When keys are uniformly distributed over a known range that maps well to buckets.
T/F: Mapping keys poorly to buckets can degrade bucket sort’s performance.
TRUE
Which sort is best for stable secondary sorting after a primary sort?
Any stable sort (e.g., merge or insertion) to preserve prior order on equal keys.
Which sort is recommended for sorting linked lists?
Merge sort, because splitting and merging lists are efficient without random access.
T/F: Heap sort’s pattern of parent/child jumps can hurt cache performance compared to sequential merges.
TRUE
Which sort balances speed, space, and simplicity for medium datasets?
Quick sort with good pivoting and small-subarray insertion cutoff.
Sorting is studied in computer science for what three main reasons?
It illustrates creative approaches to problem solving, practices fundamental programming techniques, and demonstrates algorithm performance.
What does the insertion-sort algorithm do?
It sorts a list by repeatedly inserting a new element into a sorted sublist until the whole list is sorted.
T/F: Insertion sort works by repeatedly inserting elements into a sorted sublist.
TRUE
What is the worst-case time complexity of insertion sort?
O(n²).
T/F: The insertion sort has O(n²) worst-case time complexity.
TRUE
How does bubble sort operate?
It makes several passes through an array, comparing neighboring pairs and swapping them if they are not in order.
T/F: A bubble sort repeatedly swaps adjacent elements if they are out of order.
TRUE
What happens after each pass in bubble sort?
The largest unsorted element moves to its correct position at the end of the array.
What is bubble sort’s best-case scenario?
When the array is already sorted, requiring only one pass.
T/F: The bubble sort can finish in one pass if the array is already sorted.
TRUE
What is bubble sort’s worst-case time complexity?
O(n²).
What defines merge sort’s main strategy?
It divides the array into halves recursively and merges sorted halves.