CH 23 (SORTING) - FLASH

0.0(0)
studied byStudied by 0 people
full-widthCall with Kai
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/150

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

151 Terms

1
New cards

What is the goal of sorting?

Arrange records or elements into a defined order (typically ascending or descending) based on keys or comparison criteria.

2
New cards

Why study multiple sorting algorithms?

They illustrate performance trade-offs, memory usage patterns, stability, and design paradigms like divide-and-conquer and in-place operations.

3
New cards

What is a comparison sort?

A sort that orders elements solely by comparing keys (e.g., insertion, selection, bubble, merge, quick, heap).

4
New cards

What is a non-comparison sort?

A sort that uses key structure rather than comparisons (e.g., counting, bucket, radix) to achieve linear-time behavior under constraints.

5
New cards

What is algorithmic stability in sorting?

If two items have equal keys, a stable sort preserves their original relative order.

6
New cards

T/F: Stability preserves the original relative order of equal-key records.

TRUE

7
New cards

What does “in-place” mean for sorting?

The algorithm uses O(1) or small extra space beyond the input array (e.g., quick sort, heap sort, insertion sort).

8
New cards

T/F: An in-place sort uses only a constant amount of extra memory.

TRUE

9
New cards

Which complexities matter when judging sorts?

Time complexity (best/average/worst), space complexity, stability, constant factors, and cache behavior.

10
New cards

What is the decision-tree lower bound for comparison sorts?

Ω(n log n) comparisons in the worst case.

11
New cards

T/F: No comparison sort can beat Ω(n log n) in the worst case.

TRUE

12
New cards

When are O(n²) sorts acceptable?

For very small n, nearly sorted data, or when constants and simplicity dominate.

13
New cards

What is insertion sort?

A simple comparison sort that builds a sorted prefix by inserting each element into its correct position.

14
New cards

How does insertion sort operate?

It scans leftward in the sorted region and inserts the current element at the first smaller-or-equal position.

15
New cards

What is the time complexity of insertion sort?

Worst/average O(n²); best O(n) on nearly sorted input.

16
New cards

Is insertion sort stable and in-place?

Yes, it is stable and in-place with O(1) extra space.

17
New cards

T/F: Insertion sort runs in O(n) time on an already nearly sorted array.

TRUE

18
New cards

What is bubble sort?

A simple comparison sort that repeatedly swaps adjacent out-of-order pairs, bubbling extremes toward the end.

19
New cards

What is bubble sort’s time complexity?

Worst/average O(n²); best O(n) with early-exit optimization when no swaps occur on a pass.

20
New cards

Is bubble sort stable and in-place?

Yes, bubble sort is stable and in-place.

21
New cards

T/F: Bubble sort’s average time complexity is O(n log n).

FALSE

22
New cards

What is selection sort?

A comparison sort that repeatedly selects the minimum from the unsorted region and places it at the boundary.

23
New cards

What is selection sort’s time complexity?

Always Θ(n²) comparisons regardless of input order.

24
New cards

Is selection sort stable and in-place?

In-place yes; not stable in its basic form (equal elements may be reordered).

25
New cards

T/F: Selection sort makes Θ(n²) comparisons even on sorted input.

TRUE

26
New cards

What is merge sort?

A stable divide-and-conquer comparison sort that recursively splits, sorts, and merges subarrays.

27
New cards

What is merge sort’s time complexity?

Θ(n log n) in best/average/worst cases due to balanced divide-and-conquer.

28
New cards

What extra space does merge sort require?

Θ(n) additional space for temporary arrays during merging.

29
New cards

Is merge sort stable and in-place?

Stable yes; not in-place in the standard array implementation.

30
New cards

T/F: Merge sort guarantees Θ(n log n) time in the worst case.

TRUE

31
New cards

What is quick sort?

An in-place divide-and-conquer comparison sort that partitions around a pivot, then sorts partitions.

32
New cards

What determines quick sort performance?

Pivot quality; balanced partitions yield O(n log n), highly unbalanced partitions yield O(n²).

33
New cards

What is quick sort’s average and worst-case time?

Average O(n log n); worst-case O(n²) (e.g., already sorted with poor pivot).

34
New cards

Is quick sort stable and in-place?

Typically in-place; not stable in standard in-place implementations.

35
New cards

T/F: Randomized pivot selection helps avoid quick sort’s worst case in practice.

TRUE

36
New cards

What is heap sort?

A comparison sort that builds a heap and repeatedly extracts the max (or min) to produce sorted order.

37
New cards

What is heap sort’s time and space complexity?

O(n log n) time; O(1) extra space (in-place with array-based heap).

38
New cards

Is heap sort stable?

No, heap sort is not stable in standard implementations.

39
New cards

T/F: Heap sort is in-place and runs in O(n log n) worst-case time.

TRUE

40
New cards

What is bucket sort?

A non-comparison sort that distributes keys into a fixed number of buckets, then sorts buckets and concatenates.

41
New cards

When does bucket sort run in linear time?

When keys are uniformly distributed and bucket sorting is O(1) amortized per element, total time O(n + k).

42
New cards

Is bucket sort stable?

If stable sorting is used within buckets and concatenation preserves order, bucket sort can be stable.

43
New cards

T/F: Bucket sort’s performance depends on key distribution across buckets.

TRUE

44
New cards

What is radix sort?

A non-comparison sort that processes keys digit-by-digit using a stable subroutine (often counting sort).

45
New cards

What is radix sort’s time complexity?

O(d·(n + k)) for d digits and radix k; can be O(n) when d and k are bounded.

46
New cards

Is radix sort stable and what space does it use?

Stable if the digit pass is stable; extra space depends on the counting/bucket arrays used.

47
New cards

T/F: Radix sort assumes keys can be decomposed into digits from a finite alphabet.

TRUE

48
New cards

What is external sorting?

Sorting designed for data sets larger than memory, minimizing I/O by creating sorted runs and merging them.

49
New cards

How does external merge sort work?

Create sorted runs that fit in memory, then perform multiway merges to produce the final sorted output.

50
New cards

What dominates external sorting cost?

I/O operations; algorithms aim to minimize disk reads/writes and passes.

51
New cards

T/F: External sorting emphasizes multiway merging and I/O efficiency over CPU comparisons.

TRUE

52
New cards

When would you choose insertion over merge?

On very small or nearly sorted arrays, where insertion’s low overhead and O(n) best case excel.

53
New cards

When would you choose merge over quick?

When worst-case guarantees and stability are required, or when sorting linked lists.

54
New cards

When would you choose quick over merge?

For in-memory arrays when average performance and low extra space are priorities.

55
New cards

When would you choose heap over quick?

When worst-case O(n log n) and in-place are required, and stability is not needed.

56
New cards

T/F: Merge sort is generally preferred for linked lists; quick sort is often preferred for arrays.

TRUE

57
New cards

Which sorts are stable by default?

Merge sort, insertion sort, bubble sort (and radix/bucket when stable subroutines are used).

58
New cards

Which sorts are in-place by default?

Quick sort, heap sort, insertion sort, selection sort, bubble sort (standard array forms).

59
New cards

Which sorts guarantee O(n log n) worst-case time?

Merge sort and heap sort (and some engineered quicksort variants with introspection).

60
New cards

T/F: Radix sort can achieve linear time when digit count and radix are bounded.

TRUE

61
New cards

Which sorts are best for nearly sorted data?

Insertion sort (O(n) best case) and bubble sort with early-exit optimization.

62
New cards

Which sorts are best for very large data on disk?

External merge sort with multiway merges and large block I/O.

63
New cards

T/F: Stability is essential when secondary keys must preserve the order of equal primary keys.

TRUE

64
New cards

Why might heap sort be slower in practice than quick sort?

Heap’s access pattern and constant factors can degrade cache locality compared to quick sort’s partitioning.

65
New cards

Why is merge sort cache-friendly?

It processes arrays sequentially during merge, improving spatial locality compared to random heap accesses.

66
New cards

T/F: Quick sort’s partitioning typically has good cache locality on contiguous arrays.

TRUE

67
New cards

What is the best general-purpose in-memory sort?

Optimized quick sort (often with introsort fallback) due to average speed and small constant factors.

68
New cards

What hybrid techniques improve quick sort?

Randomized or median-of-three pivots, cutoff to insertion sort for small partitions, and introspective fallback to heap sort.

69
New cards

T/F: Many library sorts use hybrids like introsort for robust performance.

TRUE

70
New cards

How do you preserve stability when using bucket or radix sort?

Use stable counting within passes and concatenate buckets without reordering.

71
New cards

What affects radix sort’s practical performance?

Digit width, radix choice, memory bandwidth, and stable counting implementation.

72
New cards

T/F: Counting sort is a stable linear-time subroutine often used inside radix sort passes.

TRUE

73
New cards

When is selection sort favored?

When writes are expensive and minimizing swaps matters, since selection sort performs O(n) swaps.

74
New cards

Which algorithm minimizes auxiliary memory among O(n log n) sorts?

Heap sort (O(1) extra space).

75
New cards

T/F: Merge sort’s Θ(n) extra space is a trade-off for stability and guaranteed O(n log n) time.

TRUE

76
New cards

Which sort is preferred for small subproblems in divide-and-conquer?

Insertion sort is commonly used below a cutoff threshold due to low overhead.

77
New cards

What risks lead to quick sort’s O(n²) behavior?

Poor pivot choices leading to highly unbalanced partitions (e.g., sorted input with first-element pivot).

78
New cards

T/F: Randomization mitigates adversarial inputs for quick sort.

TRUE

79
New cards

Which sort is easiest to implement correctly for guaranteed performance?

Merge sort (stable, predictable Θ(n log n) time).

80
New cards

Which sort offers the fewest comparisons on average among comparison sorts?

Optimized quick sort typically minimizes comparisons and moves on average.

81
New cards

T/F: Any comparison sort that always beats O(n log n) worst-case would violate the decision-tree lower bound.

TRUE

82
New cards

How do you choose between merge and radix for large numeric keys?

If keys fit fixed-width digits and memory supports linear passes, radix can be faster; merge is more general and stable for arbitrary comparators.

83
New cards

When is bucket sort a good fit?

When keys are uniformly distributed over a known range that maps well to buckets.

84
New cards

T/F: Mapping keys poorly to buckets can degrade bucket sort’s performance.

TRUE

85
New cards

Which sort is best for stable secondary sorting after a primary sort?

Any stable sort (e.g., merge or insertion) to preserve prior order on equal keys.

86
New cards

Which sort is recommended for sorting linked lists?

Merge sort, because splitting and merging lists are efficient without random access.

87
New cards

T/F: Heap sort’s pattern of parent/child jumps can hurt cache performance compared to sequential merges.

TRUE

88
New cards

Which sort balances speed, space, and simplicity for medium datasets?

Quick sort with good pivoting and small-subarray insertion cutoff.

89
New cards

Sorting is studied in computer science for what three main reasons?

It illustrates creative approaches to problem solving, practices fundamental programming techniques, and demonstrates algorithm performance.

90
New cards

What does the insertion-sort algorithm do?

It sorts a list by repeatedly inserting a new element into a sorted sublist until the whole list is sorted.

91
New cards

T/F: Insertion sort works by repeatedly inserting elements into a sorted sublist.

TRUE

92
New cards

What is the worst-case time complexity of insertion sort?

O(n²).

93
New cards

T/F: The insertion sort has O(n²) worst-case time complexity.

TRUE

94
New cards

How does bubble sort operate?

It makes several passes through an array, comparing neighboring pairs and swapping them if they are not in order.

95
New cards

T/F: A bubble sort repeatedly swaps adjacent elements if they are out of order.

TRUE

96
New cards

What happens after each pass in bubble sort?

The largest unsorted element moves to its correct position at the end of the array.

97
New cards

What is bubble sort’s best-case scenario?

When the array is already sorted, requiring only one pass.

98
New cards

T/F: The bubble sort can finish in one pass if the array is already sorted.

TRUE

99
New cards

What is bubble sort’s worst-case time complexity?

O(n²).

100
New cards

What defines merge sort’s main strategy?

It divides the array into halves recursively and merges sorted halves.