Sorting Algorithms to Know for AP Computer Science A
What You Need to Know
Sorting is a classic AP CSA topic because it tests your ability to trace loops, reason about array/index behavior, and compare algorithm efficiency.
On the AP Computer Science A exam, the “big three” sorting algorithms to know are:
- Selection Sort
- Insertion Sort
- Merge Sort
You should be able to:
- Describe what each sort does in plain English.
- Trace a few iterations/passes on an array.
- Identify time complexity (best/average/worst) using O(\cdot) notation.
- Know key properties: in-place vs extra memory, stable vs unstable.
- Apply comparisons for primitives and objects (using Comparable / compareTo).
AP CSA vibe: You’re rarely asked to invent a new sort from scratch, but you are often asked to predict intermediate states or explain why a given implementation works.
Core definitions (must-know vocabulary)
- Sorted (ascending): for all adjacent indices k, a[k] \le a[k+1].
- In-place: uses only O(1) extra memory (besides a few variables).
- Stable sort: if two items are “equal” by the comparison, their relative order stays the same after sorting.
- Comparison sort: sorting by comparing elements (all three here are comparison sorts).
Comparable comparisons in Java
- For primitives like int, you compare with .
- For objects (like String), typical AP style uses:
x.compareTo(y)- Negative means x < y, zero means equal, positive means x > y.
Step-by-Step Breakdown
Below are “exam-traceable” versions of each algorithm. Focus on what becomes sorted after each outer loop / recursion level.
1) Selection Sort (repeat: find min, swap into place)
Idea: Build a sorted prefix from left to right. On pass i, find the minimum in the unsorted suffix [i..n-1] and swap it into index i.
Steps (array length n):
- For i = 0 to n-2:
- Set minIndex = i.
- For j = i+1 to n-1:
- If a[j] < a[minIndex], set minIndex = j.
- Swap a[i] with a[minIndex].
- After pass i, indices 0..i are sorted and finalized.
Mini trace example: sort [5, 2, 4, 1]
- Pass i=0: min is 1 at index 3 → swap with index 0 →
[1, 2, 4, 5] - Pass i=1: min of suffix [2,4,5] is already at 1 → no change
Key decision point: selection sort does one swap per outer pass, after scanning.
2) Insertion Sort (repeat: insert current into sorted left side)
Idea: Treat the left side as a sorted “hand of cards.” Take the next element (the “key”) and shift larger elements right until you can insert the key.
Steps:
- For i = 1 to n-1:
- Set key = a[i], j = i-1.
- While j \ge 0 and a[j] > key:
- Shift right: a[j+1] = a[j]
- Decrement: j--
- Place key: a[j+1] = key
- After iteration i, indices 0..i are sorted.
Mini trace example: sort [5, 2, 4, 1]
- i=1, key=2: shift 5 right →
[5,5,4,1], insert 2 →[2,5,4,1] - i=2, key=4: shift 5 →
[2,5,5,1], insert 4 →[2,4,5,1] - i=3, key=1: shift 5,4,2 → insert 1 →
[1,2,4,5]
Key decision point: insertion sort’s inner loop does shifts, not repeated swaps (though swap-based versions exist).
3) Merge Sort (divide, sort halves, merge)
Idea: Recursively split the array into halves until size 1, then merge sorted halves back together.
High-level steps:
- Split array segment [low..high] at mid.
- Recursively merge sort [low..mid].
- Recursively merge sort [mid+1..high].
- Merge the two sorted halves into one sorted segment.
Typical recursion base case:
- If low \ge high, return (segment length \le 1 is already sorted).
Merging procedure (two pointers):
- You have sorted left half [low..mid] and sorted right half [mid+1..high].
- Use pointers i (left), j (right), and k (temp array index).
- Repeatedly copy the smaller of a[i] and a[j] into temp.
- Copy any leftovers from left or right.
- Copy temp back into a[low..high].
Tiny merge example: merging [2,5] and [1,4]
- Compare 2 vs 1 → take 1
- Compare 2 vs 4 → take 2
- Compare 5 vs 4 → take 4
- Leftover 5 → take 5
- Result:
[1,2,4,5]
Critical reminder: merge sort’s speed comes from merging in linear time after splitting.
Key Formulas, Rules & Facts
Complexity + properties (the big comparison table)
| Algorithm | Best time | Average time | Worst time | Extra space | In-place? | Stable? | What becomes sorted when? |
|---|---|---|---|---|---|---|---|
| Selection sort | O(n^2) | O(n^2) | O(n^2) | O(1) | Yes | Usually no | After pass i: position i is final |
| Insertion sort | O(n) (already sorted) | O(n^2) | O(n^2) (reverse) | O(1) | Yes | Yes (with shifting version) | After i: prefix [0..i] sorted |
| Merge sort | O(n\log n) | O(n\log n) | O(n\log n) | O(n) | No (needs temp) | Yes (typical merge) | After merge: that segment sorted |
When each one is “the right mental model”
| Concept you’re tested on | Use this algorithm | Notes |
|---|---|---|
| “Find the next smallest and put it in the right spot” | Selection sort | Easy to trace; one swap per pass |
| “Build a sorted section as you go; shift to insert” | Insertion sort | Great for nearly sorted arrays; stable |
| “Divide and conquer; combine sorted halves” | Merge sort | Fastest here asymptotically; uses extra memory |
Loop bounds you should memorize (common AP implementations)
- Selection sort
- Outer: i = 0 to n-2
- Inner: j = i+1 to n-1
- Insertion sort
- Outer: i = 1 to n-1
- Inner while: j \ge 0 and out-of-order condition
- Merge sort
- Base case: low \ge high
- Midpoint: often mid = (low + high) / 2 (integer division)
Stability details (subtle but testable)
- Insertion sort is stable if you only shift items when a[j] > key (strictly greater). If you shift on \ge, you can break stability.
- Merge sort is stable if when equal, you take from the left half first.
- Selection sort is usually unstable because swapping the min into place can move equal items past each other.
Comparing objects (AP CSA style)
- For ascending order using
compareTo:- “Out of order” check:
arr[j].compareTo(key) > 0
- “Out of order” check:
- For descending order:
- Flip the comparison direction.
Examples & Applications
Example 1: Identify which algorithm matches the behavior
Array state changes:
- Start:
[4, 1, 3, 2] - After pass 1:
[1, 4, 3, 2] - After pass 2:
[1, 2, 3, 4]
Key insight: After pass 1, the smallest element is in position 0. After pass 2, the second smallest is in position 1. That’s classic selection sort behavior (min-search + swap each pass).
Example 2: Insertion sort intermediate state question
Start [2, 5, 4, 6, 1]
- After i=2 (inserting element at index 2):
- Left prefix becomes sorted:
[2, 4, 5, 6, 1]
- Left prefix becomes sorted:
Key insight: insertion sort ensures the prefix [0..i] is sorted after each outer iteration.
Example 3: Merge sort split/merge reasoning
Given [8, 3, 7, 4]:
- Split into
[8,3]and[7,4] - Sort halves →
[3,8]and[4,7] - Merge →
[3,4,7,8]
Key insight: merge sort’s “magic step” is merging two sorted lists in linear time.
Example 4: Sorting objects with compareTo
If you have String[] words = {"bat", "apple", "cat"};
- Lexicographic order uses
compareTo. - Sorted result:
["apple", "bat", "cat"]
Key insight: you don’t use < with objects; you use compareTo and check sign.
Common Mistakes & Traps
Off-by-one in selection sort loops
- Wrong: running outer loop to n-1 or inner loop starting at i.
- Why wrong: redundant comparisons or missing needed comparisons.
- Fix: outer i = 0..n-2, inner j = i+1..n-1.
Swapping inside the inner loop in selection sort
- Wrong: swapping every time you see a new minimum.
- Why wrong: it can still sort, but it’s no longer the standard selection sort trace and complicates intermediate states.
- Fix: track minIndex during inner scan; do one swap at the end.
Insertion sort: forgetting to save the key
- Wrong: shifting overwrites
a[i]before you store it. - Why wrong: you lose the value you’re trying to insert.
- Fix: store
key = a[i]before shifting.
- Wrong: shifting overwrites
Insertion sort: placing the key at the wrong index
- Wrong: doing
a[j] = keyafter the loop. - Why wrong: when the loop stops, j is at the last index that is \le key (or -1), so the correct position is j+1.
- Fix: always assign
a[j + 1] = key.
- Wrong: doing
Merge sort: incorrect base case
- Wrong: using something like
if (low == high) return;but then splitting can skip or loop incorrectly with certain mid calculations. - Why wrong: for safety, you want to stop for any segment of length \le 1.
- Fix:
if (low >= high) return;.
- Wrong: using something like
Merge step: not copying leftovers
- Wrong: after the main merge loop ends, not copying remaining elements from one half.
- Why wrong: you drop values and the result is missing elements.
- Fix: after merging while both halves have elements, copy remaining left half, then remaining right half.
Stability misunderstandings
- Wrong: assuming all sorts are stable or that stability “doesn’t matter.”
- Why wrong: stability can affect results when sorting objects by one key then another.
- Fix: remember: insertion/merge are stable (typical), selection usually isn’t.
Using == to compare objects
- Wrong:
if (a[j] == key)or using < on objects. - Why wrong: == compares references, not ordering.
- Fix: use
compareTo(orequalsfor equality checks only).
- Wrong:
Memory Aids & Quick Tricks
| Trick / mnemonic | Helps you remember | When to use |
|---|---|---|
| Selection = “Select min, Swap” | Scan for smallest in the unsorted part, then swap into place | Tracing selection sort passes |
| Insertion = “Cards in hand” | You pull one card (key), shift bigger cards right, insert key | Visualizing insertion’s shifts |
| Merge = “Split then Stitch” | Divide array into halves, then merge (stitch) in sorted order | Keeping merge sort steps straight |
| Sorted prefix rule | Selection and insertion both grow a sorted left side | Quick check during tracing |
| Merge pointers: i, j, k | Left pointer i, right pointer j, temp pointer k | Avoid merge-index bugs |
Quick Review Checklist
- You can explain and trace selection, insertion, and merge sort.
- You know what’s sorted after each step:
- Selection: position i finalized each pass.
- Insertion: prefix [0..i] sorted each iteration.
- Merge: after merge, that subarray segment sorted.
- You know time complexities:
- Selection: O(n^2) always.
- Insertion: best O(n), worst O(n^2).
- Merge: O(n\log n) always (typical).
- You know space/stability:
- Selection: in-place, usually unstable.
- Insertion: in-place, stable (shifting version).
- Merge: O(n) extra, stable (typical merge).
- You can compare objects correctly using
compareTo. - You avoid classic indexing mistakes: loop bounds, j+1 insertion, copying merge leftovers.
You’ve got this—if you can trace these three cleanly, you’re in great shape for AP CSA sorting questions.