Sorting Algorithms to Know for AP Computer Science A
What You Need to Know
Sorting is a classic AP CSA topic because it tests your ability to trace loops, reason about array/index behavior, and compare algorithm efficiency.
On the AP Computer Science A exam, the “big three” sorting algorithms to know are:
- Selection Sort
- Insertion Sort
- Merge Sort
You should be able to:
- Describe what each sort does in plain English.
- Trace a few iterations/passes on an array.
- Identify time complexity (best/average/worst) using notation.
- Know key properties: in-place vs extra memory, stable vs unstable.
- Apply comparisons for primitives and objects (using Comparable / compareTo).
AP CSA vibe: You’re rarely asked to invent a new sort from scratch, but you are often asked to predict intermediate states or explain why a given implementation works.
Core definitions (must-know vocabulary)
- Sorted (ascending): for all adjacent indices , .
- In-place: uses only extra memory (besides a few variables).
- Stable sort: if two items are “equal” by the comparison, their relative order stays the same after sorting.
- Comparison sort: sorting by comparing elements (all three here are comparison sorts).
Comparable comparisons in Java
- For primitives like , you compare with .
- For objects (like ), typical AP style uses:
x.compareTo(y)- Negative means , zero means equal, positive means .
Step-by-Step Breakdown
Below are “exam-traceable” versions of each algorithm. Focus on what becomes sorted after each outer loop / recursion level.
1) Selection Sort (repeat: find min, swap into place)
Idea: Build a sorted prefix from left to right. On pass , find the minimum in the unsorted suffix and swap it into index .
Steps (array length ):
- For to :
- Set .
- For to :
- If , set .
- Swap with .
- After pass , indices are sorted and finalized.
Mini trace example: sort [5, 2, 4, 1]
- Pass : min is at index → swap with index →
[1, 2, 4, 5] - Pass : min of suffix [2,4,5] is already at → no change
Key decision point: selection sort does one swap per outer pass, after scanning.
2) Insertion Sort (repeat: insert current into sorted left side)
Idea: Treat the left side as a sorted “hand of cards.” Take the next element (the “key”) and shift larger elements right until you can insert the key.
Steps:
- For to :
- Set , .
- While and :
- Shift right:
- Decrement:
- Place key:
- After iteration , indices are sorted.
Mini trace example: sort [5, 2, 4, 1]
- , key=2: shift 5 right →
[5,5,4,1], insert 2 →[2,5,4,1] - , key=4: shift 5 →
[2,5,5,1], insert 4 →[2,4,5,1] - , key=1: shift 5,4,2 → insert 1 →
[1,2,4,5]
Key decision point: insertion sort’s inner loop does shifts, not repeated swaps (though swap-based versions exist).
3) Merge Sort (divide, sort halves, merge)
Idea: Recursively split the array into halves until size , then merge sorted halves back together.
High-level steps:
- Split array segment at .
- Recursively merge sort .
- Recursively merge sort .
- Merge the two sorted halves into one sorted segment.
Typical recursion base case:
- If , return (segment length is already sorted).
Merging procedure (two pointers):
- You have sorted left half and sorted right half .
- Use pointers (left), (right), and (temp array index).
- Repeatedly copy the smaller of and into temp.
- Copy any leftovers from left or right.
- Copy temp back into .
Tiny merge example: merging [2,5] and [1,4]
- Compare 2 vs 1 → take 1
- Compare 2 vs 4 → take 2
- Compare 5 vs 4 → take 4
- Leftover 5 → take 5
- Result:
[1,2,4,5]
Critical reminder: merge sort’s speed comes from merging in linear time after splitting.
Key Formulas, Rules & Facts
Complexity + properties (the big comparison table)
| Algorithm | Best time | Average time | Worst time | Extra space | In-place? | Stable? | What becomes sorted when? |
|---|---|---|---|---|---|---|---|
| Selection sort | Yes | Usually no | After pass : position is final | ||||
| Insertion sort | (already sorted) | (reverse) | Yes | Yes (with shifting version) | After : prefix sorted | ||
| Merge sort | No (needs temp) | Yes (typical merge) | After merge: that segment sorted |
When each one is “the right mental model”
| Concept you’re tested on | Use this algorithm | Notes |
|---|---|---|
| “Find the next smallest and put it in the right spot” | Selection sort | Easy to trace; one swap per pass |
| “Build a sorted section as you go; shift to insert” | Insertion sort | Great for nearly sorted arrays; stable |
| “Divide and conquer; combine sorted halves” | Merge sort | Fastest here asymptotically; uses extra memory |
Loop bounds you should memorize (common AP implementations)
- Selection sort
- Outer: to
- Inner: to
- Insertion sort
- Outer: to
- Inner while: and out-of-order condition
- Merge sort
- Base case:
- Midpoint: often (integer division)
Stability details (subtle but testable)
- Insertion sort is stable if you only shift items when (strictly greater). If you shift on , you can break stability.
- Merge sort is stable if when equal, you take from the left half first.
- Selection sort is usually unstable because swapping the min into place can move equal items past each other.
Comparing objects (AP CSA style)
- For ascending order using
compareTo:- “Out of order” check:
arr[j].compareTo(key) > 0
- “Out of order” check:
- For descending order:
- Flip the comparison direction.
Examples & Applications
Example 1: Identify which algorithm matches the behavior
Array state changes:
- Start:
[4, 1, 3, 2] - After pass 1:
[1, 4, 3, 2] - After pass 2:
[1, 2, 3, 4]
Key insight: After pass 1, the smallest element is in position 0. After pass 2, the second smallest is in position 1. That’s classic selection sort behavior (min-search + swap each pass).
Example 2: Insertion sort intermediate state question
Start [2, 5, 4, 6, 1]
- After (inserting element at index 2):
- Left prefix becomes sorted:
[2, 4, 5, 6, 1]
- Left prefix becomes sorted:
Key insight: insertion sort ensures the prefix is sorted after each outer iteration.
Example 3: Merge sort split/merge reasoning
Given [8, 3, 7, 4]:
- Split into
[8,3]and[7,4] - Sort halves →
[3,8]and[4,7] - Merge →
[3,4,7,8]
Key insight: merge sort’s “magic step” is merging two sorted lists in linear time.
Example 4: Sorting objects with compareTo
If you have String[] words = {"bat", "apple", "cat"};
- Lexicographic order uses
compareTo. - Sorted result:
["apple", "bat", "cat"]
Key insight: you don’t use with objects; you use compareTo and check sign.
Common Mistakes & Traps
Off-by-one in selection sort loops
- Wrong: running outer loop to or inner loop starting at .
- Why wrong: redundant comparisons or missing needed comparisons.
- Fix: outer , inner .
Swapping inside the inner loop in selection sort
- Wrong: swapping every time you see a new minimum.
- Why wrong: it can still sort, but it’s no longer the standard selection sort trace and complicates intermediate states.
- Fix: track during inner scan; do one swap at the end.
Insertion sort: forgetting to save the key
- Wrong: shifting overwrites
a[i]before you store it. - Why wrong: you lose the value you’re trying to insert.
- Fix: store
key = a[i]before shifting.
- Wrong: shifting overwrites
Insertion sort: placing the key at the wrong index
- Wrong: doing
a[j] = keyafter the loop. - Why wrong: when the loop stops, is at the last index that is key (or ), so the correct position is .
- Fix: always assign
a[j + 1] = key.
- Wrong: doing
Merge sort: incorrect base case
- Wrong: using something like
if (low == high) return;but then splitting can skip or loop incorrectly with certain mid calculations. - Why wrong: for safety, you want to stop for any segment of length .
- Fix:
if (low >= high) return;.
- Wrong: using something like
Merge step: not copying leftovers
- Wrong: after the main merge loop ends, not copying remaining elements from one half.
- Why wrong: you drop values and the result is missing elements.
- Fix: after merging while both halves have elements, copy remaining left half, then remaining right half.
Stability misunderstandings
- Wrong: assuming all sorts are stable or that stability “doesn’t matter.”
- Why wrong: stability can affect results when sorting objects by one key then another.
- Fix: remember: insertion/merge are stable (typical), selection usually isn’t.
Using to compare objects
- Wrong:
if (a[j] == key)or using on objects. - Why wrong: compares references, not ordering.
- Fix: use
compareTo(orequalsfor equality checks only).
- Wrong:
Memory Aids & Quick Tricks
| Trick / mnemonic | Helps you remember | When to use |
|---|---|---|
| Selection = “Select min, Swap” | Scan for smallest in the unsorted part, then swap into place | Tracing selection sort passes |
| Insertion = “Cards in hand” | You pull one card (key), shift bigger cards right, insert key | Visualizing insertion’s shifts |
| Merge = “Split then Stitch” | Divide array into halves, then merge (stitch) in sorted order | Keeping merge sort steps straight |
| Sorted prefix rule | Selection and insertion both grow a sorted left side | Quick check during tracing |
| Merge pointers: | Left pointer , right pointer , temp pointer | Avoid merge-index bugs |
Quick Review Checklist
- You can explain and trace selection, insertion, and merge sort.
- You know what’s sorted after each step:
- Selection: position finalized each pass.
- Insertion: prefix sorted each iteration.
- Merge: after merge, that subarray segment sorted.
- You know time complexities:
- Selection: always.
- Insertion: best , worst .
- Merge: always (typical).
- You know space/stability:
- Selection: in-place, usually unstable.
- Insertion: in-place, stable (shifting version).
- Merge: extra, stable (typical merge).
- You can compare objects correctly using
compareTo. - You avoid classic indexing mistakes: loop bounds, insertion, copying merge leftovers.
You’ve got this—if you can trace these three cleanly, you’re in great shape for AP CSA sorting questions.