Sorting Algorithm Time Complexities/Details

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/55

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 1:22 AM on 4/29/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

56 Terms

1
New cards

Bubble Sort (worst time complexity)

O(n2); We compare every element to every other element, which is approximately n times n work.

2
New cards

Bubble Sort (average time complexity)

O(n²); On average, it performs more closely to its worst case.

3
New cards

Bubble Sort (best time complexity)

O(n); If we have a separate boolean flag to check if the list is already sorted, we can detect this in one pass. Otherwise, this is O(n²).

4
New cards

in-place

The sorting algorithm uses O(1) auxiliary space

5
New cards

adaptive

The sorting algorithm is adaptive if it takes advantage of pre-existing order in the input data (aka if the data is “nearly” sorted, it takes less time to sort than if the data is completely unsorted)

6
New cards

stable

A sorting algorithm is stable if it preservers the relative order of elements that are equal (for example in the list [4, 6, 2L, 3, 1, 2R, 7], the sorted list should be [1, 2L, 2R, 3, 4, 6, 7])

7
New cards

Bubble Sort (characteristics)

bubble sort is adaptive, in-place, and stable IF you include the rule that two items with the same value can’t be swapped.

8
New cards

[5, 4, 2, 3, 1, 6, 10, 7] → what does the list look like after three passes of bubble sort?

[2, 1, 3, 4, 5, 6, 7, 10]

9
New cards

Bubble Sort (description)

arranges a list of elements by repeatedly swapping adjacent items if they are in the wrong order

10
New cards

Cocktail Shaker Sort (description)

sorting algorithm that does bubble sort in both directions (forward and backward); allows for small elements in the list to be “bubbled down” quickly

11
New cards

Cocktail Shaker Sort (best time complexity)

O(n); If we have a separate boolean flag to check if the list is already sorted, we can detect this in one pass. Otherwise, this is O(n²).

12
New cards

Cocktail Shaker Sort (average time complexity)

O(n2); On average, it performs more closely to its worst case.

13
New cards

Cocktail Shaker Sort (worst time complexity)

O(n2); We may have to do up to n swaps for each of n items.

14
New cards

Cocktail Shaker Sort (characteristics)

Same as bubble sort (stable, adaptive, in place)

15
New cards

[5, 4, 2, 3, 1, 6, 10, 7] → what does this list look like after two passes of cocktail shaker sort?

[1, 2, 3, 4, 5, 6, 7, 10] → the list is fully sorted after the forward sweep of pass #2

16
New cards

Insertion Sort

builds the sorted list one element at a time

17
New cards

insertion sort (worst case time complexity)

O(n2); If the initial array is reverse-sorted, we have to move each element to the beginning of the array one spot at a time, which is approximately n times n work.

18
New cards

insertion sort (average case time complexity)

O(n2); On average, it performs more closely to its worst case.

19
New cards

Insertion sort (best case time complexity)

O(n); If the initial array is already sorted, we check each element once and do not have to move any towards the front.

20
New cards

Insertion Sort (characteristics)

stable (if we add the rule to avoid swapping equal items), adaptive, and in-place

21
New cards

[10,9,9,2,4,12] → what does the list look like after

do later

22
New cards

Selection Sort (best case time complexity)

O(n2); Finding the largest or smallest element in the unsorted portion of the array requires n comparisons, and we do this n times.

23
New cards

Selection Sort (worst case time complexity)

O(n2); Finding the largest or smallest element in the unsorted portion of the array requires n comparisons, and we do this n times.

24
New cards

Selection Sort (average case time complexity)

O(n2); Finding the largest or smallest element in the unsorted portion of the array requires n comparisons, and we do this n times.

25
New cards

Selection Sort (description)

repeatedly finds the smallest (or largest) element from the unsorted portion and moving it to the beginning; does less swaps than any other sort

26
New cards

Selection Sort (characteristics)

in place

27
New cards

Selection Sort (Use Cases)

Has the least number of swaps out of all the sorts; if swapping is expensive, use selection.

Allocating memory is expensive, so if each swap requires some allocation, we’d want to do selection.

28
New cards

selection sort example - do this later

do later

29
New cards

Merge Sort (worst case time complexity)

O(n log n); Merge sort always recursively divides the array into halves (for a total of log n levels) and merges each level (which is an O(n) operation).

30
New cards

Merge Sort (average case time complexity)

O(n log n); Merge sort always recursively divides the array into halves (for a total of log n levels) and merges each level (which is an O(n) operation).

31
New cards

Merge Sort (best case time complexity)

O(n log n); Even in the best case, merge sort still recursively divides the array into halves (for a total of log n levels) and merges each level (which is an O(n) operation).

32
New cards

the “worst” sort

bubble; wastes multiple comparisons on the same data

33
New cards

MergeSort (characteristics)

stable

34
New cards

MergeSort (description)

Divide and Conquer Sort; break array into smaller arrays, sort the smaller arrays, then combine the sorted smaller arrays into a larger sorted array

35
New cards

merge sort (use cases)

scenarios where data access is restricted, stability is required, or datasets exceed available memory

36
New cards

Quick Sort (description)

pick a pivot element, move all elements smaller than pivot to the left and larger than the pivot to the right, then recursively picks a pivot from the left and right sub lists and repeats

37
New cards

QuickSort (characteristics)

in place

38
New cards

QuickSort (worst case time complexity)

O(n2); The worst pivot (min or max) is chosen each partition, devolving into a selection sort.

39
New cards

QuickSort (average case time complexity)

O(n log n); On average, the data is split roughly in half each partition.

40
New cards

QuickSort (best case time complexity)

O(n log n); The pivot is perfect (the median) in every partition.

41
New cards

Quicksort use cases

quicksort is used for most general-purpose sorting libraries because of its efficiency

42
New cards

Kth Select (description)

works really similar to quicksort but instead of recursively sorting both sides of the pivot it just recursively sorts the side with the kth element\

basically, compare the pivot’s index (after partitioning) to k. if k is less, then recurse left, otherwise, recurse right.

43
New cards

kth select (worst case time complexity)

O(n2); The minimum or maximum value of the subarray is always chosen.

44
New cards

kth select (average case time complexity)

O(n); Data is split in half on average during each partition. We can discard the partition which does not contain the target.

45
New cards

kth select (best case time complexity)

O(n); The pivot is always the median value of the subarray.

46
New cards

quickselect use cases

when you want to find the kth smallest/largest element in a unsorted list without sorting the whole list

47
New cards

HeapSort Description

heap sort calls build heap on the unsorted list of data and then removes data from the heap one at a time. build heap is O(n) and removing n data from a heap is O(n log n), so the whole things is about O(n log n)

48
New cards

heap sort (characteristics)

not stable because it makes large swaps over multiple elements, which can swap the positions of equal elements

not adaptive because build heap takes O(n) time no matter what, and removing from the heap is always O(log n), so the time complexity isn’t better if the data is already mostly sorted

not in place because it uses a heap

49
New cards

heap sort (best time complexity)

O(n log n); Removing an element from a heap is O(log n), and we do this n times.

50
New cards

heap sort (average time complexity)

O(n log n); Removing an element from a heap is O(log n), and we do this n times.

51
New cards

heap sort (worst time complexity)

O(n log n); Removing an element from a heap is O(log n), and we do this n times.

52
New cards

Radix Sort (description)

a non-comparative sorting algorithm that avoids direct comparisons between elements by grouping them based on individual digits or characters

Sorts all of the data by its least significant digit, then its second least significant digit, etc.

53
New cards

Radix Sort (characteristics)

stable, but not adaptive (nearly sorted data isn’t any faster to sort) or in-place (you need 10 buckets/queues)

54
New cards

Radix Sort (worst case time complexity)

O(kn);One or a few numbers in the array are much longer than the rest. The longest number has k digits.

55
New cards

Radix Sort (best case time complexity)

O(kn); Each number in the array will be added to and removed from a bucket k times, where k is the length of the longest number.

56
New cards

Radix Sort (average case time complexity)

O(kn); All of the numbers in the array have relatively few digits.