COMP 3270-002 - Exam 2 - Dr. Heaton

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/118

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

119 Terms

1
New cards

Quick Sort Pseudo Code

procedure QUICK_SORT(A)

pivot <- rand_int(0, n)

pivot_pos <- partition(A, pivot)

quick_sort(A[0;pivot_pos])

quick_sort(A[pivot_pos+1:n])

2
New cards

Partitioning Pseudo Code

procedure PARTITION(A, pivot) {

swap(A, pivot, 0)

pivot_val <- A[0]

low <- 1

high <- n-1

while low <= high do {

while A[low] <= pivot_val do {

low += 1

}

while A[high] > pivot_val do {

high -= 1

}

if high > low then {

swap(A, low, high)

}

}

swap(A, 0, high)

}

3
New cards

What is quickselect?

an algorithm for finding the kth largest item in a list. Commonly this is used to find the median, or the len(A)//2th largest element.

4
New cards

How is quickselect different than quicksort?

Uses the same idea as quick sort, but only needs to recurse on one side of the partition

5
New cards

What is expected runtime of quick select?

Linear time

6
New cards

Quick Select Pseudo Code

procedure QUICK_SELECT(A, k) {

pivot <- rand_int(0, n)

pivot_pos <- partition(A, pivot)

if pivot_pos == k-1 do {

return A[pivot_pos]

} else if pivot_pos > k-1 then {

return quick_select(A[0:pivot_pos], k)

} else {

return quick_select(A[pivot_pos+1:len(A)], k-pivot_pos)

}

}

7
New cards

what operators do we have for comparison model of computation?

>, <, <=, >=, !=, ==

8
New cards

How can any algorithm in the comparison model be visualized?

As a binary tree where the algorithm's run path is represented by a path from the root to a leaf node, with the leaf nodes being the output of the algorithm.

9
New cards

In the context of sorting, how many possible outputs or leaf nodes does an algorithm have?

An algorithm for sorting has n! possible outputs or leaf nodes, where n is the number of items being sorted.

10
New cards

What is the height of the binary tree for sorting algorithms?

The height of the tree is log⁡2(n!)

11
New cards

What is a simpler upper bound for the height of the sorting binary tree?

An upper bound could be n^n instead of n!

12
New cards

How can we find a lower bound for the height of the sorting tree?

By noticing that at least n/2 items in n! are greater than or equal to n/2

13
New cards

What is the lower bound derived from the observation about n/2 items?

The lower bound is N/2 log(n/2)

14
New cards

What is the simplified form of the lower bound in big-O notation?

The lower bound is O(n log n)

15
New cards

what does log(n/2) break down into?

log(n/2) = log(n) - log(2)

16
New cards

What is the main question regarding the random access model of computation compared to the comparison model?

Can we do better than the comparison model using random access, similar to how we did with hashing vs. binary search?

17
New cards

What data structure is suggested for efficient sorting in the random access model?

A direct access array, which is a large array that can fit any key in the universe of keys (assumed to be positive integers).

18
New cards

How are keys inserted into the direct access array?

Keys are inserted at their respective indices in the direct access array.

19
New cards

What condition must be met for keys inserted in the direct access array?

The keys must be unique.

20
New cards

How can the sorted order be obtained from the direct access array?

By walking through every position in the array and collecting the non-null items.

21
New cards

What is the time complexity for sorting if u <= cn in the direct access model?

The sorting can be done in O(n) time

22
New cards

What if u <= n^2? How can the keys be handled?

Break down the key into two components, a and b, where both are <= n: a = k//n and b = k%n

23
New cards

How can the original key k be recovered from a and b?

The original key can be recovered using the formula k = an + b

24
New cards

Given the example [17, 3, 24, 22, 12], what would the tuples (a,b) be after breaking down the keys?

The tuples would be (3,2),(0,3),(4,4),(4,2),(2,2)

25
New cards

What base is used for the concatenated keys in the example, and what do the keys convert to?

The base used is n (base 5 in this case), converting the numbers to [32, 03, 44, 42, 22].

26
New cards

What is the method called when sorting items by one element of a tuple at a time?

Tuple sort.

27
New cards

Which sorting order is correct when sorting tuples?

The sort must be stable, meaning that elements with equal keys retain their relative positions.

28
New cards

What sorting algorithm can we use under the hood?

We can use any sorting algorithm, but now that each key is <= n we can use a direct access sorting

29
New cards

What is the main principle behind counting sort?

Counting sort combines direct access sorting with chaining to handle collisions while maintaining the input order.

30
New cards

How can collisions be managed in counting sort?

By using a dynamic array that appends colliding elements to the end, thus retaining the original order.

31
New cards

What is the first step in performing radix sort when u <= n^2?

Split the keys into two components, a and b, and then use counting sort to sort the items using tuple sort, first on b and then on a

32
New cards

How do the tuples relate to the base in the context of radix sort?

The concatenation of the tuples corresponds to the numbers expressed in base n.

33
New cards

What condition allows us to convert bases in radix sort?

If u <= n^c, we can convert bases and create tuples of length c.

34
New cards

what is the time complexity of radix sort when c is constant?

O(n)

35
New cards

counting sort Pseudo code

procedure COUNTING_SORT(A) {

u <- max([x.key for x in A])

D <- [[] for i in o..u]

for x in A do {

D[x.key].append(x)

}

i <- 0

for chain in D do {

for x in chaing do {

A[i++] <- x

}

}

}

36
New cards

Radix Sort Pseudo Code

procedure RADIX_SORT(A) {

u <- max([x.key for x in A])

c <- [logn(u)]

D <- [Obj() for i in 0..u]

for in in 0..n do {

D[i].digits <- digit_tuple(A[i], n, c)

D[i].item <- A[i]

}

for i in 0..c do {

for j in 0..n do {

D[j].key <- D[j].digits[i]

}

counting_sort(D)

}

for i in 0..n do {

A[i] <- D[i].item

}

}

37
New cards

Base digit tuple pseudo code

procedure DIGIT_TUPLE(x, n, c) {

high <- x.key

digits <- []

for i in 0..c do {

low <- high % n

high <- high // n

digits.append(low)

}

return(digits)

}

38
New cards

What are the properties of a binary tree

rooted tree,

each node has a left and right child and parent node (each of which may be null),

Each node has an item (value),

Currently, the order has nothing to do with the value

39
New cards

What is the parent node of the root node?

The root node has a null parent node.

40
New cards

What defines a leaf node in a tree?

A leaf node has all null children.

41
New cards

How is the depth of a node defined?

The depth of a node is the length from that node to the root node.

42
New cards

What does the height of a node represent?

The height of a node is the maximum length between that node and its furthest leaf node.

43
New cards

What is traversal order in the context of tree structures?

The traversal order is the sequence in which every subtree at the left pointer comes before the node's value, followed by everything in the right subtree.

44
New cards

How does the traversal order depend on tree properties?

The traversal order with respect to item values depends on the properties used when building, inserting, or deleting from the tree.

45
New cards

What is the traversal order of the first tree given in Lecture Note 7?

The current traversal order is G, D, B, E, A, C, F.

46
New cards

Traversal Order Pseudo Code

Procedure ITER(T) {

if T == null then {

return

}

iter(T.left)

visit(T)

iter(T.right)

}

47
New cards

How do you find the first element in a tree T?

If T has a left child, recursively return the find first of that subtree. Otherwise, T is the first element.

48
New cards

What is the first element of the tree rooted at A in Lecture Notes 7?

The first element is G, so first(A) = G

49
New cards

What is the first element of the tree rooted at C in Lecture Notes 7?

The first element is C, so first(C) = C

50
New cards

What is the time complexity of the find first operation?

The time complexity is O(h), where h is the height of the tree

51
New cards

Find First Pseudo Code

procedure FIRST(T) {

if T.left != null then

return first(T.left)

} else {

return T

}

52
New cards

How do you find the last element in a tree T?

If T has a right child, recursibely return the find last of that subtree. Otherwise T is the last element.

53
New cards

What is the last element of the tree rooted at A?

The last element is F, so last(A) = F

54
New cards

What is the time complexity of the find last operation?

The time complexity is O(h), where h is the height of the tree.

55
New cards

Find Last Pseudo Code

procedure LAST(T) {

if T.right != null then {

return first(T.right)

} else {

return T

}

}

56
New cards

How do you find the next element in a tree T?

If T has a right child, recursively return the find first of that subtree. Otherwise, go up the parent nodes until the node is not a right child of its parent. Return that parent node.

57
New cards

What is the next element after A?

The next element is C, so next(A) = C.

58
New cards

What is the next element after E?

The next element is A, so next(E) = A

59
New cards

What is the time complexity of the find next operation?

The time complexity is O(h), where h is the height of the tree.

60
New cards

Find next Pseudo Code

procedure NEXT(T) {

if T.right != null then {

return first(T.right)

} else {

while T.parent and T.parent.right == T do {

T <- T.parent

}

return T.parent

}

}

61
New cards

How do you find the previous element in a tree T?

If T has a left child, return the find last on that node. Otherwise, go up the parent nodes until the node is not a left child of its parent. Return that parent node.

62
New cards

What is the previous element before A?

The previous element is E, so previous(A) = E

63
New cards

What is the time complexity of the find previous operation?

The time complexity is O(h), where h is the height of the tree.

64
New cards

Find previous Pseudo Code

procedure PREVIOUS(T) {

if T.right != null then {

return last(T.left)

} else {

while T.parent and T.parent.left == T do {

T <- T.parent

}

return T.parent

}

}

65
New cards

What is the purpose of the insert_before(T, new_node) function in a binary tree?

It inserts a new node into the position that makes the new node the previous node of node T

66
New cards

What happens if T.left is null during the insert_before operation?

If T.left is null, then T.left is set to new_node, and new_node.parent is set to T.

67
New cards

How does the insert_before function handle the situation when T.left is not null?

If T.left is not null, the function sets T to the previous node of T and then assigns T.right to new_node, setting new_node.parent to T.

68
New cards

What is the time complexity of the insert_before operation?

The time complexity of the insert_before operation is O(h), where h is the height of the tree.

69
New cards

In the context of the insert_before function, what does "previous(T)" refer to?

"Previous(T)" refers to the node that comes before node T in the binary tree structure.

70
New cards

What relationships are established when a new node is inserted before node T?

The new node's parent is set to T, and it becomes the left or right child of the previous node of T, depending on the structure of the tree.

71
New cards

Insert Before Pseudo Code

procedure INSERT_BEFORE(T, new_node) {

if T.left == null then {

T.left <- new_node

new_node.parent = T

} else {

T <- previous(T)

T.right <- new_node

new_node.parent <- T

}

}

72
New cards

What does insert_after(T, new_node) do?

insert a new node into the position that will make new node be the next of node T

73
New cards

What is the time complexity of insert_after?

O(h) where h is the height of the tree

74
New cards

Insert After Pseudo Code

procedure INSERT_AFTER(T, new_node) {

if T.right == null then {

T.right <- new_node

new_node.parent = T

} else{

T <- next(T)

T.left <- new_node

new_node.parent <- T

}

}

75
New cards

What is delete(T)?

delete node T

76
New cards

what is the time complexity of delete?

O(h) where h is the height of the tree

77
New cards

Delete Pseudo Code

procedure DELETE(T) {

if T.left == null and T.right == null then {

if T.parent.left == T then {

T.parent.left <- null

} else {

T.parent.right <- null

}

}

if T.left then {

X <- previous(T)

T.item = X.item

} else {

X <- next(T)

T.item = X.item

}

delete(X)

}

78
New cards

How is the order of items in a binary tree-backed sequence determined?

The order is determined by the sequence order dictated by operations like insert_at, delete, and insert_last, rather than by increasing value.

79
New cards

Why is the insert_at(x, i) operation in a binary tree potentially more efficient than in other data structures?

Because insert_at(x, i) can be O(h) in a binary tree, whereas in other structures, it has typically been linear time (O(n)).

80
New cards

What bookkeeping is required to efficiently find the position of index i in a binary tree-backed sequence?

Every node must keep track of the size of the subtree rooted at that node, denoted as node.size.

81
New cards

How is the size of subtrees updated during insertions and deletions?

The size of the subtrees is updated by going up to the parent nodes and adding or subtracting one depending on whether a node was inserted or deleted, which takes O(h).

82
New cards

How do you find the element at index i in a binary tree-backed sequence?

Check node.left.size vs i.

If node.left.size == i, return the node.

If node.left.size > i, recurse on node.left with index i.

If node.left.size < i, recurse on node.right with index i - node.left.size - 1.

83
New cards

What is the runtime for retrieving an element at index i using the get(i) function?

The runtime for get(i) is O(h), where h is the height of the tree.

84
New cards

What is the overall runtime for performing both get(i) and insert_at(x, i) operations in a binary tree-backed sequence?

The overall runtime is O(2h), which simplifies to O(h).

85
New cards

Get itam at index

procedure GET(T, i) {

if T.left.size == i then {

return T

} else if T.left.size > i then {

return get(T.left, i)

} else {

return get(T.right, i - T.left.size -1)

}

}

86
New cards

What is the height of a balanced binary tree?

O(log n)

87
New cards

Name some types of balanced binary trees.

Red-black trees, splay-trees, 2-3 trees, AVL trees.

88
New cards

Which type of tree is popular for database construction but is not binary?

B-trees.

89
New cards

What was the first proposed balanced binary tree and when was it introduced?

The AVL tree (Adelson-Velsky and Landis, 1962).

90
New cards

When is a tree considered height balanced?

When the height of the root's left child and right child differ by at most 1.

91
New cards

How is the skew of a node calculated?

Skew of a node = node.right.height - node.left.height.

92
New cards

What does it mean for a tree to have a height of O(log n)?

It means the tree's height grows logarithmically with the number of nodes, ensuring efficient operations like search, insert, and delete.

93
New cards

Why are balanced binary trees important in computer science?

They ensure efficient operations such as searching, inserting, and deleting elements by keeping the tree's height logarithmic.

94
New cards

What do rotations change in a binary tree?

Rotations change the structure of the tree without changing its traversal order.

95
New cards

How many pointer changes does it take to perform a tree rotation?

O(1) pointer changes.

96
New cards

What is the impact of rotations on the height of a tree?

Rotations can either increase or decrease the tree height by +1, 0, or -1.

97
New cards

What does inserting or deleting a node do to the height of ancestor nodes?

It changes the height of the ancestor nodes, requiring us to increment or decrement their height.

98
New cards

How much additional work does insertion or deletion add, due to rebalancing?

O(log n) work.

99
New cards

By how much does the skew of a parent node change when a child node is inserted or deleted?

The skew changes by 1.

100
New cards

What must be done if a parent's skew becomes +2 or -2 after insertion or deletion?

The subtree must be rebalanced.