1/21
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
a rotation is like what
rotate towards lighter side, work on heavier side, the lower node hands a node to higher node, lower node goes higher
a self-balancing binary tree
when the tree is unbalanced, we rotate it
a valid AVL tree has for each node, the difference between its left and right subtree cannot exceed 1
empty tree has a height of -1
for each node, its height = max of r’s left and right subtree + 1
AVL tree
whats the best case for the height of an AVL tree
O(logn)
whats the best case for the height of an AVL tree
O(logn)
rotations for an AVL tree because you’re updating pointers is
O(1)
(T/F) The lower bound for sorting is Ω(nlogn)
True
Search time for a hash table is
O(1)
a way to handle collisions in a table (two different keys are assigned the same index)
“stack” the elements in the same index of the hash table
essientially a list of linked lists
separate chaining
a way to handle collisions in a table (two different keys are assigned the same index)
find a different location to insert
open addressing
want to keep load factor (amount of elements / table size) low so that
there are no collisions
what is the expected insert/find runtime for separate chaining
O(1) bc hash/mod division is constant and don’t expect many collisions
check spot if empty, linear search from that spot until find empty spot or until find key
linear probing
what is the expected insert/find runtime for linear probing and why?
O(1) bc hash/mod division is constant and don’t expect many collisions
for AVL trees vs Hashing, what are the positives and negatives of each?
AVL trees: + n amount of nodes, - O(logn) lookup
Hashing: + O(1) lookup, - table size is bigger than # items
for open addressing vs chaining, what are the positives and negatives of each?
open addressing: + choices to handle collisions, - more difficult to implement
separate chaining: + easy to handle collisions/maintain, - not many options to handle collisions
maintaining a large table, only using a few elements
has a hash function that takes a key and returns some integer
insert/find an entry
hashing
a 1D array of key, priority value pairs
organize the elements by priority value
implement as a binary min heap (each node <= children)
1D array emulates a binary tree
left child = 2 * i
right child = 2 * i + 1
parent is i // 2
priority queue
how does push_back work in priority queue
push element pack in heap array and bubble it up if needed
how does pop_front work in priority queue
replace first node with last node, pop the last node, bubble down if needed
what is the run time for push_back in priority queue
O(logn) bc in worst case bubble up all the way to root of a balanced binary tree
what are two ways to update an existing element’s priority queue?
linearly searching O(n) the array for the element and then bubbling up if needed O(logn) => O(n)
key to index unordered map, O(1) lookup and then bubbling if needed O(logn) => O(logn)
what is the run time for pop_front() in priority queue
O(logn) bc in worst case bubble down all the way to leaf of a balanced binary tree