Looks like no one added any tags here yet for you.
What must be done if a class has one or more abstract methods?
The class must be declared abstract to avoid compile errors.
Can objects of an abstract type be instantiated?
No, objects of an abstract type cannot be instantiated.
What is required for a subclass inheriting abstract methods?
The subclass must implement all inherited abstract methods or be declared abstract.
What is the best and worst case time complexity for adding an element in a linked list?
Best case and worst case are both O(1)
What is the best and worst case time complexity for adding an element in an array list?
Best case is O(1). Worst case is O(N) if need to resize
Order of Insert for Singly-Linked List
Best Case: O(1) if inserting at the beginning or end. Average Case: O(N). Worst Case: O(N) if inserting immediately before the last element, need to traverse through almost the whole list.
Order of Insert for Doubly-Linked List
Worst Case: O(N). Can cut T(N) in half by starting from either front or back based on if the position is in the first or second half of the list.
Order of Insert for Array List
Worst Case: O(N) because of shifting all the elements.
Doubly Linked List - Advantages
Can traverse through the list backwards, Easier to implement remove(), Lower time complexity when searching for an element
Doubly Linked List - Disadvantages
Uses 50% more storage, Updates double the references when inserting/removing
Should we make the front or back of the Linked List the "top" of the stack?
Front because all methods are O(1)
Back will cause the pop() to be O(N). For a Doubly Linked List
It doesn't matter, all functions will be O(1)
Should we make the front or back of the list the front of the queue?
Front of LL is front of queue because back would have an O(N) dequeue
For a doubly linked list?
It doesn't matter, all functions will be O(1)
ArrayList?
Both of them suck Front will cause an O(N) dequeue Back will cause an O(N) enqueue
Best-case time complexity for Selection Sort
O(N^2)
Worst-case time complexity for Selection Sort
O(N^2)
Average-case time complexity for Selection Sort
O(N^2)
Is Selection Sort a comparison sort?
Yes
Is Selection Sort an in-place sort?
Yes
Is Selection Sort a stable sort?
No
Best-case time complexity for Insertion Sort
O(n) when data is already sorted.
Worst-case time complexity for Insertion Sort
O(N^2)
Average-case time complexity for Insertion Sort
O(N^2)
Is Insertion Sort a comparison sort?
Yes
Is Insertion Sort an in-place sort?
Yes
Is Insertion Sort a stable sort?
Yes
Best-case time complexity for Radix Sort
O(N * log10(max Val))
Worst-case time complexity for Radix Sort
O(N * log10(max Val))
Average-case time complexity for Radix Sort
O(N * log10(max Val))
Is Radix Sort a comparison sort?
No
Is Radix Sort an in-place sort?
No
Is Radix Sort a stable sort?
Yes
Best-case time complexity for Merge Sort
O(NlogN)
Worst-case time complexity for Merge Sort
O(NlogN)
Average-case time complexity for Merge Sort
O(NlogN)
Is Merge Sort a comparison sort?
Yes
Is Merge Sort an in-place sort?
No
Is Merge Sort a stable sort?
Yes
Best-case time complexity for Quick Sort
O(NlogN)
Worst-case time complexity for Quick Sort
O(N^2)
Average-case time complexity for Quick Sort
O(NlogN)
Is Quick Sort a comparison sort?
Yes
Is Quick Sort an in-place sort?
Yes
Is Quick Sort a stable sort?
No
Full Binary Tree
A tree in which each node has 0 or 2 children
Complete Binary Tree
A binary tree in which every parent has exactly two children, except in the last row, which is filled left to right
BST Insertion - Best Case
Balanced BST with a height of O(logN)
BST Insertion - Worst Case
Degenerate BST with a height of O(N) Removes advantages of using a BST
put() (HashMap)
O(1) - Uses HashCode to insert
get() (HashMap)
O(1)
remove() (HashMap)
O(1)
put() (TreeMap)
O(log n) - inserts keys in sorted order
get() (TreeMap)
O(log n) - finds key in tree
remove() (TreeMap)
O(log n) - finds key in tree, removes
Red-Black Tree rules
1. Tree is a binary search tree
2. Every node is labeled red or black
3. The root of the tree is black
4. Red rule: If a node is red, it's children must be black
5. Path rule: Every path from a node to a null link must contain the same number of black nodes
Red-Black Tree Insertion
1. Insert a red node
2. When traversing down the tree for insertion, swap color of black nodes with two red children
3. Case 1: If the parent of the new node is black, then done!
Case 2: If the parent of the new node is red (breaks the red rule) then need to extra work to reinstate the rules of the red black tree Case 2A: if the new node is an outside node AND if the sibling of the parent is black (or doesn't exist), perform a rotation Case 2B: If the new node is an inside node AND if the sibling of the parent is black (or doesn't exist), perform a double rotation
If root becomes red just change it to black
Recolor everything so it passes the tests
Dense graph
contains a "large" number of edges - use an adjacency matrix
Sparse Graph
number of edges is "much less" than the maximum number of possible edges - use an adjacency list because storing a mostly 0s matrix is not space efficient
How to determine the "central" vertex
pick the vertex that is 1. Connected to the most vertices
2. Has the shortest average path to its connected vertices
Open addressing
probing
closed addressing
chaining
Load Factor
Number of elements divided by the number of buckets (N/B)
Remove a Heap
1. remove element at a given index (for priority queue, always remove highest priority element)
2. Move last element in the array to that index
3. If this smaller has a larger than either of its children, swap parent and the smaller child
4. Repeat 2 until heap property is satisfied or till no more children
Efficiency of adding a word to a trie given the word has N characters
O(N)