Looks like no one added any tags here yet for you.
what is the term that encompasses the ratio k/N (items/buckets) when hasing is discussed
load factor
is it an imperative condition that k must be less than N
no it is not a requirement
how would you calculate the average number of items in a single bucket
load factor
in a worst case scenario, what is hte maximum number of items that a single bucket could accommodate
unlimited - if all items hash to the same bucket they can become a linked list of items with no theoretical upper limit
if k exceeds N could any of the buckets potentially remain empty
yes which would lead to a high load factor
in the event we decide to resize the table to a size of 2n waht is the asymptotic time complexity concerning k and N for transferring all items to a newly sized table
O(k) - there is only k elements that we would need to move only once to the new table
how does inserting keys into a hash table with collision resolved by chaining work
inserting keys into a hash table with chaining stores each key in a linked list at the index determined by the hash function allowing multiple keys to share the same index
how does inserting keys into a hash table with collision resolved by linear probing work
inserting keys into a hash table with linear probing places each key in the next available slot when a collision occurs by checking subsequent indices until an empty one is found
how does inserting keys into a hash table with collision resolved by quadratic probing work
inserting keys into a hash table with quadratic probing places each key in a new slot by checking indices at increasing quadratic intervals (eg 1, 2, 4, 8) when a collision occurs
which of the following affects the retrieval speed in a hash table?
a. the number of stored elements
b. the size of stored elements
c. the quality of the hash function
c. the quality of the hash function – a good hash function reduces collisions, leading to faster retrieval.
The size of stored elements (option b) generally does not impact retrieval speed since the hash table focuses on locating the element's position, not the size of the element itself.
In a well-designed hash table, the number of stored elements (option a) should ideally not impact retrieval speed, as hash tables are designed for constant-time complexity, O(1), on average.
what is the inorder traversal of this tree
a b c d e f g
what is the postorder traversal of this tree
a c b e g f d
what is the preorder traversal of this tree
d b a c f e g
what is the successor and predecessor of a
successor: b
predecessor: none
what is the successor and predecessor of e
successor: f
predecessor: d
what is the successor and predecessor of c
successor: d
predecessor: b
what is the successor and predecessor of f
successor: g
predecessor: e
what is the height of a bst
the length of the longest path from root to leaf
describe a full binary tree
a full binary tree is a tree in which every node has either zero or exactly two children with no nodes having only one child
describe a complete binary tree
a complete binary tree is a binary tree in which all levels are fully filled except possibly the last level which is filled from left to right
if node 10 is deleted using the binary search tree deletion procedure what will be the new root node
the node 10 will be replaced by 12 and hte root node will still be the same 15
how do you preform build max-heap
start from the last non leaf node
apply max-heapify up to the root
Result: max-heap structure in the array
how do you preform max-heapify
Compare the node with its children.
Swap with the largest child if needed.
Repeat until the node is larger than its children or reaches a leaf.
practice build max-heap by inserting the following elements: {27, 17, 3, 16, 13, 10, 1, 5, 7, 12, 4, 8, 9, 0}
what is the most appropriate data structure for an app where tracking multiple objects is required but only the object with the largest value of a specified attribute needs retrieval and processing
a heap - allows the max value to always be sorted at the root of the heap
what is the most appropriate data structure for an application where tracking multiple objects is required but only the object that has been tracked for the longest needs retrieval and processing
a queue - the object that has been in the queue the longest is at the front and new items are added at the end
what is hte degree of a vertex in a graph
the amount of edges that are connected to the vertex
You're working on the backend of a social media platform. Users in this platform can connect with each other as friends, and you need to represent and manage the connections between users. Given the scenario which works best, Adjacency matrix or Adjacency list.
In this scenario, an adjacency list is the better choice for representing and managing user connections on a social media platform
In a weighted graph with 5 nodes, each edge represents the distance between two cities. If there are 10 edges, what is the maximum number of cities that can be connected in this graph?
the maximum number of cities that can be connected is 5.
in a directed graph with 4 nodes, if each node has an out-degree of 2, what is the total number of edges in the graph?
each node has 2 outgoing edges therefore for 4 nodes there will be 4 × 2 = 8 edges in the graph
Consider an array of n elements that is to be sorted using a comparison-based sorting algorithm. In asymptotic (big-Oh) terms, what is the maximum number of comparisons needed to sort the array? Looking for the fastest worst case time complexity
O(n log n)
This is because the most efficient comparison-based sorting algorithms (such as Merge Sort, Heap Sort, and Quick Sort on average) require O(n log n) comparisons in the worst case.
using a comparison based sorting algorithm what is hte upper bound for the max number of comparisons required to check if the array is already sorted
using a comparison-based sorting algorithm, the upper bound for the maximum number of comparisons required to check if an array of nnn elements is already sorted is O(n). this is because each element can be compared to its next neighbor in a single pass to verify the sorted order
Which of the following sorting tasks would not be appropriate for Counting Sort?
a) Sorting a list of students by their unique identification numbers (ID numbers assigned by the school administration).
b) Sorting a list of books by their titles in alphabetical order.
c) Sorting a list of songs by their durations (in seconds).
d) Sorting a list of countries by their populations
d) Sorting a list of countries by their populations (large range of values makes Counting Sort impractical)
In the context of sorting, what is the lower bound for the number of comparisons needed to sort an array of n elements with a comparison-based sorting algorithm in big-Oh notation?
O(n log n)
This bound comes from the fact that comparison-based sorting algorithms must perform at least log2(n!) comparisons in the worst case
You are given an array of integers, and you need to implement a sorting algorithm to arrange the elements in ascending order. Choose one of the following sorting algorithms (Quick sort, selection sort, insertion sort, counting sort) and provide a step-by-step description of how the chosen algorithm can be applied to sort the given array. Additionally, calculate the time complexity for the chosen algorithm in terms of the number of comparisons and swaps. Array to Sort: [5, 2, 8, 9, 1, 4, 7]
Here's the Quick Sort process for [5, 2, 8, 9, 1, 4, 7]
:
1. Choose Pivot: Select 7
as the pivot.
- Partition: [5, 2, 1, 4, 7, 9, 8]
2. Left Subarray [5, 2, 1, 4]
:
- Pivot 4
: Partition to [2, 1, 4, 5]
3. Left Subarray [2, 1]
:
- Pivot 1
: Partition to [1, 2]
4. Right Subarray [9, 8]
:
- Pivot 8
: Partition to [8, 9]
5. Sorted Array: [1, 2, 4, 5, 7, 8, 9]
Time Complexity:
- Average case: O(n log n)
- Worst case: O(n^2)
list all the sorting algorithms we talked about in class
quick, selection, insertion, merge, heap, counting, radix, bucket
which sorting algorithm builds the sorted array 1 element at a time by repeatidly picking the next element and inserting it into its correct position among the already sorted elements
insertion sort
what is insertion sorts best, avg, worst case time complexity
best: O(N) - already sorted
avg: O(N²)
worst: O(N²)
which sorting algorithm finds the minimum element in the unsorted part and moves it to the sorted part iteratively
selection sort
what is the best avg worst case time complexity for selection sort
best o(n^2)
avg o(n^2)
worst o(n^2)
which sorting algorithm repeatedly swaps adjacent elements if they are in the wrong order with the largest elements bubbling up to the end
bubble sort
what is the best avg worst case time complexity for bubble sort
best o(n) - array is already sorted with early termination
avg o(n^2)
worst o(n^2)
which sorting algorithm divides the array into halves sorts each half and merges them in a divide and conquer manner
merge sort
what is the best avg worst case time complexity for merge sort
best o(n log n)
avg o(n log n)
worst o(n log n)
which sorting algorithm partitions the array around a pivot and recursively sorts the partitions
quick sort
what is the best avg worst case time complexity for quick sort
best o(n log n) - balanced partitions
avg o(n log n)
worst o(n^2) - poor pivot selection
which sorting algorithm builds a max heap from the array and repeatedly removes the root element to sort
heap sort
what is the best avg worst case time complexity for heap sort
best o(n log n)
avg o(n log n)
worst o(n log n)
which sorting algorithm counts occurrences of each unique value to sort without comparisons
counting sort
what is the best avg worst case time complexity for counting sort
best o(n + k) - depends on range of values k
avg o(n + k)
worst o(n + k)
which sorting algorithm processes individual digits starting from least to most significant to sort integers
radix sort
what is the best avg worst case time complexity for radix sort
best o(d(n + k)) - d is number of digits k is base
avg o(d(n + k))
worst o(d(n + k))
which sorting algorithm divides the array into buckets sorts each bucket and merges them
bucket sort
what is the best avg worst case time complexity for bucket sort
best o(n + k) - uniform distribution
avg o(n + k)
worst o(n^2) - poorly distributed data
which data structure stores elements in a tree format where each node has at most two children and left children are less than the parent while right children are greater
binary search tree
what is the best avg worst case time complexity for search in a binary search tree
best o(log n) - balanced tree
avg o(log n)
worst o(n) - skewed tree
which traversal method of a binary search tree will give nodes in ascending order if applied to a valid bst
in order traversal
if a hash table with separate chaining has n items and a load factor greater than 1 what is the average time complexity for an unsuccessful search
best o(1) - few collisions
avg o(load factor)
worst o(n) - all items in one bucket
what is the process of adjusting nodes in a heap to maintain the heap property after insertion or deletion
heapify
in which situation would counting sort not be appropriate for sorting a dataset
when sorting non-integer values or a dataset with a large range of values
when performing a delete operation on a node with two children in a binary search tree what is the typical replacement node to maintain bst properties
in order successor or in order predecessor
which algorithm for finding a minimum spanning tree starts at a single vertex and repeatedly adds the smallest edge that connects a new vertex to the growing mst
prim's algorithm
which algorithm for finding a minimum spanning tree sorts all edges and repeatedly adds the smallest edge to the mst if it does not form a cycle
kruskal's algorithm
what is the time complexity of kruskal's algorithm when edges are sorted initially for a graph with e edges and v vertices
best o(e log e) - initial sort of edges
which graph traversal algorithm uses a stack data structure or recursion and explores as far down a branch as possible before backtracking
depth first search
what is the time complexity of dfs on a graph with v vertices and e edges if the graph is represented as an adjacency list
best o(v + e)
avg o(v + e)
worst o(v + e)
which graph traversal algorithm uses a queue data structure and explores all neighbors of a node before moving to the next level of neighbors
breadth first search
what is the time complexity of bfs on a graph with v vertices and e edges if the graph is represented as an adjacency list
best o(v + e)
avg o(v + e)
worst o(v + e)
what is the major difference in the graph representation requirements for dfs and bfs compared to kruskal's and prim's algorithms
dfs and bfs can work on unweighted graphs while kruskal's and prim's require weighted graphs for finding minimum spanning trees