IADS Sem 1

studied byStudied by 8 people
4.0(1)
Get a hint
Hint

What are algorithms?

1 / 209

flashcard set

Earn XP

Description and Tags

210 Terms

1

What are algorithms?

Methods or recipes for solving various problems

New cards
2

What are data structures?

Ways of storing or representing data that make it easy to manipulate

New cards
3

What is divide-conquer-combine?

Algorithm design paradigm that breaks a problem into smaller subproblems, solves them recursively, and then combines the solutions.

New cards
4

What is the pseudocode for binary search?

<p></p>
New cards
5

What does O(log n) imply?

logarithmic time: the algorithm doesn’t even read the whole input

New cards
6

What does O(n) imply?

linear time: the algorithm accesses the input only a constant number of times

New cards
7

What does O(n log n) imply?

The algo splits the inputs into two pieces of similar size, solves each part and merges the solutions

New cards
8

What does O(n²) imply?

Quadratic time: the algorithm considers pairs of elements

New cards
9

What does O(n^b) imply?

Polynomial time: the algo considers many subsets of the input elements

New cards
10

What does O(1) imply?

constant time

New cards
11

What do \omega(1), \omega(n), \omega(n^b) imply?

superconstant, superlinear, and superpolynomial time (super)

New cards
12

What do o(n), o(c^n) imply?

sublinear, subexponential time (sub)

New cards
13

What does O(c^n) imply?

exponential time

New cards
14

In words, what does the not-in-place InsertSort algo do?

Go through elements of array A STARTING AT INDEX 1, copying them into temporary int B. If any of the elements to the left of currentInt are bigger than currentInt, shift them to the right and place currentInt in the empty space

<p>Go through elements of array A STARTING AT INDEX 1, copying them into temporary int B. If any of the elements to the left of currentInt are bigger than currentInt, shift them to the right and place currentInt in the empty space</p>
New cards
15

How much extra space does InsertSort require?

\theta(1)

<p>$$ \theta(1) $$</p>
New cards
16
<p>What is the pseudocode for InsertSort in-place?</p>

What is the pseudocode for InsertSort in-place?

knowt flashcard image
New cards
17
<p>What is the java code for insertionsort?</p>

What is the java code for insertionsort?

knowt flashcard image
New cards
18

How much extra space does InsertSort (in-place) require?

Requires \theta(1) extra space

<p>Requires $$ \theta(1) $$ extra space</p>
New cards
19

What is the worst-case time complexity of InsertSort in-place?

T_{worst}=T_{av}(n)=\theta(n²)

<p>$$ T_{worst}=T_{av}(n)=\theta(n²) $$ </p>
New cards
20

What is the average-case time complexity of Insert-sort in-place?

T_{worst}=T_{av}(n)=\theta(n²)

<p>$$ T_{worst}=T_{av}(n)=\theta(n²) $$ </p>
New cards
21

What is the best-case time complexity of InsertSort in-place?

T_{best}=\theta (n)

<p>$$ T_{best}=\theta (n) $$ </p>
New cards
22

Why is insertionSort preferable to selectionSort?

Best case for insertSort is O(n) or \Theta(n), while selectionSort best case is O(n²)

New cards
23

What is the MergeSort pseudocode?

<p></p>
New cards
24

In words, what does MergeSort do?

Split array A into two halves, sort each separately, then merge the results

<p>Split array A into two halves, sort each separately, then merge the results</p>
New cards
25

What is the worst-case time complexity of MergeSort?

T_{worst}=T_{av}=T_{best}=\theta(nlgn)

<p>$$ T_{worst}=T_{av}=T_{best}=\theta(nlgn) $$ </p>
New cards
26

What is the average-case time complexity of MergeSort?

T_{worst}=T_{av}=T_{best}=\theta(nlgn)

<p>$$ T_{worst}=T_{av}=T_{best}=\theta(nlgn) $$ </p>
New cards
27

What is the best-case time complexity of MergeSort?

T_{worst}=T_{av}=T_{best}=\theta(nlgn)

<p>$$ T_{worst}=T_{av}=T_{best}=\theta(nlgn) $$ </p>
New cards
28

How much extra space does MergeSort require? [2]

Requires \theta(nlgn) extra space, or \theta(n) with a better implementation

<p>Requires $$ \theta(nlgn) $$ extra space, or $$ \theta(n) $$ with a better implementation</p>
New cards
29

what is the pseudocode for recursive merge sort?

<p></p>
New cards
30

which is better, merge sort or insert sort?

MergeSort is asymptotically better than InsertSort, but not in the best case

mergesort = o(insertsort)

For EVERY one choice of a constant k > 0 , you can find a constant a such that the inequality 0 <= f(x) < k*g(x) holds for all x > a

This means that, for large enough inputs, MergeSort will always run faster than Insertion Sort. In other words, MergeSort's performance grows slower than Insertion Sort as the input size increases.

<p>MergeSort is asymptotically better than InsertSort, but not in the best case</p><p>mergesort = o(insertsort)</p><p>For EVERY one choice of a constant k &gt; 0 , you can find a constant a such that the inequality 0 &lt;= f(x) &lt; k*g(x) holds for all x &gt; a</p><p><span>This means that, for large enough inputs, MergeSort will always run faster than Insertion Sort. In other words, MergeSort's performance grows slower than Insertion Sort as the input size increases.</span></p>
New cards
31

When is insertion sort preferred to merge sort?

In smaller datasets (small n) where memory is a constraint and the array is partially ordered

New cards
32

What does little o mean (where f(n) == o(g))? (equations and words)

\forall c>0;\exists N;\forall n \geq N;f(n) < cg(n)

, equivalently \frac{g(n)}{f(n)} \rightarrow \infty, n \rightarrow \infty

For EVERY one choice of a constant c > 0 , you can find a constant a such that the inequality 0 <= f(n) < k*g(n) holds for all x > a

i.e. g is a strict upper asymptotic bound for f: g grows strictly faster than f as the input size approaches infinity!!

<p>$$ \forall c&gt;0;\exists N;\forall n \geq N;f(n) &lt; cg(n) $$</p><p> , equivalently $$ \frac{g(n)}{f(n)} \rightarrow \infty, n \rightarrow \infty $$</p><p>For EVERY one choice of a constant c &gt; 0 , you can find a constant a such that the inequality 0 &lt;= f(n) &lt; k*g(n) holds for all x &gt; a</p><p>i.e. g is a strict upper asymptotic bound for f: g grows strictly faster than f as the input size approaches infinity!!</p>
New cards
33

What does big O mean (where f(n) == O(g))? (equations and words)

\exists c>0. \exists N. \forall n \geq N. f(n) \leq cg(n)

For SOME (at least one, there exists) choice of a constant c > 0, you can find a constant a such that the inequality 0 <= f(n) <= c*g(n) holds for all x > a

i.e. g is an upper asymptotic bound (NOT STRICT) for f: g is a limit or ceiling for the growth of f, but f may grow at the same rate or slower than g

<p>$$ \exists c&gt;0. \exists N. \forall n \geq N. f(n) \leq cg(n) $$</p><p>For SOME (at least one, there exists) choice of a constant c &gt; 0, you can find a constant a such that the inequality 0 &lt;= f(n) &lt;= c*g(n) holds for all x &gt; a</p><p>i.e. g is an upper asymptotic bound (NOT STRICT) for f: <strong>g is a limit or ceiling for the growth of f, but f may grow at the same rate or slower than g</strong></p>
New cards
34

what is the relationship between big O and little o?

f=o(g) \implies f=O(g) , not conversely

<p>$$ f=o(g) \implies f=O(g) $$, not conversely</p>
New cards
35

What does \omega mean (where f(n) == \omega(g) )? (equations and words)

\forall c>0. \exists N. \forall n \geq N. f(n) > cg(n)

, f= \omega(g) \iff g=o(f)

For EVERY one choice of a constant c > 0 , there exists a threshold N such that for all n \geq N, the function f(n) grows faster than cg(n).

→ f grows STRICTLY faster than g iff g grows STRICTLY slower than f.

i.e. g is a STRICT asymptotic lower bound of f, so g grows STRICTLY faster than f for sufficiently large n: f(n) == \omega(g)

<p>$$ \forall c&gt;0. \exists N. \forall n \geq N. f(n) &gt; cg(n) $$</p><p>, $$ f= \omega(g) \iff g=o(f) $$</p><p>For EVERY one choice of a constant c &gt; 0 , there exists a threshold N such that for all $$n \geq N$$, the function f(n) grows faster than cg(n).</p><p>→ f grows STRICTLY faster than g iff g grows STRICTLY slower than f.</p><p>i.e. <strong>g is a STRICT asymptotic lower bound of f, so g grows STRICTLY faster than f for sufficiently large n: </strong> f(n) == $$ \omega(g)$$</p>
New cards
36

What does \Omega mean (where f(n) == \Omega(g) )? (equations and words)

\exists c>0. \exists N. \forall n \geq N. f(n) \geq cg(n)

, f=\Omega (g) \iff g=O(f)

g is an asymptotic lower bound for 𝑓. This means that 𝑔(𝑛) grows at least as fast as 𝑓(𝑛) for sufficiently large 𝑛.

<p>$$ \exists c&gt;0. \exists N. \forall n \geq N. f(n) \geq cg(n) $$</p><p>, $$ f=\Omega (g) \iff g=O(f) $$</p><p><em><span>g</span></em><span> is an asymptotic lower bound for 𝑓. This means that 𝑔(𝑛) grows at least as fast as 𝑓(𝑛) for sufficiently large 𝑛.</span></p>
New cards
37

What does \Theta mean (where f(n) == \Theta(g) )? (equations and words)

f = \theta (g) \implies f=O(g) \wedge f=\Omega (g)

\exists c_1,c_2>0. \exists N. \forall n \geq N. c_1g(n) \leq f(n) \leq c_2g(n)

i.e. f(n) is both asymptotically upper and lower bounded by g(n) → g(n) is f’s tight bound, where f grows at the same rate as g(n) for sufficiently large n

<p>$$ f = \theta (g) \implies f=O(g) \wedge f=\Omega (g) $$</p><p>$$ \exists c_1,c_2&gt;0. \exists N. \forall n \geq N. c_1g(n) \leq f(n) \leq c_2g(n) $$</p><p>i.e. f(n) is both asymptotically upper and lower bounded by g(n) → g(n) is f’s tight bound, where f grows at the same rate as g(n) for sufficiently large n</p>
New cards
38

Best/worst case and asymptotic bounds - what does big O and big Omega mean if talking about the worst and best case time complexities?

T_{worst} = O(g) - worst case is no worse than g

T_{worst} = \Omega (g) - worst case can be as bad as g

T_{best} = O(g) - best case can be as good as g

T_{best} = \Omega (g) - best case is no better than g

<p>$$ T_{worst} = O(g) $$ - worst case is no worse than g</p><p>$$ T_{worst} = \Omega (g) $$ - worst case can be as bad as g</p><p>$$ T_{best} = O(g) $$ - best case can be as good as g</p><p>$$ T_{best} = \Omega (g) $$ - best case is no better than g</p>
New cards
39

Time complexity for testing membership (checking if an element exists) in an unsorted array:

O(n)

Since the array is unsorted, you have to iterate through all elements to check if the desired element exists

New cards
40

Time complexity for testing membership (checking if an element exists) in a sorted array

O(log(n))

Binary search can be employed to efficiently locate elements in a sorted array

New cards
41

Time complexity for testing membership (checking if an element exists) in a (singly) linked list [2]

O(1) if you have reference to the node where you want to check

O(n) where you have to traverse from the beginning to the end to check membership

New cards
42

Time complexity for testing membership (checking if an element exists) in an unbalanced tree

O(n)

In the worst case scenario, where the tree is unbalanced and resembles a linked list, testing membership requires traversing all the nodes

New cards
43

Time complexity for testing membership (checking if an element exists) in a balanced tree

O(log(n))

Due to the balanced nature of the tree, operations like searching can be performed efficiently using techniques like binary search

New cards
44

Time complexity for adding or removing elements in an unsorted array:

O(n)

Adding or removing an element may require shifting elements to maintain order; you have to iterate through all elements to check if the desired element exists

New cards
45

Time complexity for adding or removing elements in a sorted array:

O(log(n))

Binary search can also be used to find the correct position for insertion or deletion, but shifting elements may still be necessary

New cards
46

Time complexity for adding or removing elements in a linked list:

O(1) - if adding or removing an element at the beginning of the linked list since it only involves adjusting a few pointers

O(n) - for positions NOT at the beginning of the linked list, you may need to traverse the list to find the insertion or deletion point

New cards
47

Time complexity for adding or removing elements in an unbalanced tree:

O(n)

If the tree is unbalanced, adding or removing elements may require traversing through most or all nodes to maintain balance

New cards
48

Time complexity for adding or removing elements in a balanced tree:

O(log(n))

Since balanced trees maintain their balance through rotations or other techniques, adding or removing elements can be done efficiently

New cards
49

Is a stack LIFO or FIFO?

LIFO

New cards
50

What does the stack store, and what’s a special characteristic of what the stack stores? [4]

Program variables:

  • local variables

  • function parameters

  • references to elements on the heap

All program variables are items which must have fixed size

<p>Program variables:</p><ul><li><p>local variables</p></li><li><p>function parameters</p></li><li><p>references to elements on the heap</p></li></ul><p>All program variables are items which must have fixed size</p>
New cards
51

Is a stack dynamic or static?

Dynamic - it grows and shrinks as variables enter and exit scope

New cards
52

What are local variables?

Variables declared within a function or block of code, which are created and stored on the stack when the function is called.

New cards
53

How are local variables and function parameters (stack-based variables) removed from the stack?

Since stack-based variables are scoped to the function or block in which they are declared, they have a limited lifetime. When the function exits or the block of code ends, the memory allocated to these variables is deallocated, and they cease to exist.

New cards
54

How are function parameters stored on and removed from the stack?

When a function is called, its parameters are passed to the function and stored on the stack. These parameters can be accessed

New cards
55

Where do heap objects live once created (allocated)?

Anywhere in memory

New cards
56

What happens in many languages when heap objects become unreachable?

languages inc. Java, Python, Haskell

a garbage collector (memory recycler) detects this and reclaims the space

New cards
57

What are some specifications of heap objects?

They can be of any size and may contain reference to other heap objects

<p>They can be of any size and may contain reference to other heap objects</p>
New cards
58

What are the three methods for modular exponentiation? (pseudocode)

knowt flashcard image
New cards
59
<p>Given this relatively inefficient bubble sort algo pseudocode, what is a more efficient version?</p>

Given this relatively inefficient bubble sort algo pseudocode, what is a more efficient version?

<p></p>
New cards
60

What are the 4 common functions and return types of finite sets of items of a given type X?

knowt flashcard image
New cards
61

What are the functions and return types of dictionaries (mapping keys of type X to values of type Y)?

knowt flashcard image
New cards
62

What is the pseudocode for BinarySearch(A,key,i,j)?

knowt flashcard image
New cards
63

Explain binary search in English (plus diagram)

Binary search is an efficient algorithm for finding an item from a sorted list of items. It works by repeatedly dividing in half the portion of the list that could contain the item, until you've narrowed down the possible locations to just one.

<p><span>Binary search is an efficient algorithm for finding an item from a sorted list of items. It works by </span><strong>repeatedly dividing in half the portion of the list that could contain the item, until you've narrowed down the possible locations to just one</strong><span>.</span></p>
New cards
64

What is the point of a hash (#) table?

Where there are K potential keys (huge) compared to n actual keys ‘currently in use’ (much smaller)

New cards
65

What is the hashing method?

use a chosen hash function # mapping an array of potential keys s to integers 0, …, m-1 (hash codes), where m \approx n, such that #(s) = (x(s) mod m); then try to use an array A of size m, storing the entry for key s at position #(s) in A

New cards
66

What is a clash?

a collision when trying to map keys s to hashcodes 0, …, m-1 (when trying to hash a set)

New cards
67

What are the two methods to resolve hash clashes?

1) bucket list hash tables

2) open addressing and probing

New cards
68

What is a bucket-list hash table?

A hash table where we store a list (“bucket”) of entries for each hash value, external to the hash table

<p>A hash table where we store a list (“bucket”) of entries for each hash value, external to the hash table</p>
New cards
69

What is the load, \alpha on the hash table?

the ratio \alpha = \frac{n}{m}, which may be \leq1 or >1 (i.e. cannot be 1), where n = number of elements in the hash table and m = size of hash table (number of slots or linked lists/buckets in the table)

<p>the ratio $$\alpha = \frac{n}{m}$$, which may be $$\leq1$$ or $$&gt;1$$ (i.e. cannot be 1), where n = number of elements in the hash table and m = size of hash table (number of slots or linked lists/buckets in the table)</p>
New cards
70

What is amortised time?

the way to express the time complexity when an algorithm has the very bad time complexity only once in a while besides the time complexity that happens most of time. Good example would be an ArrayList which is a data structure that contains an array and can be extended.

New cards
71

When do we “expand-and-rehash”?

When we have chosen a desired load factor \alpha and our load factor exceeds this, we can increase the size of the hash table and rehash all the elements into the new, larger hash table

New cards
72

Why do we desire a low load factor \alpha?

It ensure that the hash table remains efficient by minimising the likelihood of collisions

New cards
73

What is the time complexity for an unsuccessful lookup, assuming that for k not in the table, #(k) is equally likely to be any of the m hash codes in the hash table?

If computing #(k) takes O(1) time, the average time for unsuccessful lookup is \theta(\alpha) (where a\rightarrow infinity, and thus = \theta(n)

New cards
74

Assuming that all k keys in a hash table are equally likely to be searched for, what is the average time for a successful lookup operation?

Time complexity in the order of \theta(\alpha) , where a is the load factor

New cards
75

What is open addressing?

Storing all items within the hash table itself, rather than keeping bucket lists outside the hash table (as in bucket-list hash tabling)

New cards
76

How do we deal with clashes in open addressing?

Don’t use a simple hash function #(k), but a function #(k, i) where 0 \leq i<m

New cards
77

What would be the first few choices if hashing to key k in open addressing?

  • 1st choice: #(k, 0)

  • 2nd choice: #(k, 1)

  • etc.

New cards
78

How do we insert an item e with key k into an open-addressed hash table?

probe A[#(k, 0)], A[#(k, 1)], . . . until we find a free slot A[#(k, i)], then put e there

New cards
79

How do we lookup an item with key k in an open-addressed hash table?

To lookup an item with key k, probe A[#(k, 0)], A[#(k, 1)], . . . until we find either an item with key k, or free cell (lookup failed).

New cards
80

What is the worst-case time complexity for functions lookup/insert/delete when using hash tables?

\theta(n)

New cards
81

If we could avoid clashes entirely, what would worst-case lookup be in hash tables?

\theta(1)

New cards
82

What does it mean in hashing when keys are “static”?

no insert or delete functions required

New cards
83

What is only worth doing in the case of static keys when hashing?

finding a perfect hash function (with NO clashes) for this set of keys

New cards
84

What are the pros of probing? [2]

  • Functions insert and lookup expect a low number of probes until the hash table is nearly full

    • n_{probes} \approx \frac{1}{1-a} if unsuccessful lookup

    • less for successful lookup

  • No need for pointers

    • Saves memory which can be ‘spent’ on increasing table size and decreasing load \alpha, leading to faster lookup for same memory

New cards
85

What are the cons of probing? [2]

  • delete operation can be less efficient than other data structures!! Involves either:

    • leaving tombstone markers

    • rehashing the table

  • design of probing functions requires careful consideration (to optimise hash table performance)

    • trade-offs with efficiency, collision resolution, complexity

New cards
86

What are some different types of probing?

linear probing, quadratic probing, double hashing

New cards
87

Binary tree (representation of sets) - what are the properties of the nodes?

Each node x has a left and right branch, each of which may be null or a pointer to a child node.

AND the values of its left descendent node < current node < right descendent node

New cards
88

What is the time complexity of the contains function in a tree, balanced/unbalanced?

Contains takes O(log(n)) for balanced/unbalanced, however worst case is O(n)

New cards
89

What is the time complexity of the insert function in a tree, balanced/unbalanced?

Insert takes O(log(n)) if the tree is not too unbalanced, \theta(n) in worst case

New cards
90

What is the depth of a perfectly balanced tree?

d =\left \lfloor{log(n)}\right \rfloor

New cards
91

What is a red-black tree?

A self-balancing binary search tree in which every node is coloured with either red or black. They are guaranteed to be “not too unbalanced”.

New cards
92

What is the time complexity of all operations in red-black trees? What about the complexity of re-balancing trees?

O(log(n)) for insert, delete, rebalance

New cards
93

What are the properties of a red-black tree?

  • Root and all (trivial) leaves are black (trivial leaves are the tiny dots/triangles at the end of the branches)

  • All paths root to leaf (nil descendants) contain same number of black nodes (b, black-length)

  • On a path root to leaf, never have two reds in a row

  • min path length is b, max is 2b-1, so path lengths are under 2(log(n+1))+1

  • If a node is red, then its kids are black

  • Smaller elements will always be to the left, and bigger to the right, of nodes

<ul><li><p>Root and all (trivial) leaves are black (trivial leaves are the tiny dots/triangles at the end of the branches)</p></li><li><p>All paths root to leaf (nil descendants) contain same number of black nodes (b, black-length)</p></li><li><p>On a path root to leaf, never have two reds in a row</p></li><li><p>min path length is b, max is 2b-1, so path lengths are under $$2(log(n+1))+1$$</p></li><li><p>If a node is red, then its kids are black</p></li><li><p>Smaller elements will always be to the left, and bigger to the right, of nodes</p></li></ul>
New cards
94

What is the red-uncle rule?

When inserting a node we colour it red. If the node’s parent and uncle are red, colour them black, and the grandparents red. This pushes reds UP. We apply red-uncle rule until the endgame scenario.

If uncle is red, recolour parent, grandparent and uncle.

New cards
95

How many times should we repeat the red-uncle rule?

Until we reach an endgame scenario

New cards
96

What are the endgame scenarios of red-uncle rule when inserting a node?

1: tree is legal, do nothing

Do something:

2: the root is red

3: newNode.uncle.colour == black (TRIANGLE)

4: newNode.uncle.colour == black (LINE)

<p>1: tree is legal, do nothing</p><p>Do something:</p><p>2: the root is red</p><p>3: newNode.uncle.colour == black (TRIANGLE)</p><p>4: newNode.uncle.colour == black (LINE)</p>
New cards
97

What are the fix-up steps if after insertion of a node in the r-b tree, the root is red?

Colour it black (adds 1 to all black lengths)

New cards
98

What are the fix-up steps if the newly inserted node’s uncle is red?

Recolour parent, gp, and uncle

<p>Recolour parent, gp, and uncle</p>
New cards
99

What are the fix-up steps if the newly inserted node’s uncle is black (and there’s a triangle between Z, Z’s parent and Z’s grandparent)?

Rotate Z’s parent in the direction of the triangle

<p>Rotate Z’s parent in the direction of the triangle</p>
New cards
100

What are the fix-up steps if the newly inserted node’s uncle is black (and there’s a straight line between Z, Z’s parent and Z’s grandparent)?

Rotate grandparent (in the up and over direction) and recolour gp

<p>Rotate grandparent (in the up and over direction) and recolour gp</p>
New cards

Explore top notes

note Note
studied byStudied by 55 people
... ago
5.0(1)
note Note
studied byStudied by 15 people
... ago
5.0(1)
note Note
studied byStudied by 46 people
... ago
4.0(1)
note Note
studied byStudied by 202 people
... ago
4.7(3)
note Note
studied byStudied by 82 people
... ago
5.0(2)
note Note
studied byStudied by 5 people
... ago
5.0(1)
note Note
studied byStudied by 15 people
... ago
5.0(1)
note Note
studied byStudied by 30 people
... ago
5.0(1)

Explore top flashcards

flashcards Flashcard (72)
studied byStudied by 1 person
... ago
5.0(1)
flashcards Flashcard (45)
studied byStudied by 11 people
... ago
5.0(1)
flashcards Flashcard (54)
studied byStudied by 13 people
... ago
5.0(1)
flashcards Flashcard (67)
studied byStudied by 15 people
... ago
5.0(1)
flashcards Flashcard (131)
studied byStudied by 27 people
... ago
5.0(1)
flashcards Flashcard (20)
studied byStudied by 3 people
... ago
5.0(1)
flashcards Flashcard (57)
studied byStudied by 162 people
... ago
5.0(1)
flashcards Flashcard (49)
studied byStudied by 38 people
... ago
5.0(1)
robot