DATA Exam 1

0.0(0)
studied byStudied by 0 people
full-widthCall with Kai
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/74

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

75 Terms

1
New cards
Data (Definition)
Information or knowledge represented or coded in some form suitable for better usage or processing.
2
New cards
Data Structure (Definition)
Any data representation and its associated operations.
3
New cards
Algorithm (Definition)
In mathematics and computer science, a self-contained step-by-step set of operations to be performed.
4
New cards
The relationship between Algorithms and Data Structures
They should be thought of as a unit, neither one making sense without the other.
5
New cards

Two key aspects of Analysis of Algorithms

  1. Time analysis: How many instructions does the algorithm execute? 2. Space analysis: How much memory does the algorithm need to execute?

6
New cards
The type of running time typically focused on in algorithm analysis
Worst-case running time, as it's easier to analyze and provides a bound for all possible inputs.
7
New cards
Primitive Operations (Definition)
Basic computations performed by an algorithm, assumed to take a constant amount of time in the RAM model.
8
New cards

Sum of the first N integers (∑i)

\frac{N\left(N+1\right)}{2} . For large N, this is approximately \frac{N^2}{2} .

9
New cards

Sum of the first N terms of a geometric series (∑A^i)

\frac{A^{N-1}-1}{A-1}

10
New cards

Logarithm Definition: logX​(B)=A

X^{A}=B

11
New cards
Logarithm Rule: loga​(AB)

\log_{a}A+\log_{a}B

12
New cards
Logarithm Rule: loga​(A/B)

\frac{\log_{a}A}{\log_{a}B}

13
New cards

Logarithm Rule: loga​(A^n)

n\log_{a}A

14
New cards

Logarithm Change of Base formula: logA​(B)

\frac{\log_2B}{\log_2A}

15
New cards

Big-Oh Notation (f(n)=O(g(n)))

f(n) ≤ g(n). Upper bound (possibly equal). f(n) grows at a rate no faster than g(n).

16
New cards

Big-Oh cheat

f(n) = O(g(n)) == f(n) ≤ g(n)

17
New cards

Formal definition of Big-Oh (f(n)=O(g(n)))

There are positive constants c and n0 such that f(n) ≤ c⋅g(n) for all n ≥ n0

18
New cards

Big-Omega Notation (f(n)=Ω(g(n)))

f(n) ≥ g(n). Lower bound (possibly equal). f(n) grows at a rate no slower than g(n).

19
New cards

Big-Omega cheat

f(n) = Ω(g(n)) == f(n) ≥ g(n)

20
New cards

Formal definition of Big-Omega (f(n)=Ω(g(n)))

There is a constant c > 0 and an integer constant n≥ 1 such that f(n) ≥ c⋅g(n) for n ≥ n0.

21
New cards
Big-Theta Notation (f(n)=Θ(g(n)))
Growth rates are equal. It's the tightest description.
22
New cards

Big-Theta cheat

f(n) = Θ(g(n)) == f(n) = g(n)

23
New cards

Formal definition of Big-Theta (f(n)=Θ(g(n)))

There are constants c′>0 and c′′>0 and an integer constant n0≥1 such that c′⋅g(n) ≤ f(n) ≤ c′′⋅g(n) for n ≥ n0.

24
New cards
Little-oh Notation (f(n)=o(g(n)))
Upper bound (not equal). f(n) grows strictly slower than g(n).
25
New cards

Little-oh cheat

f(n) = o(g(n)) == f(n) < g(n)

26
New cards

Limit definition for Little-oh (f(n) = o(g(n)))

limn→∞​g(n)f(n)​ = 0.

27
New cards

Little-Omega cheat

f(n) = ω(g(n)) == f(n) > g(n)

28
New cards

Big-Oh Rule 1(Addition)

If T1(N)=O(f(N)) and T2(N)=O(g(N)), then T1​(N)+T2​(N)=max(O(f(N)),O(g(N))). (Drop lower-order terms).

29
New cards

Big-Oh Rule 2 (Multiplication)

If T1​(N)=O(f(N)) and T2​(N)=O(g(N)), then T1(N)⋅T2(N)=O(f(N)⋅g(N)).

30
New cards

Big-Oh Rule for Polynomials

If T(N) is a polynomial of degree k, then T(N)=Θ(Nk). (Drop lower-order terms and constant factors).

31
New cards
Time Complexity Rule 1: For Loops
Running time = (running time of statements inside the loop) × (number of iterations).
32
New cards
Time Complexity Rule 2: Nested Loops
Analyze inside out. Running time of statements inside × the product of loop sizes. Example: nested loops of size N and M is O(N⋅M).
33
New cards

Time Complexity Rule 3: Consecutive Statements

Add them, and the complexity is the maximum of the individual running times.

34
New cards

Time Complexity Rule 4: If/Else

Running time = (running time of the test) + max(running time of S1​, running time of S2​).

35
New cards
Rule for Simple Recursive Function (e.g., Factorial)
O(N) because the function calls itself once and reduces N by 1 each time until the base case.
36
New cards
Rule for Fibonacci-like Recursion (T(N)=T(N−1)+T(N−2)+C)
Almost doubles each time, resulting in exponential growth. This is a very slow algorithm.
37
New cards
List ADT Operations
printList, makeEmpty, find, insert (at position), remove (at position), findKth.
38
New cards
ArrayList - Underlying implementation
A growable array implementation of List.
39
New cards
LinkedList - Underlying implementation
A doubly linked list implementation.
40
New cards
ArrayList: get() and set() complexity
O(1) (Constant Time).
41
New cards
LinkedList: get() and set() complexity
O(N) (Linear Time).
42
New cards
ArrayList: add/remove at end complexity
O(1) (Constant Time).
43
New cards
LinkedList: add/remove at end complexity
O(1) (Constant Time).
44
New cards
ArrayList: add/remove at front complexity
O(N) (Linear Time) because elements must be shifted.
45
New cards
LinkedList: add/remove at front complexity
O(1) (Constant Time).
46
New cards
Advantage of ArrayList
Most efficient if you need random access through an index without frequent insertions/removals, except at the end.
47
New cards
Advantage of LinkedList
Best if your application requires frequent insertion or deletion of elements from the beginning or middle.
48
New cards
Stack ADT definition
A list with the restriction that insertions and deletions can only happen from the end called top.
49
New cards
Stack fundamental scheme
Last In, First Out (LIFO).
50
New cards

Main Stack Operations

  1. Push: Insertion. 2. Pop: Deletion (removes and returns top element). 3. Top (or peek): Returns top element without removing it.

51
New cards
Time complexity for all main Array-based Stack operations (push, pop, top)
O(1) (Constant Time).
52
New cards
Application of Stack: Balancing Symbols
Used to check if every right symbol (brace, bracket, parenthesis) corresponds to its left counterpart.
53
New cards
Application of Stack: Evaluating Postfix Expressions (Algorithm logic)
When an operand is seen, push it onto the stack. When an operator is seen, pop two operands, apply the operator, and push the result back onto the stack.
54
New cards
Application of Stack: Infix to Postfix Conversion (Parentheses rule)
If a right parenthesis is seen, pop the stack, writing symbols to output until the left parenthesis is seen, which is popped but not output.
55
New cards
Application of Stack: Method Calls (JVM)
The Java Virtual Machine (JVM) uses a stack to keep track of active methods, pushing a frame when a method is called and popping it when the method ends.
56
New cards
How to implement Two Stacks in One Array to maximize space utilization
Have one stack grow from the left to the right and the other from the right to the left. They overflow only when their top elements are adjacent.
57
New cards
Queue ADT definition
A data structure where insertions and deletions follow the First-In, First-Out (FIFO) scheme.
58
New cards
Queue operations: enqueue and dequeue
enqueue(o): Inserts an element at the rear of the queue. dequeue(): Removes and returns the element at the front of the queue.
59
New cards
Implementation using a fixed-size array is called a Circular Queue because...
The indices wrap around: if the back index reaches the end of the array, it moves to the beginning.
60
New cards
Time complexity for all main Array-based (Circular) Queue operations (enqueue, dequeue)
O(1) (Constant Time).
61
New cards
Application of Queue: Jobs to a printer
Jobs are arranged and processed in order of arrival (FIFO).
62
New cards
Application of Queue: Round Robin Schedulers
A round robin scheduler can be implemented using a queue to repeatedly dequeue, service, and then enqueue the element.
63
New cards
How to implement a Queue using Two Stacks (enqueue operation)
Push the new element onto Stack1.
64
New cards
How to implement a Queue using Two Stacks (dequeue operation)
If Stack2 is empty, pop all elements from Stack1 and push them onto Stack2. Then, pop and return the top element from Stack2. (Each element is pushed twice and popped twice for a total O(1)[cites​tart] amortized cost).
65
New cards
Binary Tree
A tree in which no node can have more than two children.
66
New cards
Binary Search Tree (BST)
A binary tree where for every node X: 1. Values in the left subtree are smaller than X. 2. Values in the right subtree are larger than X.
67
New cards
Inorder Traversal (for binary trees)
Left child, Parent, Right child. Used to retrieve elements in sorted order from a BST.
68
New cards
Preorder Traversal
Parent, Left child, Right child. Used to print a directory (parent name before children).
69
New cards
Postorder Traversal
Left child, Right child, Parent. Used for processing children before the parent, such as summing file sizes.
70
New cards
AVL Tree
A Binary Search Tree with a balance condition. The height of the left and right subtree of any node can differ by at most 1.
71
New cards
Average running time for all BST operations (contains, insert, remove, etc.)
O(logN) for an average tree. For an AVL tree, all operations are guaranteed to be O(logN).
72
New cards
Worst-case height of a standard BST
Can be as large as N−1 (if inserts result in a skewed list).
73
New cards
Single Rotation (in AVL Tree)
Performed to restore balance for "left-left" or "right-right" imbalance cases.
74
New cards
Double Rotation (in AVL Tree)
Performed to restore balance for "left-right" or "right-left" imbalance cases.
75
New cards
Strategy for deleting a node with two children in a BST
Replace the node with the smallest data in its right subtree and recursively delete that node.