C949 V4 Comprehensive Study Guide for Computer Science Concepts

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/313

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

314 Terms

1
New cards

Finiteness

An algorithm must always have a finite number of steps before it ends. When the operation is finished, it must have a defined endpoint or output and not enter an endless loop.

2
New cards

Definiteness

An algorithm needs to have exact definitions for each step. Clear and straightforward directions ensure that every step is understood and can be taken easily.

3
New cards

Input

An algorithm requires one or more inputs. The values that are first supplied to the algorithm before its processing are known as inputs. These inputs come from a predetermined range of acceptable values.

4
New cards

Output

One or more outputs must be produced by an algorithm. The output is the outcome of the algorithm after every step has been completed. The relationship between the input and the result should be clear.

5
New cards

Effectiveness

An algorithm's stages must be sufficiently straightforward to be carried out in a finite time utilizing fundamental operations. With the resources at hand, every operation in the algorithm should be doable and practicable.

6
New cards

Generality

Rather than being limited to a single particular case, an algorithm should be able to solve a group of issues. It should offer a generic fix that manages a variety of inputs inside a predetermined range or domain.

7
New cards

Modularity

This feature was perfectly designed for the algorithm if you are given a problem and break it down into small-small modules or small-small steps, which is a basic definition of an algorithm.

8
New cards

Correctness

An algorithm's correctness is defined as when the given inputs produce the desired output, indicating that the algorithm was designed correctly. An algorithm's analysis has been completed correctly.

9
New cards

Maintainability

It means that the algorithm should be designed in a straightforward, structured way so that when you redefine the algorithm, no significant changes are made to the algorithm.

10
New cards

Functionality

It takes into account various logical steps to solve a real-world problem

11
New cards

Robustness

Robustness refers to an algorithm's ability to define your problem clearly.

12
New cards

User-friendly

If the algorithm is difficult to understand, the designer will not explain it to the programmer.

13
New cards

Simplicity

If an algorithm is simple, it is simple to understand.

14
New cards

Extensibility

Your algorithm should be extensible if another algorithm designer or programmer wants to use it.

15
New cards

Brute Force Algorithm:

A straightforward approach that exhaustively tries all possible solutions, suitable for small problem instances but may become impractical for larger ones due to its high time complexity.

16
New cards

Recursive Algorithm:

A method that breaks a problem into smaller, similar subproblems and repeatedly applies itself to solve them until reaching a base case, making it effective for tasks with recursive structures.

17
New cards

Encryption Algorithm:

Utilized to transform data into a secure, unreadable form using cryptographic techniques, ensuring confidentiality and privacy in digital communications and transactions.

18
New cards

Backtracking Algorithm:

A trial-and-error technique used to explore potential solutions by undoing choices when they lead to an incorrect outcome, commonly employed in puzzles and optimization problems.

19
New cards

Searching Algorithm:

Designed to find a specific target within a dataset, enabling efficient retrieval of information from sorted or unsorted collections.

20
New cards

Sorting Algorithm:

Aimed at arranging elements in a specific order, like numerical or alphabetical, to enhance data organization and retrieval.

21
New cards

Hashing Algorithm:

Converts data into a fixed-size hash value, enabling rapid data access and retrieval in hash tables, commonly used in databases and password storage.

22
New cards

Divide and Conquer Algorithm:

Breaks a complex problem into smaller subproblems, solves them independently, and then combines their solutions to address the original problem effectively.

23
New cards

Greedy Algorithm:

Makes locally optimal choices at each step in the hope of finding a global optimum, useful for optimization problems but may not always lead to the best solution.

24
New cards

Dynamic Programming Algorithm:

Stores and reuses intermediate results to avoid redundant computations, enhancing the efficiency of solving complex problems.

25
New cards

Randomized Algorithm:

Utilizes randomness in its steps to achieve a solution, often used in situations where an approximate or probabilistic answer suffices.

26
New cards

Recursive algorithms

Recursive algorithms are a fundamental concept in computer science, particularly in the study of data structures and algorithms. A recursive algorithm is one that solves a problem by breaking it down into smaller instances of the same problem, which it then solves in the same way. This process continues until the problem is reduced to a base case, which is solved directly without further recursion.

27
New cards

Key Concepts of Recursive Algorithms

Base Case: This is the condition under which the recursion stops. It represents the simplest instance of the problem, which can be solved directly without further recursion.

Recursive Case: This is the part of the algorithm that breaks the problem down into smaller instances of the same problem and then calls the algorithm recursively on these smaller instances.

Stack: Each recursive call is placed on the system call stack. When the base case is reached, the stack begins to unwind as each instance of the function returns its result.

28
New cards

Base Case:

This is the condition under which the recursion stops. It represents the simplest instance of the problem, which can be solved directly without further recursion.

29
New cards

Recursive Case:

This is the part of the algorithm that breaks the problem down into smaller instances of the same problem and then calls the algorithm recursively on these smaller instances.

30
New cards

Stack:

Each recursive call is placed on the system call stack. When the base case is reached, the stack begins to unwind as each instance of the function returns its result.

31
New cards

Example: Factorial Calculation

The factorial of a number n (denoted as n!) is a classic example of a recursive algorithm. The factorial is defined as:

O! = 1 (Base Case)

N! = n * (n-1)! For n > O (Recursive Case)

Here's how it looks in code:

def factorial(n):if n == 0: # Base Casereturn 1else: # Recursive Casereturn n * factorial(n - 1)

How It Works:

Base Case: When n is 0, the function returns 1.

Recursive Case: For any other value of n, the function calls itself with n−1 and multiplies the result by n.

For example, calling factorial(3) would work as follows:

factorial(3) calls factorial(2)

factorial(2) calls factorial(1)

factorial(1) calls factorial(0)

factorial(0) returns 1, then:

factorial(1) returns 1 * 1 = 1

factorial(2) returns 2 * 1 = 2

factorial(3) returns 3 * 2 = 6

32
New cards

Advantages of Recursion

Simplicity:

Recursive solutions are often more elegant and easier to understand than their iterative counterparts.

33
New cards

Advantages of Recursion

Direct Translation:

Some problems are naturally recursive, like tree traversals, making recursion the most straightforward approach.

34
New cards

Disadvantages of Recursion

Performance:

Recursive algorithms can be less efficient due to the overhead of multiple function calls and potential stack overflow issues for deep recursion.

35
New cards

Disadvantages of Recursion

Memory Usage:

Recursion can consume more memory because each function call adds a new frame to the call stack.

36
New cards

When to Use Recursion:

When a problem can naturally be divided into similar sub-problems (e.g., tree traversal, searching algorithms like binary search).

When the recursive solution is significantly simpler or more intuitive than an iterative one

37
New cards

Linear Search

Concept:

Linear search is the simplest search algorithm.

It works by sequentially checking each element of the array or list until the target element is found or the end of the collection is reached.

Algorithm:

Start from the first element of the array.

Compare the current element with the target element.

If they match, return the index of the element.

If they don't match, move to the next element and repeat the process.

If the target element is not found by the end of the array, return a "not found" indication.

Time Complexity: O(n), where n is the number of elements in the array. This is because in the worst case, the algorithm may need to check every element in the array.

When to Use:

When the array or list is small.

When the array is unsorted.

When simplicity is more important than performance.

Example:

def linear_search(arr, target):for i in range(len(arr)):if arr[i] == target:return ireturn -1 # Return -1 if the element is not found

38
New cards

Binary Search

Concept:

Binary search is much more efficient than linear search but requires the array or list to be sorted.

It works by repeatedly dividing the search interval in half. If the target value is less than the middle element, the search continues in the left half, otherwise in the right half.

Algorithm:

Start with two pointers, one at the beginning (low) and one at the end (high) of the sorted array.

Find the middle element of the current interval.

Compare the middle element with the target:

If they match, return the index of the middle element.

If the target is less than the middle element, repeat the search on the left half.

If the target is greater, repeat the search on the right half.

If the interval becomes invalid (low > high), return a "not found" indication.

Time Complexity: O(log⁡ n), where n is the number of elements in the array. This logarithmic time complexity makes binary search significantly faster than linear search for large datasets.

When to Use:

When the array or list is sorted.

When the array is large and efficiency is crucial.

Example:

def binary_search(arr, target):low = 0high = len(arr) - 1while low <= high:mid = (low + high) // 2if arr[mid] == target:return midelif arr[mid] < target:low = mid + 1else:high = mid - 1return -1 # Return -1 if the element is not found

39
New cards

Step-By-Step Guide To Figuring Out The Array Elements Corresponding To The Mid-Values In The First And Second Iterations Of A Binary Search

arr = {45, 77, 89, 90, 94, 99, 100} and key = 100

Initial Setup:

The array arr is {45, 77, 89, 90, 94, 99, 100}.

The key to find is 100.

Initialize two pointers: low (start of the array) and high (end of the array).

First Iteration:

Calculate the middle index mid using the formula: mid = (low + high) / 2

Check the value at arr[mid].

Compare arr[mid] with the key:

If arr[mid] is less than key, update low to mid + 1.

If arr[mid] is greater than key, update high to mid - 1.

If arr[mid] is equal to key, you have found the key (though you won't need a second iteration in this case).

Second Iteration:

Repeat the calculation for mid with the updated low and high values.

Again, compare arr[mid] with the key and update low or high accordingly.

40
New cards

Linear Search

Time Complexity: O(n), where n is the number of elements.

41
New cards

Linear Search

Use Case: Best used when the list is small or unsorted.

42
New cards

Linear Search

Distinct Characteristics:

Simple, sequential search.

Checks each element one by one.

Works on both sorted and unsorted data.

43
New cards

Linear Search Worst, Average and Best

Best Case: O(1) — The target element is the first element.

Average Case: O(n) — The target element is somewhere in the middle or not in the array.

Worst Case: O(n) — The target element is the last element or not present.

44
New cards

Binary Search

Concept: Binary search operates on a sorted list. It repeatedly divides the search interval in half until the target element is found or the interval is empty.

45
New cards

Binary Search

Time Complexity: O(log ⁡n)

46
New cards

Binary Search

Use Case: Ideal for large, sorted datasets.

47
New cards

Binary Search

Distinct Characteristics:

Requires a sorted array.

Divides the search interval in half repeatedly.

Efficient, logarithmic time complexity.

48
New cards

Binary Search Worst, Average and Best

Best Case: O(1) — The target element is the middle element.

Average Case: O(log ⁡n) — The target element is not immediately found but within the sorted array.

Worst Case: O(log⁡ n) — The target element is at the extreme ends or not present.

49
New cards

Interpolation Search

Concept: Similar to binary search but works on uniformly distributed data. It estimates the position of the target element based on the value.

50
New cards

Interpolation Search

Time Complexity: O(log ⁡log ⁡n) in the best case, O(n) in the worst case.

51
New cards

Interpolation Search

Use Case: Effective when the data is uniformly distributed.

52
New cards

Interpolation Search

Best Case: O(1) — The target element is exactly where the interpolation suggests.

Average Case: O(log ⁡log⁡ n) — Uniformly distributed data.

Worst Case: O(n) — Highly skewed data distribution or worst interpolation.

53
New cards

Depth-First Search (DFS) and Breadth-First Search (BFS)

Concept: Used primarily in graph and tree data structures. DFS explores as far as possible along one branch before backtracking, while BFS explores all neighbors at the present depth before moving on to nodes at the next depth level.

54
New cards

Depth-First Search (DFS) and Breadth-First Search (BFS)

Time Complexity: O(V+E), where V is the number of vertices and E is the number of edges.

55
New cards

Depth-First Search (DFS) and Breadth-First Search (BFS)

Use Case: Useful for searching nodes in graphs and trees.

56
New cards

Depth-First Search (DFS) Worst, Average and Best

Best Case: O(1) — The target node is found immediately.

Average Case: O(V+E)— Typically when all nodes and edges must be explored.

Worst Case: O(V+E) — The target node is the last one discovered.

57
New cards

Breadth-First Search (BFS)

Best Case: O(1) — The target node is the root or the first node checked.

Average Case: O(V+E) — All nodes and edges need to be explored.

Worst Case: O(V+E) — The target node is the last one explored.

58
New cards

Sorting Algorithms

Sorting algorithms organize data in a particular order (usually ascending or descending). This makes searching and other operations more efficient.

59
New cards

Bubble Sort

Use Case: Simple but inefficient for large datasets. Best used for educational purposes or small lists.

60
New cards

Bubble Sort

Distinct Characteristics:

Repeatedly swaps adjacent elements if they are in the wrong order.

Simple, but inefficient for large datasets.

"Bubbles" the largest element to the end of the list.

61
New cards

Bubble Sort

Bubble Sort Worst, Average and Best

Best Case: O(n) — The array is already sorted (with an optimized version that stops early).

Average Case: O(n2) — Average case with random elements.

Worst Case: O(n2) — The array is sorted in reverse order.

62
New cards

Bubble Sort

Bubble: Look for something that swaps so the result can "bubble" to the top. (Swap, Exchange)

63
New cards

Selection Sort

Use Case: Inefficient for large lists, but useful when memory writes are more expensive than comparisons.

64
New cards

Selection Sort

Distinct Characteristics:

Finds the minimum element and swaps it with the first unsorted element.

Reduces the problem size by one in each iteration.

Always performs O(n2) comparisons, regardless of input.

65
New cards

Selection Sort

Selection Sort Worst, Average and Best

Best Case: O(n2) — Selection sort does not improve with better input, always O(n2).

Average Case: O(n2) — Average case with random elements.

Worst Case: O(n2) — Selection sort is insensitive to input order.

66
New cards

Selection Sort

Selection: Look for code that repeatedly finds the minimum (or maximum) element and moves it to the beginning (or end) of the list. (Select minimum, Swap with start)

67
New cards

Insertion Sort

Use Case: Good for small or nearly sorted lists.

68
New cards

Insertion Sort

Distinct Characteristics:

Builds a sorted list one element at a time.

Efficient for small or nearly sorted datasets.

Shifts elements to make space for the current element.

69
New cards

Insertion Sort

Insertion Sort Worst, Average and Best

Best Case: O(n) — The array is already sorted.

Average Case: O(n2) — Average case with random elements.

Worst Case: O(n2) — The array is sorted in reverse order.

70
New cards

Insertion:

Look for code that builds a sorted portion of the list one element at a time by inserting each new element into its correct position within the already-sorted part. (Insert, Shift Element)

71
New cards

Merge Sort

Use Case: Efficient and stable; good for large datasets.

72
New cards

Merge Sort

Distinct Characteristics:

Divides the list into halves, sorts each half, and then merges them.

Stable and efficient for large datasets.

Requires additional space for merging.

73
New cards

Merge Sort Worst, Average and Best

Best Case: O(n log⁡ n) — Merge sort's time complexity is the same in all cases.

Average Case: O(n log⁡ n).

Worst Case: O(n log⁡ n).

74
New cards

Merge:

Look for something that continually splits a list in half. (Merge, Split)

75
New cards

Quicksort

Use Case: Often faster in practice than merge sort but less stable.

76
New cards

Quicksort

Distinct Characteristics:

Selects a "pivot" element and partitions the array around it.

Recursively sorts the partitions.

Efficient, but can degrade to O(n2) if poor pivot selection occurs.

77
New cards

Quicksort Worst, Average and Best

Best Case: O(n log⁡ n) — The pivot splits the array into two nearly equal halves.

Average Case: O(n log⁡ n) — Average case with random pivots.

Worst Case: O(n2) — The pivot is always the smallest or largest element, leading to unbalanced partitions.

78
New cards

Quicksort

Look for the keywords "pivot" and/or "split". (Pivot, Split)

79
New cards

Heap Sort

Use Case: Useful when memory usage is a concern as it's an in-place algorithm.

80
New cards

Heap Sort

Distinct Characteristics:

Utilizes a binary heap data structure.

Builds a max-heap and repeatedly extracts the maximum element.

Efficient and in-place, but not stable.

81
New cards

Heap Sort Worst, Average and Best

Best Case: O(n log⁡ n) — Heap sort's time complexity is the same in all cases.

Average Case: O(n log⁡ n).

Worst Case: O(n log⁡ n).

82
New cards

Heap Sort:

Look for code that uses a heap data structure to repeatedly extract the maximum (or minimum) element and rebuilds the heap. (Heapify, Extract Max, Build Heap)

83
New cards

Counting Sort:

Use Case: Efficient for sorting integers or other items with a small range of possible values.

84
New cards

Counting Sort:

Distinct Characteristics:

Non-comparative sorting.

Counts occurrences of each element and uses this information to place elements.

Efficient for small ranges of integers.

85
New cards

Counting Sort Worst, Average and Best

Best Case: O(n+k)— k is the range of the input.

Average Case: O(n+k).

Worst Case: O(n+k).

86
New cards

Radix Sort

Use Case: Effective for sorting large numbers or strings with a fixed length.

87
New cards

Radix Sort

Distinct Characteristics:

Sorts numbers by processing individual digits.

Non-comparative, stable, and efficient for specific data types.

Often combined with counting sort.

88
New cards

Radix Sort Worst, Average and Best

Best Case: O(n*k) — k is the number of digits in the largest number.

Average Case: O(n*k).

Worst Case: O(n*k).

89
New cards

Radix Sort:

Look for code that sorts numbers based on their individual digits, starting from the least significant digit (LSD) or the most significant digit (MSD). (Count, Frequency, Sum)

90
New cards

Bucket Sort

Use Case: Good for uniformly distributed data.

91
New cards

Bucket Sort

Distinct Characteristics:

Distributes elements into buckets and sorts each bucket individually.

Efficient when the input is uniformly distributed.

Often combined with another sorting algorithm like insertion sort.

92
New cards

Bucket Sort Worst, Average and Best

Best Case: O(n+k) — k is the number of buckets; assumes uniform distribution.

Average Case: O(n+k).

Worst Case: O(n2) — All elements end up in one bucket (degenerate case).

93
New cards

Bucket

Look for something that distributes the values into "buckets" where they are individually sorted. (Bucket)

94
New cards

Shell Sort

Distinct Characteristics:

Generalization of insertion sort with a gap sequence.

Sorts elements far apart and gradually reduces the gap.

Efficient for medium-sized datasets.

Time Complexity: Depends on the gap sequence; commonly O(n^3/2).

95
New cards

Shell Sort Worst, Average and Best

Best Case: O(n log n) — Occurs when the array is already sorted or nearly sorted, especially when using a good gap sequence like the Knuth sequence.

Average Case: O(n^(3/2)) or O(n^1.5) — Highly dependent on the gap sequence used. With commonly used sequences like the Knuth sequence, the average-case complexity is approximately O(n^1.5).

Worst Case: O(n2) — Can degrade to O(n2), particularly with poorly chosen gap sequences like the original Shell sequence (where the gaps are halved each time).

96
New cards

Shell Sort:

Look for code that sorts elements at specific intervals and gradually reduces the interval until it performs a final insertion sort. (Gap, Interval)

97
New cards

Summary of all Searching & Sorting Algorithms:

Linear Search:

Simple, sequential; O(n).

98
New cards

Summary of all Searching & Sorting Algorithms:

Binary Search:

Sorted data, divide and conquer; O(log n).

99
New cards

Summary of all Searching & Sorting Algorithms:

Bubble Sort:

Swaps, bubbles up; O(n^2).

100
New cards

Summary of all Searching & Sorting Algorithms:

Selection Sort

Finds minimum, swaps; O(n^2).